Abstract from the Literature

By James Ross In   Issue Volume 11 No. 2

Staub D, et al. Effectiveness of a repellent containing DEET and EBAAP for preventing tick bites. Wild Environ Med 2002; 13(1): 12-20.

OBJECTIVE

TOPICAL REPELLENTS can provide effective personal protection from tick-borne diseases by preventing the attachment of ticks. The goal of this study was to assess the effectiveness of a commercially avail­ able repellent spray containing both N, N diethyl 3 methylbenzamide, previously known as N, N diethyl m toluamide (DEET) and ethyl butylacetylaminopropionate (EBAAP) against tick bites in a population at risk in Switzerland under real-life conditions.

METHODS

The effectiveness of an insect repellent spray containing both DEET and EBAAP was evaluated in a randomised double-blind placebo-controlled study. The study requiring simple application of the repel­ lent to exposed skin was carried out on 276 forestry workers and orienteers under everyday conditions in Switzerland from May to September 1999. We measured total effectiveness of the repellent by the following formula: percentage effectiveness + 100 x (Tp – Tr) I Tp where Tp and Tr were the average number of tick bites per hour spent in the wooded areas for the repellent and placebo groups, respectively.

RESULTS
The average number of tick bites per hour of expo­ sure to wooded areas differed significantly between the placebo (n=l38) and repellent (n=138) groups, 0.17 v 0.10 (P< 0.05). Total repellent effectiveness against tick attachment was 41.1% (95% CI, 2.5 - 79.6). On the arms an effectiveness of 66% (95% CI 17.3-114.7) was observed. No significant difference in the average number of unattached ticks could be found. CONCLUSIONS
This study found that an easily applied repellent is moderately effective in reducing the risk of tick bites.

COMMENT
Only moderately effective. The subjects were told to apply repellent twice a day, but not how extensively to apply it. As part of a study they are likely to be far more rigorous in application: both in remembering and in amount used. Thus, the practical effectiveness is likely be considerably less than stated in this study. So, and this applies to protection from any vector, such protection is only partly successful and must be part of a range of protective measures.

Military Medicine 2002; Volume 167(2)

Supplement I: Proceedings of the international conference on low-level radiation injury and medical countermeasures.

This supplement provides some very useful articles on the current state of knowledge and areas of research in this filed. The following are a couple of the offerings

Blakely WF et al. Overview of low-level radiation exposure assessment: Biodosimetry. Mil Med 2002;167(2 Suppl): 20-24.

The capability to make diagnostic assessments of radiation exposure is needed to support triage of radiation casualties and medical treatment decisions in military operations. At the International Conference on low­ level radiation injury and medical countermeasures session on biodosimetry in the military, participants reviewed the field of biomarkers, covering a wide range of biological endpoints. Participants evaluated early changes associated with exposure to ionising radiation, including chromosomal and DNA damage, gene expression and associated proteins and DNA mutations. The use and development of advanced monitoring and diagnostic technologies compatible with military operations was emphasised. Conventional radiation bioassays required a substantial amount of time between when the sample is taken and when the data can be provided for decision making. These “reach back” bioassays are evaluated in laboratories that are not in the field: these laboratories routinely measure exposures of 25 cGy (photon equivalent levels). Detection thresholds can be reduced approximately fivefold by the addition of significant and tiresome scoring efforts. Alternative real­ time biomarkers that can be measured in field laboratories or with handheld detection devices show promise as screening and clinical diagnostic tools, but they require further development and validation studies.

Kumar KS, et al. Nutritional approaches to radioprotection: Vitamin E. Mil Med 2002;167 (2 Suppl): 57-59.
Low-level radiation injury is dependent on the radiation dose and dose rate. The major military use of any potential radioprotectant is to prevent the short-term effects of lethality and the long-term effects of cancer and other pathologies from radiation exposure that may occur in a nuclear battlefield or in a nuclear material contaminated field of operation. Therefore, a radioprotectant should not affect the ability of military personnel to perform tasks. Because exposure to ionising radiation induces free radical species, effective antioxidants
either alone or in combination with other agents, can be used as potential bioprotectors. To test this hypothesis, we studied vitamin E for its radioprotective efficacy Using CD2F, male mice as the model system, we observed that vitamin E at a dose of 400 IU/kg acts as a good radioprotectant against lethal doses of cobalt 60 radiation. Vitamin E was more efficacious when given subcutaneously than when given orally.

COMMENT
Vitamin E is advantageous as it is non-toxic and thus will not interfere with primary duties. just what is the mechanism is still speculation: a ‘sink’ for free radicals is the most likely.

McDiarmid M et al. Health Effects and Biological Monitoring Results of Gulf Was Veterans Exposed to Depleted Uranium. Mil Med 2002;167(2 Supply): 123-124.
A small group of gulf-war veterans have retained fragments of depleted uranium (DU) shrapnel, the long-term health consequences of which are undetermined.

We evaluated the clinical health effects of DU exposure in Gulf War veterans compared with nonexposed Gulf War veterans. History and follow-up medical examinations were performed on 29 exposed veterans and
38 nonexposed veterans. Outcome measures used were urinary uranium determinations, clinical laboratory values, and psychiatric and neurocognitive assessment. Gulf War veterans with retained DU metal fragments were found to be still excreting elevated levels of urinary uranium 7 years after first exposure to DU (range for exposed individuals is 0.01-30.7 ug/g creatinine vs 0.01-0.05 ug/g creatinine in the nonexposed). The
persistence of the elevated urine uranium suggests ongoing mobilisation of uranium from a storage depot resulting in chronic systemic exposure. Adverse effects in the kidney, a presumed target organ, were not seen at the time of the study; however, other subtle effects were observed in the reproductive and central nervous systems of the DU exposed veterans.

COMMENT
If any significant health effects do occur from DU, it requires a very long latency period. DU shrapnel produces no medium-term health effects, so the concerns about short-term exposure can be laid to rest, unless there is a unique exposure.
International Conference on the Operational Impact of Psychological Casualties from Weapons of Mass Destruction – Proceedings. Mil Med 2001;166 (Supply 2).

Another proceedings produced by Military Medicine as a complete supplementary issue.

Holmstedt Mark B, et al. The psychological aspect of the Anthrax Vaccine: ‘The Dover Experience’. Mil Med 2001;166 (Suppl2): 36-40.

Discussion focuses on the Department of Defense (DoD) Anthrax Vaccination Implementation Program (AVIP) at Dover Air Force Base in Delaware (AFB DE). The history relates to the effects of an organised ‘anthrax -no’ group in using the Internet and the media to launch an intense rumour campaign on the DoD AVIP, which leads to mistrust of and lack of confidence in senior leaders and the medical community. For many airmen, the fear of the vaccine takes precedence over the threat of the disease.

Dover Wing and Medical Group personnel combat the ‘anthrax -no’ campaign by providing increased education on the AVIP. Experts in the field are brought into Dover to answer questions and to provide information to begin the process of rumour control. Policies are established by the medical group to assure the community’s medical needs are expediently, thoroughly and competently met. The primary focus is to provide a safe, non-judgmental environment for members to voice their concerns and to resolve their medical issues.

COMMENT
The DoD policy on anthrax vaccine for the military in the US has just changed (end ]un 02), not because of risk profile from the vaccine but because of difficulty in obtaining sufficient stocks, the need to stockpile for possible use in civilians and the recognition that the risk of a widespread outbreak is remote. That the vaccine is 1960s technology at best, requiring 6 shots over 18 months and annual boosters and is very expensive are other relevant issues. There have been 450 US military who have been disciplined in some way as a result of refusing to have the vaccine.

Pastel R. Collective Behaviours: Mass Panic and Outbreaks of Multiple Unexplained Symptoms. Mil Med 2001;166 (Supply 2):44-46.

The general public, the mass media, and many government officials believe that the use of weapons of mass destruction (WMD) will inevitably lead to mass panic and/or mass hysteria. However, studies of disasters and wars show that disorganised flight in the presence of a real or perceived danger (i.e. mass panic) is rare. On the other hand, in a real or perceived WMD scenario, outbreaks of multiple unexplained symptoms (i.e. mass psychogenic illness, mass sociogenic illness, mass hysteria or epidemic hysteria) may be prevalent. Many of these symptoms (fatigue, nausea, vomiting, headache, dizziness lightheadedness, and anorexia) are common in combat and after toxic chemical exposure, chemical weapon exposure, prodromal infectious disease, and acute radiation sickness.

COMMENT
Always a fascinating subject, I feel, mass psychogenic ill­ness. It will likely be a significant confounding factor in the assessment of possible attacks. As noted above, classic symptoms of MPI can easily mimic the poorly differentiated symptoms, particularly early in an epidemic of biological cause or anytime in a chemical cause. The more expectant of an outbreak, the more likely people are to convince themselves of its reality. Particular if other people are complaining of symptoms.

Ramalingaswami V. Psychosocial effects of the 1994 Plague Outbreak in Surat, India. Mil Med 2001;166 (Suppl2): 29-30.

The plague outbreak in Surat, India in September 1994 stirred a nationwide panic and near international isolation of India. These are aspects that need serious attention. A large amount of damage to India’s image
and an immense economic loss occurred. Some advice for the future is suggested.

COMMENT

This was a remarkable event, and the blind hopelessness that has marked mankind’s response to the Black Death resurfaced as if we had learned nothing. I recall that the ADF sent a doctor to India to placate the members of the Australian expatriate community, particularly the diplomats. ‘In one night, an avoidable exodus of 600,000 people fled Surat by whatever means available, including horse carriage, ox car, or even on foot… Doctors fled the city saying ‘this plague, nothing can be done’.

Dervay JP, Powell MR, Butler B, and Fife CE. The Effect of Exercise and Rest Duration on the Generation of Venous Gas Bubbles at Altitude. Aviat Space Environ Med 2002; 73:22-7

BACKGROUND

Decompression, as occurs with aviators and astronauts undergoing high altitude operations or with deep-sea divers returning to surface, can cause gas bubbles to form within the organism. Pressure changes to evoke bubble formation in vivo during depressurisation are several orders of magnitude less than those required
for gas-phase formation in vitro in quiescent liquids. Preformed micronuclei acting as ‘seeds’ have been proposed, dating back to the 1940s. These tissue gas micronuclei have been attributed to a minute gas-phase located in hydrophobic cavities, surfactant-stabilised microbubbles, or arising from musculoskeletal activity. The lifetimes of these micronuclei have been presumed to be from a few minutes to several weeks.

HYPOTHESIS

The greatest incidence of venous gas emboli (VGE) will be detected by precordial Doppler ultrasound with depressurisation immediately following lower extremity exercise, with progressively reduced levels of VGE observed as the interval from exercise to depressurisation lengthens.

METHODS

In a blinded cross-over design, 20 individuals (15 men, 5 women) at sea level exercised by performing knee­ bend squats (150 knee flexes over 10 min, 235-kcal • h-1) either at the beginning, middle, or end of a 2-h
chair-rest period without an oxygen pre-breathe. Seated subjects were then depressurised to 6.2 psia (6706 m or 22,000 ft altitude equivalent) for 120 min with no exercise performed at altitude.

RESULTS

Of the 20 subjects with VGE in the pulmonary artery, 10 demonstrated a greater incidence of bubbles with exercise performed just prior to depressurisation, compared with decreasing bubble grades and incidence as the interval of rest increased prior to depressurisation. No decompression illness was reported.

CONCLUSIONS
There is a significant increase in decompression­ induced bubble formation at 6.2 psia when lower extremity exercise is performed just prior to depressurisation as compared with longer rest intervals. Analysis indicated that micronuclei half-life is on the order of an hour under these hypobac conditions.

COMMENT
This involved pre-depressurisation exercise, not exercise during depressurisation. And, of course, there were bubbles but not symptoms, which begs the question of what actually causes symptoms. just because there are more bubbles does not necessarily mean the risk of DCI is increased.

Grimson JM, Schallhorn SC, and Kaupp SE. Contrast Sensitivity: Establishing Normative Data for Use in Screening Prospective Naval Pilots. Aviat Space Environ Med 2002; 73:28-35.

BACKGROUND

Contrast sensitivity testing can be a useful supplement to standard visual acuity tests. Currently, there are no standards for contrast sensitivity in military aviation. Student naval pilots, who often have better-than-ever­ age visual acuity, could be expected to have better-­than-average contrast sensitivity. Any attempt to establish contrast sensitivity standards for military aviation should begin with establishing normative data, particularly data gathered from the military aviation community.

HYPOTHESIS
Student naval pilots differ from the general military population on Small Letter Contrast Test measurements.

METHODS
Contrast sensitivity was measured in a group of student naval pilots (n = 107) and compared with results from aviation and non-aviation personnel. The Small Letter Contrast Test (SLCT) was used (19). Other subjects consisted of student naval flight officers (n = 40), experienced naval pilots and flight officers (n = 35 and 86, respectively), enlisted aircrew (n = 175), and other military personnel tested before undergoing photorefractive keratectomy (n = 185).

RESULTS
Data collected provide large-group demographic characteristics and normative values for contrast sensitivity measured with the SLCT. Of the non-aviation controls, 95% scored at least 0.62 (read at least seven lines plus two of ten letters on the eighth line of the chart), and 95% of the student pilots scored at least 0.81, (read at least nine lines plus one letter on the 10th line).

CONCLUSION
Student naval pilots scored significantly better on the SLCT than the military control population. The SLCT shows potential as a screening device during induction physical examinations of military pilots.

COMMENT
Just because someone has better than average visual acuity does not mean (with certainty) they will have better than average contrast sensitivity. They are different parameters and may not be linked. It would be useful to be able to define what is the minimum acceptable for contrast sensitivity for recruitment, and then also for post-refractive or other eye surgery. Eventually, a consensus will be reached on a standard.

Smith SD Characterising the Effects of Airborne Vibration on Human Body Vibration Response. Aviat Space Environ Med 2002; 73:36-45.

BACKGROUND
Exposure to high intensity, low-frequency noise can cause whole-body vibration. Such exposures to airborne vibration can reach the limits of human tolerance and have been associated with physiological and pathological disorders. The objective of this study was to characterise human body vibration response during exposures to operational airborne vibration.

METHODS
Triaxial body accelerations were collected at multiple anatomical sites with the subject located at selected crew positions during ground-based engine runup tests on several military tactical aircraft. The acceleration time histories were processed in one-third octave frequency bands and compared with the one-third octave band noise data.

RESULTS

The most significant finding was the occurrence of a resonance peak in the fore-and-aft (X) chest acceleration in the frequency bands between 63 and 100Hz. Both the chest acceleration and associated noise level increased as the subject moved aft of the exhaust outlet, coinciding with the report of increasing chest vibration. A relatively linear relationship was found between the overall chest accelerations and noise levels between 5 and 250 Hz. An approach to developing combined noise and vibration exposure criteria was proposed.

CONCLUSIONS

The resonance observed in the upper torso strongly suggested that airborne vibration in the 60 to 100 Hz frequency band may be an important contributing factor in the generation of subjective symptoms and possibly physiological and pathological disorders. Additional field and laboratory studies are required to validate the relationship between the biodynamic responses, noise levels, and physiological and pathological consequences.

COMMENT
Vibroacoustic disease is a field of continuing research and increasing concern. Infrasound can cause a huge range of problems, from interference with concentration to psycho­ logical effects to significant long-term disease. The range of 60-100 Hz is higher than is generally viewed to be the main culprit.

Koda E. Could foot and mouth disease be a biological warfare incident? Mil Med 2002;167(2): 91-92.

COMMENT
This editorial provides a postulation that the foot and mouth outbreak in the UK in 2001 could have been an act of bioterrorism. While the idea is worthwhile, the detail lacks any credibility and in the end I was left with the feeling that this should never have made it into print. All concerned should have backed off and realised it was simply not the case -at least not on this occasion. Truly a silly article.

Ohrui N, et al. Physiological Incidents During 39 Years of hypobaric Chamber Training In Japan. Aviat Space Environ Med 2002; 73(4): 395-398.

BACKGROUND
Hypobaric chamber training for military aircrew is very important for flight safety. Since we began hypo­ baric training in our laboratory in 1960, some trainees have suffered physiological incidents. This study will characterise the physiological incidents during hypo­ baric chamber training at the Japan Air Self-Defence Force QASDF).

METHODS
All available training records from 1960-1998 were reviewed and the frequency of physiological incidents counted and analysed.

RESULTS
There were 29,677 trainees and 58,454 exposures. Overall frequency of physiological incidents was 6.3%. Physiological incidents included ear pain, paranasal sinus pain, abdominal pain, hypoxia, hyperventilation, joint pain, and toothache. Decompression sickness (DCS-1, simple joint pain only) was rare. In cases of DCS-1, joint pain was easily relieved with controlled descent. During the last three decades, overall prevalence of physiological incidents has gradually increased from 5.3-6.1% before 1991, to 6.8-9.9% after 1991. However, prevalence rate showed no change throughout the period when ear pain was factored out. The increase in prevalence was entirely due to an increased frequency of ear pain: 3.6-4.6% before 1991, and
5.4-7.2% after 1991.

CONCLUSIONS
DCS has not been a problem in the ASDF hypobaric chamber training experience. The majority of physio­logical incidents during hypobaric chamber training in ASDF have been ear pain, a minor but frequent obstacle to hypobaric training. The exact cause of the observed increase in frequency of Eustachian tube dysfunction currently remains unclear.

COMMENT
A continuing topic of discussion, and perplexity, in the Aviation medicine fraternity, is the widely differing rates of Decompression Illness as a result of seemingly analogous exposures at various training establishments around the globe. Australia is firmly in the high rate camp, to the extent that the ADF no longer has hypobaric chamber runs to above 10000 feet, with hypoxia being experienced by gas-mix to reduce the oxygen percentage. This report puts Japan firmly in the low rate camp.

Ireland R. Pharmacologic Considerations for Serotonin Reuptake Inhibitor Use by Aviators. Aviat Space Environ Med 2002; 73:421-9

Physicians frequently use serotonin reuptake inhibitors (SRIs) to treat a variety of psychiatric and medical conditions, many of which occur in aviators. SRIs are efficacious for treating acute conditions and may also prove useful for prophylaxis against recurrence through maintenance dosage. Aviators must meet standard safety criteria in order to use medications while performing flying duties and must receive individual waivers as well. This article reviews the particular threats that serotonergic agents pose to aviation safety. Some SRIs may prove safer than others to use in the aviation environment, but such medications will require appropriate ground testing and must provide aero medically safe control of the symptoms for which they are prescribed.

COMMENT

SRIs are used for long-term control of depression, as well as other psychological! dependence conditions. In aviation, the banning of SRIs in aircrew will likely lead to the under­ ground use or poorly treated depressives. This review strongly suggests that SRIs have the potential to be safety used by aviators, provided appropriate protocols are followed. Further studies will follow.

Contributed by Darrell Duncan

Chaffin J, King JE, Fretwell LD. US Army Dental Emergencies Rates in Bosnia. Mil Med 2001; 166 (12).

OBJECTIVE
Our objective is to report on overall dental emergency rate by dental classification in a US Army peacekeeping operation linger than 6 months in the year 2000.

MATERIALS AND METHODS

This study was a retrospective cohort analysis of dental emergencies experienced by soldiers of the 3rd
Armoured Cavalry Regiment as a part of Stabilisation Force VII. Before the deployment, all soldiers received dental examinations and the necessary dental treatment to make the class 1 or 2. A dental emergency was identified from the field treatment records when a soldier presented to the clinic for a ‘sick call’, emergency, or trauma visit.

RESULTS

Retrospective review of the records identified 211 dental emergencies. Class 1 soldiers experienced 75 dental emergencies and class 2 soldiers experienced 136 emergencies. 3rd Armored Cavalry Regiment soldiers spent an average of 201.95 days deployed. The overall emergency rate was 156 dental emergencies per 1000 soldiers per year. Class 1 and 2 rates were 121 and 185 dental emergencies per 1000 troops per year respectively.

CONCLUSIONS
The results tend to confirm that dental emergencies continue to be a threat to overall readiness in deployed environments. Military planners need to ensure that the dental component of future forces are sufficient to care for the expected emergencies.

COMMENT
This is one of the few studies of dental disease rates that I have seen. The authors acknowledge such studies are sparse. This study has a number of limitations, as indicated by the authors. Comparison to other results, and in particular the ADF experience, should be done with caution, in particular noting any differences in definitions of a dental emergency. The authors rightly point out that more studies of this nature are desirable to assist in the development of predictive models of dental casualties to aid planners. For instance, it would be of interest to see studies comparing the presentation rate to sick parades for each of the ADB dental categories. This would allow validation of our current policy with respect to dental readiness.

Pijnenburg ACM, Glas AS, de Roos MAJ, Bogaard K, Lijmer JG, Boosuyt PMM, Butzelaar RMJM, Keeman, J N. Radiography in Acute Ankle Injuries: The Ottawa Ankle Rules Versus Local Diagnostic Decision Rules. Ann Emerg Med 2002;39(6).

STUDY OBJECTIVES

We validate the Ottawa Ankle Rules and two Dutch ankle rules in distinguishing clinically significant fractures from insignificant fractures and other injuries in patients with a painful ankle presenting to the emergency department.

METHODS
This prospective comparison of three ankle rules was conducted in the ED of a 580-bed community teaching
hospital in Amsterdam from January 1998 to April 1999.

Participants included 647 consecutive patients aged 18 years or older presenting with a painful ankle after trauma. All physicians received extensive and pictorial training on how to correctly score the respective items of the rules. The physician on call recorded these items derived from history and physical examination on a standardised datasheet. All patients subsequently underwent standard radiographic assessment. A radiologist and a trauma surgeon evaluated the radiographs blinded from the results of the data sheet form and treatment given. The diagnostic performance of the three rules was measured in terms of sensitivity, specificity and the reduction of radiographs. Receiver operating characteristics (ROC) curves were constructed and the area under the ROC curves was calculated and compared.

RESULTS
Seventy-four fractures were seen, of which 41 were clinically significant. The Ottawa Ankle Rules had a sensitivity of 98% for identifying clinically significant fractures; the local rules scored 88% and 59% respectively. The potential savings in radiographs for the 3 decision rules were 24%, 54% and 82% respectively. The area under the ROC curve was better for both the local rules (0.84 and 0.83) compared with the Ottawa Ankle Rules (0.76).

CONCLUSION
Because the identification of all relevant fractures is more important than a reduction in radiographs, the higher sensitivity of the Ottawa Ankle Rules makes these most suitable for implementation in the Netherlands.

COMMENT
This study looked to see if a clinical rule validated elsewhere (the Ottawa Ankle Rule1.2) could be applied in a different environment. The age range of the patients in the study was wider than the ADF population (18- 92 with and an average of 35). Despite this difference in populations, it is not unreasonable to expect that the Ottawa Ankle Rules could be used within the ADF to reduce the number of ankle radiographs ordered and to reduce the likelihood of missed fractures. If the incidence of ankle injuries and ankle radiographs is known, the potential
the benefit of this measure could be calculated.

Author Information

References

  1. Stiell IG, Greenberg GH, McKnight RD, et al. A study to develop clinical decision rules for the use of radiography in acute ankle injuries. Ann Emerg Med 1992; 21: 384-390.

Acknowledgements

Reader Feedback