Clinical journal of sport medicine: official journal of the Canadian Academy of Sport Medicine

Published by Lippincott, Williams & Wilkins
Print ISSN: 1050-642X
To identify the nature and extent of research in sport injury prevention with respect to 3 main categories: (1) training, (2) equipment, and (3) rules and regulations. We searched PubMed, CINAHL, Web of Science, Embase, and SPORTDiscus to retrieve all sports injury prevention publications. Articles were categorized according to the translating research into injury prevention practice model. We retrieved 11 859 articles published since 1938. Fifty-six percent (n = 6641) of publications were nonresearch (review articles and editorials). Publications documenting incidence (n = 1354) and etiology (n = 2558) were the most common original research articles (33% of total). Articles reporting preventive measures (n = 708) and efficacy (n = 460) were less common (10% of the total), and those investigating implementation (n = 162) and effectiveness (n = 32) were rare (1% of total). Six hundred seventy-seven studies focused on equipment and devices to protect against injury, whereas 551 investigated various forms of physical training related to injury prevention. Surprisingly, publications studying changes in rules and regulations aimed at increasing safety and reducing injuries were rare (<1%; n = 63) with a peak of only 20 articles over the most recent 5-year period and an average of 10 articles over the preceding 5-year blocks of time. Only 492 of 11 859 publications actually assessed the effectiveness of sports injury prevention interventions or their implementation. Research in the area of regulatory change is underrepresented and might represent one of the greatest opportunities to prevent injury.
To analyze differences in sports injury characteristics of the upper and lower extremity and to identify factors that contribute to the risk of sustaining an upper extremity injury compared with the risk of sustaining a lower extremity injury. Retrospective cohort study. An emergency department of a large European level I trauma center. A total of 25 120 patients with a simple sports injury, attending during 1990-2005. Independent variables used to assess risk factors were extracted from a local database. These include age, sex, type of injury, site and side of the injury, type of sport, injury mechanism, and data on admission. Main outcome measure was the relation of various risk factors to the occurrence of either upper or lower extremity injury. Logistic regression analysis was used to identify predictors for upper extremity injury. Thirty-five percent upper and 53% lower extremity injuries were recorded. Most injuries were sustained when playing soccer (36%). Fractures were more frequently diagnosed in the upper than in the lower extremities (44% and 14%, respectively), especially in children. Falling was the main cause of upper extremity injury. Further risk factors were young age and playing individual sports, no-contact sports, or no-ball sports. Women were at risk in speed skating, inline skating, and basketball, whereas men mostly got injured during skiing and snowboarding. A high percentage of sports injuries are sustained to the upper extremity. Different risk factors were identified for both sexes. These risk factors should be taken into account when designing preventive measures.
Objective: Describe ankle injury epidemiology among US high school athletes in 20 sports. Design: Descriptive prospective epidemiology study. Setting: Sports injury data for the 2005/06 to 2010/11 academic years were collected using an Internet-based injury surveillance system, Reporting Information Online. Participants: A nationwide convenience sample of US high schools. Assessment of risk factors: Injuries sustained as a function of sport and gender. Main outcome measures: Ankle sprain rates and patterns, outcomes, and mechanisms. Results: From 2005/06 to 2010/11, certified athletic trainers reported 5373 ankle sprains in 17,172,376 athlete exposures (AEs), for a rate of 3.13 ankle sprains per 10,000 AEs. Rates were higher for girls than for boys (rate ratio [RR], 1.25; 95% confidence interval [CI], 1.17-1.34) in gender-comparable sports and higher in competition than practice for boys (RR, 3.42; 95% CI, 3.20-3.66) and girls (RR, 2.71; 95% CI, 2.48-2.95). The anterior talofibular ligament was most commonly injured (involved in 85.3% of sprains). Overall, 49.7% of sprains resulted in loss of participation from 1 to 6 days. Although 0.5% of all ankle sprains required surgery, 6.6% of those involving the deltoid ligament also required surgery. The athletes were wearing ankle braces in 10.6% of all the sprains. The most common injury mechanism was contact with another person (42.4% of all ankle sprains). Conclusions: Ankle sprains are a serious problem in high school sports, with high rates of recurrent injury and loss of participation from sport.
To investigate changes in serum concentrations of the biochemical markers of brain damage S-100B and neuron-specific enolase (NSE) in ice hockey and basketball players during games. Descriptive clinical research. Competitive games of the Swedish Elite Ice Hockey League and the Swedish Elite Basketball League. Twenty-six male ice hockey players (from two teams) and 18 basketball players (from two teams). None. S-100B and NSE were analyzed using two-site immunoluminometric assays. The numbers of acceleration/deceleration events were assessed from videotape recordings of the games. Head trauma-related symptoms were monitored 24 hours after the game using the Rivermead Post Concussion Symptoms Questionnaire. Changes in serum concentrations of S-100B (postgame - pregame values) were statistically significant after both games (ice hockey, 0.072 +/- 0.108 microg/L, P = 0.00004; basketball, 0.076 +/- 0.091 microg/L, P = 0.001). In basketball, there was a significant correlation between the change in S-100B (postgame-pregame values) and jumps, which were the most frequent acceleration/deceleration (r = 0.706, P = 0.002). For NSE, no statistically significant change in serum concentration was found in either game. For one ice hockey player who experienced concussion during play, S-100B was increased more than for the other players. S-100B was released into the blood of the players as a consequence of game-related activities and events. Analysis of the biochemical brain damage markers (in particular S-100B) seems to have the potential to become a valuable additional tool for assessment of the degree of brain tissue damage in sport-related head trauma and probably for decision making about returning to play.
Objective: To evaluate if Fédération Internationale de Football Association's "The 11+" injury prevention program improves physical fitness and technical performance in youth futsal players. Design: Randomized cohort study. Setting: Futsal club. Participants: Thirty-six futsal players (17.3 ± 0.7 years). Intervention: Players were randomized to an intervention group (n = 18) or a control group (n = 18). Intervention group performed "The 11+" twice per week for 12 weeks. Main outcome measures: Isokinetic testing to access maximal quadriceps (Q) and hamstring (H) strength, vertical jump (squat jump, SJ; countermovement jump, CMJ), 5-m and 30-m sprint, agility, slalom, and balance performances were also measured. Results: Intervention group increased (P < 0.05) quadriceps concentric (14.7%-27.3%) and hamstrings concentric (9.3%-13.3%) and eccentric (12.7%) peak torque. Intervention group improved functional H:Q ratio by 1.8% to 8.5% (P < 0.05). Intervention group improved (P < 0.05) SJ (13.8%) and CMJ (9.9%) and 5-m and 30-m sprint (8.9% and 3.3%, respectively), agility (4.7%), and slalom (4.8%) performances. Intervention group also improved balance, by decreasing the number of falls by 30% in the nondominant limb. No changes were observed in control group. Conclusions: The results suggest that 'The 11+' can be used as an effective conditioning means for improving physical fitness and technical performance of youth futsal players.
Quadriceps contusions often result in significant time loss and the possibility of myositis ossificans. The objective of this descriptive case series was to document the results of an initial treatment regimen instituted within 10 minutes from the time of the injury. This study was a prospective case series of 47 midshipmen who sustained quadriceps contusions between August 1987 and December 2005 and who were treated identically and followed by serial examinations until the return to unrestricted full athletic activities. United States Naval Academy (USNA), Annapolis, Maryland. USNA midshipmen who sustained quadriceps contusions while participating in sports activities. Inclusion criteria were (1) stated inability at the time of the injury to continue participation and (2) the inability to perform a pain-free, isometric quadriceps contraction and maintain the knee in full extension with a straight leg lift. On diagnosis the knee was passively flexed painlessly to 120 degrees and held continuously in that position for 24 hours. Use of the brace was discontinued at 24 hours and the midshipman was instructed to perform active, pain-free quadriceps stretching several times a day and to perform pain-free isometric quadriceps strengthening exercises as soon as possible. Goals included pain-free knee flexion and quadriceps size and firmness equal to the uninjured side. Average time from the day of the injury to return to unrestricted full athletic activities with no disability. The average time to return to unrestricted full athletic activities with no disability was 3.5 days (range of 2 to 5 days). Radiographic examination of the first 23 midshipmen at 3 and 6 months following the injury revealed 1 case of myositis ossificans. Placing and holding the knee in 120 degrees of flexion immediately following a quadriceps contusion appears to shorten the time to return to unrestricted full athletic activities compared with reports in other studies.
To document the conditions seen by medical practitioners at a multidisciplinary sports medicine clinic during a 12-month period on the basis of site of injury, pathology, and sport played. A coding system for anatomical region, pathology, and sport played was designed. The total number of patient diagnoses coded and entered for analysis was 2,429. The most common sports involved were Australian football 322 (13.3%), distance running 299 (12.3%), netball/basketball 210 (8.6%), racquet sports 140 (5.8%), and track running 135 (5.6%). The most commonly injured region was the knee with 668 presentations (27.5%), followed by the upper limb (8.8%). The most frequently diagnosed pathology was overuse/inflammation with 1,115 (45.9%). Other pathologies diagnosed were partial ligament sprains 316 (13.0%), muscle strain 99 (4.1%), compartment syndrome 85 (3.5%), and third-degree ligament tear (3.5%). The most common diagnoses seen were patellofemoral syndrome, lumbar spine disorders, rotator cuff tendinitis, lateral ligament ankle sprain, medial meniscus tear, medial collateral ligament knee sprain, lateral meniscus tear, achilles tendinosis, anterior cruciate ligament tear and sacroiliac joint inflammation. A study of this nature provides valuable information both to the epidemiologist and clinician.
Objective: To compare baseline scores of middle and high school students on the Sport Concussion Assessment Tool 2 (SCAT2) by sex and age. Design: Cross-sectional study. Setting: Single private school athletic program. Participants: Three hundred sixty-one middle and high school student-athletes. Intervention: Preseason SCAT2 was administered to student-athletes before athletic participation. Main outcome measures: Total SCAT2 score, symptoms, symptom severity, Glasgow coma scale, modified Balance Error Scoring System (BESS), coordination, and Standardized Assessment of Concussion (SAC) with subsections: Orientation, Immediate Memory, Concentration, and Delayed Recall. Results: No differences were found in total SCAT2 scores between sex (P = 0.463) or age (P = 0.21). Differences were found in subcomponents of the SCAT2. Twelve year olds had significantly lower concentration scores (3.3 ± 1.2) than 15 and 18 year olds (3.9 ± 1.0 and 4.2 ± 1.0, respectively). The 12 year olds also had the lowest percentage of correct responses for the SAC's concentration 5-digit (46%), 6-digit (21%), and months' backward (67%) tasks. Females presented with more symptoms (20.0 ± 2.2 vs. 20.6 ± 2.1 P = 0.007) better immediate memory (14.6 ± 0.9 vs. 14.3 ± 1.0, P = 0.022) and better BESS scores (27.2 ± 2.3 vs. 26.6 ± 2.6, P = 0.043) than their male counterparts. Conclusions: Normative values for total SCAT2 and subscale scores show differences in concentration between ages, whereas symptoms, BESS, and immediate memory differed between sexes. We also found that 12 year olds have increased difficultly with the advanced concentration tasks, which lends support to the development of a separate instrument, such as the Child-SCAT3. The presence of developmental differences in the younger age groups suggests the need for annual baseline testing. Clinical relevance: Subtle differences between age and sex have been identified in many components of the SCAT2 assessment. These differences may support the current evolution of concussion assessment tools to provide the most appropriate test. Baseline testing should be used when available, and clinicians should be aware of potential differences when using normalized values.
Nandrolone is an anabolic steroid widely used in several sports. The numerous nandrolone positive cases in the recent years (International Olympic Committee statistics) led to several studies in the antidoping field. Nevertheless, essential questions pertaining to nandrolone endogenous production, the effects of physical exercise on the excretion of nandrolone metabolites, and contamination from nutritional supplements must still be addressed. The purpose of this study was to evaluate the influence of exhaustive exercises on 19-norandrosterone (19-NA) and 19-noretiocholanolone (19-NE) urinary excretion rates after administration of labeled nandrolone. A total of 34 healthy male Caucasian volunteers from the Institute of Sports Sciences and Physical Education (University of Lausanne) applied to participate in the study. All subjects were free from any physical drug addiction and were instructed strictly to avoid any nutritional supplement or steroid before and during the study. The participants were randomly dispatched in 2 groups in a double-blind way: a placebo group and a group treated with C-labeled nandrolone. The urinary concentrations of the 2 main nandrolone metabolites, 19-NA and 19-NE, were measured using gas chromatography coupled with mass spectrometry. In addition, clinical parameters such as creatinine, total protein, and beta2-microglobuline levels were determined using immunologic assays. After an oral ingestion of a 25 mg 3,4-C2-nandrolone dose, followed by a second identical dose 24 hours later, 19-NA and 19-NE could be detected in the urine for a period of 6 days after the initial intake. Despite several interesting observations, the measurements were very scattered and did not appear to be significantly influenced by exercise sessions in the athlete population. The results of this study suggest that physical exercise cannot be considered as a reliable parameter that systematically affects nandrolone metabolite concentrations in the urine.
To investigate the outcome of subchondral stress fractures (SSF) of the knee after treatment with the prostacyclin analogue Iloprost or the opioid analgesic Tramadol. Case series/retrospective review. Tertiary care center. Fourteen patients with at least a single subchondral stress fracture of the knee, surrounded by bone marrow edema, visible on T1-weighted and short tau inversion recovery magnetic resonance images. Nine patients had been treated with oral Iloprost (group 1; 11 SSF) and 5 patients with Tramadol (group 2; 5 SSF) for 4 weeks in the course of a double-blind, randomized clinical trial. MR images were obtained at baseline (1 day before the start of treatment), after 3 months, and after 1 year. SSF volumes and their rates of change between baseline and follow-up examinations, as determined on T1-weighted images by computer-assisted quantification. After three months, the SSF volumes had decreased by a median of 42.2% in group 1 and increased by a median of 2.2% in group 2 (P = 0.008). After 1 year, the median decrease in SSF volumes was 100.0% in group 1 and 65.7% in group 2 (P = 0.017). This small case series suggests that healing of SSF is more pronounced after Iloprost treatment.
To determine the effect of a 12-month intensive ballet training regimen on hip and ankle range of motion in male and female, first- and second-year professional dancers. 12-month longitudinal follow-up. National classical ballet school in Australia. 28 female and 20 male full-time ballet students with a mean +/- 1 SD, ages 16.8 +/- 0.8 and 17.7 +/- 1.2 years, respectively. Degrees of range of motion of left and right sides for the following movements: standing plié in parallel-passive ankle dorsiflexion (DF); standing turnout in the balletic first position--lower leg external rotation (LLER); supine hip external rotation (ER); supine hip internal rotation (IR). An additional range of motion was calculated: external rotation below the hip joint (BHER) derived by subtracting hip ER from LLER. In all subjects combined, hip and ankle ranges increased statistically on the right. However, the amount was generally minimal and most at the borderline of the amount of error associated with the measurement tool. While there was no change in LLER, there was a decrease in BHER. There were no overall gender differences, and year differences existed only for left hip ER and total hip ER with first-year dancers showing significant improvements in these ranges. For DF and sum of hip IR, first-year males and second-year females had increases in range. There was a negative relationship between baseline range and the amount of change over the 12 months. Dancers ages 16-18 years who enter full-time ballet training did not augment their ankle dorsiflexion to any appreciable degree. Some, but certainly not all, increased their hip active external rotation over 12 months without increasing their total lower limb turnout. Hip ER was more likely to improve in the first-year rather than second-year student in this elite full-time training school.
To examine the effects of 16 weeks of intensive cycling training on seminal reactive oxygen species (ROS), malondialdehyde (MDA), superoxide dismutase (SOD), catalase, and total antioxidant capacity (TAC) in male road cyclists. Repeated measures design. The Exercise Physiology Laboratory of the Urmia University. Twenty-four healthy nonprofessional male road cyclists (aged 17-26 years) participated in this study. All subjects participated in 16 weeks of intensive cycling training. The semen samples were collected, respectively, at baseline (T1), immediately (T2), 12 (T3), and 24 (T4) hours after the last training session in week 8; immediately (T5), 12 (T6), and 24 (T7) hours after the last training session in week 16; and 7 (T8) and 30 (T9) days after the last training session in week 16. Total antioxidant capacity and SOD were measured by colorimetric assay. The levels of ROS were measured by a chemiluminescence assay. Malondialdehyde levels were measured by thiobarbituric acid reactive substance assay. Catalase was measured by monitoring the initial rate of disappearance of hydrogen peroxide (initial concentration 10 mM) at 240 nm. The levels of seminal ROS and MDA increased (P < 0.008) and remained high after 30 days of recovery. The levels of seminal SOD, catalase, and TAC decreased (P < 0.008) and remained low after 30 days of recovery (P < 0.008). Sixteen weeks of intensive cycling training may have deleterious consequences for spermatozoa and hence may affect sperm healthy parameters in male cyclists.
To describe physiologic alterations in runners competing in a 160-km endurance event and to evaluate the utility of weight and blood pressure measurements in the assessment of runner performance. Prospective cohort study. One hundred sixty-km ultramarathon. Ninety-one of the 101 participants in the 2010 Tahoe Rim 100 Mile Endurance Run. Brachial blood pressure, heart rate, and weight were assessed before competition, at 80 km, and at 160 km. Alterations in brachial blood pressure, heart rate, and weight were assessed in finishers. Weight loss, brachial blood pressure, pulse pressure, and heart rate at 80 km were assessed in all participants for their ability to predict failure to finish the race. Participants who finished 160 km (57%) experienced their fastest heart rates (P < 0.05), lowest systolic pressures (P < 0.05), highest diastolic pressures (P < 0.05), narrowest pulse pressures (P < 0.05), and lowest weights (P < 0.05) at 80 km. High rates of finishing were seen in those who lost >5% of their prerace weight (87%). Categorical weight loss (<3%, 3%-5%, and >5%) was not associated with the ability to finish (P > 0.05) or finishing time (P > 0.05), whereas the presence of a narrow pulse pressure was associated with a high likelihood (likelihood ratio = 9.84; P = 0.002) of failure to finish. Greater intracompetition weight loss is not associated with impaired performance but rather may be an aspect of superior performance. Narrow pulse pressure was associated with a high likelihood of failure to finish.
To relate changes in body mass, total body water (TBW), extracellular fluid (ECF), and serum sodium concentration ([Na]) from a 161-km ultramarathon to finish time and incidence of hyponatremia. Observational. : The 2008 Rio Del Lago 100-Mile (161-km) Endurance Run in Granite Bay, California. Forty-five runners. Pre-race and post-race body mass, TBW, ECF, and serum [Na]. Body mass and serum [Na] significantly decreased 2% to 3% (P < 0.001) from pre-race to post-race, but TBW and ECF were unchanged. Significant relationships were observed between finish time and percentage change in body mass (r = 0.36; P = 0.01), TBW (r = 0.50; P = 0.007), and ECF (r = 0.61; P = 0.003). No associations were found between post-race serum [Na] and percentage change in body mass (r = -0.04; P = 0.94) or finish time (r = 0.5; P = 0.77). Hyponatremia (serum [Na] < 135 mmol/L) was present among 51.2% of finishers. Logistic regression prediction equation including pre-race TBW and percentage changes in TBW and ECF had an 87.5% concordance with the classification of hyponatremia. Hyponatremia occurred in over half of the 161-km ultramarathon finishers but was not predicted by change in body mass. The combination of pre-race TBW and percentage changes in TBW and ECF explained 87.5% of the variation in the incidence of hyponatremia. Exercise-associated hyponatremia can occur simultaneously with dehydration and cannot be predicted by weight checks at races.
This is a retrospective study of 98 hockey players who underwent 107 surgical explorations for refractory lower abdominal and groin pain that prevented them from playing hockey at an elite level. Retrospective chart review combined with a complete follow-up examination and questionnaire. The players were treated in an ambulatory care university tertiary care centre. A total of 98 elite hockey players underwent 107 surgical groin explorations for intractable groin pain preventing their play. Follow-up was 100%. Each player had repair of a tear of the external oblique muscle and fascia reinforced by a Goretex mesh. The ilioinguinal nerve was resected in each patient. There was absence of groin pain on the return to play hockey at an elite level. In all, 97 of 98 players returned to play after the surgical procedures. No morbidity was attributed to division of the ilioinguinal nerve. Surgical exploration of the involved groin with repair of the torn external oblique muscle and division of the ilioinguinal nerve has resulted in resolution of refractory groin pain and return to play in the elite hockey player. The surgical procedure is associated with a low morbidity. Recent observations on dynamic ultrasound show promise in accurately diagnosing this injury.
To review the cases of stress fracture seen over a 2-year period at a sports medicine clinic. One hundred and eighty cases diagnosed as stress fractures on the basis of clinical picture and radiological evidence were reviewed. The following features of each stress fracture were noted: age, sex, site, sport/activity. A sports medicine centre in Melbourne, Australia. The average age was 21.8 years. Seventy eight of these stress fractures were seen in women, 102 in men. The most common sites of stress fractures were the metatarsal bones (n = 42), tibia (n = 36), fibula (n = 30), tarsal navicular (n = 26) and pars interarticularis (n = 17). The most common sport was track (n = 54). Other common sports activities were jogging/distance running (n = 35), dance (n = 32) and Australian football (n = 14). The distribution of sites of stress fractures varied from sport to sport. Among the track athletes (n = 54), navicular (n = 19), tibia (n = 14) and metatarsal (n = 9) were the most common stress fracture sites. The distance runners (n = 35) predominantly sustained tibia (n = 15), and fibula (n = 8) stress fractures, while metatarsal stress fractures (n = 18) were the most common among dancers. The distribution of sports varied with the site of the stress fracture. In the metatarsal stress fractures (n = 42), dance was the most common activity. Distance running (n = 15) and track (n = 14) were the most common sports in the group to have sustained tibia stress fractures (n = 36). Track athletes (n = 14) were particularly prevalent in the navicular stress fracture group (n = 26). The distribution of sites of stress fractures in this study shows some differences from previously published studies.
To examine the results from doping controls conducted by the Norwegian Confederation of Sport (NCS) from 1977 to 1995. Data were collected by combining three computerized databases and manual records on samples taken and results from analyses in the International Olympic Committee (IOC)-accredited laboratories in London, Huddinge, Cologne, and Oslo. Samples were declared positive if they contained any banned substance on the IOC list that was in effect at any given time. A total of 15,208 samples were taken; most of them (12,870; 85%) were from Norwegian athletes (90% unannounced tests) belonging to national federations under NCS jurisdiction (NCS members), 461 (3%) were from external Norwegian athletes (either users of private gyms or athletes in organized sports federations not affiliated with the NCS), and 1,874 (12%) were from foreign athletes (three cases with unknown affiliation). There were 130 positive samples and 24 refusals among NCS members (1.2%; men, 1.4%; women, 0.3%), 86 positive samples and 8 refusals among external Norwegian athletes (20%; men, 24%; women, 8%), and 39 positive samples and 1 refusal among foreign athletes (1.6%; men, 2.1%; women, 0.7%). A gradual decrease in the percentage of positive samples was observed among NCS members as testing frequency was increased gradually from 1987 to 1995 in the three high-prevalence sports: powerlifting, weightlifting, and athletics. An increase in the test frequency of doping tests was associated with a decrease in the percentage of positive samples in targeted sports.
To examine the number and rates of head injuries occurring in the community as a whole for the team sports of ice hockey, soccer, and football by analyzing data from patients presenting to US emergency departments (EDs) from 1990 to 1999. Retrospective analysis. Data compiled for the US Consumer Product Safety Commission using the National Electronic Injury Surveillance System were used to generate estimates for the total number of head injuries, concussions, internal head injuries, and skull fractures occurring on a national level from the years 1990 to 1999. These data were combined with yearly participation figures to generate rates of injuries presenting to the ED for each sport. There were an estimated 17,008 head injuries from ice hockey, 86,697 from soccer, and 204,802 from football that presented to US EDs from 1990 to 1999. The total number of concussions presenting to EDs in the United States over the same period was estimated to be 4820 from ice hockey, 21,715 from soccer, and 68,861 from football. While the rates of head injuries, concussions, and combined concussions/internal head injuries/skull fractures presenting to EDs per 10,000 players were not always statistically similar for all 3 sports in each year data were available, they were usually comparable. While the total numbers of head injuries, concussions, and combined concussions/skull fractures/internal head injuries presenting to EDs in the United States are different for ice hockey, soccer, and football for the years studied, the yearly rates for these injuries are comparable among all 3 sports.
To analyze the patterns and causes of tennis-related injuries using, for the first time, a nationally representative data set. A retrospective cohort analysis was performed using the National Electronic Injury Surveillance System database. All tennis-related injuries treated in US emergency departments (EDs) from 1990 to 2011 were analyzed. During the study period, an estimated 492 002 (95% confidence interval, 364 668-619 336) individuals, aged 5 to 94 years, presented to US EDs for tennis-related injuries. Independent variables include patient age and gender, mechanism of injury, and location of injury event. Outcome variables include injury diagnosis, body region injured, disposition from ED, and involvement of the net. Most injuries were sustained by a nonspecific mechanism during play (37.9%) and occurred at a sport or recreation facility (83.4%). Children aged 5 to 18 years had a higher mean injury rate than adults older than 19 years. The most commonly injured body regions were the lower extremities (42.2%) and upper extremities (26.7%). Sprains or strains (44.1%) were the most common type of injury. The number of tennis-related injuries decreased by 41.4% during the years 1990 to 2011, and the tennis-related injury rates decreased by more than 45% during the study period. Among the 3.4% of patients who were admitted to the hospital, two-thirds (65.6%) involved patients 56 years of age or older. Despite the decrease in tennis-related injuries, the growing popularity of this sport warrants increased efforts to prevent injuries, especially among child and older adult participants.
Objective: To examine the incidence of illness and highlight gender differences in tennis players competing in a major professional tennis tournament over a 16-year period between 1994 and 2009. Design: Descriptive epidemiology study of illness trends in professional tennis players. Setting: Archival data from the US Open Tennis Championships. Participants: Participants in the US Open Tennis Championships main draw from 1994 to 2009. Main outcome measures: Illness data collected at the US Open Tennis Championships between 1994 and 2009 were classified using guidelines presented in a sport-specific consensus statement. Each case was categorized according to the medical system effected and impact on play availability during the tournament. Illness rates were determined based on the exposure of an athlete to a match event and were calculated as the ratio of illness cases per 1000 match exposures (ME). Results: The average number of illness cases over the 16-year period analyzed was 58.19 ± 12.02 per year (36.74 per 1000 ME) requiring assistance by the medical staff. Statistical analyses showed a significant fluctuation in illness cases related to the dermatological (DERM), gastrointestinal, renal/urogenital/gynecological, neurological, ophthalmic and otorhinolaryngological (ENT), and infectious medical systems (P < 0.05). The ENT and DERM conditions were the most commonly reported types of illness for both men and women. Conclusions: Numerous medical systems are susceptible to illness in tennis players. Sport-specific factors may influence susceptibility to common illnesses experienced by professional tennis players.
The goals of this study were to assess the health care available to Wisconsin high school football players and to assess high schools' compliance with safety requirements of the Wisconsin Interscholastic Athletic Association (WIAA). The design was a cross-sectional survey-based study. The setting consisted of WIAA high schools. Athletic directors of WIAA high school football programs participated in the survey. The main outcome measures were the prevalence of medical coverage by physicians, certified athletic trainers, and ambulance personnel at football games and practice and the prevalence of compliance with WIAA requirements. Seventy-seven percent (302/392) of surveys were returned. Thirty-six percent of schools had a designated team physician. Eighty-seven percent had a trainer, and 86% were certified athletic trainers (Athletic Trainer Certified, ATC). At practice and scrimmage, 79% had an ambulance available or on call, 52% had a trainer present, and 28% had a physician on call. At football games, 71% had an ambulance, 67% a certified athletic trainer, 48% an emergency medical technician, and 45% a physician present. Regarding WIAA requirements, 9% had no accessible phone, 27% had no written emergency plan of action, 92% had gloves, and 92% had blood spill kits. Larger schools had better compliance with WIAA requirements than did smaller schools. Health care coverage was provided mainly by trainers and ambulance personnel, although physicians were routinely present at almost half of all games. Failure to comply with WIAA medical coverage requirements was not infrequent. This study forms the basis for an informational intervention, providing an opportunity to correct deficits.
To document use of diagnostic imaging during a multi-sport games to assist in planning for future such competitions. Medical records from the 1997 Canada Summer Games and from the Brandon General Hospital were reviewed. All uses of diagnostic imaging were compiled as were results of the imaging examinations. These data were correlated with demographic information. A total of 80 imaging examinations were performed during the 1997 Canada Summer Games. These were mainly plain radiographs (n = 77), with two nuclear medicine examinations and one computed tomography (CT) scan. Ultrasound and magnetic resonance imaging (MRI) were available but not used. Use of imaging examinations correlated well with the risk category of the sports, and was almost identical between female and male athletes; women accounted for 42.5% of the imaging examinations and 42.7% of the participants. These data may be helpful in planning for other multi-sport competitions. The mix of sports is of greater predictive value than the ratio of female to male athletes when predicting the demand for diagnostic imaging services.
To evaluate the clinical diagnostic skills of healthcare providers treating skin infections in Minnesota high school wrestlers. Collected data from the Minnesota State Wrestling 3-class tournament over a 10-year period from 1997 through 2006. Male wrestlers 13 to 18 years of age who have qualified by placing first or second in their section tournaments. On-site physicians screen athletes that have suspicious lesions and review those who have already been seen by their local healthcare providers. Athletes are allowed to compete if their lesions are considered noninfectious. After review of the data, a distinct difference was noted with the number of skin infections seen in the larger schools compared to the smaller ones. Review of the Minnesota Medical Association's data base for available healthcare providers was correlated with the communities these infected wrestlers were from. Each of the 3 classes had 238 participants each year. A total of 299 skin infections were recorded over this 10-year period. Analysis of class comparison and number of skin infections reveals a significant difference, with class A (smaller schools) having 81 infections compared to class AAA (larger schools) having 119, P=0.0076. Comparison of healthcare providers finds a distinct difference with the smaller communities having 1.5 providers per town compared with larger schools having 66.4 providers per town. Continuity, not availability, of healthcare is necessary to properly control skin infections in high school wrestling.
To examine the incidence and characteristics of concussions for one season in the Canadian Football League (CFL). Retrospective survey. 289 players reporting to CFL training camp. Of these, 154 players had played in the CFL during the 1997 season. Based on self-reported symptoms, calculations were made to determine the number of concussions experienced during the previous season, the duration of symptoms, the time for return to play after concussion, and any associated risk factors for concussions. Of all the athletes who played during the 1997 season, 44.8% experienced symptoms of a concussion. Only 18.8% of these concussed players recognized they had suffered a concussion. 69.6% of all concussed players experienced more than one episode. Symptoms lasted at least 1 day in 25.8% of cases. The odds of experiencing a concussion increased 13% with each game played. A past history of a loss of consciousness while playing football and a recognized concussion while playing football were both associated with increased odds of experiencing a concussion during the 1997 season. Many players experienced a concussion during the 1997 CFL season, but the majority of these players may not have recognized that fact. Players need to be better informed about the symptoms and effects of concussions.
To elucidate the epidemiology and the mechanisms of snowboarding wrist injuries, especially distal radial fractures. A prospective survey of snowboarders with distal radial fractures. From November 21, 1998, to April 22, 2001, we analyzed and interviewed 5110 injured snowboarders, and a total of 740 snowboarders with distal radial fractures were studied. On the basis of the medical records and radiographs, the severity of distal radial fracture was analyzed according to the AO classification. Distal radial fractures occurred at a rate of 0.31 per 1000 snowboarder visits and were assessed in 740 snowboarders in this study. Most of the injured snowboarders were either of beginner (42.0%) or intermediate level (48.1%). The most common events leading to an injury in snowboarding were falling (59.6%) and jumping (36.1%). Comminuted and articular fractures classified as AO types A3, B, and C, which required surgical treatment, made up 63.2% of distal radial fractures in snowboarders. The most remarkable differences between the first-time or beginner group and the intermediate or expert group were that the former had a significantly higher proportions of extra-articular fractures classified as AO type A (P < 0.05), and the latter were significantly more likely to have compression or complex intra-articular fractures such as AO type C (P < 0.05). Furthermore, first-time or beginner snowboarders were more likely to be injured because of a simple fall than were the intermediates or experts (P < 0.05). Over 60% of distal radial fractures classified as AO type C in the intermediate or expert group resulted from jumping. Furthermore, the side opposite to the snowboarder's preferred direction of stance was more often affected. A high incidence of injury during opposite-side edging, which is used more frequently in snowboarding, was found in novice female snowboarders. This study suggested several patterns in the nature of wrist injuries sustained while snowboarding, and these facts should be taken into consideration in the diagnosis of wrist injuries in snowboarders.
: The Female Athlete Triad is a medical condition often observed in physically active girls and women, and involves 3 components: (1) low energy availability with or without disordered eating, (2) menstrual dysfunction, and (3) low bone mineral density. Female athletes often present with 1 or more of the 3 Triad components, and an early intervention is essential to prevent its progression to serious endpoints that include clinical eating disorders, amenorrhea, and osteoporosis. This consensus statement represents a set of recommendations developed following the first (San Francisco, California) and second (Indianapolis, Indianna) International Symposia on the Female Athlete Triad. It is intended to provide clinical guidelines for physicians, athletic trainers, and other health care providers for the screening, diagnosis, and treatment of the Female Athlete Triad and to provide clear recommendations for return to play. The 2014 Female Athlete Triad Coalition Consensus Statement on Treatment and Return to Play of the Female Athlete Triad Expert Panel has proposed a risk stratification point system that takes into account magnitude of risk to assist the physician in decision-making regarding sport participation, clearance, and return to play. Guidelines are offered for clearance categories, management by a multidisciplinary team, and implementation of treatment contracts. This consensus paper has been endorsed by The Female Athlete Triad Coalition, an International Consortium of leading Triad researchers, physicians, and other health care professionals, the American College of Sports Medicine, and the American Medical Society for Sports Medicine.
Differences in collision rates per game in games from 2002 (large ice), 2003 (small ice) and 2004 (intermediate ice) World Junior Hockey championships. Significantly more collisions of all types occurred on the small ice. Collision frequency on the intermediate size ice was intermediate between the small and large ice surfaces, with an inverse relationship between ice size and collision frequency.
A. Differences in collision rates in games from 2002 (large ice), 2003 (small ice) and 2004 (intermediate ice) World Junior Hockey championships. Significantly more collisions of all types occurred on the small ice compared to the large ice. Collision frequency on the intermediate size ice was intermediate between the small and large ice surfaces. Box plots show medians (horizontal lines), 25th to 75th percentiles (shaded areas) and ranges (extreme bars). Single game outlier values in 2002 and 2004 marked by “○”. *p<0.05, ***p<0.005. ns: not significant.
B. Differences in head impacts in games from 2002 (large ice), 2003 (small ice) and 2004 (intermediate ice) World Junior Hockey championships. Significantly more head impacts of all types occurred on the small ice compared to the large ice. Box plots show medians (horizontal lines), 25th to 75th percentiles (shaded areas) and ranges (extreme bars). *p<0.05, **p<0.01, ***p<0.005. ns: not significant.
Collisions in World Junior Hockey games from 2002 (large ice), 2003 (small ice) and 2004 (intermediate ice) championships
Means (± 95% confidence levels) of collisions in games from 2002 (large ice), 2003 (small ice) and 2004 (intermediate ice) World Junior Hockey championships
Objective: To determine if collision rates and head impacts in elite junior hockey differed between games played on the small North American ice surface (85 ft wide), an intermediate-size Finnish ice surface (94 ft wide), and the large standard international ice surface (100 ft wide). Design: Videotape analysis of all games involving Team Canada from the 2002 (large ice, Czech Republic), 2003 (small ice, Canada), and 2004 (intermediate ice, Finland) World Junior Championships. All collisions were counted and separated into various categories (volitional player/player bodychecks, into boards or open ice, plus accidental/incidental player/boards, player/ice, head/stick, head/puck). Further subdivisions included collisions involving the head directly or indirectly and notably severe head impacts. Results: Small, intermediate, and large ice surface mean collisions/game, respectively, were 295, 258, 222, total collisions; 251, 220, 181, volitional bodychecks; 126, 115, 88, into boards; 125, 106, 93, open ice; 71, 52, 44, total head; 44, 36, 30, indirect head; 26, 16, 13, direct head; and 1.3, 0.5, 0.3, severe head (P < 0.05 for small-intermediate ice and intermediate-large ice differences in total collisions; P < 0.005 for small-large ice difference; P < 0.05 for small-intermediate ice differences in head impacts; P < 0.01 for small-large ice differences in total and severe head impacts). Conclusions: There is a significant inverse correlation between ice size and collision rates in elite hockey, including direct, indirect, and severe head impacts. These findings suggest that uniform usage of the larger international rinks could reduce the risk of injury, and specifically, concussions in elite hockey by decreasing the occurrence of collisions and head impacts.
To gather data and examine the use by elite Olympic athletes of food supplements and pharmaceutical preparations in total and per sport, country, and gender. Survey study. Athens 2004 Olympic Games (OG). Data from 2 sources were collected: athletes' declaration of medications/supplements intake recorded on the Doping Control Official Record during sample collection for doping control, and athletes' application forms for granting of a therapeutic use exemption (TUE) and through the abbreviated TUE process (aTUE). Classification of declared food supplements according to the active ingredient and medications according to therapeutic actions and active compounds. 24.3% of the athletes tested for doping control declared no use of medications or food supplements. Food supplements (45.3%) continue to be popular, with vitamins (43.2%) and proteins/aminoacids (13.9%) in power sports being most widely used. Nonsteroidal antiinflammatory agents and analgesics were also commonly used by athletes (11.1% and 3.7%, respectively). The use of the hemoderivative actovegin and several nonprohibited anabolic preparations are discussed. The prevalence of medication use for asthma and the dangers of drug interactions are also presented.Laboratory analysis data reveal that of the aTUEs received for inhaled glucocorticosteroids, only budesonide was detectable in significant percentage (10.0%). Only 6.5% of the 445 athletes approved to inhale beta2-agonists led to an adverse analytical finding. This review demonstrates that overuse of food supplements was slightly reduced compared to previous OGs and a more rational approach to the use of medication is being adopted.
Objective: To describe the incidence and risk factors for high ankle sprains (ie, syndesmosis injuries) among National Collegiate Athletic Association (NCAA) football players. Design: Descriptive epidemiologic study. Setting: Data were examined from the NCAA's Injury Surveillance System (ISS) for 5 football seasons (from 2004-2005 to 2008-2009). Participants: All NCAA men's football programs participating in the ISS. Assessment of risk factors: No additional risk factors were introduced as a result of this analysis. Main outcome measures: For partial and complete syndesmosis injuries, outcome measures included incidence, time lost from participation, and requirement for surgical repair. Results: The overall incidence of high ankle sprains in NCAA football players was 0.24 per 1000 athlete exposures, accounting for 24.6% of all ankle sprains. Athletes were nearly 14 times more likely to sustain the injury during games compared with practice; complete syndesmosis injuries resulted in significantly greater time lost compared with partial injuries (31.3 vs 15.8 days). Less than 3% of syndesmosis injuries required surgical intervention. There was a significantly higher injury incidence on artificial surfaces compared with natural grass. The majority of injuries (75.2%) occurred during contact with another player. Conclusions: Our data suggest a significantly higher incidence of syndesmosis injuries during games, during running plays, and to running backs and interior defensive linemen. The wide range in time lost from participation for complete syndesmosis injuries underscores the need for improved understanding of injury mechanism and classification of injury severity such that prevention, safe return to play protocols, and outcomes can be further improved.
To describe the epidemiology of fractures among US high school athletes participating in 9 popular sports. Descriptive epidemiologic study. Sports injury data for the 2005-2009 academic years were collected using an Internet-based injury surveillance system, Reporting Information Online (RIO). A nationally representative sample of 100 US high schools. Injuries sustained as a function of sport and sex. Fracture injury rates, body site, outcome, surgery, and mechanism. Fractures (n = 568 177 nationally) accounted for 10.1% of all injuries sustained by US high school athletes. The highest rate of fractures was in football (4.61 per 10 000 athlete exposures) and the lowest in volleyball (0.52). Boys were more likely than girls to sustain a fracture in basketball (rate ratio, 1.35,; 95% confidence interval, 1.06-1.72) and soccer (rate ratio, 1.34; 95% confidence interval, 1.05-1.71). Overall, the most frequently fractured body sites were the hand/finger (28.3%), wrist (10.4%), and lower leg (9.3%). Fractures were the most common injury to the nose (76.9%), forearm (56.4%), hand/finger (41.7%), and wrist (41.6%). Most fractures resulted in >3 weeks' time lost (34.3%) or a medical disqualification from participation (24.2%) and were more likely to result in >3 weeks' time lost and medical disqualification than all other injuries combined. Fractures frequently required expensive medical diagnostic imaging examinations such as x-ray, computed tomographic scan, and magnetic resonance imaging. Additionally, 16.1% of fractures required surgical treatment, accounting for 26.9% of all injuries requiring surgery. Illegal activity was noted in 9.3% of all fractures with the highest proportion of fractures related to illegal activity in girls' soccer (27.9%). Fractures are a major concern for US high school athletes. They can severely affect the athlete's ability to continue sports participation and can impose substantial medical costs on the injured athletes' families. Targeted, evidence-based, effective fracture prevention programs are needed.
Measure incidence of spinal injuries in Canadian ice hockey for the 6-year period 2000-2005 and examine trends from 1943 to 2005. Data about spinal injuries with and without spinal cord injury in ice hockey have been collected by ThinkFirst's Canadian Ice Hockey Spinal Injuries Registry since 1981 through questionnaires from practitioners, ice hockey organizations, and media reports. All provinces and territories of Canada. All Canadian ice hockey players. Age, gender, level of play, location, and mechanism of injury. Incidence and nature of injuries. Forty cases occurred in 2000-2005, representing a decline in annual injuries and bringing the total registry cases to 311 during 1943-2005. Five (12.5%) of these 40 cases were severe, which includes all complete and incomplete spinal cord injuries, and is a decline from the previous 23.5% in this category. In the 311 cases, men comprised 97.7%, the median age was 18 years, 82.8% of the injuries were cervical, and 90.3% occurred in games in organized leagues. The most common mechanism of injury was impact with the boards (64.8%), and the most common cause was check/push from behind at 35.0%, which has declined. The major provincial differences in injury rates persist, with the highest in Ontario, British Columbia, New Brunswick, and Prince Edward Island and the lowest in Quebec and Newfoundland. There has been a recent decline in spinal injuries in Canadian ice hockey that may be related to improved education about injury prevention and/or specific rules against checking/pushing from behind.
To investigate the epidemiology of dislocations/separations in a nationally representative sample of high school student-athletes participating in 9 sports. Descriptive epidemiologic study. Sports injury data for the 2005-2009 academic years were collected using an Internet-based injury surveillance system, Reporting Information Online (RIO). A nationally representative sample of 100 US high schools. Injuries sustained as a function of sport and gender. Dislocation/separation rates, body site, outcome, surgery, and mechanism. Dislocations/separations represented 3.6% (n = 755) of all injuries. The most commonly injured body sites were the shoulder (54.9%), wrist/hand (16.5%), and knee (16.0%); 18.4% of dislocations/separations were recurrences of previous injuries at the same body site; 32.3% of injuries were severe (ie, student-athletes unable to return to play within 3 weeks of the injury date), and 11.8% required surgical repair. The most common mechanisms of injury were contact with another player (52.4%) and contact with the playing surface (26.4%). Injury rates varied by sport. In gender-comparable sports, few variations in patterns of injury existed. Rates were highest in football (2.10 per 10 000 athletic exposures) and wrestling (1.99) and lowest in baseball (0.24) and girls' soccer (0.27). Although dislocation/separation injuries represent a relatively small proportion of all injuries sustained by high school student-athletes, the severity of these injuries indicates a need for enhanced injury prevention efforts. Developing effective targeted preventive measures depends on increasing our knowledge of dislocation/separation rates, patterns, and risk factors among high school athletes.
To develop a precompetition medical assessment (PCMA) of elite football players aimed at identifying risk factors for sudden cardiac death. Retrospective analysis of the PCMA forms. Of the 32 national teams (with 23 players), PCMA forms from 605 players were submitted after the final match (82%). Data of 582 players were analyzed (79%). Recorded results of a standardized PCMA in all players before the 2006 FIFA World Cup including medical history, physical examination, resting/exercise electrocardiogram, and echocardiography were analyzed by 2 independent cardiologic reviewers. Apart from general deficits in data quality, at least 6 players (1.0%) could be identified as demanding further investigations to rule out a serious cardiovascular disease. Comprehensive cardiac testing is feasible in international elite football. To improve future results, the PCMA was revised. It is questionable if exercise stress testing should be included in future PCMA. To ensure correct results, sports cardiologic expertise is essential. In the face of organizational challenges and variable medical standards, alternative approaches to the practical implementation of the PCMA need to be investigated.
To test the hypotheses that the hemoglobin (Hb) distribution curve in elite male and female long track speed skaters is not normally distributed and that there is a positive relationship between competitive success and Hb concentration. A venous blood sample was taken before the events from all skaters. The Hb concentration distribution curves of all ranked from 1 to 30 were tested for normality. In addition, individual Hb concentrations were plotted against ranking in the matching events. 2006 major championships and Olympic winter games. All elite male and female speed skaters (217 men and 200 women) competing in major international championships in 2006 and in the Olympic winter games 2006. Hb concentration and individual ranking in the matching event. The mean Hb levels in men and women were 15.7 +/- 0.8 g/dL and 14.0 +/- 0.7 g/dL, respectively. The distribution curve in men would meet the criteria for normal distribution when 4 values from 2 skaters with naturally high Hb levels were neglected. In the women, the distribution curve did not meet the criteria for normality because of low frequency in the right side of the distribution curve and a high frequency at the left side. The curve failed to have a steep drop off at the right side. When plotting Hb concentration against ranking, there is no correlation and relationship between Hb concentration and competitive success. The Hb concentrations are within the normal range for endurance athletes, and there is no indication that the values are titrated toward the upper allowed limit. In addition, there is no relationship between Hb concentration and competitive success in elite speed skaters.
To create a model for the precompetition medical assessment (PCMA) of international elite football players aimed at identifying risk factors and to assess the feasibility of standardized requirements for teams from countries with variable medical standards. Descriptive feasibility study. Medical assessment of professional football players before a major international competition. Thirty-two national football teams comprising 736 players participating in the 2006 FIFA World Cup. A standardized football-specific PCMA was developed, and all team physicians were asked to perform the PCMA before the 2006 FIFA World Cup. Response rate, completeness of documentation forms, and quality of data. Of all 32 teams participating in the 2006 FIFA World Cup, the precompetition assessment forms of 26 teams (response rate 81%) were returned. Of the initial target population of 736 players, the data of 582 players (79%) were analyzed. The average completeness of the forms ranged from 34% to 94% among teams and average completeness of the different sections of the forms from 78% to 98%. Quality of data provided varied considerably. The response rate of 81% demonstrated that a standardized approach is possible, whereas results and quality of data required adaptations of the form and review of the implementation procedure.
The aim of this study was to analyze all sports injuries incurred in competitions and/or training during the 2007 World Athletics Championships and to prove the feasibility of the injury surveillance system developed for the 2008 Olympic Games for individual sports. Prospective recording of injuries. 11 IAAF World Championships in Athletics 2007 in Osaka, Japan. All national team physicians and physiotherapists; Local Organising Committee (LOC) physicians working in the Medical Centres at the stadium and warm-up area. Frequency, characteristics, and incidence of injuries. 192 injuries were reported, resulting in an incidence of 97 injuries per 1000 registered athletes. More than half of the injuries (56%) were expected to prevent the athlete from participating in competition or training. Eighty percent affected the lower extremity; the most common diagnosis was thigh strain (16%). In most cases, the injury was caused by overuse (44%). A quarter of the injuries were incurred during training and 137 (71%) in competition. On average, 72.4 injuries per 1000 competing athletes were incurred in competitions. The incidence of injury varied substantially among the disciplines. The risk of a time-loss injury was highest in heptathlon, women's 10,000 m, women's 3000 m steeplechase, decathlon, and men's marathon. The injury surveillance system proved feasible for individual sports. Risk of injury varied among the disciplines, with highest risk in combined disciplines, steeplechase, and long-distance runs. Preventive interventions should mainly focus on overuse injuries and adequate rehabilitation of previous injuries.
Top-cited authors
Carolyn Emery
  • The University of Calgary
Nicola Maffulli
  • Università degli Studi di Salerno
Karim M Khan
  • University of British Columbia - Vancouver
Timothy D Noakes
  • Cape Peninsula University of Technology
Michael Mccrea
  • Medical College of Wisconsin