Article

Performance success or failure is explained by weeks lost to injury and illness in elite Australian Track and Field athletes: a 5-year prospective study

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Objectives To investigate the impact of training modification on achieving performance goals. Previous research demonstrates an inverse relationship between injury burden and success in team sports. It is unknown whether this relationship exists within individual sport such as athletics. Design A prospective, cohort study (n = 33 International Track and Field Athletes; 76 athlete seasons) across five international competition seasons. Methods Athlete training status was recorded weekly over a 5-year period. Over the 6-month preparation season, relationships between training weeks completed, the number of injury/illness events and the success or failure of a performance goal at major championships was investigated. Two-by-two table were constructed and attributable risks in the exposed (AFE) calculated. A mixed-model, logistic regression was used to determine the relationship between failure and burden per injury/illness. Receiver Operator Curve (ROC) analysis was performed to ascertain the optimal threshold of training week completion to maximise the chance of success. Results Likelihood of achieving a performance goal increased by 7-times in those that completed >80% of planned training weeks (AUC, 0.72; 95%CI 0.64-0.81). Training availability accounted for 86% of successful seasons (AFE = 0.86, 95%CI, 0.46 to 0.96). The majority of new injuries occurred within the first month of the preparation season (30%) and most illnesses occurred within 2-months of the event (50%). For every modified training week the chance of success significantly reduced (OR = 0.74, 95%CI 0.58 to 0.94). Conclusions Injuries and illnesses, and their influence on training availability, during preparation are major determinants of an athlete's chance of performance goal success or failure at the international level.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In athletics, few articles have analysed the potential relationships between performance and injuries (Raysmith and Drew, 2016;Edouard et al., 2019Edouard et al., , 2021. During eight international athletics championships, Edouard et al. (2019) reported that lower numbers of injuries per registered athlete were correlated with higher number of medals and gold medals per registered athletes, when analysing country participation grouped according to country team sizes. ...
... Edouard et al. (2021) reported that being injured during international combined event competitions was associated with lower odds of winning a medal during the respective competition. To our knowledge, only Raysmith and Drew (2016) reported results on the relationships between performances and injuries during an athletics season follow-up. They reported that injuries occurring during the 6-months preparation period of an international competition, and related loss in training time, have a negative effect on the performance success for the respective competition in international-level athletes. ...
... For the primary analysis, we only considered (i) any participation in a national championship, and (ii) any participation in an international competition. Indeed, participation at a national or international competition represents the global result of a season for an athlete, and often the goal of the season as reported by Raysmith and Drew (2016). On the contrary, a podium or a victory is the result of a single competition where several "parasitic" factors can play a role on the performance, such as for instance number of participants, climatic conditions, stress, etc., all of which are diluted by the number of competitions when considering annual participation in national or international competitions. ...
Article
Full-text available
Background Performance success or failure in athletics (Track and Field) and the capacity to succeed are driven at the adult level, like in other sports, by many factors, injury being one of them. More information regarding the potential relationships between performance and injuries in athletics is needed. Objective To analyse the potential association between performance and occurrence of injuries in national-level athletics athletes from sprints, jumps and combined events through several seasons. Methods We performed a retrospective analysis of performance and injury data collected prospectively in 8 national-level athletics athletes followed during at least five consecutive seasons from 2009 to 2019. For each athlete, injuries data [total injuries (injuries) and time-loss injuries (TLI)] were collected by the same sports medicine physician throughout the study period using a medical attention injury definition. Performances during official competitions were collected on the French Federation of Athletics website, and included (i) any participation in national championships, (ii) any participation in an international competition (i.e., being national team member for an international competition), (iii) any podium at the national championships, (iv) any podium at an international competition, and (v) performance metrics normalised to the world record (WR) of the respective athletics speciality (%WR). For each athlete, we performed a descriptive analysis of the performances and injuries. We also performed four binomial logistic regressions with (1) national championships participation (yes/no) or (2) international competition participation (yes/no) as dependent variables, and injuries (yes/no) or TLI (yes/no) as independent variables, adjusted for individual athlete and number of seasons, and in models on participation in international competitions, was also adjusted for national championship participation (yes/no), with Odd Ratios (OR) with 95% confidence intervals (95%CI). Results Among the 8 national-level athletics athletes included in the present study, cumulated 155 injuries, including 52 TLI (33.5%). There was an average of 2.7 ± 1.7 injuries and 0.9 ± 0.6 TLI per athlete per season over the study period. The occurrence of injuries was significantly associated with higher odds of national championships participation (OR = 4.85 [95% CI 3.10 to 3050.5], p = 0.021). The occurrence of TLI was significantly associated with higher odds of national championships participation (OR = 133.6 [95% CI 4.92 to 14251.5], p = 0.013). The occurrence of injuries or TLI were associated with insignificantly lower odds of international championships participation. Conclusions Our present pilot study confirms that injuries are part of an athletes' life. The occurrence of at least one injury was associated with higher odds of participation in a national championship, whereas the absence of at least one injury was associated with higher odds of participation in an international championship. We hypothesised that the length of the season can play a role in the risk of injury occurrence, but if the athlete wants to reach his/her highest level, decreasing the risk of injuries seems to be of importance. Despite the caution that should be taken in the interpretation of our results, our present study confirms the interest and relevance of injury risk reduction approach in athletics.
... Injuries are a reasonably common occurrence in high-level athletes both in and out of competition (Drew, Raysmith, & Charlton, 2017;Feddermann-Demont, Junge, Edouard, Branco, & Alonso, 2014;Raysmith & Drew, 2016). Masters runners appear to be at an increased risk of injury compared with their younger counterparts; for example, McKean, Manson, and Stanish (2006) reported that Masters runners were significantly more likely to get injured, and suffer more injuries, than younger adults. ...
... As the hamstring muscles are more susceptible to injury during sprinting (Chumanov, Schache, Heiderscheit, & Thelen, 2012) and age is a risk factor for hamstring strain injury (Gabbe, Bennell, & Finch, 2006), it is logical to suggest that Masters sprinters are at an increased risk of hamstring injury, although, to our knowledge, this has yet to be empirically quantified. Given the relationship between injury and decreased likelihood of achieving a performance goal (Drew et al., 2017;Raysmith & Drew, 2016) and that Masters athletes are more likely to experience such injuries (Gabbe et al., 2006;McKean et al., 2006), it appears logical to suggest that Masters athletes undertake preventative measures to mitigate their risk of injury. This should include eccentric loading activities for the hamstring (Bourne et al., 2018), regular exposure to high-speed running (Edouard et al., 2019), and calf strengthening exercises (Alfredson, Pietilä, Jonsson, & Lorentzon, 1998). ...
... This should include eccentric loading activities for the hamstring (Bourne et al., 2018), regular exposure to high-speed running (Edouard et al., 2019), and calf strengthening exercises (Alfredson, Pietilä, Jonsson, & Lorentzon, 1998). Inclusion of these exercise types within the Masters sprinters program should assist in enhancing training and competition availability and, as a result, performance (Raysmith & Drew, 2016). ...
Article
Elite sprint performances typically peak during an athlete’s twenties, and decline thereafter with age. The mechanisms underpinning this sprint performance decline are often reported to be strength-based in nature, with reductions in strength capacities driving increases in ground contact time and decreases in stride lengths and frequency. However, an as-of-yet under-explored aspect of Masters sprint performance is that of age-related degradation in neuromuscular infrastructure, which manifests as both declining strength and movement coordination. Here, we explore reductions in sprint performance in Masters athletes in an holistic fashion, blending discussion of strength and power changes with neuromuscular alterations, along with mechanical and technical age-related alterations. In doing so, we provide recommendations to Masters sprinters—and the aging population in general—as to how best to support sprint ability and general function with age, identifying nutritional interventions that support performance and function, and suggest useful programming strategies and injury-reduction techniques.
... Injuries are a reasonably common occurrence in high-level athletes both in and out of competition (Drew, Raysmith, & Charlton, 2017;Feddermann-Demont, Junge, Edouard, Branco, & Alonso, 2014;Raysmith & Drew, 2016). Masters runners appear to be at an increased risk of injury compared with their younger counterparts; for example, McKean, Manson, and Stanish (2006) reported that Masters runners were significantly more likely to get injured, and suffer more injuries, than younger adults. ...
... As the hamstring muscles are more susceptible to injury during sprinting (Chumanov, Schache, Heiderscheit, & Thelen, 2012) and age is a risk factor for hamstring strain injury (Gabbe, Bennell, & Finch, 2006), it is logical to suggest that Masters sprinters are at an increased risk of hamstring injury, although, to our knowledge, this has yet to be empirically quantified. Given the relationship between injury and decreased likelihood of achieving a performance goal (Drew et al., 2017;Raysmith & Drew, 2016) and that Masters athletes are more likely to experience such injuries (Gabbe et al., 2006;McKean et al., 2006), it appears logical to suggest that Masters athletes undertake preventative measures to mitigate their risk of injury. This should include eccentric loading activities for the hamstring (Bourne et al., 2018), regular exposure to high-speed running (Edouard et al., 2019), and calf strengthening exercises (Alfredson, Pietilä, Jonsson, & Lorentzon, 1998). ...
... This should include eccentric loading activities for the hamstring (Bourne et al., 2018), regular exposure to high-speed running (Edouard et al., 2019), and calf strengthening exercises (Alfredson, Pietilä, Jonsson, & Lorentzon, 1998). Inclusion of these exercise types within the Masters sprinters program should assist in enhancing training and competition availability and, as a result, performance (Raysmith & Drew, 2016). ...
Article
Elite sprint performances typically peak during an athlete’s 20s and decline thereafter with age. The mechanisms underpinning this sprint performance decline are often reported to be strength-based in nature with reductions in strength capacities driving increases in ground contact time and decreases in stride lengths and frequency. However, an as-of-yet underexplored aspect of Masters sprint performance is that of age-related degradation in neuromuscular infrastructure, which manifests as a decline in both strength and movement coordination. Here, the authors explore reductions in sprint performance in Masters athletes in a holistic fashion, blending discussion of strength and power changes with neuromuscular alterations along with mechanical and technical age-related alterations. In doing so, the authors provide recommendations to Masters sprinters—and the aging population, in general—as to how best to support sprint ability and general function with age, identifying nutritional interventions that support performance and function and suggesting useful programming strategies and injury-reduction techniques.
... Generally, the consequences of subsequent injuries (extended removal from sports participation and elevated medical cost) are a major concern and represent a real burden for the athlete and for the society (Brooks, Fuller, Kemp, & Reddin, 2006;Hamilton et al., 2011). In addition, subsequent injuries may have detrimental effects on sports performance (Raysmith & Drew, 2016) and substantial psychological repercussions on the player (Guskiewicz et al., 2007). ...
... Since subsequent injuries could have catastrophic repercussions on psychological (Guskiewicz et al., 2007) and socioeconomic status (Brooks et al., 2006), and on physical performance (Raysmith & Drew, 2016), with an increased time-loss from sports participation (Brooks et al., 2006;Hamilton et al., 2011), it would be important to intervene to prevent these repercussions in athletes with UPT. Clinicians should take the risk of subsequent acute, non-contact lower extremity musculoskeletal injury in athletes with UPT into consideration as it concerns management and return-to-play decisions even after exhibiting improvement on standard clinical tests. ...
Article
This study aimed to investigate static and dynamic postural balance inter-limb asymmetries in athletes with unilateral patellar tendinopathy (UPT) and estimate subsequent lower extremity musculoskeletal injury risk compared to controls. Twenty-eight athletes with UPT were recruited. Twenty-eight healthy athletes served as controls. Static postural balance inter-limb asymmetry (symmetry index: SI) was assessed based on differences in the mean center of pressure (CoP) velocity (CoPv) values between the affected leg (AL) and non-affected leg (NAL) for the UPT group, and the dominant leg (DL) and non-dominant leg (NDL) for controls. Outcome variables were dynamic postural balance, assessed with inter-limb asymmetry using the Y Balance Test (YBT), and injury risk. In static balance, SI values were significantly (P<0.001) higher in the UPT group compared to controls. In dynamic balance, normalized inter-limb asymmetry values were also significantly higher in athletes with UPT compared to controls in anterior (P<0.001), posteromedial (P<0.001) and posterolateral (P<0.01) directions, and in the composite score (P<0.001). Furthermore, the incidence of sustaining a non-contact lower extremity injury during the follow-up period (10 months) was significantly higher (P<0.05) in the UPT group compared to controls. Athletes with UPT had postural balance inter-limb asymmetries. Moreover, they had increased subsequent lower extremity musculoskeletal injury risk compared to controls. Since most athletes with UPT continue to train and compete, adequate training and rehabilitation programs should be implemented to prevent potential subsequent injury occurrence.
... Pre-season is a critical phase of training for elite rugby players to prepare for the physical demands of competition. Preventing illness, caused by opportunistic infection or reactivation of latent viruses, is paramount to optimize athletes training availability and, ultimately, performance (Palmer-Green et al., 2013;Raysmith & Drew, 2016). However, intensified training, a typical characteristic of pre-season training in elite rugby union and league (Argus et al., 2010;Killen et al., 2010), can increase illness risk by suppressing various components of immune function (Gleeson & Pyne, 2016;C. ...
... These findings may be explained by the general consensus in the literature and elite sport, whereby athletes are typically only advised to avoid training if they are reporting "below-the-neck" (systemic) symptoms (Walsh, 2018). In the current study, it is possible that rugby players with URTS modified training (e.g., reduced volume and/or intensity of training), which can negatively affect sporting success (Raysmith & Drew, 2016); however, this information was not provided to the researchers. Further research is needed to explore the relationship between training modification and URTS episodes in elite team-sport athletes. ...
Article
This study examined possible predictors of upper respiratory tract symptom (URTS) episodes in elite rugby union and league players (n = 51) during intensive pre-season training. Baseline saliva and blood samples were collected in the first week of pre-season training for analysis of salivary secretory immunoglobulin A (SIgA) and cytomegalovirus. Thereafter, SIgA, URTS, internal training load and self-reported wellness data were repeatedly measured throughout a 10-week pre-season training period. Univariate frailty model analysis, which included 502 observations, was performed for each rugby code for the following independent predictor variables: SIgA concentration, internal training load, total wellness, sleep quantity, sleep quality and stress. Rugby union and league players experienced a similar number of URTS episodes; however, predictors of URTS episodes differed between the codes. No biomarkers or self-reported measures significantly predicted URTS risk in rugby union players, while reductions in self-reported total wellness (HR: 0.731, p = 0.004) and sleep quality (HR: 0.345, p = 0.001) predicted increased URTS risk in rugby league players. The findings from this study highlight that factors influencing URTS risk are perhaps sport specific and this may be attributed to different sporting demands and/or different management of players by team-practitioners.
... Elite athletes who can maintain training availability at 80% have a significantly greater chance of achieving their key performance goals. 1 Therefore, in order to maximize the likelihood of success in elite athletes, attention should be paid to the prevention of both injury and illness. 1,2 Injuries within professional cycling could be classified as injuries caused by accidents (ie, trauma due to falls) or overuse injuries. ...
... Elite athletes who can maintain training availability at 80% have a significantly greater chance of achieving their key performance goals. 1 Therefore, in order to maximize the likelihood of success in elite athletes, attention should be paid to the prevention of both injury and illness. 1,2 Injuries within professional cycling could be classified as injuries caused by accidents (ie, trauma due to falls) or overuse injuries. 3 Lower back pain and knee pain are the most common overuse injuries in elite cycling. ...
Article
Purpose: To determine if workload and seasonal periods (preseason vs in season) are associated with the incidence of injuries and illnesses in female professional cyclists. Methods: Session rating of perceived exertion was used to quantify internal workload and was collected from 15 professional female cyclists, from 33 athlete seasons. One week (acute) workload, 4 weeks (chronic) workload, and 3 acute:chronic workload models were analyzed. Two workload models are based on moving averages of the ratios, the acute:chronic workload ratio (ACWR), and the ACWR uncoupled (ACWRuncoup). The difference between both is the chronic load; in ACWR, the acute load is part of the chronic load, and in ACWRuncoup, the acute and chronic load are uncoupled. The third workload model is based on exponentially weighted moving averages of the ratios. In addition, the athlete season is divided into the preseason and in season. Results: Generalized estimating equations analysis was used to assess the associations between the workload ratios and the occurrence of injuries and illnesses. High values of acute workload (P = .048), ACWR (P = .02), ACWRuncoup (P = .02), exponentially weighted moving averages of the ratios (P = .01), and the in season (P = .0001) are significantly associated with the occurrence of injury. No significant associations were found between the workload models, the seasonal periods, and the occurrence of illnesses. Conclusions: These findings suggest the importance of monitoring workload and workload ratios in female professional cyclists to lower the risk of injuries and therefore improve their performances. Furthermore, these results indicate that, in the preseason, additional stressors occur, which could lead to an increased risk of injuries.
... 10 Adult athletes experience an average burden of injury of two weeks per season. 11 In 45% of youth athletes, more than a week's loss of training is due to injury. 12 A resultant 65% reduction in the achievement of their performance goals occurs in athletes who sustain more than one injury requiring a modification of more than 20% of their seasonal training weeks. ...
... 12 A resultant 65% reduction in the achievement of their performance goals occurs in athletes who sustain more than one injury requiring a modification of more than 20% of their seasonal training weeks. 11 The impact of an injury on an athlete produces a physical, psychological, performance, and financial fallout that requires management over several seasons. 1,7,13 Prevention is better than cure in breaking the chain reaction that develops as a sequela of injury. ...
Article
Objective: This review aims to evaluate the effectiveness of exercise intervention versus no intervention or alternate intervention to prevent shoulder injuries in athletes. Introduction: Injury-prevention research has proven the effectiveness of exercise in preventing sports injuries in general and in the lower limb specifically. However, the results have been extrapolated to sport-related shoulder injuries from limited evidence. Similar reviews have been faced with insufficient high-quality evidence and limited studies due to restrictive target populations, resulting in reduced generalizability. Inclusion criteria: Peer-reviewed randomized controlled trials, with adequate control arms, investigating shoulder-injury events after exercise intervention in athletes, both training or competing in sports, will be included. Studies with substitute end points for injury events and non-self-propelled athletes, or vehicle assisted athletes, will be excluded. Methods: A comprehensive search of multiple databases will be used to find relevant studies. The databases will be searched from inception to April 2021, with no language restrictions imposed. Keywords and derivatives of "sport," "exercise intervention," "prevention," "shoulder injury," and "randomized controlled trials" will be used.Sources will include Academic Search Ultimate (EBSCOhost); CINAHL Plus (EBSCOhost); Cochrane Central Register of Controlled Trials (Wiley); MasterFILE Premier (EBSCOhost); MEDLINE (PubMed); Physiotherapy Evidence Database (PEDro); ProQuest Health and Medical Complete and Nursing and Allied Health Source (ProQuest Complete); ScienceDirect (Elsevier); Scopus (Elsevier); SPORTDiscus (EBSCOhost); and Web of Science (Clarivate Analytics). Data extraction and synthesis will follow the JBI Manual for Evidence Synthesis guidance for systematic reviews of effectiveness. Systematic review registration number: PROSPERO CRD42020204141.
... Traditionally thought to act as a simple scaffold network, recent technological advances have shown that the ECM is not only responsible for biological mechanotransduction, but also is dynamically involved in signaling and regulatory processes within muscle (Gillies & Lieber, 2011). Although metabolic and mechanical stress is required to initiate tissue remodeling and repair (Hyldahl & Hubal, 2014), accelerating the recovery from this damage may be favorable for directly improving athletic performance and for accumulating training volume and adaptation (Raysmith & Drew, 2016). ...
Article
The authors sought to determine whether consuming collagen peptides (CP) enhances musculoskeletal recovery of connective tissues following a damaging exercise bout. Resistance-trained males consumed 15 g/day of CP ( n = 7) or placebo ( n = 8), and after 7 days, maximal voluntary isometric contraction (MVIC), countermovement jump height, soreness, and collagen turnover were examined. Five sets of 20 drop jumps were performed and outcome measures were collected 24, 48, and 120 hr postexercise. Countermovement jump height was maintained in the CP group at 24 hr (PRE = 39.9 ± 8.8 cm vs. 24 hr = 37.9 ± 8.9 cm, p = .102), whereas the CP group experienced a significant decline at 24 hr (PRE = 40.4 ± 7.9 cm vs. 24 hr = 35.5 ± 6.4 cm, p = .001; d = 0.32). In both groups, muscle soreness was significantly higher than PRE at 24 hr ( p = .001) and 48 hr ( p = .018) but not at 120 hr ( p > .05). MVIC in both legs showed a significant time effect (left: p = .007; right: p = .010) over the 5-day postexercise period. Neither collagen biomarker changed significantly at any time point. CP supplementation attenuated performance decline 24 hr following muscle damage. Acute consumption of CP may provide a performance benefit the day following a bout of damaging exercise in resistance-trained males.
... In our study, the lower risk of disordered eating amongst international athletes and male national athletes compared to their recreational counterparts may represent a selection factor, such that the health and performance implications of improper fueling could preclude success to progress to the international level. A higher drive for thinness is associated with an increased incidence of musculoskeletal injuries in female athletes [47], and disordered eating related injury could certainly interfere with athletic success due to loss of training time [48]. Differences across athlete calibre may also relate to underlying motivations for training and competing, as initiating training to lose weight is associated with an increased risk of disordered eating development [49]. ...
Article
Full-text available
Both dietary and exercise behaviors need to be considered when examining underlying causes of low energy availability (LEA). The study assessed if exercise dependence is independently related to the risk of LEA with consideration of disordered eating and athlete calibre. Via survey response, female (n = 642) and male (n = 257) athletes were categorized by risk of: disordered eating, exercise dependence, disordered eating and exercise dependence, or if not presenting with disordered eating or exercise dependence as controls. Compared to female controls, the likelihood of being at risk of LEA was 2.5 times for female athletes with disordered eating and >5.5 times with combined disordered eating and exercise dependence. Male athletes with disordered eating, with or without exercise dependence, were more likely to report signs and symptoms compared to male controls-including suppression of morning erections (OR = 3.4; p < 0.0001), increased gas and bloating (OR = 4.0–5.2; p < 0.002) and were more likely to report a previous bone stress fracture (OR = 2.4; p = 0.01) and ≥22 missed training days due to overload injuries (OR = 5.7; p = 0.02). For both males and females, in the absence of disordered eating, athletes with exercise dependence were not at an increased risk of LEA or associated health outcomes. Compared to recreational athletes, female and male international caliber and male national calibre athletes were less likely to be classified with disordered eating.
... Training days lost due to acute illness may negatively impact athlete performance. 5 Availability of players in teams is important to achieve team and tournament success. 6 Acute illness not only decreases performance and reduces the ability to sustain high intensity training, 7 but also increases the risk of serious medical complications and even sudden death during strenuous exercise. ...
Article
Objectives To document incidence rate and severity of specific sub-categories of respiratory tract illness (RTill) in rugby players during the Super Rugby tournament. Design Cross-sectional study. Methods Team physicians completed daily illness logs in 537 professional male rugby players from South African teams participating in the Super Rugby Union tournaments (2013–2017) (1141 player-seasons, 102,738 player-days). The incidence rate (IR: illness episodes/1000 player-days) and severity [%RTill resulting in time-loss, illness burden (IB: days lost to illness/1000 player-days) and days until return-to-play (DRTP)/single illness (mean: 95% Confidence Intervals)] are reported for the following specific sub-categories of RTill: non-infective respiratory tract illness (RTnon-inf), respiratory tract infections (RTinf), influenza-like illness, infective sinusitis, upper respiratory tract infections (URTinf), lower respiratory tract infections (LRTinf). Results The overall IR of RTill was 2.9 (2.6–3.3). IR was higher for RTinf (2.5; 2.2–2.9) vs. RTnon-inf (0.4; 0.3–0.6) (p < 0.001). For sub-categories the highest IR was in URTinf (1.9; 1.7–2.2), while the % illness causing time-loss was influenza-like illness (100%), LRTinf (91.7%), infective sinusitis (55.6%), and URTinf (49.0%). IB was highest for URTinf (2.0; 1.6–2.5), and the DRTP/single illness was highest for LRTinf (3.2; 2.3–4.4), and influenza-like illness (2.1; 1.6–2.8). Conclusions RTinf accounted for >57% of all illness during the Super Rugby tournament, and mostly URTinf. Influenza-like illness. LRTinf caused time-loss in >90% cases. URTinf, LRTinf and influenza-like illness resulted in the highest burden of illness and LRTinf caused the highest DRTP. Prevention strategies should focus on mitigating the risk of RTinf, specifically URTinf, LRTinf and influenza-like illness.
... 1 3 Reported recurrence rates in prospective follow-up studies range from 14% to 63%. 3 6 7 In track and field, hamstring injury is the most prevalent injury in competition and therefore has significant performance and financial implications for athletes, national governing bodies and international federations. [8][9][10] The British Athletics Muscle Injury Classification (BAMIC) is an MRI classification system with clearly defined, anatomically focused classes based on the site of injury: myofascial (class a), muscletendon junction (class b) or intratendon injury (class c) and a numerical grading system (0-4) based on the extent of injury (table 1). 11 It is a reliable classification system that is associated with return to play. ...
Article
Objectives The British Athletics Muscle Injury Classification (BAMIC) correlates with return to play in muscle injury. The aim of this study was to examine hamstring injury diagnoses and outcomes within elite track and field athletes following implementation of the British Athletics hamstring rehabilitation approach. Methods All hamstring injuries sustained by elite track and field athletes on the British Athletics World Class Programme between December 2015 and November 2019 that underwent an MRI and had British Athletics medical team prescribed rehabilitation were included. Athlete demographics and specific injury details, including mechanism of injury, self-reported gait phase, MRI characteristics and time to return to full training (TRFT) were contemporaneously recorded. Results 70 hamstring injuries in 46 athletes (24 women and 22 men, 24.6±3.7 years) were included. BAMIC grade and the intratendon c classification correlated with increased TRFT. Mean TRFT was 18.6 days for the entire cohort. Mean TRFT for intratendon classifications was 34±7 days (2c) and 48±17 days (3c). The overall reinjury rate was 2.9% and no reinjuries were sustained in the intratendon classifications. MRI variables of length and cross-sectional (CSA) area of muscle oedema, CSA of tendon injury and loss of tendon tension were associated with TRFT. Longitudinal length of tendon injury, in the intratendon classes, was not associated with TRFT. Conclusion The application of BAMIC to inform hamstring rehabilitation in British Athletics results in low reinjury rates and favourable TRFT following hamstring injury. The key MRI variables associated with longer recovery are length and CSA of muscle oedema, CSA of tendon injury and loss of tendon tension.
... Perceived health, which refers to a person's general perception of her/his health, might be linked to injury (Messier et al., 2018;Raysmith and Drew, 2016). Low perceived health has been reported as a reason to discontinue running (Fokkema et al., 2019). ...
Article
Full-text available
Knowledge about prevalence and etiology of running-related injuries (RRIs) is important to design effective RRI prevention programs. Mental aspects and sleep quality seem to be important potential risk factors, yet their association with RRIs needs to be elucidated. The aims of this study are to investigate the epidemiology of RRIs in recreational runners and the association of mental aspects, sleep, and other potential factors with RRIs. An internet-based questionnaire was sent to recreational runners recruited through social media, asking for personal and training characteristics, mental aspects (obsessive passion, motivation to exercise), sleep quality, perceived health, quality of life, foot arch type, and RRIs over the past six months. Data were analyzed descriptively and using logistic regression. Self-reported data from 804 questionnaires were analyzed. Twenty-five potential risk factors for RRIs were investigated. 54% of runners reported at least one RRI. The knee was the most-affected location (45%), followed by the lower leg (19%). Patellofemoral pain syndrome was the most-reported injury (20%), followed by medial tibial stress syndrome (17%). Obsessive passionate attitude (odds ratio (OR):1.35; 95% confidence interval (CI):1.18-1.54), motivation to exercise (OR:1.09; CI:1.03-1.15), and sleep quality (OR:1.23; CI:1.15-1.31) were associated with RRIs, as were perceived health (OR:0.96; CI:0.94-0.97), running over 20 km/week (OR:1.58; CI:1.04-2.42), overweight (OR:2.17; CI:1.41-3.34), pes planus (OR:1.80; CI:1.12-2.88), hard-surface running (OR:1.37; CI:1.17-1.59), running company (OR:1.65; CI:1.16-2.35), and following a training program (OR:1.51; CI:1.09-2.10). These factors together explained 30% of the variance in RRIs. A separate regression analysis showed that mental aspects and sleep quality explain 15% of the variance in RRIs. The association of mental aspects and sleep quality with RRIs adds new insights into the multifactorial etiology of RRIs. We therefore recommend that besides common risk factors for RRI, mental aspects and sleep be incorporated into the advice on prevention and management of RRIs.
... Days of training missed due to illness or injury can limit training volume for endurance athletes [1] which can then have a detrimental effect on performance [2]. However, upper respiratory tract and pulmonary infection risk has been shown to increase 8-10% with each 10% increase in training load and by 50-70% during periods of intensified training in swimmers [3]. ...
Article
Full-text available
Introduction: Understanding the sport-specific immune response elicited during both training and competition is imperative to maximise athlete health and performance. Despite a growing population of professional enduro mountain bike athletes, little is known about the recovery of the immune system following enduro racing events. Methods: Nine international level elite enduro mountain bike athletes (age 24.3 ± 2.4 years, height 178.5 ± 8.7 cm, mass 76.5 ± 12.5 kg) completed a laboratory-based maximal exercise test (LAB) on a cycle ergometer and competed in an international mountain bike enduro race event (RACE). Blood samples were taken before, immediately after, and 1 h after LAB and before, 1 h after, and 17 h after RACE. Leukocyte subsets were enumerated using seven-colour flow cytometry. Lucia's training impulse (LuTRIMP) and vibration exposure (VIB) were quantified during RACE. Results: Seven participants were included in the final analyses. There was a significant (p < 0.05) increase in neutrophil count alongside a reduction of cytotoxic lymphocyte cell subsets of both the innate (CD3-/CD56+ NK-cells and CD3-/CD56dim NK-cells) and adaptive (CD8+/CD62L-/CD45RA- T-cells and CD8+/CD27+/CD28- T-cells) components of the immune system one hour after RACE. All cell counts returned to baseline values 17 h afterwards (p > 0.05). Cell subset redistribution from pre- to post-one-hour time points (%Δpre-post1h) in cell subsets with potent effector functions (Neutrophils, CD3-/CD56+ NK-cells, CD8+/CD62L-/CD45RA- T-cells, CD8+/CD27+/CD28- T-cells, and CD3-/CD56dim/CD57- NK-cells) was significantly greater at RACE than LAB (p < 0.05). VIB was shown to be a superior predictor of %Δpre-post1h CD4+ T-cells, CD4+ early T-cells, CD4+ naïve T-cells, and NK cells as compared with LuTRIMP on its own (ΔR2 = 0.63 - 0.89, p < 0.05). Conclusions: The race event offers a greater challenge to the immune system than LAB, and potentially, whole body vibration is a key component of training load measurement in mountain bike applications.
... Training in this CHO-restricted state also increases cortisol production and reduces a n t i b o d y p r o d u c t i o n a n d l y m p h o c y t e proliferation, thus potentially leading to a transient period of immunosuppression and consequent heightened infection risk (Gleeson et al., 2001). An illness may enforce an involuntary reduction in training frequency, and each week of unfinished planned training is understood to decrease the probability of success by 26% in elite athletes (Raysmith and Drew, 2016). Therefore, excessive use of this strategy may best be avoided in order to p r o t e c t t h e i m m u n i t y o f a t h l e t e s . ...
Article
Full-text available
Due to the importance of glycogen for energy production, research has traditionally recommended sufficient carbohydrate (CHO) availability to maximise exercise performance. However, recent evidence has suggested that undertaking some training sessions with low CHO availability may bring about greater physiological adaptations. This strategy has commonly been termed as 'train low'. Although desirable adaptations in gene expression related to mitochondrial biogenesis and the activity of enzymes related to aerobic metabolism have been observed, research is conflicted towards the ergogenic impact this technique has on exercise performance. Additionally, this strategy may produce maladaptations such as reduced training intensity, immunosuppression, protein oxidation and reduced pyruvate dehydrogenase (PDH) activity. Therefore, if athletes are to adopt this strategy, it is suggested that they periodise 'train low' to solely low-intensity sessions which won't be impaired by a drop in work rate, but otherwise maintaining sufficient daily CHO intake. Also, athletes could negate potential maladaptations by using caffeine and/or CHO mouth rinse to maintain exercise intensity, and increasing protein ingestion to counteract increased protein oxidation. Future research should directly compare the effect of 'train low' between use in all sessions and solely low-intensity sessions within one comprehensive study to better understand the mechanisms behind the apparent superiority of the latter strategy.
... It has been shown that LH travel can cause further risk to athlete health because of increased risk of venous thromboembolisms (5), upper respiratory tract infection (49,53), and gastrointestinal symptoms (53). If an athlete does suffer from illness after a LH flight, this is likely to have a detrimental effect on subsequent performance (35). ...
... Subjects also noted whether any training sessions were burdened or modified in response to injury or illness post-training (i.e., subjects were able to train but may have required medical attention peri-training or had their training modified) (21). As mentioned, if subjects could not complete scheduled training sessions because of injury or illness, they were excluded from the study. ...
Article
Full-text available
The main purpose of this investigation was to examine influence of mental fatigue on sessional ratings of perceived exertion (sRPE) over a training week in elite athletes in open skill (OS, i.e., more unpredictable and externally paced sports) and closed skill (CS, i.e., more predictable and internally paced) sports. Visual analogue scales for mental fatigue, sRPE (CR-10 scale), and training duration were collected from an OS group (n = 27) of basketball and volleyball athletes and a CS group (n = 28) of weightlifting and track and field athletes during a typical training week 5 months before the 2016 Olympic Games. These variables were then examined using repeated measure correlations and linear mixed models with the level of significance set for the study at p , 0.05. There was a small significant correlation between mental fatigue and sRPE in the OS group (r = 0.23, p = 0.01), but not in the CS group (r = 0.07, p = 0.38). Mental fatigue had trivial influence on sRPE during individual sessions, but had a moderate effect on total sRPE over a week (p = 0.001, f2 = 0.265) when accounting for type of sport, training duration, and injury/illness burden. It seems mental fatigue may not significantly influence sRPE in individual training sessions, but may potentially have a cumulative effect that may affect the sRPE over a training week. This suggests monitoring mental fatigue independently of other training load (TL) measures may be worthwhile for strength and conditioning specialists and sports coaches to manage their athletes and researchers conducting studies into TL and performance.
... Staying healthy and injury free is one of the most important 79 factors for optimal performance in sports. 1 The machine learning algorithm that we use takes as input a set 229 of feature vectors (data describing a training setup) and the 230 corresponding label (injury or healthy). In an iterative approach 231 the algorithm determines a predictive model that best maps the 232 input feature vectors to the corresponding labels. ...
Purpose: Staying injury free is a major factor for success in sports. Although injuries are difficult to forecast, novel technologies and data-science applications could provide important insights. Our purpose was to use machine learning for the prediction of injuries in runners, based on detailed training logs. Methods: Prediction of injuries was evaluated on a new data set of 74 high-level middle- and long-distance runners, over a period of 7 years. Two analytic approaches were applied. First, the training load from the previous 7 days was expressed as a time series, with each day's training being described by 10 features. These features were a combination of objective data from a global positioning system watch (eg, duration, distance), together with subjective data about the exertion and success of the training. Second, a training week was summarized by 22 aggregate features, and a time window of 3 weeks before the injury was considered. Results: A predictive system based on bagged XGBoost machine-learning models resulted in receiver operating characteristic curves with average areas under the curves of 0.724 and 0.678 for the day and week approaches, respectively. The results of the day approach especially reflect a reasonably high probability that our system makes correct injury predictions. Conclusions: Our machine-learning-based approach predicts a sizable portion of the injuries, in particular when the model is based on training-load data in the days preceding an injury. Overall, these results demonstrate the possible merits of using machine learning to predict injuries and tailor training programs for athletes.
... 9 Recognition of the role of illness is growing in sports medicine, for instance, physiological changes associated with acute infective illness impair motor coordination, decrease muscle strength, 11 reduce peak V02 (peak oxygen consumption) and endurance capacity and change metabolic function, 12 consequently affecting biathletes ability to train, compete 2 and perform well. 2 Our current understanding of illness rates is limited to data from Winter Olympic Games (11% in Vancouver, 6 10% in Sochi 7 and 15% in Pyeong Chang. 8 ) Like injury data, no longitudinal studies in cross country skiing or biathlon are currently available to explore the impact of illness on these athletes. ...
Article
Full-text available
Introduction Reliably and accurately establishing injury and illness epidemiology in biathletes will provide insight into seasonal changes, provide potential to better embed innovative prevention strategies and advance sports medicine through the provision of effective healthcare to biathletes. The main objective of the Biathlon Injury and Illness Study (BIIS) is to provide the first comprehensive epidemiological profile of injury and illness in biathlon athletes during two consecutive Biathlon World Cup seasons over 2-years. Methods The BIIS study methodology is established in line with the International Olympic Committee (IOC) injury and illness surveillance protocols using a biathlon-specific injury and illness report form. Team medical staff will provide weekly data using injury and illness definitions of any injury or illness that receives medical attention regardless of time loss. Injuries or illness must be diagnosed and reported by a qualified medical professional (eg, team physician, physiotherapist) to ensure accurate and reliable diagnoses. Descriptive statistics will be used to identify the type, body region and nature of the injury or illness and athlete demographics such as age and gender. Summary measures of injury and illnesses per 1000 athlete-days will be calculated whereby the total number of athletes will be multiplied by the number of days in the season to calculate athlete-days. Ethics and Dissemination This study has been approved by the Bellbery Human Research Ethics Committee (HREC reference: 2017-10-757). Results will be published irrespective of negative or positive outcomes and disseminated through different platforms to reach a wide range of stakeholders.
... Hence, it was not possible to determine any causes of the selfreported illness episodes. While any reduction in or loss of training time as a consequence of illness is important, regardless of cause [10,44], future studies should aim to determine the causes of reported illness episodes in developing XC skiers. This may enable the subsequent reduction of developing illness within this population. ...
Article
Full-text available
Objective This study aimed to describe the endurance training and incidence of illnesses reported by a group of well-trained cross-country (XC) skiers throughout their transition from junior to senior level. Methods Changes in self-reported training and performance, from 31 well-trained XC skiers, were analyzed from the start of the season they turned 16 y until the end of the season they turned 22 y, using linear mixed-effects models. Differences in the incidence of self-reported illness episodes were analyzed using incidence rate ratios, and the relationships between self-reported illness and training volumes were analyzed using linear mixed-effects models in a sub-group of 23 of the skiers. Results In total, 145 seasons of training data (including 85,846 h of endurance training) and 109 person-years of illness data (including 380 self-reported illness episodes) were analyzed. The athletes progressively increased their annual endurance training volume from age 16 to 22 y in a linear fashion, from ~ 470 to 730 h. Low- and high-intensity training volumes increased by 51.4 ± 2.4 h·y ⁻¹ (p < .001) and 4.9 ± 0.6 h·y ⁻¹ (p < .001), respectively. Sport-specific and non-specific training increased by 50.0 ± 2.2 h·y ⁻¹ (p < .001) and 4.6 ± 2.0 h·y ⁻¹ (p < .001), respectively. The athletes reported a median (range) of 3 (0–8) illness episodes and 17 (0–80) days of illness per year, and there was an inverse relationship between self-reported illness days and annual training volume (-0.046 ± 0.013 d·h ⁻¹ ; p < .001). Conclusions This group of well-trained XC skiers increased their endurance training volume in a linear fashion by ~ 55 h annually. This was primarily achieved through an increase in low-intensity and sport-specific training. Furthermore, higher training volumes were associated with a lower number of self-reported illness days.
... The type and severity of an injury have a direct impact on the return-to-sports (RTS) time frame [29]. On the international level, training time loss due to an injury is considered a major determinant for success or failure [30]. The current results show that the duration of sporting time loss differed between the injuries with the longest downtime for ligament and muscle injuries of the elbow (36 week and 39 weeks, respectively) in throwing sports. ...
Article
Full-text available
Objectives: To prevent the occurrence of injury in a sport, exact knowledge of injury patterns is needed. To synthesize sport-specific injuries in track and field comparing elite and recreational level athletes, as well as gender. Furthermore, analyze the time loss due to injury and reduction in athletic performance. Methods: Injury type-specific frequencies were recorded according to discipline, gender and performance level. Injury severity was assessed by time loss duration and performance reduction. Results: 64% of athletes suffered at least one injury. In the top 10 ranking, 83% (n = 524) were located in the lower extremities. A muscle strain of the thigh had the highest prevalence in sprint (34%, n = 41), jump (15%, n = 15) and middle-distance running (16%, n = 6). More injuries occurred during training (75%, n = 165) as compared to competition (25%, n = 56). The longest time loss was documented in throwing with a downtime of 36 weeks after a ligament injury of the elbow and 39 weeks after a muscle injury of the elbow. The injury with the highest number of athletes with a reduced level of performance was the foot ligament injury in sprint athletes at 100%. Conclusion: Assessing time loss and performance reduction in athletics, there are discipline-specific injury patterns. This study points out the high prevalence of training injuries, highlighting the need for future investigations to adapt training management, improve medical care and rehabilitation with respect to every discipline.
... Calf muscle strain injuries (CMSI) are prevalent in elite sports [1,2] and contribute to the negative impact that any injury can have on team success [3][4][5]. The burden of CMSI can also be significant, with > 3 months time-loss reported for some cases in American football [6], football (soccer) [1] and Australian Football [7]. ...
Conference Paper
Full-text available
Background Despite being a common cause of time loss, information regarding best practice for calf muscle strain injuries (CMSI) in sport is scarce. Objective To establish best practice for the assessment and management of CMSI. Design Qualitative. Setting In-depth interviews. Patients (or Participants) 20 expert medical professionals working in elite sport and/or researchers specialising in the field; representing seven countries and seven sports. Interventions (or Assessment of Risk Factors) Semi-structured interviews using a schedule of questions canvassing pre-identified topics. Thematic coding to analyse findings. Main Outcome Measurements Data were evaluated in three key areas: (i) injury characteristics, (ii) injury management, and (iii) injury prevention. Results CMSI have unique injury characteristics compared to other common muscle strain injuries (e.g. hamstring), but a criteria-based approach can assist forming the most accurate impression of prognosis. Similarly, a structured approach should be followed to ensure the athlete returns to a high level of performance and the risk of re-injury is minimized, focusing on: re-strengthening, plyometric and ballistic exercises, as well as running-based reconditioning specific to the sport. For the best chance to prevent index CMSI, strategies should span multiple domains of athlete management: screening and monitoring, field-based exposure (e.g. workload data), and off-field interventions (e.g. strengthening). Injury prevention strategies should be tailored to the individual, considering extrinsic (the sport, position played, club culture/coach expectations) and intrinsic (previous injury history, age, training history) factors that may increase susceptibility to CMSI. Conclusions Knowledge about the unique injury characteristics of CSMI can clarify the likely prognosis and best approach to rehabilitation. Practitioners attempting to prevent CMSI should use a multi-faceted approach given that the aetiology of CMSI is complex and often unique to the individual.
... Maintaining athlete immune function is critical for optimising training-availability and performance 1 . Strenuous, prolonged exercise depletes endogenous carbohydrate (CHO) stores (e.g. ...
Article
Full-text available
This study examined the effect of short-term adaptation to a ketogenic diet (KD) on resting and post-exercise immune markers. In a randomised, repeated-measures, cross-over study, eight trained, male, endurance athletes ingested a 31-day low-carbohydrate (CHO), KD (energy intake: 4% CHO; 78% fat) or their habitual diet (HD) (energy intake: 43% CHO; 38% fat). On days 0 and 31, participants ran to exhaustion at 70% VO2max . A high-CHO (2 g⋅kg-1 ) meal was ingested prior to the pre-HD, post-HD and pre-KD trials, with CHO (~55 g⋅h-1 ) ingested during exercise. Whereas, a low-CHO (<10 g) meal was ingested prior to the post-KD trial, with fat ingested during exercise. Blood and saliva samples were collected at pre-exercise, exhaustion and 1-h post-exhaustion. T-cell-related cytokine gene expression within peripheral blood mononuclear cells (PBMC) and whole-blood inflammatory cytokine production were determined using 24 h multi-antigen-stimulated whole-blood cultures. Multi-antigen-stimulated PBMC IFN-γ mRNA expression and the IFN-γ/IL-4 mRNA expression ratio were higher at exhaustion in the post- compared with pre-KD trial (p=0.003 and p=0.004); however, IL-4 and IL-10 mRNA expression were unaltered (p>0.05). Multi-antigen-stimulated whole-blood IL-10 production was higher in the post- compared with pre-KD trial (p=0.028); whereas, IL-1β, IL-2, IL-8 and IFN-γ production were lower in the post- compared with pre-HD trial (p<0.01). Salivary immunoglobulin A (SIgA) secretion rate was higher in the post- compared with pre-KD trial (p<0.001). In conclusion, short-term adaptation to a KD in endurance athletes may alter the pro- and anti-inflammatory immune cell cytokine response to a multi-antigen in vitro and SIgA secretion rate.
... [1][2][3]6,7 Injury has a detrimental impact on performance, with high levels of time lost from training being associated with athletes not reaching their performance goals. 8 Robust data is required on injury and illness to inform the development of effective prevention measures within a sport. 9,10 Across 16 major international athletics championships, muscle injuries were the most common injury type, representing 41% of all injuries. ...
Article
Full-text available
Background: Athletics (also known as track and field) is one of the most popular sports in the world and is the centrepiece of the Summer Olympic Games. Participation in athletics training and competition involves a risk of illness and injury. Purpose: To describe injury and illness in British Olympic track and field athletes over three full training and competition seasons. Study design: Descriptive Epidemiology Study. Methods: A total of 111 athletes on the British national program were followed prospectively for three consecutive seasons between 2015-2018. Team medical personnel recorded all injuries and illnesses during this time, following current consensus-based methods. All data pertaining to these records were reviewed and analyzed for sports injury and illness epidemiological descriptive statistics. Results: The average age of the athletes was 24 years for both males and females (24 years, +/- 4). Total exposure for the three seasons was 79 205 athlete days (217 athlete years). Overuse injuries (56.4%) were more frequent than acute injuries (43.6%). The thigh was the most common injury location (0.6 per athlete year), followed by the lower leg (0.4 per athlete year) and foot (0.3 per athlete year). Muscle and tendon were the most commonly injured tissues, while strains and tears were the most common pathology type. Hamstring muscle strain was the most common diagnosis causing time loss, followed by Achilles tendinopathy and soleus muscle strain. Respiratory illness was the most common illness type (0.3 per athlete year). Conclusion: Hamstring strains, Achilles tendinopathy, and soleus strains are the most common injuries in athletics and have highest burden. Respiratory illness is the most common illness and has the highest burden. Knowledge of this injury and illness profile within athletics could be utilised for the development of targeted prevention measures within the sport at the elite level. Level of evidence: 3.
... Calf muscle strain injuries (CMSI) are prevalent in elite sports [1,2] and contribute to the negative impact that any injury can have on team success [3][4][5]. The burden of CMSI can also be significant, with > 3 months time-loss reported for some cases in American football [6], football (soccer) [1] and Australian Football [7]. ...
Article
Full-text available
Background Despite calf muscle strain injuries (CMSI) being problematic in many sports, there is a dearth of research to guide clinicians dealing with these injuries. The aim of this study was to evaluate the current practices and perspectives of a select group of international experts regarding the assessment, management and prevention of CMSI using in-depth semi-structured interviews. Results Twenty expert clinicians working in elite sport and/or clinician-researchers specialising in the field completed interviews. A number of key points emerged from the interviews. Characteristics of CMSI were considered unique compared to other muscle strains. Rigor in the clinical approach clarifies the diagnosis, whereas ongoing monitoring of calf capacity and responses to loading exposure provides the most accurate estimate of prognosis. Athlete intrinsic characteristics, injury factors and sport demands shaped rehabilitation across six management phases, which were guided by key principles to optimise performance at return to play (RTP) while avoiding subsequent injury or recurrence. To prevent CMSI, periodic monitoring is common, but practices vary and data are collected to inform load-management and exercise selection rather than predict future CMSI. A universal injury prevention program for CMSI may not exist. Instead, individualised strategies should reflect athlete intrinsic characteristics and sport demands. Conclusions Information provided by experts enabled a recommended approach to clinically evaluate CMSI to be outlined, highlighting the injury characteristics considered most important for diagnosis and prognosis. Principles for optimal management after CMSI were also identified, which involved a systematic approach to rehabilitation and the RTP decision. Although CMSI were reportedly difficult to prevent, on- and off-field strategies were implemented by experts to mitigate risk, particularly in susceptible athletes.
... The list is neither exhaustive nor universal; other performance outcomes may be caused by LEA, while not all those with RED-S may experience all the listed impairments. Generally, more severe LEA and its consequences can cause lost training days due to injury or illness, and lost training days compromise performance 97 . Overall, coaches must internalize these basic considerations: proper training loads combined with adequate nutrition are two critical and inseparable factors for achieving peak athletic performance. ...
Article
Full-text available
The Female Athlete Triad (Triad) and the more encompassing Relative Energy Deficiency in Sport (RED-S) are disorders caused by low energy availability (LEA). LEA is a state of insufficient energy intake by an athlete relative to their energy expenditure. Persistent LEA results in the deleterious consequences to health and performance that comprise RED-S. With respect to both the Triad and RED-S, researchers have called for more education of those involved with sport, particularly coaches, to help reduce the incidence of these disorders. Recent studies have shown that as few as 15% of coaches are aware of the Triad, with up to 89% unable to identify even one of its symptoms. RED-S is a more recently established concept such that coach knowledge regarding it has only begun to be assessed, but the results of these initial studies indicate similar trends as for the Triad. In this review, we synthesize research findings from 1986 to 2021 that pertains to LEA and RED-S, which coaches should know so they can better guide their athletes.
... This methodology has provided reliable and comparable data for this particular context of international championships [8,23]. However, if we broaden the focus to the whole track and field season, we find that only a few studies exist and that they use different methods [4,[24][25][26][27][28][29][30][31][32][33], which does not allow a true comparison of the data, and could explain why injury data should now be presented separately between championships and whole season. A method was developed in 2014 at a consensus meeting of international and national athletics federations [11], and the IOC recently updated a consensus statement on methods for recording and reporting of epidemiological data on injury and illness in sport 2020 [10], that are expected to implement long-term cohort follow-ups over one or more seasons with a comparison between studies. ...
Chapter
Track and field (athletics) is an Olympic sport composed of several different disciplines: sprints, hurdles, jumps, throws, combined events, middle and long distances, marathon, and race walking. The practice of track and field leads to a risk of injuries. A clear knowledge of the epidemiology of injuries is of great interest as a first step of injury prevention in track and field. During the context of international championships, several studies provide a clear view of the “risks” of injuries, and injury number, incidence, and characteristics varied with sex and disciplines. During the whole season, which represents a significantly larger period of exposure, there are currently and to our knowledge only few studies reporting injury data, but with consistent findings. About two third of athletes had at least one injury during the whole season. As an overview of the injury characteristics in track and field, the most common injury problems experienced are hamstring muscle injuries (especially in sprints, hurdles, and jumps), Achilles tendinopathies (in sprints, middle and long distances, and jumps), knee overuse injuries (in sprints, middle and long distances), shin splints and/or stress fractures (in sprints, middle and long distances), ankle sprains (in jumps), and low back pain (in jumps and throws).
... This makes reduction of muscle injuries the first challenge for athlete's health protection, and also for performance improvement, given the close relationships between health, injury and performance in athletics. 3,16,28 Hamstrings were the most frequently injured muscles in the majority of the disciplines involving running (>50% of muscle injuries in disciplines involving high running velocities: sprints, hurdles, jumps and combined events). This extends previous findings 2 placing the hamstring muscle injury as the predominant injury diagnosis in international athletics championships. ...
Article
Objective To analyse the rates of lower limb muscle injuries in athletics disciplines requiring different running velocities during international athletics championships. Design Prospective total population study. Methods During 13 international athletics championships (2009 - 2019) national medical teams and local organizing committee physicians daily reported all newly incurred injuries using the same study design, injury definition and data collection procedures. In-competition lower limb muscle injuries of athletes participating in disciplines involving running (i.e. sprints, hurdles, jumps, combined events, middle distances, long distances, and marathon) were analysed. Results Among the 12,233 registered athletes, 344 in-competition lower limb muscle injuries were reported (36% of all in-competition injuries). The proportion, incidence rates and injury burden of lower limb muscles injuries differed between disciplines for female and male athletes. The most frequently injured muscle group was hamstring in sprints, hurdles, jumps, combined events and male middle distances runners (43% to 75%), and posterior lower leg in female middle distances, male long distances, and female marathon runners (44% to 60%). Hamstring muscles injuries led to the highest burden in all disciplines, except for female middle distance and marathon and male long distance runners. Hamstring muscles injury burden was generally higher in disciplines requiring higher running velocities, and posterior lower leg muscle injuries higher in disciplines requiring lower running velocities. Conclusions The present study shows discipline-specific injury location in competition context. Our findings suggest that the running velocity could be one of the factors that play a role in the occurrence/location of muscle injuries.
... Injuries can be highly disadvantageous. They can lead to reduced physical performance [39], high medical costs [40] and in extreme cases they can be career-ending [23,41,42]. ...
... 7 These injuries can lead to significant periods of time loss from sports with a negative impact on performance. 8 Long-term effects of joint injuries increase the risk of osteoarthritis development and joint instability, causing limitations in daily life, sports and/or work. 9 Thus, injuries in judo could potentially lead to important short-term and long-term problems. ...
Article
Full-text available
Objectives To systematically develop an injury prevention programme in judo and test its feasibility: Injury Prevention and Performance Optimization Netherlands (IPPON) intervention. Methods We used the five-step Knowledge Transfer Scheme (KTS) guidelines. In the first two steps, we described the injury problem in judo and showed possibilities to reduce the injury rates. In the third step, the Knowledge Transfer Group (KTG) translated this information into actions in judo practice. Expert meetings and practical sessions were held. In the fourth step, we developed the injury prevention programme and evaluated its feasibility in judo practice in a pilot study. As a final step, we will evaluate the injury prevention programme on its effectiveness to reduce injuries. Results In the first two steps, information collected indicated the need for reducing judo injuries due to high incidence rates. Injury prevention programmes have shown to be effective in reducing injuries in other sports. For judo, no injury prevention programme has yet been systematically developed. In the third step, the KTG reached consensus about the content: a trainer-based warm-up programme with dynamic exercises focusing on the shoulder, knee and ankle. In the fourth step, the intervention was developed. All exercises were approved in the pilot study. Based on the pilot study’s results, the IPPON intervention was extended and has become suitable for the final step. Conclusion We developed the IPPON intervention using the systematic guidance of the KTS. This trainer-based programme focuses on the prevention of shoulder, knee and ankle injuries in judo and consists of 36 exercises classified in three categories: (1) flexibility and agility, (2) balance and coordination and (3) strength and stability. The effectiveness and feasibility of the intervention on injury reduction among judo athletes will be conducted in a randomised controlled trial.
... Total injury incidence in soccer (football) has been reported at rates of 2.0 to 19.4 per 1000 hours exposure in elite youth players, and 2.5 to 9.4 per 1000 hours exposure in professionals [1]. These injuries present a high burden to both players and their teams, as time lost to injury and illness is detrimental to success [2][3][4]. Clearly, player availability (e.g. fewer days lost) is an important factor for team success. ...
Article
In soccer (football), dominant limb kicking produces higher ball velocity and is used with greater frequency than the non-dominant limb. It is unclear whether limb dominance has an effect on injury incidence. The purpose of this systematic review with meta-analysis is to examine the relationship between limb dominance and soccer injuries. Studies were identified from four online databases according to PRISMA guidelines to identify studies of soccer players that reported lower extremity injuries by limb dominance. Relevant studies were assessed for inclusion and retained. Data from retained studies underwent meta-analyses to determine relative risk of dominant versus non-dominant limb injuries using random-effects models. Seventy-four studies were included, with 36 of them eligible for meta-analysis. For prospective lower extremity injury studies, soccer players demonstrated a 1.6 times greater risk of injury to the dominant limb (95% CI [1.3–1.8]). Grouped by injury location, hamstring (RR 1.3 [95% CI 1.1–1.4]) and hip/groin (RR 1.9 [95% CI 1.3–2.7]) injuries were more likely to occur to the dominant limb. Greater risk of injury was present in the dominant limb across playing levels (amateurs RR 2.6 [95% CI 2.1–3.2]; youths RR 1.5 [95% CI 1.26–1.67]; professionals RR 1.3 [95% CI 1.14–1.46]). Both males (RR 1.5 [95% CI 1.33–1.68)] and females (RR 1.5 [95% CI 1.14–1.89]) were more likely to sustain injuries to the dominant limb. Future studies investigating soccer injury should adjust for this confounding factor by using consistent methods for assigning limb dominance and tracking use of the dominant versus non-dominant limb.
... The previous study that goes against the results of this current study defined injuries as all injuries that occurred, even if the athlete continued to play, whereas in our current study injuries were defined as when an athlete missed a game due to injury (7). Also, it should be noted that one study investigating Australian track and field athletes found that there was a relationship between injuries and finish places at major events (31). Though a different definition, it is a similar concept that supporting the influence injuries can have on a team's success each season. ...
Article
Full-text available
An expert strength and conditioning coach can be an important component of a sports performance and medicine staff that will train their athletes to help them become more resilient to injury. Previous research in a variety of sports has shown that teams with players that have fewer games missed due to injury have achieved greater success. The purpose of this study was to determine if a relationship exists between games missed due to injury by offensive and defensive starters on National Football League (NFL) rosters and a NFL team's ability to win during the 2010-19 NFL seasons. A Spearman rank-order correlation analysis set at (p ≤ 0.01) level of significance indicated that fewer games missed by starters in the NFL is correlated with multiple variables associated with winning such as games won per season and playoff appearances. These results were obtained after analyzing all 32 NFL teams from the 2010-19 seasons. Descriptive statistics were also used to further analyze the data set and found that teams ranked in the top-five in terms of fewest injuries outperformed the remainder of the teams in the NFL according to multiple variables associated with winning. The data in this study supports that NFL organizations that have fewer games missed due to injury of their athletes may have a better opportunity of achieving success.
... Indeed, the influence of injuries on individual and team performance has already been documented. (Eirale et al. 2013;Hägglund et al. 2013;Raysmith and Drew 2016;Drew et al. 2017) Identifying an injury definition that aligns with the stakeholder's perceptions has practical and real-world significance. (Shrier 2020) Monitoring performance-limiting injury problems through the selfreported impact of an injury on performance provide means to develop context-driven injury risk-mitigating strategies to reduce performance-limiting injuries. ...
Article
Background Injury perceptions and related risk-mitigating interventions are context-dependent. Despite this, most injury surveillance systems are not context-specific as they do not integrate end-users perspectives. Purpose To explore how Maltese national team football players, coaches, and health professionals perceive a football-related injury and how their context influences their perceptions and behaviours towards reporting and managing a football injury. Methods 13 semi-structured interviews with Maltese female and male national team football players (n=7), coaches (n=3), and health professionals (n=3) were conducted. Data were analysed using thematic analysis. Results Three themes were identified: (1) How do I perceive an injury? consisted of various constructs of a sports injury, yet commonly defined based on performance limitations. (2) How do I deal with an injury? encapsulated the process of managing the injury (3) What influences my perception, reporting and management of an injury? comprised personal and contextual factors that influenced the perception and, consequently, the management of an injury. Conclusion Performance limitations should be used as part of future injury definitions in injury surveillance systems. Human interaction should be involved in all the processes of an injury surveillance framework, emphasising its active role to guide the injury management process.
Thesis
Full-text available
Variation between individuals in response to a stimulus is a well-established phenomenon. This thesis discusses the drivers of this inter-individual response, identifying three major determinants; genetic, environmental, and epigenetic variation between individuals. Focusing on genetic variation, the thesis explores how this information may be useful in elite sport, aiming to answer the question “Is there utility to genetic information in elite sport?” The current literature was critically analysed, with a finding that the majority of exercise genomics research explains what has happened previously, as opposed to assisting practitioners in modifying athlete preparation and enhancing performance. An exploration of the potential ways in which genetic information may be useful in elite sport then follows, including that of inter- individual variation in response to caffeine supplementation, the use of genetic information to assist in reducing hamstring injuries, and whether genetic information may help identify future elite athletes. These themes are then explored via empirical work. In the first study, an internet-based questionnaire assessed the frequency of genetic testing in elite athletes, finding that around 10% had undertaken such a test. The second study determined that a panel of five genetic variants could predict the magnitude of improvements in Yo-Yo test improvements following a standardised training programme in youth soccer players. The third study demonstrated the effectiveness of a panel of seven genetic variants in predicting the magnitude of neuromuscular fatigue in youth soccer players. The fourth and final study recruited five current or former elite athletes, including an Olympic Champion, and created the most comprehensive Total Genotype Score in the published literature to date, to determine whether their scores deviated significantly from a control population of over 500 non-athletes. The genetic panels were unable to adequately discriminate the elite performers from non-athletes, suggesting that, at this time, genetic testing holds no utility in the identification of future elite performers. The wider utilisation of genetic information as a public health tool is discussed, and a framework for the implementation of genetic information in sport is also proposed. In summary, this thesis suggests that there is great potential for the use of genetic information to assist practitioners in the athlete management process in elite sport, and demonstrates the efficacy of some commercially available panels, whilst cautioning against the use of such information as a talent identification tool. The major limitation of the current thesis is the low sample sizes of many of the experimental chapters, a common issue in exercise genetics research. Future work should aim to further explore the implementation of genetic information in elite sporting environments.
Article
Full-text available
El presente artículo muestra una revisión sistemática de la literatura acerca de los tratamientos terapéuticos para el acúfeno en adultos que han tenido resultados positivos, tanto en estudios nacionales como internacionales. Por consiguiente, se realizó un estudio de tipo documental, en el cual se buscaron y consultaron documentos bibliográficos relacionados con el tema. Para ello se desarrollaron diversas etapas de recolección de información. En un primer momento, se hizo un rastreo en diversas bases de datos, consideradas fuentes primarias y secundarias en las cuales se ubicaron los 50 artículos relacionados con la temática a través de una estrategia de exploración determinada. En un segundo momento se ejecutó la organización y descripción de los materiales encontrados para finalmente hacer la discusión de los hallazgos. Dado lo anterior, se establece que existen diversos tratamientos terapéuticos para el acúfeno, no obstante, no hay todavía un tratamiento contundente que pueda eliminar este síntoma en un 100% de los pacientes. El uso de terapias de reentrenamiento continúa siendo un punto principal, sin embargo, estos se encuentran en proceso de mejora para establecer resultados de mayor avance en la disminución total de este síntoma. Las ciencias de la salud especializadas para este campo no han determinado terapias estándar que puedan ser aplicadas en todos los casos de pacientes con acúfeno por la gran variedad de posibles etiologías que aún no son comprobadas. Por ende, los tratamientos terapéuticos disminuyen en un alto porcentaje los efectos del acúfeno en adultos, no obstante, requieren de mejoras.
Article
Objectives Full-contact football-code team sports offer a unique environment for illness risk. During training and match-play, players are exposed to high-intensity collisions which may result in skin-on-skin abrasions and transfer of bodily fluids. Understanding the incidence of all illnesses and infections and what impact they cause to time-loss from training and competition is important to improve athlete care within these sports. This review aimed to systematically report, quantify and compare the type, incidence, prevalence and count of illnesses across full-contact football-code team sports. Design/methods A systematic search of Cochrane Library, MEDLINE, SPORTDiscus, PsycINFO and CINAHL electronic databases was performed from inception to October 2019; keywords relating to illness, athletes and epidemiology were used. Studies were excluded if they did not quantify illness or infection, involve elite athletes, investigate full-contact football-code sports or were review articles. Results Twenty-eight studies met the eligibility criteria. Five different football-codes were reported: American football (n = 10), Australian rules football (n = 3), rugby league (n = 2), rugby sevens (n = 3) and rugby union (n = 9). One multi-sport study included both American football and rugby union. Full-contact football-code athletes are most commonly affected by respiratory system illnesses. There is a distinct lack of consensus of illness monitoring methodology. Conclusions Full-contact football-code team sport athletes are most commonly affected by respiratory system illnesses. Due to various monitoring methodologies, illness incidence could only be compared between studies that used matching incidence exposure measures. High-quality illness surveillance data collection is an essential component to undertake effective and targeted illness prevention in athletes.
Article
Objectives To describe the injury characteristics of male youth athletes exposed to year-round athletics programmes. Methods Injury surveillance data were prospectively collected by medical staff in a cohort of youth athletics athletes participating in a full-time sports academy from 2014–2015 to 2018–2019. Time-loss injuries (>1 day) were recorded following consensus procedures for athletics. Athletes were clustered into five event groups (sprints, jumps, endurance, throws and non-specialised) and the number of completed training and competition sessions (athletics exposures (AE)) were calculated for each athlete per completed season (one athlete season). Injury characteristics were reported overall and by event groups as injury incidence (injuries per 1000 AE) and injury burden (days lost per 1000 AE). Results One-hundred and seventy-eight boys (14.9±1.8 years old) completed 391 athlete seasons, sustaining 290 injuries. The overall incidence was 4.0 injuries per 1000 AE and the overall burden was 79.1 days lost per 1000 AE. The thigh was the most common injury location (19%). Muscle strains (0.7 injuries per 1000 AE) and bone stress injuries (0.5 injuries per 1000 AE) presented the highest incidence and stress fractures the highest burden (17.6 days lost per 1000 AE). The most burdensome injury types by event group were: bone stress injuries for endurance, hamstring strains for sprints, stress fractures for jumps, lesion of meniscus/cartilage for throws and growth plate injuries for non-specialised athletes. Conclusion Acute muscle strains, stress fractures and bone stress injuries were identified as the main injury concerns in this cohort of young male athletics athletes. The injury characteristics differed between event groups.
Article
Objective: 1) Describe overall illness, and COVID-19-specific illness in high school athletes in the 2019-2020 and 2020-2021 academic school years; and 2) describe and assess the risk of musculoskeletal injury following general infection, and after COVID-19. Design: Ecological study. Methods: High schools (6 states; 176 high schools) were matched between the 2019-2020 and 2020-2021 academic school years, based on 2020-2021 high school sport participation. Illness and injury data were collected from the high school athletic trainers. Illness was stratified by overall illness, general infection, and COVID-19. Injuries following moderate or severe infections or COVID-19 were recorded. Illness and injury incidence rate per 100 athletes per year with 95% confidence intervals (95% CI) were calculated. Negative binomial models comparing injury following general infections and COVID-19 infections were calculated. Results: 98,487 and 72,521 athletes participated in the 2019-2020 and 2020-2021 years. Illness incidence rate was less in the 2019-2020 [0.30 (95% CI: 0.27-0.34)] than 2020-2021 [1.1 (1.0-1.2)] year, resulting in a difference of 0.8 (95% CI: 0.7, 0.9). COVID-19 incidence rate was 0.52 (0.47-0.58) in the 2020-2021 year. Injury following general infection incidence rate was 27.9 injuries (21.4-34.5) per 100 athletes in 2019-2020, and 22.5 injuries (19.3-25.7) per 100 athletes in 2020-2021. There was no difference in injury risk following general infection and COVID-19 [Rate Ratio: 1.2 (95% CI: 0.7-2.4)]. Conclusions: The incidence rate for all illnesses in high school athletes was slightly (0.8) greater in the 2020-2021 academic compared to the 2019-2020 year. Most of the incidence increase was due to infections and COVID-19. Subsequent injury incidence following moderate and severe infections were similar between years and between general infections and COVID-19.
Article
Aim To describe the periodic health evaluation (PHE) practices of the top performing National Olympic Committees (NOCs). Methods We sent a survey to NOCs finishing in the top 8 for medal count at the 2016 Rio Olympic Games or 2018 PyeongChang Olympic Games. The survey included four sections: (1) PHE staff composition and roles, (2) beliefs regarding the PHE, (3) a ranking of risk factors for future injury and (4) details on the elements of the PHE. Results All 14 NOCs with top 8 finishes at the 2016 Rio Olympic Games or 2018 PyeongChang Olympic Games completed the survey. NOCs included a median of seven staff specialties in the PHE, with physicians and physiotherapists having the highest level of involvement. There was agreement that PHEs are effective in identifying current health conditions (13/14) and that athletes should receive individualised action plans after their PHE (14/14), but less agreement (6/14) that PHEs can predict future injury. The practices of NOCs were diverse and often specific to the athlete population being tested, but always included the patient’s health history, laboratory studies, cardiovascular screening and assessments of movement capacity. The top three risk factors for future injury were thought to be previous injury, age and training experience. Conclusions Among the top performing NOCs, the PHE is a comprehensive, multidisciplinary process aimed to identify existing conditions and provide baseline health and performance profiles in the event of future injury. Research linking PHEs to injury prevention is needed.
Article
Full-text available
Background: The Oslo Sports Trauma Research Center (OSTRC) questionnaires on overuse injury and health problem were developed to register overuse injury and health problems. However, this questionnaire has not been translated or validated in Thai. Objective: To develop the original edition of the OSTRC Questionnaire on overuse injuries and health problems into Thai language and to examine the validity and reliability of the adapted scale. Materials and Methods: The development of the questionnaire followed the steps of translation, which included forward translation, translation merging, backward translation, and critique by the researcher, health professionals, speech professionals, athletes, and the translators for semantic and conceptual equivalence. A cohort of 65 Thai athletes were recruited. Cross-sectional surveillance data were used to record overuse injury and health problem. Throughout the 12-week surveillance period, all participants were assigned to complete the questionnaire within three days after receiving a questionnaire by e-mail. Robustness and reliability process was seen in the 57 athletes who completed their 12 weeks of surveillance period. Results: The OSTRC on overuse injuries and health problems Thai version (OSTRC-OT and HT) showed a high internal consistency. Cronbach’s α of the OSTRC-OT for ankle, knee, and hip regions was 0.919, 0.973, and 0.976, respectively, and the OSTRC-HT was 0.959 and an excellent testretest reliability with intraclass correlation coefficient of the OSTRC-OT for ankle, knee, and hip regions at 0.994, 0.970, and 0.991, respectively, and the OSTRC-HT at 0.970; all p-values<0.001. Known-groups validity, the severity scores of the OSTRC-OT for ankle, knee, and hip regions, and the OSTRC-HT scores were statistically significant different between injury and non-injury groups. Conclusion: The validity and reliability of both questionnaires, the OSTRC-OT and the OSTRC-HT were at an excellent level. Moreover, the OSTRCOT and OSTRC HT have an excellent ability to separate athletes who have an injury and health problem for those who do not. Keywords: Questionnaire, Sport, Overuse injury, Health problem
Article
Acute respiratory infections (ARinf) are common in athletes, but their effects on exercise and sports performance remain unclear. This systematic review aimed to determine the acute (short-term) and longer-term effects of ARinf, including SARS-CoV-2 infection, on exercise and sports performance outcomes in athletes. Data sources searched included PubMed, Web of Science, and EBSCOhost, from January 1990-31 December 2021. Eligibility criteria included original research studies published in English, measuring exercise and/or sports performance outcomes in athletes/physically active/military aged 15-65years with ARinf. Information regarding the study cohort, diagnostic criteria, illness classification, and quantitative data on the effect on exercise/sports performance were extracted. Database searches identified 1707 studies. After full text screening, 17 studies were included (n = 7793). Outcomes were acute or longer-term effects on exercise (cardiovascular or pulmonary responses), or sports performance (training modifications, change in standardised point scoring systems, running biomechanics, match performance or ability to start/finish an event). There was substantial methodological heterogeneity between studies. ARinf was associated with acute decrements in sports performance outcomes (4 studies) and pulmonary function (3 studies), but minimal effects on cardiorespiratory endurance (7 studies in mild ARinf). Longer-term detrimental effects of ARinf on sports performance (6 studies) were divided. Training mileage, overall training load, standardised sports performance-dependent points and match play can be affected over time. Despite few studies, there is a trend towards impairment in acute and longer-term exercise and sports outcomes after ARinf in athletes. Future research should consider a uniform approach to explore relationships between ARinf and exercise/sports performance.
Thesis
Background: Elite youth athletes participate in intense and structured training programmes to realise their performance potential, but their development may be interrupted by injuries. To reduce the impact of injuries we first need to know which injuries affect participation the most and what the risk factors are. Growth and maturation represent two potential non-modifiable intrinsic risk factors that are unique to adolescent athletes. The literature published on this topic is, however, considered of low quality and findings in earlier studies are inconsistent. The aim of this thesis was therefore to identify the most common and burdensome injuries in elite male youth athletes participating in football (soccer) and athletics (track and field) and to explore growth and maturation as risk factors. Methods: All studies were based on data from routine monitoring of athletes at Aspire Academy, a national elite sports academy in Doha, Qatar. Participants were males aged 11 to 18 years participating in the football or athletics programmes. The first study (Paper I) was a methodological study where we investigated the effect on injury incidence when a broad medical-attention definition was used and recorders/supervisors were invested in research projects relying on the data. This study was based on injury data for the U16 through U18 squads from 2012/13 through 2016/17 (211 players). Papers II and III were descriptive epidemiological studies in athletics and football, respectively. Time-loss injuries were collected prospectively over five seasons in athletics (2014/15 through 2018/19, 179 athletes) and four seasons in football (2016/17 through 2019/20, 301 players) by physiotherapists. The most common (injury incidence) and burdensome (injury burden) combinations of injury location and type were identified, and injury patterns were examined for event groups (athletics; non-specialised, endurance, sprints, jumps, throws) and age groups (football; U13 through U18). In Papers IV and V, subsamples of athletes (74 in athletics, 103 in football) from the epidemiological studies with complete growth (anthropometric measures, i.e. height, leg length and body mass) and maturity (skeletal age, using the Fels method) assessments were included. Growth rates, maturity status and maturity tempo were then examined as risk factors for specific injury types. Main results: The level of investment in the injury surveillance programme by the injury recorder (team physiotherapist) or supervisor had a large impact on the incidence of non-time-loss injuries and injuries with a minimal day loss (1-3 days), while time-loss injuries overall were unaffected (Paper I). In athletics (Paper II), the main concerns were bone and muscle injuries, with thigh muscle strains/ruptures, lumbar spine stress fractures and lower leg bone stress injuries as the most burdensome location-type combinations. Injury patterns were, however, specific to each event group. In football (Paper III), typical “football injuries” (knee sprains, thigh strains and ankle sprains) were the most burdensome, followed by lumbosacral bone stress injuries and physis injuries to the hip/groin. Older athletes sustained more injuries relative to exposure (hours); muscle injuries were increasingly common and physis injuries less common with age. In Paper IV, younger skeletal age and greater changes in height, leg length and skeletal age over a season were associated with a greater incidence of bone and growth plate injuries in athletics. No associations with injury risk were found for changes in body mass, trunk height or body mass index. In football (Paper V), growth rates over shorter periods were not related to injury risk when accounting for age (chronological age or skeletal age) and load (weekly exposure). Older skeletal age was associated with significantly greater overall, sudden onset, muscle and joint sprain injury risk. The associations could, however, not be considered practically relevant due to the uncertain estimates for the odds ratios. Conclusion: Based on our findings, time-loss incidence should be used when multiple medical staff recorders are involved in the data collection. Injuries patterns in elite male youth athletes are specific to the sport, event group and age group; tailoring injury reduction programmes may therefore be possible. A large proportion of lost training and competition days were attributed to bone injuries; these should be targeted to a larger degree in risk factor studies and in injury reduction programmes. Skeletal maturity appears to affect the risk of sustaining certain injury types in football and athletics, while growth rates were only related to injury risk in athletics. Practitioners and researchers may need to consider the full growth and maturity process, rather than analysing short isolated periods, to better understand the relationship between growth, maturation and injury risk.
Article
Introduction. Currently, modern technologies of integrated sports training in race walking are being developed, taking into account the whole range of requirements of modern sports of the highest achievements, including means of recovery and stimulation of working capacity. Research hypothesis: to find out the availability of opportunities for improving the quality of practical work of coaches with highly qualified athletes in pre-competition training in race walking. The purpose of the study: to clarify the problematic issues of pre-start training, to substantiate the possibilities of the development and targeted use of specific means of recovery and stimulation of working capacity in the pre-competition practice of highly qualified athletes in race walking. Material and methods: a survey of 12 trainers (n = 12), specializing in training athletes in race walking, was conducted: Merited coachers of Ukraine – 3, highest category – 3, first category – 6. We used the method of interviewing respondents (questionnaires) using the scale of estimates. Results: the respondents showed a positive attitude to modern means of recovery and increased performance, injury prevention, solving issues of improving the technique in race walking, were used non-parametric methods for evaluating the data distribution function, which are presented in table form. Conclusions: the perspectives of using the research results for the effectiveness of the coaches co-operation with high-class athletes in the context of intensification in modern preparation for athletes, its high integration are presented. Key words: race walking, pre-performavce training, coachers, questionnaire.
Article
Nutrition plays a key role in training for, and competing in, competitive sport, and is essential for reducing risk of injury and illness, recovering and adapting between bouts of activity, and enhancing performance. Consumption of a Mediterranean diet (MedDiet) has been demonstrated to reduce risk of various non-communicable diseases and increase longevity. Following the key principles of a MedDiet could also represent a useful framework for good nutrition in competitive athletes under most circumstances, with potential benefits for health and performance parameters. In this review, we discuss the potential effects of a MedDiet, or individual foods and compounds readily available in this dietary pattern, on oxidative stress and inflammation, injury and illness risk, vascular and cognitive function, and exercise performance in competitive athletes. We also highlight potential modifications which could be made to the MedDiet (whilst otherwise adhering to the key principles of this dietary pattern) in accordance with contemporary sports nutrition practices, to maximise health and performance effects. In addition, we discuss potential directions for future research.
Article
Full-text available
Objective: To test the efficacy of the Athletics Injury Prevention Programme (AIPP) to reduce the percentage of athletes presenting at least one injury complaint leading to participation restriction (ICPR) over an athletics season. Methods: During the 2017-2018 athletics season, we included in this cluster randomised controlled trial (ClinicalTrials.gov Identifier: NCT03307434) 840 athletes randomly assigned (randomisation unit: athletic clubs) to a control group (regular training) or to an intervention group (regular training plus the AIPP 2/week). Using a weekly online questionnaire, athletes reported the ICPR, training and competition exposures, and, for the intervention group, the compliance with the AIPP. The primary outcome was the percentage of athletes presenting at least one ICPR over the study follow-up. Results: A total of 449 and 391 athletes were included in the intervention and control groups, respectively. From them, 68 (15.1%) and 100 (25.6%) athletes, respectively, provided 100% of the requested information during the follow-up (39 weeks). A total of 6 (8.8%) performed the AIPP 2/week or more. The proportion of athletes who had at least one ICPR over the follow-up period was similar in the intervention (64.7%) and control groups (65.0%), with adjusted odds ratios: 0.81 (95% CI 0.36 to 1.85). There were no between-group differences when comparing separately the subgroups corresponding with the different compliance levels. Conclusion: This cluster randomised controlled trial reported no efficacy of the AIPP. However, the overall response proportion and the compliance with the AIPP in the intervention group were low. In individual sports especially, efforts should be first made to improve the implementation and adoption of interventions.
Article
Full-text available
Objectives This study aimed to describe the injury epidemiology of domestic and international level male New Zealand cricketers from seasons 2009–2010 to 2014–2015 across all match formats given the increasing popularity of T20 cricket. Methods Match exposure and injury surveillance data collected prospectively by New Zealand Cricket was analysed using international consensus recommendations for injury surveillance and reporting in cricket. Relationships between playing level, role and injury were statistically analysed. Results A total of 268 elite male New Zealand cricketers from seasons 2009–2010 to 2014–2015 were analysed from the New Zealand Cricket injury surveillance system. Total new match injury incidence rates were 37.0 and 58.0 injuries per 10 000 player hours in domestic and international cricket, respectively. Total new and recurrent match injury incidence in international cricket was approximately 1.7 times higher than domestic cricket (277.6 vs 162.8 injuries per 1000 player days). Injury prevalence rates were 7.6% and 10.0% in domestic and international cricket. The hamstring (8.2%) in domestic cricket and the groin (13.5%) in international cricket were the most injured body sites. Most match days lost in domestic cricket were to the lumbar spine (417 days), and groin in international cricket (152 days). There were statistically significant differences in injury between domestic and international level cricketers (χ ² =4.39, p=0.036), and playing role (χ ² =42.29, p<0.0001). Conclusions Total injury incidence rates in elite New Zealand cricket increased in 2009–2015 compared with previous data. International-level players and pace bowlers were the most injured individuals.
Article
Full-text available
Introduction: Team sport athletes have increased susceptibility to upper respiratory symptoms (URS) during periods of intensified training and competition. Reactivation of Epstein–Barr Virus (EBV) may be a novel marker for risk of upper respiratory illness (URI) in professional athletes. Aims: To investigate changes in salivary EBV DNA (in addition to the well-established marker, salivary secretory immunoglobulin A), and incidence of URS in professional footballers. Methods: Over a 16-week period (August to November 2016), 15 male players from a professional English football League 1 club provided weekly unstimulated saliva samples (after a rest day) and recorded URS. Saliva samples were analyzed for secretory IgA (ELISA) and EBV DNA (qPCR). Results: Whole squad median (interquartile range) saliva IgA concentration and secretion rate significantly decreased (p < .05) between weeks 8 and 12 (concentration, 107 (76–150) mg/L healthy baseline to 51 (31–80) mg/L at week 12; secretion rate 51 (30–78) µg/min healthy baseline to 22 (18–43) µg/min at week 12). Two players reported URS episodes during week 10, both after IgA secretion rate decreased below 40% of the individual’s healthy baseline. EBV DNA was detected in the weeks before URS but also at other times and in healthy players (overall frequency 40%, range 11–78%) and frequency was similar between the URS and healthy group. Conclusion: These findings confirm salivary IgA as a useful marker of URS risk but EBV DNA was not. Further research capturing a greater number of URS episodes is required, however, to fully determine the utility of this marker.
Article
Full-text available
Objective To describe weekly illness prevalence and illness symptoms by sex in youth floorball players during one season. Design Prospective cohort study. Setting Players who were registered to play community level floorball during the 2017–2018 season (26 weeks) in two provinces in southern Sweden. Participants 471 youth players aged 12–17 years. Mean (SD) age for 329 male players 13.3 (1.0) years and 142 female players 13.7 (1.5) years. Primary and secondary outcome measures Weekly self-reported illness prevalence and illness symptoms according to the 2020 International Olympic Committee’s consensus recommendations. Results 61% of youth floorball players reported at least one illness week during the season, with an average weekly illness prevalence of 12% (95% CI 10.8% to 12.3%). The prevalence was slightly higher among females (13%, 95% CI 11.6% to 14.3%) than males (11%, 95% CI 9.9% to 11.7%), prevalence rate ratio 1.20 (95% CI 1.05 to 1.37, p=0.009). In total, 49% (53% male, 43% female) of illness reports indicated that the player could not participate in floorball (time loss), with a mean (SD) absence of 2.0 (1.7) days per illness week. Fever (30%), sore throat (16%) and cough (14%) were the most common symptoms. Female players more often reported difficulty in breathing/tight airways and fainting, and male players more often reported coughing, feeling tired/feverish and headache. Illness prevalence was highest in the peak winter months (late January/February) reaching 15%–18% during this period. Conclusions Our novel findings of the illness prevalence and symptoms in youth floorball may help direct prevention strategies. Athletes, coaches, parents and support personnel need to be educated about risk mitigation strategies. Trial registration number NCT03309904 .
Article
Full-text available
To determine the health status of athletes before the start of an international athletics championship and to determine whether preparticipation risk factors predicted in-championship injuries. At the beginning of the 2013 International Association of Athletics Federations (IAAF) World Championships, all registered athletes (n=1784) were invited to complete a preparticipation health questionnaire (PHQ) on health status during the month preceding the championships. New injuries that occurred at the championships were prospectively recorded. The PHQ was completed by 698 (39%) athletes; 204 (29.2%) reported an injury complaint during the month before the championships. The most common mode of onset of preparticipation injury complaints was gradual (43.6%). Forty-nine athletes in the study reported at least one injury during the championships. Athletes who reported a preparticipation injury complaint were at twofold increased risk for an in-championship injury (OR=2.09; 95% CI 1.16 to 3.77); p=0.014). Those who reported a preparticipation gradual-onset injury complaint were at an almost fourfold increased risk for an in-championship time-loss injury (OR=3.92; 95% CI 1.69 to 9.08); p=0.001). Importantly, the preparticipation injury complaint severity score was associated with the risk of sustaining an in-championship injury (OR=1.14; 95% CI 1.06 to 1.22); p=0.001). About one-third of the athletes participating in the study reported an injury complaint during the month before the championships, which represented a risk factor for sustaining an injury during the championship. This study emphasises the importance of the PHQ as a screening tool to identify athletes at risk of injuries before international championships. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Article
Full-text available
The expectation that training enhances performance is well explored in professional sport. However, the additional challenges of physical and cognitive maturation may require careful consideration when determining workloads to enhance performance in adolescents. The objective of this study was to determine the state of knowledge on the relationship between workloads, physical performance, injury and/or illness in adolescent male football players. A systematic review of workloads, physical performance, injury and illness in male adolescent football players was conducted. Studies for this review were identified through a systematic search of six electronic databases (Academic Search Complete, CINAHL, PsycINFO, PubMed, SPORTDiscus, and Web of Science). For the purpose of this review, load was defined as the cumulative amount of stress placed on an individual from multiple training sessions and games over a period of time, expressed in terms of either the external workloads performed (e.g., resistance lifted, kilometres run) or the internal response (e.g., heart rate, rating of perceived exertion) to that workload. A total of 2,081 studies were initially retrieved from the six databases, of which 892 were duplicates. After screening the titles, abstracts and full texts, we identified 23 articles meeting our criteria around adolescent football players, workloads, physical performance, injury and/or illness. Seventeen articles addressed the relationship between load and physical performance, four articles addressed the relationship between load and injury and two articles addressed both. A wide range of training modalities were employed to improve the physical performance of adolescent football players, with strength training, high-intensity interval training, dribbling and small-sided games training, and a combination of these modalities in addition to normal football training, resulting in improved performances on a wide range of physiological and skill assessments. Furthermore, there was some (limited) evidence that higher workloads may be associated with the development of better physical qualities, with one study demonstrating enhanced submaximal interval shuttle run performance with each additional hour of training or game play. Of the few studies examining negative consequences associated with workloads, increases in training load led to increases in injury rates, while longer training duration was associated with a greater incidence of illness. The combined capacity for adolescent males to grow, train and improve physical performance highlights and underscores an exciting responsiveness to training in the football environment. However, the capacity to train has some established barriers for adolescents experiencing high workloads, which could also result in negative consequences. Additional research on stage-appropriate training for adolescent male footballers is required in order to address the knowledge gaps and enhance safe and efficient training practices.
Article
Full-text available
Movement towards sport safety in Athletics through the introduction of preventive strategies requires consensus on definitions and methods for reporting epidemiological data in the various populations of athletes. To define health-related incidents (injuries and illnesses) that should be recorded in epidemiological studies in Athletics, and the criteria for recording their nature, cause and severity, as well as standards for data collection and analysis procedures. A 1-day meeting of 14 experts from eight countries representing a range of Athletics stakeholders and sport science researchers was facilitated. Definitions of injuries and illnesses, study design and data collection for epidemiological studies in Athletics were discussed during the meeting. Two members of the group produced a draft statement after this meeting, and distributed to the group members for their input. A revision was prepared, and the procedure was repeated to finalise the consensus statement. Definitions of injuries and illnesses and categories for recording of their nature, cause and severity were provided. Essential baseline information was listed. Guidelines on the recording of exposure data during competition and training and the calculation of prevalence and incidences were given. Finally, methodological guidance for consistent recording and reporting on injury and illness in athletics was described. This consensus statement provides definitions and methodological guidance for epidemiological studies in Athletics. Consistent use of the definitions and methodological guidance would lead to more reliable and comparable evidence.
Article
Full-text available
To determine if the comparison of acute and chronic workload is associated with increased injury risk in elite cricket fast bowlers. Data were collected from 28 fast bowlers who completed a total of 43 individual seasons over a 6-year period. Workloads were estimated by summarising the total number of balls bowled per week (external workload), and by multiplying the session rating of perceived exertion by the session duration (internal workload). One-week data (acute workload), together with 4-week rolling average data (chronic workload), were calculated for external and internal workloads. The size of the acute workload in relation to the chronic workload provided either a negative or positive training-stress balance. A negative training-stress balance was associated with an increased risk of injury in the week after exposure, for internal workload (relative risk (RR)=2.2 (CI 1.91 to 2.53), p=0.009), and external workload (RR=2.1 (CI 1.81 to 2.44), p=0.01). Fast bowlers with an internal workload training-stress balance of greater than 200% had a RR of injury of 4.5 (CI 3.43 to 5.90, p=0.009) compared with those with a training-stress balance between 50% and 99%. Fast bowlers with an external workload training-stress balance of more than 200% had a RR of injury of 3.3 (CI 1.50 to 7.25, p=0.033) in comparison to fast bowlers with an external workload training-stress balance between 50% and 99%. These findings demonstrate that large increases in acute workload are associated with increased injury risk in elite cricket fast bowlers.
Article
Full-text available
Background: The influence of injuries on team performance in football has only been scarcely investigated. Aim: To study the association between injury rates and team performance in the domestic league play, and in European cups, in male professional football. Methods: 24 football teams from nine European countries were followed prospectively for 11 seasons (2001-2012), including 155 team-seasons. Individual training and match exposure and time-loss injuries were registered. To analyse the effect of injury rates on performance, a Generalised Estimating Equation was used to fit a linear regression on team-level data. Each team's season injury rate and performance were evaluated using its own preceding season data for comparison in the analyses. Results: 7792 injuries were reported during 1 026 104 exposure hours. The total injury incidence was 7.7 injuries/1000 h, injury burden 130 injury days lost/1000 h and player match availability 86%. Lower injury burden (p=0.011) and higher match availability (p=0.031) were associated with higher final league ranking. Similarly, lower injury incidence (p=0.035), lower injury burden (p<0.001) and higher match availability (p<0.001) were associated with increased points per league match. Finally, lower injury burden (p=0.043) and higher match availability (p=0.048) were associated with an increase in the Union of European Football Association (UEFA) Season Club Coefficient, reflecting success in the UEFA Champions League or Europa League. Conclusions: Injuries had a significant influence on performance in the league play and in European cups in male professional football. The findings stress the importance of injury prevention to increase a team's chances of success.
Article
Full-text available
Background: The Olympic Movement Medical Code encourages all stakeholders to ensure that sport is practised without danger to the health of the athletes. Systematic surveillance of injuries and illnesses is the foundation for developing preventive measures in sport. Aim: To analyse the injuries and illnesses that occurred during the Games of the XXX Olympiad, held in London in 2012. Methods: We recorded the daily occurrence (or non-occurrence) of injuries and illnesses (1) through the reporting of all National Olympic Committee (NOC) medical teams and (2) in the polyclinic and medical venues by the London Organising Committee of the Olympic and Paralympic Games' (LOCOG) medical staff. Results: In total, 10 568 athletes (4676 women and 5892 men) from 204 NOCs participated in the study. NOC and LOCOG medical staff reported 1361 injuries and 758 illnesses, equalling incidences of 128.8 injuries and 71.7 illnesses per 1000 athletes. Altogether, 11% and 7% of the athletes incurred at least one injury or illness, respectively. The risk of an athlete being injured was the highest in taekwondo, football, BMX, handball, mountain bike, athletics, weightlifting, hockey and badminton, and the lowest in archery, canoe slalom and sprint, track cycling, rowing, shooting and equestrian. 35% of the injuries were expected to prevent the athlete from participating during competition or training. Women suffered 60% more illnesses than men (86.0 vs 53.3 illnesses per 1000 athletes). The rate of illness was the highest in athletics, beach volleyball, football, sailing, synchronised swimming and taekwondo. A total of 310 illnesses (41%) affected the respiratory system and the most common cause of illness was infection (n=347, 46%). Conclusions: At least 11% of the athletes incurred an injury during the games and 7% of the athletes' an illness. The incidence of injuries and illnesses varied substantially among sports. Future initiatives should include the development of preventive measures tailored for each specific sport and the continued focus among sport bodies to institute and further develop scientific injury and illness surveillance systems.
Article
Full-text available
Objective Sports injuries are often recurrent and there is wide recognition that a subsequent injury (of either the same or a different type) can be strongly influenced by a previous injury. Correctly categorising subsequent injuries (multiple, recurrent, exacerbation or new) requires substantial clinical expertise, but there is also considerable value in combining this expertise with more objective statistical criteria. This paper presents a new model, the subsequent injury categorisation (SIC) model, for categorising subsequent sports injuries that takes into account the need to include both acute and overuse injuries and ten different dependency structures between injury types. Methods The suitability of the SIC model was demonstrated with date ordered sports injury data from a large injury database from community Australian football players over one playing season. A subsequent injury was defined to have occurred in the subset of players with two or more reported injuries. Results 282 players sustained 469 subsequent injuries of which 15.6% were coded to categories representing injuries that were directly related to previous index injuries. This demonstrates that players can sustain a number of injuries over one playing season. Many of these will be unrelated to previous injuries but subsequent injuries that are related to previous injury occurrences are not uncommon. Conclusion The handling of subsequent sports injuries is a substantial challenge for the sports medicine field—both in terms of injury treatment and in epidemiological research to quantify them. Application of the SIC model allows for multiple different injury types and relationships within players, as well as different index injuries.
Article
Full-text available
Objectives: To examine the relationship between combined training and game loads and injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Forty-six elite Australian footballers (mean±SD age of 22.2±2.9 y) from one club were involved in a one-season study. Training and game loads (session-RPE multiplied by duration in min) and injuries were recorded each time an athlete exerted an exercise load. Rolling weekly sums and week-to-week changes in load were then modelled against injury data using a logistic regression model. Odds ratios (OR) were reported against a reference group of the lowest training load range. Results: Larger 1 weekly (>1750 AU, OR=2.44-3.38), 2 weekly (>4000 AU, OR=4.74) and previous to current week changes in load (>1250 AU, OR=2.58) significantly related (p<0.05) to a larger injury risk throughout the in-season phase. Players with 2-3 and 4-6 years of experience had a significantly lower injury risk compared to 7+ years players (OR=0.22, OR=0.28) when the previous to current week change in load was more than 1000 AU. No significant relationships were found between all derived load values and injury risk during the pre-season phase. Conclusions: In-season, as the amount of 1-2 weekly load or previous to current week increment in load increases, so does the risk of injury in elite Australian footballers. To reduce the risk of injury, derived training and game load values of weekly loads and previous week-to-week load changes should be individually monitored in elite Australian footballers.
Article
Full-text available
Little is known of injury patterns in track and field (athletics). Injury prevalence has been proposed as the most appropriate measure of the injury rate in sports where athletes are at risk for overuse problems. To ascertain 1-year retrospective and current prevalence of injury in elite track and field athletes to help plan injury prevention programs for this sport. Descriptive epidemiology study. Two hundred seventy-eight youth (16 years old) and adult athletes from an eligible study population of 321 athletes were included. The 1-year retrospective injury prevalence was 42.8% (95% confidence interval [CI], 36.9%-49.0%); the point prevalence was 35.4% (95% CI, 29.7%-41.4%). The diagnosis group displaying the highest injury prevalence was inflammation and pain in the gradual onset category (1-year prevalence, 20.9%; 95% CI, 16.2%-26.2%; and point prevalence, 23.2%; 95% CI, 18.4%-28.7%). A strong tendency for higher 1-year prevalence of 16.5% (95% CI, 12.2%-21.4%) than point prevalence of 8.5% (95% CI, 5.5%-12.5%) was recorded for sudden onset injuries in the diagnosis group sprain, strain, and rupture. The body region showing the highest injury prevalence was the knee and lower leg with 15.0% (95% CI, 11.0%-19.8%) 1-year prevalence and 13.7% (95% CI, 9.8%-18.3%) point prevalence, followed by the Achilles tendon, ankle, and foot/toe with 11.7% (95% CI, 8.2%-16.1%) 1-year prevalence and 11.4% (95% CI, 7.9%-15.8%) point prevalence. The injury prevalence is high among Swedish elite track and field athletes. Most of the injuries affect the lower extremities and are associated with a gradual onset. Although it is associated with a potential recall bias, the 1-year retrospective prevalence measure captured more sudden onset injuries than the point prevalence measure. Future prospective studies in track and field are needed to identify groups of athletes at increased risk.
Article
Full-text available
To analyse the frequency and characteristics of sports injuries and illnesses incurred during the World Athletics Championships. Prospective recording of newly occurred injuries and illnesses. Twelfth International Association of Athletics Federations World Championships in Athletics 2009 in Berlin, Germany. National team physicians and physiotherapists and 1979 accredited athletes; Local Organising Committee physicians working in the Medical Centres. Incidence and characteristics of newly incurred injuries and illnesses. 236 injury incidents with 262 injured body parts and 269 different injury types were reported, representing an incidence of 135.4 injuries per 1000 registered athletes. Eighty percent affected the lower extremity. Thigh strain (13.8%) was the main diagnosis. Overuse (44.1%) was the predominant cause. Most injuries were incurred during competition (85.9%). About 43.8% of all injury events were expected to result in time-loss. 135 illnesses were reported, signifying an incidence of 68.2 per 1000 registered athletes. Upper respiratory tract infection was the most common condition (30.4%) and infection was the most frequent cause (32.6%). The incidence of injury and illnesses varied substantially among the events. The risk of injury varied with each discipline. Preventive measures should be specific and focused on minimising the potential for overuse. Attention should be paid to ensure adequate rehabilitation of previous injuries. The addition of the illness part to the injury surveillance system proved to be feasible. As most illnesses were caused by infection of the respiratory tract or were environmentally related, preventive interventions should focus on decreasing the risk of transmission, appropriate event scheduling and heat acclimatisation.
Article
Full-text available
Standardized assessment of sports injuries provides important epidemiological information and also directions for injury prevention. To analyze the frequency, characteristics, and causes of injuries incurred during the Summer Olympic Games 2008. Descriptive epidemiology study. The chief physicians and/or chief medical officers of the national teams were asked to report daily all injuries newly incurred during the Olympic Games on a standardized injury report form. In addition, injuries were reported daily by the physicians at the medical stations at the different Olympic venues and at the polyclinic in the Olympic Village. Physicians and/or therapists of 92 national teams covering 88% of the 10,977 registered athletes took part in the study. In total, 1055 injuries were reported, resulting in an incidence of 96.1 injuries per 1000 registered athletes. Half of the injuries (49.6%) were expected to prevent the athlete from participating in competition or training. The most prevalent diagnoses were ankle sprains and thigh strains. The majority (72.5%) of injuries were incurred in competition. One third of the injuries were caused by contact with another athlete, followed by overuse (22%) and noncontact incidences (20%). Injuries were reported from all sports, but their incidence and characteristics varied substantially. In relation to the number of registered athletes, the risk of incurring an injury was highest in soccer, taekwondo, hockey, handball, weightlifting, and boxing (all >or=15% of the athletes) and lowest for sailing, canoeing/kayaking, rowing, synchronized swimming, diving, fencing, and swimming. The data indicate that the injury surveillance system covered almost all of the participating athletes, and the results highlight areas of high risk for sport injury such as the in-competition period, the ankle and thigh, and specific sports. The identification of these factors should stimulate future research and subsequent policy change to prevent injury in elite athletes.
Article
Full-text available
The aim of this study was to analyze all sports injuries incurred in competitions and/or training during the 2007 World Athletics Championships and to prove the feasibility of the injury surveillance system developed for the 2008 Olympic Games for individual sports. Prospective recording of injuries. 11 IAAF World Championships in Athletics 2007 in Osaka, Japan. All national team physicians and physiotherapists; Local Organising Committee (LOC) physicians working in the Medical Centres at the stadium and warm-up area. Frequency, characteristics, and incidence of injuries. 192 injuries were reported, resulting in an incidence of 97 injuries per 1000 registered athletes. More than half of the injuries (56%) were expected to prevent the athlete from participating in competition or training. Eighty percent affected the lower extremity; the most common diagnosis was thigh strain (16%). In most cases, the injury was caused by overuse (44%). A quarter of the injuries were incurred during training and 137 (71%) in competition. On average, 72.4 injuries per 1000 competing athletes were incurred in competitions. The incidence of injury varied substantially among the disciplines. The risk of a time-loss injury was highest in heptathlon, women's 10,000 m, women's 3000 m steeplechase, decathlon, and men's marathon. The injury surveillance system proved feasible for individual sports. Risk of injury varied among the disciplines, with highest risk in combined disciplines, steeplechase, and long-distance runs. Preventive interventions should mainly focus on overuse injuries and adequate rehabilitation of previous injuries.
Article
Full-text available
To determine the incidence of lower-extremity injury among high school cross-country runners and to identify risk factors for injury, the authors prospectively monitored a cohort of 421 runners competing on 23 cross-country teams in 12 Seattle, Washington, high schools during the 1996 cross-country season. Collected were daily injury and athletic exposure (AE) reports, a baseline questionnaire on prior running and injury experience, anthropometric measurements, and coaches' training logs. The overall incidence rate of injury was 17.0/1,000 AEs. Girls had a significantly higher overall injury rate (19.6/1,000 AEs) than boys did (15.0/1,000 AEs) (incidence rate ratio = 1.3, 95% confidence interval: 1.0, 1.6). Compared with boys, girls had significantly higher rates of injuries resulting in >or=15 days of disability. For the overall sample and for girls, Cox regression revealed that a quadriceps angle of >or=20 degrees and an injury during summer running prior to the season were the most important predictors of injury. For boys, a quadriceps angle of >or=15 degrees and a history of multiple running injuries were most associated with injury. Results suggest that the incidence of lower-extremity injuries is high for cross-country runners, especially girls. Preseason screening to determine risk factors should be examined as a preventive approach for identifying high-risk runners.
Article
Consistency in routines for reporting injury has been a focus of development efforts in sports epidemiology for a long time. To gain an improved understanding of current reporting practices, we applied the Injury Definitions Concept Framework (IDCF) in a review of injury reporting in a subset of the field. Meta-narrative review, An analysis of injury definitions reported in consensus statements for different sports and studies of injury epidemiology in athletics (track and field) published in PubMed between 1980 and 2013 was performed. Separate narratives for each of the three reporting contexts in the IDCF were constructed from the data. Six consensus statements and 14 studies reporting on athletics injury epidemiology fulfilled the selection criteria. The narratives on sports performance, clinical examination, and athlete self-report contexts were evenly represented in the eligible studies. The sports performance and athlete self-report narratives covered both professional and community athletes as well as training and competition settings. In the clinical examination narrative, data collection by health service professionals was linked to studies of professional athletes at international championships. From an application of the IDCF in a review of injury reporting in sports epidemiology we observed a parallel usage of reporting contexts in this field of research. The co-existence of reporting methodologies does not necessarily reflect a problematic situation, but only provided that firm precautions are taken when comparing studies performed in the different contexts. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Conference Paper
Introduction The Australian Institute of Sport (AIS) has recently invested in an online data management system (AMS) which incorporates injury and illness data, clinical notes, training loads, wellness reporting, as well as many other performance-related variables. This management system has been adopted by 19 National Sporting Organisations (NSO) with four more planned later in 2014 and 4 State Institutes of Sport/Academies of Sport (SISSAS) have formally agreed, with implementation at various stages throughout 2014. The adoption of this monitoring system has grown exponentially with current estimations indicating approximately 10,000 injuries will be recorded over the next 5 years. Clinically, it is understood that any injury will disrupt training and exposes the athlete to risk of injury on return competition due to the relative shift in training volume and modality, thus injury prevention programs and ongoing monitoring systems are imperative for the success of Australian athletes. This presentation outlines the methodology that the AIS is undertaking to monitor and prevent subsequent injuries. Methods Monitoring of injuries and illness was undertaken across the 19 NSOs over a 6 month period. A data dictionary was created with common definitions of injury and illness. The definition of injury and illness was defined as “Any physical or medical complaint that results in an athlete being unable to participate in training or competition, as planned by coaching staff, for greater than 24 hours”. Subsequent injuries were defined as an injury sustained following an index injury (SIC model). Index injuries were defined as the first new injury within this period. Results 2318 athletes were monitored with 577 injuries recorded (new, n=414; recurrent, n=163; insidious onset, n=164; overuse, n=208, trauma, n=205). Thirty-seven percent of all injuries recorded were subsequent to an index injury (n=216, median=2/injured athlete, range=0-6). The overall servicing cost of all injuries equated to 2187 physiotherapy treatments, 1980 “maintenance” treatments (treatment in the absence of current injury) and 1121 soft-tissue therapy treatment (21% directly related to injury management) spread across 60 physiotherapy staff nationally. The estimated financial cost of these treatments was estimated as $650,000AU. Discussion The results of this study show a comparatively high subsequent injury rate compared to previous studies. It highlights the need to monitor and implement injury prevention programs for subsequent injuries. This study presents the methodology the Australian Institute of Sport is undertaking to monitor subsequent injuries in Australian National Sporting Organisations.
Article
Objectives To explore the performance of retrospective health data collected from athletes before Athletics championships for the analysis of risk factors for in-competition injury and illness (I&I). Methods For the 2013 European Athletics Indoor Championships, a self-report questionnaire (PHQ) was developed to record the health status of 127 athletes during the 4 weeks prior to the championship. Physician-based surveillance of in-competition I&I among all 577 athletes registered to compete was pursued during the championships. Results 74 athletes (58.3%) from the sample submitted a complete PHQ. 21 (28%) of these athletes sustained at least one injury and/or illness during the championships. Training more than 12h/week predisposed for sustaining an in-competition injury, and a recent health problem for in-competition illness. Among the 577 registered athletes, 60 injuries (104/1000 registered athletes) were reported. 31% of injuries were caused by the track, and 29% by overuse. 29 illnesses were reported (50/1000 registered athletes); upper respiratory tract infection and gastro-enteritis/diarrhoea were the most reported diagnoses. Conclusions Pre-participation screening using athletes’ self-report PHQ showed promising results with regard to identification of individuals at risk. Indoor injury types could be attributed to extrinsic factors, such as small track size, track inclination, and race tactics.
Article
Objectives To survey injury/illness in the National Basketball Association (NBA) over a 25-year period and examine the relationship of injury/illness to team performance. Design A retrospective correlational design. Method Trends were examined in reported numbers of players injured/ill during a season and games missed due to injury/illness from seasons ending in 1986 through 2005. This period was compared to years 2006-2010, when NBA teams were allowed to increase the total number of players on the team from 12 to 15. Results There was a highly significant trend (p <0.0001) of increasing numbers of players injured/ill and games missed from 1986 through 2005. After the team expansion in 2006, these rates fell abruptly by 13% and 39% respectively (both p <0.0001 compared to the previous 5-year period). We also found a significant inverse association between games missed due to injury/illness and percent games won (r = -0.29, p <0.0001). Conclusions Results demonstrate an increased rate of injury in the NBA up until the expansion of team size in 2006. Following 2006, team expansion was positively associated with decreased injury/illness rates. The latter finding suggests the importance of maintaining a healthy roster with respect to winning outcomes.
Article
Objective To estimate the incidence, type and severity of musculoskeletal injuries in youth and adult elite athletics athletes and to explore risk factors for sustaining injuries. Design Prospective cohort study conducted during a 52-week period. Setting Male and female youth and adult athletics athletes ranked in the top 10 in Sweden (n=292). Results 199 (68%) athletes reported an injury during the study season. Ninety-six per cent of the reported injuries were non-traumatic (associated with overuse). Most injuries (51%) were severe, causing a period of absence from normal training exceeding 3 weeks. Log-rank tests revealed risk differences with regard to athlete category (p=0.046), recent previous injury (>3 weeks time-loss; p=0.039) and training load rank index (TLRI; p=0.019). Cox proportional hazards regression analyses showed that athletes in the third (HR 1.79; 95% CI 1.54 to 2.78) and fourth TLRI quartiles (HR 1.79; 95% CI 1.16 to 2.74) had almost a twofold increased risk of injury compared with their peers in the first quartile and interaction effects between athlete category and previous injury; youth male athletes with a previous serious injury had more than a fourfold increased risk of injury (HR=4.39; 95% CI 2.20 to 8.77) compared with youth females with no previous injury. Conclusions The injury incidence among both youth and adult elite athletics athletes is high. A training load index combing hours and intensity and a history of severe injury the previous year were predictors for injury. Further studies on measures to quantify training content and protocols for safe return to athletics are warranted.
Article
Tukey's jackknife estimate of variance for a statistic $S(X_1, X_2, \cdots, X_n)$ which is a symmetric function of i.i.d. random variables $X_i$, is investigated using an ANOVA-like decomposition of $S$. It is shown that the jackknife variance estimate tends always to be biased upwards, a theorem to this effect being proved for the natural jackknife estimate of $\operatorname{Var} S(X_1, X_2, \cdots, X_{n-1})$ based on $X_1, X_2, \cdots, X_n$.
Article
Methods of evaluating and comparing the performance of diagnostic tests are of increasing importance as new tests are developed and marketed. When a test is based on an observed variable that lies on a continuous or graded scale, an assessment of the overall value of the test can be made through the use of a receiver operating characteristic (ROC) curve. The curve is constructed by varying the cutpoint used to determine which values of the observed variable will be considered abnormal and then plotting the resulting sensitivities against the corresponding false positive rates. When two or more empirical curves are constructed based on tests performed on the same individuals, statistical analysis on differences between curves must take into account the correlated nature of the data. This paper presents a nonparametric approach to the analysis of areas under correlated ROC curves, by using the theory on generalized U-statistics to generate an estimated covariance matrix.
Article
Sixty runners belonging to two clubs were followed for 1 year with regard to training and injury. There were 55 injuries in 39 athletes. The injury rate per 1,000 hours of training was 2.5 in long-distance/marathon runners and 5.6 to 5.8 in sprinters and middle-distance runners. There were significant differences in the injury rate in different periods of the 12 month study, the highest rates occurring in spring and summer. In marathon runners there was a significant correlation between the injury rate during any 1 month and the distance covered during the preceding month (r = 0.59). In a retrospective analysis of the cause of injury, a training error alone or in combination with other factors was the most common injury-provoking factor (72%). The injury pattern varied among the three groups of runners: hamstring strain and tendinitis were most common in sprinters, backache and hip problems were most common in middle-distance runners, and foot problems were most common in marathon runners.
Article
The training programmes and competitive performances of 147 track and field athletes, from many different clubs within the UK, were analysed retrospectively in order to study the incidence, severity and types of injuries which they had suffered during the year September 1989-September 1990. This information was then related to the particular event in which they specialized as well as a number of hypothetical risk factors proposed for making them more prone to injury. Of the athletes 96 (65.3%) were male and 51 (34.7%) were female, and their ages ranged from 14 to 32 years, with their levels of competition ranging from 'competitive spectators' to UK internationals. A marked correlation was noted between their age, level of competition, number of supervised training sessions which they attended, and their incidence of injuries. However, certain other factors which were studied, such as their sex, the hours they trained, and the particular event in which they specialized appeared to provide no obvious relationship.
Article
This study evaluated the incidence, distribution and types of musculoskeletal injuries sustained by 95 track and field athletes in a 12 month period using a retrospective cohort design, and analysed selected training, anthropometric, menstrual and clinical biomechanical risk factors. Overall, 72 athletes sustained 130 injuries giving an athlete incidence rate of 76% and an injury exposure rate of 3.9 per 1000 training hours. The majority of injuries were overuse in nature and approximately one-third of all injuries were recurrent. The risk of injury was not influenced by gender or event group. The most common sites of injury were the leg (28%), thigh (22%) and knee (16%) with the most common diagnoses being stress fractures (21%) and hamstring strains (14%). Injury patterns varied between event groups with middle-distance and distance runners sustaining more overuse injuries, and sprinters, hurdlers, jumpers and multi event athletes more acute injuries (p < 0.05). Increasing age, greater overall flexibility and a greater prevalence of menstrual disturbances were associated with a greater likelihood of injury. The results of this study show that track and field athletes are at high risk for musculoskeletal injury and that it may be possible to identify those who are more likely to sustain an injury.
Article
Injury classification systems are generally used in sports medicine (1) to accurately classify diagnoses for summary studies, permitting easy grouping into parent categories for tabulation and (2) to create a database from which cases can be extracted for research on specific injuries. Clarity is most important for the first purpose, whereas diagnostic detail is particularly important for the second. An ideal classification system is versatile and appropriate for all sports and all data collection scenarios. The Orchard Sports Injury Classification System (OSICS) was developed in 1992 primarily for the first purpose, a specific study examining the incidence of injury at the elite level of football in Australia. As usage of the OSICS expanded into different sports, limitations were noted and therefore many revisions have been made. A recent study found the OSICS-8, whilst superior to the International Classification of Diseases Australian Modification (ICD-10-AM) in both speed of use and 3-coder agreement, still achieved a lower level of agreement than expected. The study also revealed weaknesses in the OSICS-8 that needed to be addressed. A recent major revision resulted in the development of the new 4-character OSICS-10. This revision attempts to improve interuser agreement, partly by including more diagnoses encountered in a sports medicine setting. The OSICS-10 should provide far greater depth in classifications for the benefit of those looking to maintain diagnostic information. It is also structured to easily collapse down into parent classifications for those wanting to preserve basic information only. For those researchers wanting information collected under broader injury headings, particularly those not using fully computerized systems, the simplicity of the OSICS-8 system may still suffice.
Data dictionary for the national injury and illness database.
  • Drew M.K.
  • Wallis M.
  • Hughes D.
Drew MK, Wallis M, Hughes D. Data dictionary for the national injury and illness database, In: AIS best practice handbook. 1st ed. Canberra, Australian Sports Commission, 2014. p. 1-9.
Preparticipation injury complaint is a risk factor for injury: a prospective study of the Moscow 2013 IAAF Championships.
  • Alonso J.-M
  • Jacobsson J.
  • Timpka T.