ArticlePDF Available

The acute: Chronic workload ratio predicts injury: High chronic workload may decrease injury risk in elite rugby league players

Authors:
  • South Sydney Rabbitohs
  • Gabbett Performance Solutions

Abstract and Figures

Aim: Investigate whether acute workload (1 week total distance) and chronic workload (4-week average acute workload) predict injury in elite rugby league players. Methods: Data were collected from 53 elite players over two rugby league seasons. The ‘acute:chronic workload ratio’ was calculated by dividing acute workload by chronic workload. A value of greater than 1 represented an acute workload greater than chronicworkload. All workload data were classified into discrete ranges by z-scores. Results Compared with all other ratios, a very-high acute:chronic workload ratio (≥2.11) demonstrated the greatest risk of injury in the current week (16.7% injury risk) and subsequent week (11.8% injury risk). High chronic workload (>16 095 m) combined with a very high 2-week average acute:chronic workload ratio (≥1.54) was associated with the greatest risk of injury (28.6% injury risk). High chronic workload combined with a moderate workload ratio (1.02–1.18) had a smaller risk of injury than low chronic workload combined with several workload ratios (relative risk range from 0.3 to 0.7×/÷1.4 to 4.4; likelihood range=88–94%, likely). Considering acute and chronic workloads in isolation (ie, not as ratios) did not consistently predict injury risk. Conclusions: Higher workloads can have either positive or negative influences on injury risk in elite rugby league players. Specifically, compared with players who have a low chronic workload, players with a high chronic workload are more resistant to injury with moderate-low through moderate-high (0.85–1.35)acute:chronic workload ratios and less resistant to injury when subjected to ‘spikes’ in acute workload, that is, very-high acute:chronic workload ratios ∼1.5.
Content may be subject to copyright.
A preview of the PDF is not available
... Acute workload is typically measured daily over a seven-day period and represents fatigue in an athlete. Chronic workload is generally the average of 28 days of the workload and represents the fitness of an athlete [1,2]. The optimal ACWR is a range between 0.8 and 1.5-an ACWR above 1.5 increases the risk of injury, and a value below 0.8 may result in loss of fitness [1]. ...
... Currently, most ACWR studies have been derived from professional, adult male athletes [1][2][3][4][5][6][7]. While ACWR has proven helpful in male sports, the research on female Sports 2023, 11, 51 2 of 11 adolescent athletes is limited [5]. ...
... The calculation of the acute workload was the average workload during the most recent 7-day period, and ACWR values were calculated for the end of each week. The chronic workload using the RA method averaged the preceding 28-day workloads [1,2,4,7]. The chronic workload using the EWMA method utilized time decay to account for the most recent workload when calculating the chronic workload [4,22]. ...
Article
Full-text available
Monitoring training load using acute:chronic workload ratio (ACWR) enables coaches to maximize fitness potential while mitigating injury risks by maintaining an optimal ACWR range. There are two methods of determining ACWR: rolling average (RA) and exponentially weighted moving average (EWMA). This study aimed to (1) compare weekly changes in kinetic energy (KE) output in female youth athletes (n = 24) during the high school (HSVB) and club volleyball (CVB) seasons and (2) evaluate the agreement in RA and EWMA ACWR calculations during the HSVB and CVB seasons. Weekly load was measured using a wearable device, and RA and EWMA ACWRs were calculated using KE. The HSVB data showed spikes in ACWR at the onset of the season and during one week mid-season (p = 0.001-0.015), but most weeks were in the optimal ACWR range. The CVB data had greater weekly variations throughout the season (p < 0.05), and many weeks were outside of the optimal ACWR range. There were moderate correlations between the two ACWR methods (HSVB: r = 0.756, p < 0.001; CVB: r = 0.646, p < 0.001). Both methods can be used as a monitoring tool for consistent training like that in HSVB, but more research is needed to investigate appropriate methods for an inconsistent season like that of CVB.
... Moreover, when devices such as global positioning systems are not available, one way to control the training load could be the rating of perceived exertion (RPE), which can also be multiplied by session duration to generate the session-RPE (s-RPE) [6,7]. These measures allow for calculating other ratios, such as the acute: chronic workload ratio (ACWR) [8][9][10]. The mentioned ratio uses the accumulated load during one week (acute load) and the load of the past four weeks (chronic load) to understand their relationship [8]. ...
... These measures allow for calculating other ratios, such as the acute: chronic workload ratio (ACWR) [8][9][10]. The mentioned ratio uses the accumulated load during one week (acute load) and the load of the past four weeks (chronic load) to understand their relationship [8]. It also allows obtaining more knowledge about the players´ status for a better training design. ...
Article
Full-text available
Background The aim of this study was two-fold: (i) to determine the correlation between 2D:4D, maximal oxygen uptake (VO2max), body fat percentage (BF%), maximum heart rate (HRmax), change of direction (COD), and accumulated acute and chronic workload variables; (ii) to verify if the length of the second digit divided by fourth digit (2D:4D) can explain fitness variables and accumulated training load. Methods Twenty elite young football players (age: 13.26 ± 0.19 years; height: 165.8 ± 11.67 cm; body mass: 50.70 ± 7.56 kg; VO2max, 48.22 ± 2.29 ml.kg− 1.min− 1) participated in the present study. Anthropometric and body composition variables (e.g., height, body mass, sitting height, age, BF%, body mass index, right and left finger 2D:4D ratios) were measured. The following fitness tests were also conducted: 30 − 15 Intermittent Fitness Test (VO2max and HRmax), COD (5-0-5 agility test), and speed (10-30msprint test. HRmax and the training load were also measured and monitored using the Rate of Perceived Exertion during the 26 weeks. Results There were associations between HRmax and VO2max, between 2D and 4D lengths and Left and Right hand ratios. Also, in AW with Right and Left 4D. The CW and de ACWR with the Right 4D. There were other associations between physical test variables and workload variables. Conclusions Under-14 soccer players with low right and left-hand 2D:4D ratios did not perform better in the selected fitness tests to assess VO2max, COD, or sprint ability. However, it cannot be ruled out that the absence of statistically significant results may be related to the small sample size and the maturational heterogeneity of the participants.
... This condition is also highly prevalent in other throwing and overhead sports such as baseball (Amin et al., 2015). In terms of shoulder injury, load and physical attributes such as scapular dyskinesia, reduced strength and glenohumeral ROM are independent risk factors for injury (Amin et al., 2015;Clarsen et al., 2014;Drew & Finch, 2016;Hulin, Gabbett, Lawson, Caputi, & Sampson, 2016). This however, is the first study to investigate a combination of physical attributes and load and their impact on shoulder injuries. ...
... Analysis of the injuries showed that a large increase in exposure/ load to handball (>60 %) increased the shoulder injury risk (HR 1.91; 95% CI 1.0 to 3.70, p=.05). This has been supported in previous literature in multiple other sports such as Australian football and rugby league (Drew & Finch, 2016;Hulin et al., 2016). However, the most clinically relevant finding of that study was those athletes with reduced external rotation strength/scapular dyskinesia who increased their weekly load between 20% and 60%, were between 4.0 (HR 4.0; 95% CI 1.1 to 15.2 p=.04) and 4.8 (HR 4.8; 95% CI 1.4 to 12.8 p=.01) times more likely to sustain an injury when compared to the reference group. ...
... Related works of injury prediction have mainly used machine learning with risk factors. Hulin et al. [16] found that the acute:chronic workload ratio predicts injuries in elite rugby league players. Gabbett [17] modeled relationships between the training load and likelihood of injury to predict injuries in elite collision sport athletes. ...
Article
Full-text available
In sumo wrestling, a traditional sport in Japan, many wrestlers suffer from injuries through bouts. In 2019, an average of 5.2 out of 42 wrestlers in the top division of professional sumo wrestling were absent in each grand sumo tournament due to injury. As the number of injury occurrences increases, professional sumo wrestling becomes less interesting for sumo fans, requiring systems to prevent future occurrences. Statistical injury prediction is a useful way to communicate the risk of injuries for wrestlers and their coaches. However, the existing statistical methods of injury prediction are not always accurate because they do not consider the long-term effects of injuries. Here, we propose a statistical model of injury occurrences for sumo wrestlers. The proposed model provides the estimated probability of the next potential injury occurrence for a wrestler. In addition, it can support making a risk-based injury prevention scenario for wrestlers. While a previous study modeled injury occurrences by using the Poisson process, we model it by using the Hawkes process to consider the long-term effect of injuries. The proposed model can also be applied to injury prediction for athletes of other sports.
... Workload measurement and training-load monitoring have been a part of standard conditioning methods in football teams. According to some studies [22,23], they may reduce the injury ratio. ...
Chapter
Full-text available
Football players are prone to sports injuries such as ankle sprain, groin pain, ACL injury, and so on. Muscle strain injury also frequently occurs in football games or practice. As previous studies show, previously injured players have altered muscle and neural functions as well as tissue properties associated with muscle strain injury. They have altered vibration sense, tissue stiffness, and increases in micro-muscle damage. However, training load or conditioning programs are provided the same as those for uninjured players in most cases. In this chapter, the conditioning strategies for players who have previous muscle injuries will be suggested according to the phenomenon after muscle strain injury.
... To cope with these match demands and to provide an adequate training stimulus which optimizes performance and minimizes the risk of injuries [6,7], the assessment of TL and ML becomes crucial. In these terms, the assessment of TL and ML through ratings of perceived effort (RPE) has proved to be a valid measure of training load due to its relationships with internal and external load Internal workload in elite female football players during the whole in-season: starters vs non-starters large weekly changes in TL leading to a greater increase in injury risk. ...
Article
The aim of this study was to quantify weekly internal workload across the in-season and compare the workload variables between starter and non-starter Spanish female first league (Liga Iberdrola) football players. Twenty-six participants belonging to the same team (age, height, and mass: 25.4 ± 6.1 years, 167.4 ± 4.8 cm and 57.96 ± 6.28 kg, respectively) participated in this study. Training loads (TL) and match loads (ML) were assessed through breath-cardiovascular (RPEbreath), leg-musculature (RPEleg) and cognitive (RPEcog) rating of perceived exertion (RPE0–10) for each training session and match during the in-season phase (35 weeks). Session-RPE (sRPE) was calculated by multiplying each RPE value by session duration (minutes). Total weekly TL (weekly TL+ML), weekly TL, weekly ML, chronic workload, acute:chronic workload ratio, training monotony, and training strain were calculated. Linear mixed models were used to assess differences for each dependent variable, with playing time (starter vs non-starter players) used as a fixed factor, and athlete, week, and team as random factors. The results showed that total weekly TL (d = 1.23–2.04), weekly ML (d = 4.65–5.31), training monotony (d = 0.48–1.66) and training strain (d = 0.24–1.82) for RPEbreath, RPEleg and RPEcog were higher for starters in comparison with non-starters (p = 0.01). Coaches involved in elite female football should consider implementing differential sRPE monitoring strategies to optimize the weekly load distribution for starters and non-starters and to introduce compensatory strategies to equalize players’ total weekly load.
... In theory, there is a range of ACWR in which the risk of injury is increased, while research shows that it should not be used in isolation to analyze the causality between load and injury. In [10] authors investigate whether acute workload and chronic workload predict injury in professional rugby league players. Data were gathered from 53 rugby players during two league seasons. ...
Article
Full-text available
The growing intensity and frequency of matches in professional football leagues are related to the increasing physical player load. An incorrect training model results in over- or undertraining, which is related to a raised probability of an injury. This research focuses on predicting non-contact lower body injuries coming from over- or undertraining. The purpose of this analysis was to create decision-making models based on data collected during both training and match, which will enable the preparation of a tool to model the load and report the increased risk of injury for a given player in the upcoming microcycle. For this purpose, three decision-making methods were implemented. Rule-based and fuzzy rule-based methods were prepared based on expert understanding. As a machine learning baseline XGBoost algorithm was considered. Taking into account the dataset used containing parameters related to the external load of the player, it is possible to predict the risk of injury with a certain precision, depending on the method used. The most promising results were achieved by the machine learning method XGBoost algorithm (Precision 92.4%, Recall 96.5%, and F1-score 94.4%).
Article
Background: Only few studies analyzed real training programs of sprinters while that should be a valuable step in the understanding of sprint training. The present study aimed at characterizing track cycling sprinter training by training load and intensity distribution. Methods: Twenty-nine weeks of prechampionship training data were retrospectively analyzed for 6 world-class athletes. Training load was measured by the ratio of volume completed to maximal volume and categorized by five intensity zones (endurance: zones1-2; sprinting: zones3-5) and exercise type (on-bike or resistance). Intra-week (training monotony) and inter-week (acute-chronic workload ratio) variation was also studied. Results: On-bike training represented 77.4±15.3% of total training load; resistance training, 22.6±15.2% (note high standard deviation). Total weekly training load significantly varied (P=0.0002) with high acute-chronic workload ratio (12.0±3.2 weeks >1.5 or <0.8), but low intra-week variations (training monotony, 1.81±0.20). Zone4 and zone5 made up 74.4±16.9% of total training load; zone1, 15.8±11%. Training load was seldom in zone2 (6.4±5.3%) or zone3 (3.3±4.2%). From the first to the second half of the period, zone3-4 training load decreased (39.3±3.3 to 27.4±1.7%; P=0.01), while zone5 increased (34.9±2.4 to 50±3.7%; P=0.002). Conclusions: In this reduced group of elite athletes, training appeared to mainly consist of on-bike exercises within the highest intensity zones. As demonstrated by monotony and acute-chronic workload ratio overloading and unloading are based on high variations over weeks, not days. Essentially, this study describes a polarized intensity distribution on the highest intensities which increased with world championships approach.
Article
Background: Achilles tendinopathy (AT) is a common problem among runners. There is only limited evidence for risk factors for AT, and most studies have not defined the AT subcategories. No study has compared the incidence and risk factors between insertional AT and midportion AT, though they are considered distinct. This study aimed to assess incidence and risk factors of AT based on data from a large prospective cohort. The secondary aim was to explore differences in risk factors between insertional and midportion AT. Methods: Participants were recruited from among registered runners at registration for running events. Questionnaires were completed at baseline, 1 month before the event, 1 week before the event, and 1 month after the event. Information concerning demographics, training load, registered events, and running-related injuries were collected at baseline. The follow-up questionnaires collected information about new injuries. A pain map was used to diagnose midportion and insertional AT. The primary outcome was the incidence of AT. Multivariable logistic regression analysis was applied to identify risk factors for the onset. Results: 3379 participants were included with a mean follow-up of 20.4 weeks. The incidence of AT was 4.2%. The proportion of insertional AT was 27.7% and of midportion AT was 63.8%%; the remaining proportion was a combined type of insertional and midportion AT. Men had a significantly higher incidence (5%, 95% confidence interval (95%CI):4.1-6.0) than women (2.8%, 95%CI:2.0-3.8). AT in the past 12 months was the most predominant risk factor for new-onset AT (odds ratio (OR) = 6.47, 95%CI: 4.27 -9.81). This was similar for both subcategories of AT (insertional: OR = 5.45, 95%CI: 2.51-11.81; midportion: OR = 6.96, 95%CI:4.24 -11.40). Participants registering for an event with a distance of 10/10.55 km were less likely to develop a new-onset AT (OR = 0.59, 95%CI: 0.36-0.97) or midportion AT (OR = 0.47, 95%CI: 0.23 -0.93). Higher age had a significant negative association with insertional AT (OR = 0.97, 95%CI: 0.94-1.00). Conclusion: The incidence of new-onset AT among recreational runners was 4.2%. The proportion of insertional and midportion AT was 27.7% and 63.8%, respectively. AT in the past 12 months was the predominant risk factor for the onset of AT. Risk factors varied between insertional and midportion AT, but we could not identify clinically relevant differences between the 2 subtypes.
Article
Full-text available
The current study aimed to analyze, using accelerometer-based activity, acute workload, chronic workload, acutechronic workloads ratio, training-monotony and training-strain throughout a competitive soccer-season and to compare these variables between players from different playing positions. Twenty-one professional soccer-players were monitored during the 48 weeks of the season. Players were grouped according to their position. Four lateral-defenders and four winger-players formed LDW group, four central-defenders and four forwards formed CDF group, and six midfielder-players formed MDF group. Accelerometer-based variables were collected during training and match contexts and were used to generate indicators of weekly acute and chronic workload, training monotony, training strain and metabolic power. A one-way ANOVA compared all dependent variables between groups, and effect sizes for pairwise comparisons were calculated. Results revealed variations in the weekly load throughout the season, which demands caution from coaches to avoid injuries. There were no differences in weekly-loads for all dependent variables (P > 0.05, small-to-moderate effects). We conclude that the weekly-load is not constant during a competitive season and players from different positions have similar weekly-loads. Therefore, previously reported in the literature, possible match-related positional differences might be compensated by differences in training-related loads, leading to a similar profile when considering the whole week.
Article
Full-text available
This study investigated whether match intensities during predefined periods differed among successful and less-successful rugby league teams. 4 semi-elite rugby league teams were split into 'high-success' and 'low-success' groups based on their success rates. Movement was recorded using a global positioning system (10 Hz) during 20 rugby league matches. Following the peak ball-in-play time period, the high-success group was able to maintain ball-in-play time that was: (1) 22% greater than the low-success group (P=0.01) and (2) greater than their mean period of match-play (P=0.01). In the peak and mean periods of match play, hit-up forwards from the high-success group covered less total distance (P=0.02; P=0.01), less high-intensity running distance (P=0.01; P=0.01) and were involved in a greater number of collisions (P=0.03; P=0.01) than hit-up forwards from the low-success group. These results demonstrate that greater amounts of high-intensity running and total distance are not related to competitive success in semi-elite rugby league. Rather, competitive success is associated with involvement of hit-up forwards in a greater number of collisions and the ability of high-success teams to maintain a higher ball-in-play time following the peak period. Strength and conditioning programs that: (1) emphasize high-intensity running and neglect to combine these running demands with collisions, and (2) do not offer exposure to match specific ball-in-play time demands, may not provide sufficient physiological preparation for teams to be successful in rugby league. © Georg Thieme Verlag KG Stuttgart · New York.
Article
Full-text available
Maximizing the hypertrophic response to resistance training (RT) is thought to be best achieved by proper manipulation of exercise program variables including exercise selection, exercise order, length of rest intervals, intensity of maximal load, and training volume. An often overlooked variable that also may impact muscle growth is repetition duration. Duration amounts to the sum total of the concentric, eccentric, and isometric components of a repetition, and is predicated on the tempo at which the repetition is performed. We conducted a systematic review and meta-analysis to determine whether alterations in repetition duration can amplify the hypertrophic response to RT. Studies were deemed eligible for inclusion if they met the following criteria: (1) were an experimental trial published in an English-language refereed journal; (2) directly compared different training tempos in dynamic exercise using both concentric and eccentric repetitions; (3) measured morphologic changes via biopsy, imaging, and/or densitometry; (4) had a minimum duration of 6 weeks; (5) carried out training to muscle failure, defined as the inability to complete another concentric repetition while maintaining proper form; and (6) used human subjects who did not have a chronic disease or injury. A total of eight studies were identified that investigated repetition duration in accordance with the criteria outlined. Results indicate that hypertrophic outcomes are similar when training with repetition durations ranging from 0.5 to 8 s. From a practical standpoint it would seem that a fairly wide range of repetition durations can be employed if the primary goal is to maximize muscle growth. Findings suggest that training at volitionally very slow durations (>10s per repetition) is inferior from a hypertrophy standpoint, although a lack of controlled studies on the topic makes it difficult to draw definitive conclusions.
Article
Full-text available
Injuries in collegiate ice hockey can result in significant time lost from play. The identification of modifiable risk factors relating to a player's physical fitness allows the development of focused training and injury prevention programs targeted at reducing these risks. To determine the ability of preseason fitness outcomes to predict in-season on-ice injury in male collegiate ice hockey players. Prognostic cohort study. Level 3. Athlete demographics, percentage body fat, aerobic capacity (300-m shuttle run; 1-, 1.5-, 5-mile run), and strength assessment (sit-ups, push-ups, grip strength, bench press, Olympic cleans, squats) data were collected at the beginning of 8 successive seasons for 1 male collegiate ice hockey team. Hockey-related injury data and player-level practice/game athlete exposure (AE) data were also prospectively collected. Seventy-nine players participated (203 player-years). Injury was defined as any event that resulted in the athlete being unable to participate in 1 or more practices or games following the event. Multivariable logistic regression was performed to determine the ability of the independent variables to predict the occurrence of on-ice injury. There were 132 injuries (mean, 16.5 per year) in 55 athletes. The overall injury rate was 4.4 injuries per 1000 AEs. Forwards suffered 68% of the injuries. Seventy percent of injuries occurred during games with equal distribution between the 3 periods. The mean number of days lost due to injury was 7.8 ± 13.8 (range, 1-127 days). The most common mechanism of injury was contact with another player (54%). The odds of injury in a forward was 1.9 times (95% CI, 1.1-3.4) that of a defenseman and 3 times (95% CI, 1.2-7.7) that of a goalie. The odds of injury if the player's body mass index (BMI) was ≥25 kg/m(2) was 2.1 times (95% CI, 1.1-3.8) that of a player with a BMI <25 kg/m(2). The odds ratios for bench press, maximum sit-ups, and Olympic cleans were statistically significant but close to 1.0, and therefore the clinical relevance is unknown. Forwards have higher odds of injury relative to other player positions. BMI was predictive of on-ice injury. Aerobic fitness and maximum strength outcomes were not strongly predictive of on-ice injury.
Article
Full-text available
Objectives: Our objectives were to assess the magnitude of the disparity in lumbar spine bone mineral density (LSBMD) Z-scores generated by different reference databases and to evaluate whether the relationship between LSBMD Z-scores and vertebral fractures (VF) varies by choice of database. Patients and design: Children with leukemia underwent LSBMD by cross-calibrated dual-energy x-ray absorptiometry, with Z-scores generated according to Hologic and Lunar databases. VF were assessed by the Genant method on spine radiographs. Logistic regression was used to assess the association between fractures and LSBMD Z-scores. Net reclassification improvement and area under the receiver operating characteristic curve were calculated to assess the predictive accuracy of LSBMD Z-scores for VF. Results: For the 186 children from 0 to 18 years of age, 6 different age ranges were studied. The Z-scores generated for the 0 to 18 group were highly correlated (r ≥ 0.90), but the proportion of children with LSBMD Z-scores ≤-2.0 among those with VF varied substantially (from 38-66%). Odds ratios (OR) for the association between LSBMD Z-score and VF were similar regardless of database (OR = 1.92, 95% confidence interval 1.44, 2.56 to OR = 2.70, 95% confidence interval 1.70, 4.28). Area under the receiver operating characteristic curve and net reclassification improvement ranged from 0.71 to 0.75 and -0.15 to 0.07, respectively. Conclusions: Although the use of a LSBMD Z-score threshold as part of the definition of osteoporosis in a child with VF does not appear valid, the study of relationships between BMD and VF is valid regardless of the BMD database that is used.
Article
Full-text available
Purpose: To quantify activity profiles in approximately 5-min periods to determine if the intensity of rugby league match play changes after the most intense period of play and to determine if the intensity of activity during predefined periods of match play differ between successful and less-successful teams playing at an elite standard. Methods: Movement was recorded using a MinimaxX global positioning system (GPS) unit sampling at 10 Hz during 25 rugby league matches, equating to 200 GPS files. Data for each half of match play were separated into 8 equal periods. These periods represented the most intense phase of match play (peak period), the period after the most intense phase of match play (subsequent period), and the average demands of all other periods in a match (mean period). Two rugby league teams were split into a high-success and a low-success group based on their success rates throughout their season. Results: Compared with their less-successful counterparts, adjustables and hit-up forwards from the high-success team covered less total distance (P < .01) and less high-intensity-running distance (P < .01) and were involved in a greater number of collisions (P < .01) during the mean period of match play. Conclusions: Although a greater number of collisions during match play is linked with a greater rate of success, greater amounts of high-intensity running and total distance are not related to competitive success in elite rugby league. These results suggest that technical and tactical differences, rather than activity profiles, may be the distinguishing factor between successful and less-successful rugby league teams.
Article
Rugby league is an international collision sport played by junior, amateur, semiprofessional and professional players. The game requires participants to be involved in physically demanding activities such as running, tackling, passing and sprinting, and musculoskeletal injuries are common. A review of injuries in junior and senior rugby league players published in Sports Medicine in 2004 reported that injuries to the head and neck and muscular injuries were common in senior rugby league players, while fractures and injuries to the knee were common in junior players. This current review updates the descriptive data on rugby league epidemiology and adds information for semiprofessional, amateur and junior levels of participation in both match and training environments using studies identified through searches of PubMed, CINHAL, Ovid, MEDLINE, SCOPUS and SportDiscus (R) databases. This review also discusses the issues surrounding the definitions of injury exposure, injury rate, injury severity and classification of injury site and type for rugby league injuries. Studies on the incidence of injuries in rugby league have suffered from inconsistencies in the injury definitions utilized. Some studies on rugby league injuries have utilized a criterion of a missed match as an injury definition, total injury incidences or a combination of both time-loss and non-time-loss injuries, while other studies have incorporated a medical treatment injury definition. Efforts to establish a standard definition for rugby league injuries have been difficult, especially as some researchers were not in favour of a definition that was all-encompassing and enabled non-time-loss injuries to be recorded. A definition of rugby league injury has been suggested based on agreement by 4 group of international researchers. The majority of injuries occur in the match environment, with rates typically increasing as the playing level increases. However, professional level injury rates were reportedly less than semiprofessional participation. Only a few studies have reported training injuries in rugby league, where injury rates were reported to be less than match injuries. Approximately 16-30% of all rugby league injuries have been reported as severe, which places demands upon other team members and, if the player returns to playing too early, places them at an increased risk of further injuries. Early research in rugby league identified that ligament and joint injuries were the common injuries, occurring primarily to the knee. More recently, studies have shown a change in anatomical injury sites at all levels of participation. Although the lower limb was the frequent injury region reported previously, the shoulder has now been reported to be the most common injury site. Changes in injury site and type could be used to prompt further research and development of injury reduction programmes to readdress the issue of injuries that occur as a result of participation in rugby league activities. Further research is warranted at all participation levels of rugby league in both the match and training environments to confirm the strongest risk factors for injury.
Article
We compared the accuracy of 2 GPS systems with different sampling rates for the determination of distances covered at high-speed and metabolic power derived from a combination of running speed and acceleration. 8 participants performed 56 bouts of shuttle intermittent running wearing 2 portable GPS devices (SPI-Pro, GPS-5 Hz and MinimaxX, GPS-10 Hz). The GPS systems were compared with a radar system as a criterion measure. The variables investigated were: total distance (TD), high-speed distance (HSR>4.17 m·s(-1)), very high-speed distance (VHSR>5.56 m·s(-1)), mean power (Pmean), high metabolic power (HMP>20 W·kg(-1)) and very high metabolic power (VHMP>25 W·kg(-1)). GPS-5 Hz had low error for TD (2.8%) and Pmean (4.5%), while the errors for the other variables ranged from moderate to high (7.5-23.2%). GPS-10 Hz demonstrated a low error for TD (1.9%), HSR (4.7%), Pmean (2.4%) and HMP (4.5%), whereas the errors for VHSR (10.5%) and VHMP (6.2%) were moderate. In general, GPS accuracy increased with a higher sampling rate, but decreased with increasing speed of movement. Both systems could be used for calculating TD and Pmean, but they cannot be used interchangeably. Only GPS-10 Hz demonstrated a sufficient level of accuracy for quantifying distance covered at higher speeds or time spent at very high power.