The Journal of Strength and Conditioning Research

Published by Lippincott, Williams & Wilkins
Online ISSN: 1533-4287
Print ISSN: 1064-8011
Publications
The purpose of this study was to investigate the reliability of two lower body reaction time (RT) tests to determine differences in RT between genders and compatible and incompatible conditions. Fifteen male and female (n=30) (22.63 ± 2.88 yrs; 175.31 ± 8.72 cm; 67.33 ± 9.71 kg) sport science students participated in this study. Subjects were required to complete two lower body RT tests responding to an arrow during compatible (same direction) and incompatible (opposite direction) stimulus-response conditions. The "simple" foot RT test, required subjects to step quickly on the appropriate mat, as directed by the stimulus, with response time being measured. The "complex" foot RT test required subjects to leap off a force plate to the appropriate mat in response to the stimulus, with RT, movement time (MT) and total movement time (TMT), being measured. Intra-class coefficient, coefficient of variation, and paired samples t-test (p ≤ 0.05) were calculated for all variables. High reliability was observed for both tests between compatible and incompatible conditions. Significant differences (p ≤ 0.05) were observed between genders for RT during the "simple" RT test. Significant differences (p ≤ 0.05) were observed for MT and TMT during compatible and incompatible conditions for the "complex" RT test. In conclusion, both tests are reliable to determine lower body RT during both conditions. MT and TMT during the "complex" RT test were significantly different suggesting MT could be the discriminating factor between conditions as well as genders. Examining lower body RT during a movement commonly observed in sport may provide coaches more detail about athletes cognitive and athletic ability, enabling the components of RT to be trained.
 
A traditional progressive resistance exercise program consists of increasing the number of repetitions at a constant load until exceeding an established repetition range. Subsequently, the load is increased by 1.1 kg (2.5 lb) or more, and the lifter works at the new load until again exceeding the repetition range. This investigation examines the use of small incremental loads for 2 upper-body exercises (bench press and triceps press). Subjects were randomly assigned to traditional (TRAD) progressive resistance exercise (5 women, 5 men) and small increment (SI) progressive resistance exercise (5 women, 4 men) groups. Initially, both groups trained for 8 weeks using TRAD progressive resistance exercise. Subjects who achieved 7 repetitions on the final set of an exercise increased the load for the next session by 2.2 (bench press) or 1.1 kg (triceps press). Following the initial 8-week training period, the TRAD group continued for another 8 weeks following the same protocol, whereas the SI group trained for an additional 8 weeks, increasing the load by 0.22 kg (0.5 lb) when completing 7 or 8 repetitions and 0.44 kg (1 lb) when achieving 9 or more repetitions. All groups, except TRAD women, made significant increases in 1 repetition maximum (1RM) for the bench press. Both TRAD men and SI men significantly increased 1RM triceps press. Groups that did not significantly increase 1RM, in either the bench press or triceps press, demonstrated similar trends. For TRAD men and SI men, the number of repetitions to failure for the bench press at 60% 1RM decreased after training. Both regimens proved effective for increasing strength throughout 8 weeks. In conclusion, SI progressive resistance exercise appears to be as effective as TRAD progressive resistance exercise for increasing strength during 8 weeks in short-term pretrained college-aged men and women. However, preliminary data suggest that the TRAD progressive resistance exercise program might be a more effective method of increasing resistance during an extended period.
 
The purpose of this study was to determine the reliability of a computer software angle tool that measures thoracic (T), lumbar (L) and pelvic (P) angles as a means of evaluating spine and pelvic flexibility during the sit-and-reach (SR) test. Thirty adults performed the SR twice on separate days. The SR test was captured on video and later analyzed for T, L and P angles using the computer software angle tool. During the test, three markers were placed over T1, T12 and L5 vertebrae to identify T, L and P angles. Intraclass correlation (ICC) indicated a very high internal consistency (between trials) for T, L and P angles (.95 - .99); thus the average of trials was used for test-retest (between days) reliability. Mean (± SD) values did not differ between days for T (51.0 ± 14.3 vs 52.3 ± 16.2°), L (23.9 ± 7.1 vs 23.0 ± 6.9°) or P (98.4 ± 15.6 vs 98.3 ± 14.7°) angles. Test-retest reliability (ICC) was high for T (.96) and P (.97) angles, and moderate for L angle (.84). Both intrarater and interrater reliabilities were high for T (.95, .94) and P (.97, .97) angles, and moderate for L angle (.87, .82). Thus, the computer software angle tool is a highly objective method for assessing spine and pelvic flexibility during a video-captured SR test.
 
Schematic illustration of the video reactive agility test.  
Mean 6 SD decision accuracy (% correct) at DT1 and DT2 for HS and LS footballers (effect size shown in parentheses).*
Mean 6 SD agility, decision, and movement times for feint trials wholly inclusive of correct or incorrect decisions at either DT1 or DT2.*
Decision-making accuracy and the time cost of incorrect responses was compared between higher- (n=14) and lower-standard (n=14) Australian footballers during reactive agility tasks incorporating feint and non-feint scenarios. Accuracy was assessed as whether the subject turned in the correct direction to each stimulus. With skill groups pooled, decision accuracy at the first (or only) stimulus (decision time 1) was 94±7 % and decreased to 83±20 % for the second stimulus (decision time 2) (p=0.01; d=0.69). However, with skill groups separated decision accuracy was similar between groups at decision time 1 (higher 95±6 vs. lower 92±7 %; p=0.6; d=0.42), somewhat better in the higher-standard group at decision time 2 (88±22 % vs. 78±17 %; p=0.08; d=0.50). But the decrease in accuracy from decision time 1 to 2 was significant in the lower-standard group only (92±7 % to 78±17 %; p=0.02; d=1.04). However, with skill groups pooled but agility times examined exclusively in trials involving correct or incorrect decisions, incorrect decisions at decision time 1 during feint trials resulted in a shorter agility time (1.73±0.24 vs. 2.03±0.39 sec; p = 0.008; d = 0.92) whereas agility time was significantly longer in feint (incorrect at decision time 2 only) (2.65±0.41 vs. 1.97±0.36 sec; p < 0.001; d = 1.76) and non-feint trials (1.64±0.13 vs. 1.51±0.10 sec; p = 0.001; d = 1.13). Therefore, while decision-making errors typically worsen reactive agility performance, successful anticipation of a feint can produce performance improvements. Furthermore, higher standard footballers are less susceptible to such feints, perhaps due to superior anticipation. Training to improve decision-making accuracy, particularly involving feint movements, may therefore principally benefit lesser skilled players and should be practiced regularly.
 
Morphological characteristics and best performance time of the subjects.
Physiological parameters determined during maximal graded test.*
Olympic flat water kayaking races take place over a distance of 500 and 1,000 m. This study was designed to determine the aerobic and anaerobic contributions to 500- and 1,000-m races during flat water paddling in open water, using the accumulated oxygen deficit (AOD) method. Seven internationally ranked athletes, specialized in 500-m races and familiar with 1,000-m races, participated in this study (age: 21.86 ± 1.68 years, body mass: 78.54 ± 3.41 kg, height: 1.84 ± 0.03 m, body fat%: 10.14 ± 0.69%). All the participants performed 3 track-kayaking sessions. During the first session, the maximal oxygen uptake and maximal aerobic speed were determined using a portable gas analyzer and a global positioning system. During the successive testing sessions, paddlers performed in a randomized counterbalanced order a 500- and a 1,000-m race under field conditions (open water track kayaking). The 500-m AOD was significantly higher than the 1,000-m AOD (18.16 ± 4.88 vs. 9.34 ± 1.38 ml·kg(-1), p < 0.05). The aerobic contribution resulted in being higher during the 1,000 m compared with that in the 500-m condition (86.61 ± 1.86% vs. 78.30 ± 1.85%, respectively, p < 0.05). The results of this study showed that the 500- and 1,000-m races are 2 physiologically different kayaking events with a higher aerobic contribution in the 1,000 m. The training prescription for elite athletes should emphasize aerobic high-intensity training for the 1,000 m and anaerobic short-term training for the 500-m race.
 
Experimental design of the potentiated and normal warm-up conditions. RETT = rowing ergometer time trial.
Rowing requires strength, power, and strength-endurance for optimal performance. A rowing based warm-up could be enhanced by exploiting the postactivation potentiation (PAP) phenomenon, acutely enhancing power output at the beginning of a race where it's needed most. Minimal research has investigated the effects of PAP on events of longer duration (i.e., 1000 m rowing). The purpose of this research was to investigate the effects of PAP on 1000 m rowing ergometer performance through the use of two different warm-up procedures: 1) a rowing warm-up combined with a series of isometric conditioning contractions, known as the potentiated warm-up (PW), and 2) a rowing warm-up only (NW). The isometric conditioning contractions in the PW were performed by "pulling" an immovable handle on the rowing ergometer, consisting of 5 sets of 5s (2s at submaximal intensity, and 3s at maximal intensity), with a 15s recovery between sets. The 1000 m rowing ergometer time trial was performed after each warm-up condition, whereby mean power output, mean stroke rate, and split time were assessed every 100 m. Ten Australian national level rowers served as the subjects, and performed both conditions in a counterbalanced order on separate days. The PW reduced 1000 m time by 0.8% (p > 0.05). The PW improved mean power output by 6.6% (p < 0.01) and mean stroke rate by 5.2% (p < 0.01) over the first 500 m; resulting in a reduction of 500 m time by 1.9% (p < 0.01), compared to the NW. It appears that the inclusion of isometric conditioning contractions to the rowing warm-up enhance short-term rowing ergometer performance (especially at the start of a race) to a greater extent than a rowing warm-up alone.
 
Vernillo, G, Agnello, L, Drake, A, Padulo, J, Piacentini, MF, and Torre, AL. An observational study on the perceptive and physiological variables during a 10,000-m race walking competition. J Strength Cond Res 26(10): 2741-2747, 2012-In this study, we observed the variations on physiological and perceptual variables during a self-paced 10,000-m race walking (RW) event with the aim to trace a preliminary performance profile of the distance. In 14 male athletes, the heart rate (HR) was monitored continuously throughout the event. The rating of perceived exertion (RPE) was collected using the Borg's 6-20 RPE scale placed at each 1,000 m of an outdoor tartan track. Pacing data were retrieved from the official race results and presented as percent change compared with the first split time. The athletes spent 95.4% at 90-100% of the HRpeak, whereas the other work (4.6%) was negligible. During the race, a shift toward higher HR values was observed because % HRpeak increased by 3.6% in the last vs. the first 1,000-m sector (p = 0.002, effect size [ES] = 1.55 ± 0.68, large). The mean RPE reported by the athletes in the last 1,000 m was significantly higher than in the first 5 sectors (p < 0.02, ES = 1.93-2.96, large to very large). The mean percent change increased between the first 6 sectors and the last 1,000-m sector (p < 0.01, ES = 1.02-2.1, moderate to very large). The analysis of walking velocity at each 1,000-m sector suggested the adoption of a negative pacing. In conclusion, the RPE may be a valid marker of exercise intensity even in real settings. Match physiological and perceptual data with work rate are required to understand race-related regulatory processes. Pacing should be considered as a conscious behavior decided by the athletes based on the internal feedback during the race.
 
Vänttinen, T, Blomqvist, M, Nyman, K, and Häkkinen, K. Changes in body composition, hormonal status, and physical fitness in 11-, 13-, and 15-year-old Finnish regional youth soccer players during a two-year follow-up. J Strength Cond Res 25(12): 3342-3351, 2011-The purpose of this study was to examine the changes in body composition, hormonal status, and physical fitness in 10.8 ± 0.3-year-old (n = 13), 12.7 ± 0.2-year-old (n = 14), and 14.7 ± 0.3-year-old (n = 12) Finnish regional youth soccer players during a 2-year monitoring period and to compare physical fitness characteristics of soccer players with those of age-matched controls (10.7 ± 0.3 years, n = 13; 14.7 ± 0.3 years, n = 10) not participating in soccer. Body composition was measured in terms of height, weight, muscle mass, percentage of body fat, and lean body weight of trunk, legs, and arms. Hormonal status was monitored by concentrations of serum testosterone and cortisol. Physical fitness was measured in terms of sprinting speed, agility, isometric maximal strength (leg extensors, abdominal, back, grip), explosive strength, and endurance. Age-related development was detected in all other measured variables except in the percentage of body fat. The results showed that the physical fitness of regional soccer players was better than that of the control groups in all age groups, especially in cardiovascular endurance (p < 0.01-0.001) and in agility (p < 0.01-0.001). In conclusion, playing in a regional level soccer team seems to provide training adaptation, which is beyond normal development and which in all likelihood leads to positive health effects over a prolonged period of time.
 
The aim of the present study was to show if the use of continuous-running training vs. intermittent-running training has comparable or distinct impact on aerobic fitness in children. At first, children were matched according to their chronological age, their biological age (secondary sexual stages), and their physical activity or training status. Then, after randomization 3 groups were composed. Sixty-three children (X 9.6 +/- 1.0 years) were divided into an intermittent-running training group (ITG, 11 girls and 11 boys), a continuous-running training group (CTG, 10 girls and 12 boys), and a control group (CG, 10 girls and 9 boys). Over 7 weeks, ITG and CTG participated in 3 running sessions per week. Before and after the training period, they underwent a maximal graded test to determine peak oxygen uptake (peak VO2) and maximal aerobic velocity (MAV). Intermittent training consisted of short intermittent runs with repeated exercise and recovery sequences lasting from 5/15 to 30/30 seconds. With respect to continuous training sessions, repeated exercise sequences lasted from 6' to 20'. Training-effect threshold for statistical significance was set at p < 0.05. After training, peak VO2 was significantly improved in CTG (+7%, p < 0.001) and ITG (+4.8%, p < 0.001), whereas no difference occurred for the CG (-1.5%). Similarly, MAV increased significantly (p < 0.001) in both CTG (+8.7%) and ITG (+6.4%) with no significant change for CG. Our results demonstrated that both continuous and intermittent-running sessions induced significant increase in peak VO2 and MAV. Therefore, when adequate combinations of intensity/duration exercises are offered to prepubertal children, many modalities of exercises can successfully be used to increase their aerobic fitness. Aerobic running training is often made up of regular and long-distance running exercises at moderate velocity, which causes sometimes boredom in young children. During the developmental years, it seems therefore worthwhile to use various training modalities, to make this activity more attractive and thus create conditions for progress and enhanced motivation.
 
There are limited data on how coordinative sprint drills and maximal short burst activities affects children's sprint and agility performance. The purpose of the present study was to investigate the effect of short burst activities on sprint and agility performance in 11- to 12-year-old boys. A training group (TG) of 14 boys followed a 6-week, 1-hour·week(-1), training program consisting of different short burst competitive sprinting activities. Eleven boys of similar age served as controls (control group [CG]). Pre- and posttests assessed 10-m sprint, 20-m sprint, and agility performance. Results revealed significant performance improvement in all tests within TG (p < 0.05), but not between TG and CG in the 10-m sprint test. Furthermore, the relationships between the performances in straight-line sprint and agility showed a significant transfer effect (r = 0.68-0.75, p < 0.001). Findings from the present study indicate that competitive short burst activities executed with maximal effort may produce improvement in sprint and agility performance in 11- to 12-year-old boys.
 
Differences in trunk strength capacity due to gender and sports are well documented in adults. In contrast, data concerning young athletes is sparse. The purpose of this study was to assess the maximum trunk strength of adolescent athletes and to investigate differences between genders and age groups.A total of 520 young athletes were recruited. Finally, 377 (n=233/144 m/f; 13±1yrs; 1.62±0.11m height; 51±12kg mass; training: 4.5±2.6yrs; training-sessions/week: 4.3±3.0; various sports) were included in the final data analysis. Furthermore, five age groups were differentiated (age groups: 11, 12, 13, 14 and 15 yrs; n=90, 150, 42, 43 and 52, respectively.) Maximum strength of trunk flexors (Flex) and extensors (Ext) was assessed in all subjects during isokinetic concentric measurements (60°/sec; 5 repetitions; ROM: 55°). Maximum strength was characterized by absolute peak torque (Flexabs, Extabs; Nm), peak torque normalized to body weight (Flexnorm, Extnorm; Nm/kg BW) and Flexabs/Extabs ratio (RKquot). Descriptive data analysis (mean±SD) was completed followed by ANOVA (α=0.05; post-hoc-test (Tukey-Kramer)).Mean maximum strength for all athletes was 97±34 Nm in Flexabs and 140±50 Nm in Extabs (Flexnorm 1.9±0.3 Nm/kg BW, Extnorm 2.8±0.6 Nm/kg BW). Males showed statistically significant higher absolute and normalized values compared to females (p<0.001). Flexabs and Extabs rose with increasing age almost two-fold for males and females (Flexabs, Extabs: p<0.001). Flexnorm and Extnorm increased with age for males (p<0.001), however, not for females (Flexnorm: p=0.26; Extnorm: p=0.20). RKquot (mean± SD: 0.71±0.16) did not reveal any differences regarding age (p=0.87) or gender (p=0.43).In adolescent athletes, maximum trunk strength must be discussed in a gender- and age-specific context. The flexor/extensor ratio revealed extensor dominance, which seems to be independent of age and gender. The values assessed may serve as a basis to evaluate and discuss trunk strength in athletes.
 
The relative age effect (RAE) suggests that there is a clustering of birth dates just after the cut-off used for sports selection in age-grouped sports and that in such circumstances relatively older sportspeople may enjoy maturational and physical advantages over their younger peers. Few studies have examined this issue in non-selective groups of children and none have examined whether there is evidence of any RAE in skill performance. The aim of this study was to assess whether there were differences in fundamental movement skill proficiency within children placed in age groups according to the school year. Six fundamental movement skills (FMS: sprint, side gallop, balance, jump, catch, and throw) were assessed in 539 school children (258 boys, 281 girls) aged 6-11 years (mean age ± S.D. = 7.7 ± 1.7 years). We examined differences in these FMS between gender groups and children born in different quarters of the year after controlling for age and body mass index (BMI). For balance, chronological age was significant as a covariate (p = .0001) with increases in age associated with increases in balance. Boys had significantly higher sprint mastery compared to girls (p = .012) and increased BMI was associated with poorer sprint mastery (p = .001). Boys had higher catching mastery than girls (p = .003) and children born in Q1 had significantly greater catching mastery than children born in Q2 (p = .015), Q3 (p = .019) and Q4 (p = .01). Results for throwing mastery also indicated higher mastery in boys compared to girls (p = .013), and that children born in Q1 had higher throwing proficiency than children born in Q4 (p = .038). These results are important if coaches are basing sport selection on measures of skilled performance, particularly in object-control skills. Categorizing children's skilled performance based on rounded down values of whole-year age may disadvantage children born relatively later in the selection year whereas children born earlier in the selection year will likely evidence greater skill mastery and subsequent advantage for selection purposes.
 
Anaerobic exercise is involved in many recreational and competitive sport activities. This study first established regression equations to predict maximal anaerobic power and then cross-validated these prediction equations. Using stepwise multiple regression analysis prediction equations for relative (watts per kilogram of body mass) and absolute (watts) mean and peak anaerobic power using the 30-second Wingate Test as the power measure were determined for 40 boys (age, 11-13 years). Percentage of body fat, free-fat weight, midthigh circumference, and 30-m dash were the independent predictive variables with the generated regression equations subsequently cross-validated using 20 different boys (age, 11-13 years). Significant correlations (Pearson r) were found for the cross-validation subjects between the measured power outputs and predicted power outputs for relative mean power (r = 0.48, p < 0.05), absolute mean power (r = 0.77, p < 0.01), and absolute peak power (r = 0.76, p < 0.01). Using paired t-tests, no significant mean differences (p > 0.05) were found for the same subjects between actual and predicted power outputs for relative mean power, absolute mean power, and absolute peak power. Prediction of maximal anaerobic power from selected anthropometric measurements and 30-m dash appears tenable in 11-13-year-old boys and can be accomplished in a simple cost- and time-effective manner.
 
The purpose of this study was to determine whether or not a relationship existed between upper-body power and class level among female club gymnasts. Sixty female gymnasts between the ages of 10 and 11 and between class levels 5 and 8 participated in the study. The distance of a medicine-ball throw was used to measure upper-body power. Three types of throws--overhead forward throw, overhead backward throw, and chest pass--were performed with a 6-lb rubber medicine ball. The mean distances of 2 trails were calculated and categorized into age group and class level. An analysis of variance design was used to determine the relationship between mean throw distances and throw type, age, and class level. No significant differences were found between mean throw distances and throw type, age, or class level. The results of this study show no relationship between upper-body power of female gymnasts and throw type, age, and class level.
 
Mean sprint times and individual values for each subject in the unloaded running sprinting tests: (A) 20 m; (B) 30 m; (C) 40 m; (D) 10–40 m sprint; (E) 20–30 m split; (F) 20–40 m split. Training groups: Low load (LL), medium load (ML), and high load (HL). Significant differences from pre- to posttraining: * p # 0.05; ** p , 0.01. 
Mean jump height and individual values for each subject in: (A) CMJ; (B) JS. Significant difference from pre- to posttraining: * p # 0.05. #Significant differences between groups. 
Mean propulsive velocity attained with common absolute loads at the pre- and posttests in the full SQ exercise, and individual values for each subject. Significant difference from pre- to posttraining: * p # 0.05. SQ = full squat. 
The optimal resisted load for sprint training has not been established yet, although it has been suggested that a resistance reducing the athlete's velocity by more than 10% from unloaded sprinting would entail substantial changes in the athlete's sprinting mechanics. This investigation has evaluated the effects of a 7-week, 14-session sled resisted sprint training on acceleration with three different loads according to a % of body mass (BM): low load (LL: 5% BM, n = 7), medium load (ML: 12.5% BM, n = 6) and high load (HL: 20% BM, n = 6), in young male students. Besides, the effects on untrained exercises: vertical jump (CMJ), loaded vertical jump (JS) and full squat (SQ) were analyzed. The three groups followed the same training program consisting in maximal effort sprint accelerations with the respective loads assigned. Significant differences between groups only occurred between LL and ML in CMJ (p<0.05), favoring ML. Paired t-tests demonstrated statistical improvements in 0-40 m sprint times for the three groups (p<0.05), and in 0-20 m (p<0.05) and 0-30 m (p<0.01) sprint times for HL. Sprint times in 10-40 m (p<0.01) and 20-40 m (p<0.05) were improved in LL. Time intervals in 20-30 m and 20-40 m (p<0.05) were statistically reduced in ML. As regards the untrained exercises, CMJ and SQ for ML and HL (p<0.05) and JS for HL were improved. The results show that, depending on the magnitude of load used, the related effects will be attained in different phases of the 40 m. It would seem that to improve the initial phase of acceleration up to 30 m, loads around 20% of BM should be used, whereas to improve high-speed acceleration phases, loads around 5 to 12.5% of BM should be preferred. Moreover, sprint resisted training with ML and HL would enhance vertical jump and leg strength in moderately trained subjects.
 
Schematic representation of the T120S test setup.  
Body position during shuttles requiring subject to dive on group at line B.  
The aim of this study was to design a simple field test to measure the anaerobic endurance fitness of rugby league players, which is an important fitness quality in the game of rugby league. Twelve amateur football players with a mean (+/-SD) age of 21.5 years (+/-2.2) volunteered to participate in the study. The subjects completed 1 trial of the Wingate 60-second (W60) cycle test and 2 trials of the new Triple-120 meter shuttle (T120S) test. All trials were completed 4 days apart. The validity of the T120S was determined by comparing physiological responses (heart rate and blood lactate) and rating of perceived exertion to the all-out W60 cycle test. The results indicate there is a significant relationship between maximum heart rate (r = 0.63 and 0.71) for the 2 trials of the T120S and the W60 cycle test. There was no significant relationship between the 2 trials and the W60 cycle test for post 3 minute lactate (r = 0.112 and 0.101) and rating of perceived exertion (r = 0.94 and 0.161). However, the T120S test elicited greater mean values for these measures than the W60 cycle test. The results indicate that the T120S is a valid test of anaerobic endurance and represents a sports specific test of this quality that may provide useful information for players and coaches involved in the sport of rugby league.
 
Comparison graphs between forwards and backs for changes over time for pull-ups, MSSRT, and sprint times (10 and 40 m) during the 13-year period of the study. Data are shown as mean 6 95% CI. MSSRT = multistage shuttle run test; CI = confidence interval. 
Percentage change for both groups combined over the 13-year period of the study (percentages calculated so that a faster sprint time is expressed as a positive change). The variation around each data point represents the 95% CI. CI = confidence interval. 
This study compared changes in the body size and physical characteristics of South African u20 rugby union players over a 13-year period. A total of 453 South African u20 players, (Forwards; n = 256 and Backs; n = 197), underwent measurements of body mass, stature, muscular strength, endurance and 10 m and 40 m sprint times. A two-way analysis of variance was used to determine significance differences for the main effects of position (Forwards vs. Backs) and time (1998-2010). The pooled data showed that Forwards were significantly heavier (22%), taller (5%) and stronger (18%) than the Backs. However, when 1 RM strength scores were adjusted for body mass, Backs were stronger per kg body mass. Stature did not change over the 13-year period for both groups. There were however significant increases in muscular strength (50%), body mass (20%), muscular endurance (50%). Furthermore, an improvement in sprint times over 40 m (4%) and 10 m (7%) was evident over the period of the study. In conclusion, the players became heavier, stronger, taller and improved their upper body muscular endurance over the 13 years of the study. Furthermore, sprint times over 10 m and 40 m improved over the same time period despite the increase in body mass. It can be speculated that the changes in physical characteristics of the players over time are possibly a consequence of; 1) adaptations to the changing demands of the game, and 2) advancements in training methods.
 
The purpose of the study was to investigate the influence of a 14-week swimming training program on psychological, hormonal, and performance parameters of elite women swimmers. Ten Olympic and international-level elite women swimmers were evaluated 4 times along the experiment (i.e., in T1, T2, T3, and T4). On the first day at 8:00 am, before the blood collecting at rest for the determination of hormonal parameters, the athletes had their psychological parameters assessed by the profile of mood-state questionnaire. At 3:00 am, the swimmers had their anaerobic threshold assessed. On the second day at 3:00 am, the athletes had their alactic anaerobic performance measured. Vigor score and testosterone levels were lower (p ≤ 0.05) in T4 compared with T3. In addition, the rate between the peak blood lactate concentration and the median velocity obtained in the alactic anaerobic performance test increased in T4 compared with T3 (p ≤ 0.05). For practical applications, the swimming coaches should not use a tapering with the present characteristics to avoid unexpected results.
 
The aims of this study were (a) to identify and compare the speed and agility of 12- and 14-year-old elite male basketball players and (b) to investigate relations between speed and agility for both age groups of basketball players, to help coaches to improve their work. Sixty-four players aged 12 (M = 11.98 years, SD = 0.311) and 54 players aged 14 (M = 14.092 years, SD = 0.275) were tested. Three agility tests: agility t-test, zigzag agility drill, and agility run 4 × 15 m and 3 speed tests: 20-m run, 30-m run, and 50-m run were applied. Fourteen-year-old players achieved significantly better results in all speed and agility tests compared with 12-year-old players. The correlation coefficient (r = 0.81, p = 0.001) showed that 12-year-old players have the same ability in the 30- and 50-m runs. The other correlation coefficient (r = 0.59, p = 0.001) indicated that 20- and 30-m runs had inherently different qualities. The correlation coefficients between agility tests were <0.71, and therefore, each test in this group represents a specific task. In 14-year-old players, the correlation coefficients between the speed test results were <0.71. In contrast, the correlation coefficients between the agility tests were >0.71, which means that all the 3 tests represent the same quality. During the speed training of 12-year-old players, it is advisable to focus on shorter running distances, up to 30 m. During the agility training of the same players, it is useful to apply exercises with various complexities. In speed training of the 14-year-old players, the 30- and 50-m runs should be applied, and agility training should include more specific basketball movements and activities.
 
Differences between line-drill (LD) test and retest measured against the average performance for the 2 test sessions. Differences are expressed as a percentage of the average of the LD test and retest. The bias line and the upper and lower 95% limits of agreement (LOA U and LOA L , respectively) are also presented. 
This study evaluates the validity and reliability of the line-drill (LD) test of anaerobic performance in 76 male basketball players 14.0-16.0 years of age. The Wingate Anaerobic Test (WAnT) was used as the reference for anaerobic performance. Wingate Anaerobic Test and LD test were moderately correlated (0.39 and 0.43, p < 0.01). Estimated age at peak height velocity (APHV) was moderately, negatively, and significantly (p < 0.01) correlated with WAnT peak (r = -0.69) and mean power (r = -0.71); earlier-maturing players had greater anaerobic power. Training experience was not associated with anaerobic performance, but chronological age (CA) and estimated APHV were significant covariates of the LD test (p < 0.05). National players were better than local players on the LD test (p < 0.01) after controlling for CA and body size. Short-term reliability of the LD test (n = 12, 1-week interval) was good: technical error of measurement = 0.44 seconds (95% confidence interval [CI] 0.31-0.75 seconds), intraclass correlation coefficient = 0.91 (95% CI 0.68-0.97), and coefficient of variation = 1.4% (95% CI 1.0-2.3%). Although the relationship between the LD test and WAnT was moderate, the LD test effectively distinguished local- and national-level adolescent basketball players. In contrast to WAnT, the LD test was not influenced by estimated biological maturity status. Thus, the LD test may be suitable for field assessment of anaerobic performance of youth basketball players.
 
This study examined the effects of on-field combined strength and power training (CSPT) on physical performance among U-14 young soccer players. Players were assigned to experimental (EG, n = 28) and control groups (CG, n = 23). Both groups underwent preseason soccer training for 12 weeks. EG performed CSPT twice a week, which consisted of strength and power exercises that trained the major muscles of the core, upper, and lower body. CSPT significantly (p < 0.05) improved vertical jump height, ball-shooting speed, 10 m and 30 m sprint times, Yo-Yo intermittent endurance run (YYIER), and reduced submaximal running cost (RC). CSPT had moderate effect on vertical jump, ball-shooting, 30 m sprint, and YYIER, small effect on 10 m sprint, RC, and maximal oxygen uptake. YYIER had significant (p < 0.05) correlations with 10 m (r = -0.47) and 30 m (r = -0.43) sprint times, ball-shooting speed (r = 0.51), and vertical jump (r = 0.34). The CSPT can be performed together with soccer training with no concomitant interference on aerobic capacity and with improved explosive performances. In addition, it is suggested that CSPT be performed during the preseason period rather than in-season to avoid insufficient recovery/rest or overtraining.
 
The aim of this article was to identify differences in the anthropometric and physiological characteristics of first team and reserve young soccer players (10-14 years old) at both the beginning and end of the soccer season. Body composition was calculated by measuring weight, height, skinfold, limb circumference, and joint diameter. Vo2max was estimated by Astrand's test. Sprint and jump tests were also performed. In general, first team players (FTPs) were taller and leaner. However, the most relevant difference that we found at the beginning of the season was that FTPs had shorter sprint times than reserves in the 30-m test (both flat and with 10 cones). Moreover, these differences in sprint time were more marked at the end of the season. In addition, jump test performance by the reserves declined from the beginning to the end of the season. These results indicate that sprint time is an important factor associated with selection as an FTP between the ages of 10 and 14 years. The progression of the FTPs during the course of the season is better than that of the reserves and is associated with a different degree of growth and maturity. These findings should be taken into account by trainers and coaches to avoid a bias against late maturing or younger soccer players.
 
The main purpose of the present investigation was to verify the responses of hematological parameters in men and women competitive swimmers during a 14-week training program. Twenty-three Olympic and international athletes were evaluated 4 times during the experiment: at the beginning of the endurance training phase (T1), at the end of the endurance training phase (T2), at the end of the quality phases (T3), and at the end of the taper period (T4). On the first day at 8:00 AM, each swimmer had a blood sample taken for the determination of hematological parameters. At 3:00 PM, the athletes had their aerobic performance measured by anaerobic threshold. On the second day at 8:00 AM, the swimmers had their aerobic performance measured by critical velocity. Hematocrit and mean corpuscular volume diminished (p < or = 0.05) from T1 to T2 (men: 5.8 and 7.2%; women: 11.6 and 6.8%), and increased (p < or = 0.05) from T2 to T3 (men: 7.2 and 6.0%; women: 7.4 and 5.2%). These results were related to the plasma volume changes of the athletes. However, these alterations do not seem to affect the swimmers' aerobic performance. For practical applications, time-trial performance is better than aerobic performance (i.e., anaerobic threshold and critical velocity) for monitoring training adaptations.
 
Study design: pre and postdiagnostics with test procedures and the intervention phase of 5 weeks' highintensity interval training (HIIT) and high-volume training (HVT) (HRmax = maximal heart rate). 
(A) Mean pre and postchanges in maximal oxygen uptake ( _ VO 2 max) and 1,000-m running time (T 1000m ) after short-term (5 weeks) high-intensity interval training (HIIT) and high-volume training (HVT). (B) Individual responses of maximal oxygen uptake ( _ VO 2 max) and 1,000-m running time (T 1000m ) after short-term (5 weeks) highintensity interval training (HIIT) and high-volume training (HVT). Dashed lines show individual responses as pre–post differences in % (D pre–post). Black lines represent the mean pre–post responds for either HIIT or HVT.  
Pre-post values for HIIT and HVT*
Correlation of pre–post changes of maximal oxygen uptake (Dpre–post _ VO 2 max) vs. pre–post changes in 1,000-m running time (Dpre–post T 1000m ).  
High-intensity interval training (HIIT) in junior and adult soccer has been shown to improve oxygen uptake (VO₂) and enhance soccer performance. The main purpose of this study was to examine the short term effects of a 5-week HIIT vs. high-volume training (HVT) program in 14-year-old soccer players regarding the effects on VO₂max and 1,000-m time (T₁₀₀₀) and on sprinting and jumping performance. In a 5-week period, 19 male soccer players with a mean (SD) age of 13.5 ± 0.4 years performed HIIT at close to ~90% of maximal heart rate. The HVT intensity was set at 60-75% of maximal heart rate. VO₂max increased significantly (7.0%) from pre to post in HIIT but not after HVT. T₁₀₀₀ decreased significantly after HIIT (~-10 vs. ~-5 seconds in HVT). Sprint performance increased significantly in both groups from pre to posttesting without any changes in jumping performance.
 
The purpose of this study was to test if a simplified impulse-response (IR) model would correlate with competition performances in an elite middle-distance runner over a period of 7 years that encompassed two Olympiads. Daily recorded pace and time obtained from training logs of this individual for the years 2000 to 2006 were used to calculate the impulse (training stress score, or TSS). The daily TSS was used to generate acute and chronic training loads (ATL and CTL, respectively), and a model response output, or p(t), was calculated based on the relationship p(t) = CTL - ATL. Competition performances (800 m-1 mile) were converted to Mercier scores (MS) and compared to p(t) and model parameters TSS, ATL, and CTL. MS was positively correlated with model output response p(t) (p < 0.01) and negatively with ATL (p < 0.01). Quadratic relationships were also observed between MS and both p(t) and CTL (p < 0.001), potentially indicating an optimal balance between fitness, fatigue, and performance. The results of this study demonstrate that the output of this simplified IR modeling approach correlates with performance in at least 1 elite athlete. Further studies are necessary to determine the generalizability of this method, but coaches may wish to use this approach to analyze previous training and performance relationships and iteratively modify training to optimize performance.
 
We investigated the musculoskeletal adaptations and efficacy of a whole-body eccentric progressive resistance-training (PRT) protocol in young women. Subjects (n = 37; mean age, 24.3) were randomly assigned to one of 3 groups: high-intensity eccentric PRT (HRT), low-intensity eccentric PRT (LRT), or control. Subjects performed 3 sets of 6 repetitions at 125% intensity or 3 sets of 10 repetitions at 75% intensity in the HRT and LRT groups, respectively, 2 times per week for 16 weeks. Strength was determined by the concentric 1-repetition maximum (1RM) standard. Bone mass and body composition were measured by dual-energy x-ray absorptiometry (DXA). Blood and urine samples were obtained for deoxypyridinoline, osteocalcin, creatine kinase, and creatinine. Data were analyzed by repeated-measures analysis of variance with post hoc comparisons. Strength increased 20-40% in both training groups. Lean body mass increased in the LRT (0.7 +/- 0.6 kg) and HRT (0.9 +/- 0.9 kg) groups. Bone mineral content increased (0.855 +/- 0.958 g) in the LRT group only. Deoxypyridinoline decreased and osteocalcin increased in the HRT and LRT groups, respectively. These findings suggest that submaximal eccentric training is optimal for musculoskeletal adaptations and that the intensity of eccentric training influences the early patterns of bone adaptation.
 
Estimated marginal means of position units 
Published literature suggests that one of the key determinants of success at rugby union international competitions is anthropometric profile. The Long Term Player Development (LTPD) model is a framework designed to guide the development of the tactical, physical, and psychological domains of sporting participation. In Ireland the Train-to-Train stage of the LTPD model is a critical stage, whereby the next developmental progression would include the transition of players into professional academies. To date no previously published studies have examined the anthropometric profile of Irish Schools' rugby union players at the Train- to-Train stage of the LTPD model. The anthropometric profile of 136 male adolescent rugby union players at the Train-to-Train stage of the LTPD model was assessed using total-body dual-energy X-ray absorptiometry. Significant differences in height, body mass, body fat %, fat mass, lean mass and fat-free mass were observed between players assigned to the forward and back units, as well as for specific position categorizations within each unit. Direct logistic regression revealed that body mass was a statistically significant (p < 0.01) predictor of unit position classification, with an odds ratio of 2.35, indicating that players with a higher body mass were twice as likely to be classified as forwards.The results of the present study indicate that at the Train-to-Train stage of the LTPD model, forward and back units have distinctly different anthropometric profiles. Furthermore, anthropometric differentiation also exists within specific position categorizations within each of these playing units. Thus, anthropometric profiling should be carried out on a systematic and periodic basis, as this will allow for the evaluation of the effectiveness of the implementation strategies of the LTPD model on a national basis.
 
Eccentric squat on decline board on unstable surface with extra load.
The aim of the study was to investigate the efficacy of rehabilitation protocol applied during competitive period for the treatment of patellar tendinopathy. A total of sixteen male volleyball players were divided into 2 groups. Fifteen from experimental group (E) and 13 from control group (C) fulfilled the same tests three times: before the training program started (1 measurement), after 12 weeks (2 measurement) and after 24 weeks (3 measurement). The above-mentioned protocol included: USG imagining with color Doppler function, clinical testing, pain intensity evaluation with VISA-P questionnaire, leg muscle strength, power and jumping ability measurements. The key element of the rehabilitation program was eccentric squat on decline board with additional unstable surface. The essential factor of the protocol was a set of preventive functional exercises, with focus on eccentric exercises of hamstrings. Patellar tendinopathy was observed in 18% of the tested young volleyball players. Implementation of the presented rehabilitation protocol with eccentric squat on decline board applied during sports season lowered the pain level of the young volleyball players. Presented rehabilitation protocol applied without interrupting the competitive period among young volleyball players together with functional exercises could be an effective method for the treatment of patellar tendinopathy.
 
The purpose of this study is to evaluate the hematological profile of military recruits in different settings and training programs and to investigate the link between anemia and iron deficiency with stress fracture (SF) occurrence. We surveyed × groups of recruits for 16 months: 221 women (F) and 78 men (M) from × different platoons of a gender-integrated combat battalion and a control group (C F) of 121 female soldiers from a noncombat unit. Data were fully collected upon induction and at 4 and 16 months from 48F, 21M, and 31C F. Blood tests, anthropometry, physical aerobic fitness, and SF occurrence were evaluated. On induction day, 18.0 and 19.0% of F and C F were found to be anemic, and 61.4 and 50.9%, respectively, were found to have iron deficiency, whereas 7.7% of M were found to be anemic and 10.2% iron deficient. During the 4 months of army basic training (ABT), anemia and iron deficiency prevalence did not change significantly in any group. After 16-months, anemia prevalence decreased by 8% among F and C F and abated in M. Iron deficiency was prevalent in 50.0, 59.4, and 18.8% of F, C F, and M, respectively. Stress fractures were diagnosed in 14 F during ABT, and they had a significantly higher prevalence (p < 0.05) of anemia and iron deficiency anemia compared to F without SFs. The observed link between anemia and iron deficiency on recruitment day and SFs suggests the importance of screening female combat recruits for these deficiencies. To minimize the health impact of army service on female soldiers, preventative measures related to anemia and iron deficiency should be administered. Further research is needed for evaluating the influence of low iron in kosher meat as a possible explanation for the high prevalence of iron deficiency among young Israeli recruits.
 
Plyometric training (PT) programs are widely used to improve explosive actions in soccer players of various ages, although there is debate about optimal training duration and time course of improvement. Twenty-two early to mid-puberty elite soccer players were assigned to a control (CG, n=10, regular soccer training) or plyometric training group (PTG, n=12, regular soccer training substituted with two PT sessions each week). Both groups trained for 16 weeks during the in-season period. CG performed only tests at baseline and post-intervention, whereas PTG performed additional tests after 4, 8 and 12 weeks. During each test, subjects' performances in speed (10-m&30-m; 5-m&20-m), agility, shuttle run (SR), multiple 5 bounds (MB5) and standing long jump (LJ) were recorded. The PTG showed improved performance in 20-m sprint time (-3.2%), agility time (-6.1%), MB5 distance (+11.8%) and LJ distance (+7.3%) (all, p<0.05) after 16 weeks. All these improvements were higher compared with CG (all, p<0.05). The time course of improvement in the PT group showed that 20-m sprint time improved after 16 weeks (p=0.012), agility after 4 (p=0.047) and 8 weeks (p=0.004), but stopped after 12 weeks (p=0.007); MB5 after 8 (p=0.039), 12 (p=0.028) and 16 weeks (p<0.001), and LJ improved after 4 (p=0.045), 12 (p=0.008) and 16 weeks (p<0.001). PT seems to be an appropriate training tool to enhance some, but not all explosive actions. The results indicate that the duration of a PT program is highly dependent on what type of explosive actions should be improved or whether several explosive actions should be improved at the same time.
 
Foot strike pattern has not been examined during ultramarathons where fatigue or avoidance of impact might have greater affect on foot strike and other gait parameters than in shorter events. In this study, video analysis from three level sites at a 161-km ultramarathon was used to: (1) examine changes in foot strike pattern, stride rate, and stride length; (2) determine if foot strike pattern is related to performance; and (3) ascertain if post-race blood creatine phosphokinase (CK) concentrations differ by foot strike pattern. Rear-foot strike (RFS) prevalence was 79.9%, 89.0%, and 83.9% at 16.5, 90.3 and 161.1 km, respectively. There was a significant distance effect observed between the 90.3 km and 161.1 km site for stride rate (p<0.05) and across all distances for stride length (p<0.0001), but stride rate and length were stable among the top-20 finishers. There was no effect (p=0.3) of foot strike pattern on performance. However, top-20 finishers had greater use (p=0.02) of a non-RFS pattern at 161.1 km than the remaining finishers. There was a trend toward greater post-race blood CK values among non-RFS compared to RFS runners, reaching significance at the 90.3 km site (p<0.05). Thus, the increased RFS prevalence by race mid-point was likely due to greater muscular demands of non-RFS patterns as supported by the higher post-race blood CK concentrations among non-RFS runners. Faster runners maintained higher stride rates and lengths throughout the race and made greater use of a non-RFS pattern at the end of the race compared with the slower finishers.
 
The aim of this study was to provide percentile values for 9 different muscular strength tests for Spanish children (1,513 boys and 1,265 girls) aged 6 to 17.9 years. The influence of body weight on the muscular strength level across age groups was also examined. Explosive strength was assessed by the throw ball test (upper body), standing broad jump, and vertical jump tests (lower body). Upper-body muscular endurance was assessed by push ups, bent arm hang, and pull ups tests, and abdominal muscular endurance was assessed by sit ups, curl ups in 30 seconds, and curl ups tests. Body mass index (BMI) was calculated. Participants were categorized according to the BMI international cut-off values as underweight, normalweight, overweight, and obese. Boys had significantly better scores than girls in all the studied tests, except in the 3 upper-body muscular endurance tests in the 6- to 7-year-old group and in the push ups test in the 8- to 9-year-old group. Underweight and normalweight individuals showed similar strength levels. Both underweight and normalweight children and adolescents had significantly higher performance than their overweight and obese counterparts in the lower-body explosive strength tests and in the push ups test in boys and bent arm hang test in both boys and girls. In conclusion, percentiles values of 9 muscular strength tests are provided. Percentiles values are of interest to identify the target population for primary prevention and to estimate the proportion of adolescents with high or low muscular strength levels. The overweight and obese groups had worse scores than their underweight and normalweight counterparts, whereas the underweight group had a similar performance to the normalweight group.
 
Acceleration tubing club system used during the 6-week golf-specific strength training. 
Graphic display of the data shown in Table 5. Evolution in variables were analysis of variance showed significant time 3 group interaction effects ( p , 0.05): body fat, muscle mass, squat jump (SJ), counter movement jump (CMJ), ball speed, club mean acceleration, and variables related to maximal strength (RM). 
General training regimen of the golfers during the study.
Descriptive data for anthropometric features, isometric grip strength, explosive strength, ball speed, mean club acceleration, and maximal strength for the control group (CG) and the treatment group (TG) for each test occasion (mean 6 SD).
The purpose of this study was to determine the effects of an 18-week strength training program on variables related to low-handicap golfers' performance. Ten right-handed male golfers, reporting a handicap of 5 or less, were randomly divided into two groups: the control group (CG) (N = 5, age: 23.9 ± 6.7 years) and the treatment group (TG) (N = 5, age: 24.2 ± 5.4 years). CG players followed the standard physical conditioning program for golf, which was partially modified for the TG. The TG participated in an 18-week strength training program divided into three parts: maximal strength training including weightlifting exercises (2 days a week for 6 weeks), explosive strength training with combined weights and plyometric exercises (2 days a week for 6 weeks), and golf-specific strength training, including swings with a weighted club and accelerated swings with an acceleration tubing system (3 days a week for 6 weeks). Body mass, body fat, muscle mass, jumping ability, isometric grip strength, maximal strength (RM), ball speed, and golf club mean acceleration were measured on five separate occasions. The TG demonstrated significant increases (p < 0.05) in maximal and explosive strength after 6 weeks of training and in driving performance after 12 weeks. These improvements remained unaltered during the 6-week golf-specific training period and even during a 5-week detraining period. It may be concluded that an 18-week strength training program can improve maximal and explosive strength and these increases can be transferred to driving performance; however, golfers need time to transfer the gains.
 
CMJ (centimeters), bench press (kilograms), full squat (kilograms), throwing velocity (kilometers per hour), and 20-m swim sprint (seconds) test performance of the S and C groups before and after the 18-week in-season training.* 
We examined the effect of 18 weeks of strength and high-intensity training on key sport performance measures of elite male water polo (WP) players. Twenty-seven players were randomly assigned to 2 groups; control (in-water training only) and strength group, (strength training sessions (twice per week) + in-water training). In-water training was conducted five days per week. 20-m maximal sprint swim, maximal dynamic strength (1RM) for upper (bench press (BP) and lower (full squat (FS) body, countermovement jump and throwing velocity were measured before and after training. Training program included upper and lower body strength and high-intensity exercises (bench-press, full-squat, military-press, pull-ups, CMJ loaded, abs). Baseline-training results showed no significant differences between the groups in any of the variables tested. No improvement was found in the control group, however, meaningful improvement was found in all variables in the experimental group: CMJ (2.38 cm, 6.9%, Effect Size (ES)=0.48), BP (9.06 kg, 10.53%, ES=0.66), FS (11.06 kg, 14.21%, ES=0.67), throwing velocity (1.76 km/h, 2.76%, ES=0.25) and 20-m maximal sprint swim (-0.26 sec, 2.25%, ES=0.29). Specific strength and high-intensity training in male WP players for 18 weeks produced a positive effect on performance qualities highly specific to WP. Therefore, we propose modifications to current training methodology for WP players to include strength and high-intensity training for athlete preparation in this sport.
 
The purpose of the study was to compare the seasonal changes (preparation period: PP and competition period: CP) of vertical jumping performance and knee muscle strength in a team of under-19 women volleyball players (N=12, 16.2 ± 1.5 yrs). The countermovement jump was used to evaluate jumping performance. The isometric knee extension moment at 150 ms from the onset of contraction (T150) and at a maximum of contraction (TMAX) were determined at nine knee angles (from 10° to 90°, full knee extension = 0°). The peak isokinetic knee extension (TISOK-EXT) and flexion (TISOK-FLEX) moment were determined at 60, 180 and 240°·s. Repeated measures analysis of variance was applied to the differences between PP and CP (p ≤ 0.05). Significant increases in jumping performance were found for jump height, peak impulse, total impulse, peak power and take-off velocity (p ≤ 0.05). T150 was significantly increased (p ≤ 0.05) at the knee flexion angles from 40° to 90°, whereas the increase was not significant at the rather extended knee angles of 10°, 20° and 30° (p > 0.05). TMAX was significantly increased only at 90° of knee flexion (p ≤ 0.05). With the exception of TISOK-FLEX at 60°·s (p ≤ 0.05), the increases of TISOK-EXT and TISOK-FLEX were not significant (p > 0.05). The TISOK-EXT / TISOK-FLEX ratios were not significantly changed (p > 0.05). The main application of the study is that it provides performance standards and potential criteria for variable selection for jumping performance and knee muscle strength seasonal evaluation.
 
Clinical laboratory tests.* 
The purpose of this investigation was to determine the metabolism of 2 over-the-counter steroids (Nortesten, which contains 36 mg of 19-nor-4-androstene-3,17-dione and 36 mg of 19-nor-4-androstene-3,17-diol) in healthy, resistance-trained men. Subjects were administered either low (72 mg) or high doses (144 mg) of Nortesten twice daily for 10 days. All subjects tested positive via urinalysis for the presence of nortestosterone at days 3, 5, 7, and 10. There was no change in the urine testosterone-epitestosterone ratio at any day. Furthermore, as determined by serum chemistry tests, there was no effect on renal, hepatic, hematological, or bone marrow function. Thus, short-term ingestion of 19-nor-4-androstene-3,17-dione and 19-nor-4-androstene-3,17-diol may result in a positive drug test result without any harmful side effects.
 
Week of training for team A. 
Week of training for team B. 
Number of training sessions and their duration for teams A and B. 
The purpose of this study was to assess the effect of the training executed by 2 under-19 teams from the first Spanish division on aerobic power, strength, and acceleration capacity. Two under-19 soccer teams that competed in the same league were evaluated on 2 occasions. The first evaluation (E₁) was done at the beginning of the competitive period, and the second evaluation (E₂) was done 16 weeks later, coinciding with the end of the first half of the regular season. The following were evaluated: lower-body strength through jump height with countermovement with and without load (CMJ/CMJ₂₀), speed of the Smith machine bar movement in a progressive load test of full squats (FSL), acceleration capacity in 10, 20, and 30 m (T₁₀, T₂₀, T₃₀, T₁₀₋₂₀, T₁₀₋₃₀, T₂₀₋₃₀), and maximal aerobic speed (MAS). Team A executed complementary strength training, and training loads were determined with regard to the speed with which each player moved the bar in FSL. Between the evaluations, the training sessions of each team were recorded to assess their influence on the changes in E2. Team A significantly improved its MAS (p < 0.01) and its application of strength in the CMJ₂₀ (p < 0.05) and FS₂₀₋₃₀₋₄₀ (p < 0.01), while significantly worsening their acceleration capacity in all the splits (p < 0.01). Team B slightly worsened its MAS and significantly improved its application of strength in the CMJ₂₀ (p < 0.01) and FS₅₀₋₆₀ (p < 0.05). Its acceleration capacity improved insignificantly except for in the 20- to 30-m interval/T₂₀₋₃₀ (p < 0.05). The present study demonstrates that the use of loads as a function of the speed of movement, without the need to determine maximum repetitions is a methodology that is adequate for the improvement of the application of strength in under-19 soccer players.
 
Total and partial time spent at different types of training and match loads for the under-19 volleyball players from weeks 1 to 9 and from weeks 10 to 18.
Values for the training-induced adaptations on jump capacity of the under-19 volleyball players at the beginning of the preparatory period (T1), after 9 weeks of training (T2), and after 18 weeks of training (T3).
Total time spent and percentages for two distinct periods of nine weeks of training (weeks 1-9) and (weeks 10-18) according to the different capacities developed in the conditioning sessions for the under-19 volleyball players.
The under-19 Brazilian volleyball national team has achieved great performances at international competitions. Because the vertical jump capacity is critical for success in volleyball, the purpose of this study was to identify the training-induced adaptations on jump capacity assessed by general and specific tests during 3 different moments (i.e., T1, T2, and T3) of a macrocycle of preparation for the world championship. The sample was composed of 11 athletes from the Brazilian national team-World Champion (age, 18.0 +/- 0.5 years; height: 198.7 +/- 5.4 cm; and body mass, 87.3 +/- 5.9 kg). They were evaluated for jumping capacity by the following tests: squat jump (SJ), countermovement jump (CMJ), and jump anaerobic resistance (15 seconds) (JAR) and standing reach, height, and vertical jump tests for attack and block. Descriptive statistics were computed, and a repeated-measures analysis of variance was used. The Tukey-Kramer post hoc test was used when appropriate. Significance was set at P < or = 0.05. The results showed that the training-induced adaptations on the SJ (3.9%) and CMJ (2.3%) were not statistically significant. The JAR showed statistical significance between T2 and T3 (9.6%), while the attack height and block height presented significant differences between T1 and T2 (2.5% and 3.3%, respectively) and T1 and T3 (3.0% and 3.5%, respectively). The volume of training was quantified between weeks 1 and 9 (10,750 minutes, 1,194 +/- 322 min x wk(-1)) and between weeks 10 and 18 (8,722 minutes, 969 +/- 329 min x wk(-1)). In conclusion, this study showed that there were progressive and significant training-induced adaptations, mainly on the tests that simulated the specific skills, such as spike and block, with the best results being reached after the first 9 weeks of training. This probably reflected not only the individual's capacity to adapt, but also the characteristics of the training loads prescribed during the entire macrocycle.
 
Changes in (A) body height, (B) body weight, and (C) body composition for college level football players from 1959 to 2011 for all positions combined, mixed backs, mixed lineman, and mixed skilled players. 
and 2 illustrate the change in body composition from 1959 to 2011 for college football players and from 1942 to 2011 for professional football lineman. The average change in body composition for every year is somewhere between 20.133 and 0.127% fat for mixed offensive backs, 0.046-0.275% fat for mixed lineman, 20.053 to 0.164% fat for mixed skilled, and 0.030-0.278% fat for all positions combined. We
Regression slopes, 95% confidence intervals (CI), and statistical significance for professional football player's body height, body weight, and body composition from 1942 to 2011 for all positions combined, mixed backs, mixed lineman, and mixed skilled players.
Regression slopes, 95% confidence intervals (CI), and statistical significance for college football player's body height, body weight, and body composition from 1959 to 2011 for all positions, mixed backs, mixed lineman, and mixed skilled players.
The purpose of this study was to document changes in height (cm), body weight (kg) and body composition (%fat) of American football players from 1942 to 2011. Published articles were identified from data bases and cross referencing of bibliographies. Studies selected met the requirements of: 1) having two of three dependent (height, body weight, and body composition) variables reported in the results; 2) containing a skill level of college or professional; 3) providing measured, not self-reported data; and 4) published studies in English-language journals. The data were categorized into groups based on skill level (college and professional). The player positions were grouped into three categories: mixed linemen (offensive and defensive linemen, tight ends, and linebackers), mixed offensive backs (quarterback, and running backs), and mixed skilled positions (defensive backs and wide receivers). Linear regression was used to provide slope estimates and 95% confidence intervals. Unpaired t-tests were used to determine whether an individual regression slope was significantly different from zero. Statistical significance was set at p < 0.017. College level players in all position groups have significantly increased body weight over time (95% CI mixed lineman .338-.900 kg/yr; mixed offensive backs .089-.298 kg/yr; mixed skilled .078-.334 kg/yr). The college level mixed linemen showed a significant increase over time for height (95% CI .034-.188 cm/yr) and body composition (.046-.275 %fat/year). Significant increases in body weight over time were found for professional level mixed lineman (95% CI .098-.756 kg/yr) and mixed offensive backs (95% CI .180-.545 kg/yr). There were no other significant changes at the professional level. These data demonstrate that body weight of all college players and professional mixed lineman have significantly increased from 1942 to 2011.
 
The purpose of this cross-sectional study was to investigate the change in 100-km running performance and in the age of peak performance for 100-km ultra-marathoners. Age and running speed of the annual fastest women and men in all 100-km ultra-marathons held worldwide between 1960 and 2012 were analyzed in 148,017 finishes with 18,998 women and 129,019 men using single, multi-variate and non-linear regressions. Running speed of the annual fastest men increased from 8.67 to 15.65 km/h and from 8.06 to 13.22 km/h for the annual fastest women. For the annual ten fastest men running speed increased from 10.23±1.22 to 15.05±0.29 km/h (p<0.0001) and for the annual ten fastest women from 7.18±1.54 to 13.03±0.18 km/h (p<0.0001). The sex difference decreased from 56.1% to 16.3% for the annual fastest finishers (p<0.0001) and from 46.7±8.7% to 14.0±1.2% for the annual ten fastest finishers (p<0.0001). The age of the annual fastest men increased from 29 to 40 years (p=0.025). For the annual fastest women, the age remained unchanged at 35.0±9.7 years (p=0.469). For the annual ten fastest women and men, the age remained unchanged at 34.9±3.2 (p=0.902) and 34.5±2.5 years (p=0.064), respectively. To summarize, 100-km ultra-marathoners became faster, the sex difference in performance decreased but the age of the fastest finishers remained unchanged at ∼35 years. For athletes and coaches to plan a career as 100-km ultra-marathoner, the age of the fastest female and male 100-km ultra-marathoners remained unchanged at ∼35 years between 1960 and 2012 although the runners improved their performance over time.
 
The purpose of this study was to compare normative data from present Division I National Collegiate Athletic Association football teams to those from 1987. Players were divided into 8 positions for comparisons: quarterbacks (QB), running backs (RB), receivers (WR), tight ends (TE), offensive linemen (OL), defensive linemen (DL), linebackers (LB), and defensive backs (DB). Comparisons included height, body mass, bench press and squat strength, vertical jump, vertical jump power, 40-yd-dash speed, and body composition. Independent t-tests were used to analyze the data with level of significance set at p < 0.01. Significant differences (p < 0.01) were found in 50 of 88 comparisons. From 1987 until 2000, Division I college football players in general have become bigger, stronger, faster, and more powerful. Further research is warranted to investigate if these trends will continue.
 
Women’s 100-m freestyle best time in the world 1990–2011. annotated. 
Considerations for this phenomenon.
O'Connor, LM and Vozenilek, JA. Is it the athlete or the equipment? An analysis of the top swim performances from 1990 to 2010. J Strength Cond Res 25(12): 3239-3241, 2011-Forty-three world record swims were recorded at the 2009 Fédération Internationale de Natation (FINA) World Championship meet in Rome. Of the 20 FINA recognized long-course (50-m pool) swimming events, men set new world records in 15 of those events, whereas women did the same in 17 events. Each of the men's world records and 14 of the 17 women's records still stand. These performances were unprecedented; never before had these many world records been broken in such a short period of time. There was much speculation that full-body, polyurethane, technical swimsuits were the reason for the conspicuous improvement in world records. Further analysis led the FINA to institute new rules on January 1, 2010, that limited the types of technical swimsuits that could be worn by athletes. No long-course world record has been broken since then. We sought to understand this phenomenon by analyzing publicly available race data and exploring other possible causes including improvements in other sports, improvements in training science, changes in rules and regulations, gender differences, anaerobic vs. aerobic events, unique talent, and membership data.
 
The primary purpose of this study was to identify the most appropriate method for normalizing physical performance measures to body mass in American football players. Data were obtained from the population of players (n = 4,603) that completed the vertical jump, broad jump, 40-yd sprint, 20-yd shuttle, 3-cone drill, and bench press at the National Football League Scouting Combine from 1999 to 2014. Correlation coefficients were used to assess relationships between body mass and physical performance measures. For the entire group and each playing position, absolute (i.e., non-normalized) performance measures were significantly (p ≤ 0.05) correlated with body mass, indicating that normalization is warranted. Ratio scaling, however, was not appropriate for normalizing most performance measures, as it merely reversed (and increased in magnitude) the significant correlations between body mass and performance. Allometric scaling with derived allometric parameters was appropriate for normalizing all performance measures, as correlations between body mass and performance were near to zero and no longer statistically significant. However, the derived allometric parameters differed by playing position. Thus, when normalizing physical performance measures to body mass, strength and conditioning professionals should use allometric scaling with test- and position-specific allometric parameters. Additionally, in the current study, percentile rankings were generated to provide test- and position-specific normative reference values for the absolute measures. Until body mass normalization techniques are adopted more broadly, strength and conditioning professionals can use these normative references values to compare current players to those who have already participated in the Scouting Combine.
 
Magyari, PM and Churilla, JR. Association between lifting weights and metabolic syndrome among U.S. adults: 1999-2004 National Health and Nutrition Examination Survey. J Strength Cond Res 26(11): 3113-3117, 2012-The purpose of this cross-sectional study was to determine the proportion of U.S. adults who participate in the resistance exercise modality of lifting weights (LWs) by demographic characteristics and to investigate the impact of LWs on the prevalence and risk of metabolic syndrome (MetS) in a national representative sample of U.S. adults. The sample (n = 5,618) in this cross-sectional study included adults aged ≥20 years who participated in the 1999-2004 National Health and Nutrition Examination Survey. Approximately twice as many men (11.2%; 95% confidence interval [CI] 9.5, 13.1) reported LWs as women did (6.3%; 95% CI 5.2, 7.6) with non-Hispanic Whites (9.6%; 95% CI 8.1, 11.4) reporting the highest levels and Mexican Americans reporting the lowest levels (5.6%; 95% CI 4.4, 7.2) of engaging in LWs. Additionally, higher levels of socioeconomic status were associated with greater levels of self-reported LWs. MetS prevalence was found to be significantly lower among U.S. adults reporting LWs (24.6%; 95% CI 19.3, 30.9) compared with adults not reporting LWs (37.3%; 95% CI 35.5, 39.2) with associated risk reductions of 58% (p < 0.001) and 37% (p < 0.01) in the unadjusted model and model adjusted for demographic variables, respectively. These findings suggest that LWs may play a role in reducing the prevalence and risk of MetS among U.S. adults. Therefore, exercise professionals should strongly encourage the activity of LWs among adults of all ages to promote metabolic health and focus programs designed to increase the adoption of LWs among the subgroups who report the lowest levels of LWs.
 
Aerobic capacity and body composition were measured at 3 time points over a 1-year period in 26 Division 1A women soccer players from Texas A&M University, in order to determine whether there were seasonal changes in these parameters. Subjects were tested in December, immediately following a 4-month competitive season; in April, following 15 weeks of strength and conditioning; and immediately prior to the start of the regular season in August, following a 12-week summer strength and conditioning program. A periodized strength and conditioning program design was incorporated in order to optimize anaerobic and oxidative capacity immediately prior to the regular competitive season. Significant differences in VO2max were measured between August (49.24 +/- 4.38 ml x kg(-1) x min(-1)) and December (44.87 +/- 4.61 ml x kg(-1) x min(-1)). No significant changes in aerobic capacity were found between April (47.43 +/- 4.01 ml x kg(-1) x min(-1)) and August (49.64 +/- 5.25 ml x kg(-1) x min(-1)). Significant increases in body fat were measured between August (15.71 +/- 2.92%) and December (18.78 +/- 2.79%), before and after the competitive season, respectively. No significant changes in body fat were found between April (16.24 +/- 2.95%) and August (15.71 +/- 2.92%). The results of this study suggest that decreases in muscle mass over the course of a regular competitive season contribute to decreases in aerobic capacity in collegiate women soccer players. Although it is unknown whether this decrease in muscle mass is the result of inadequate training or a normal adaptation to the physiological demands imposed by soccer, the results of the current study suggest that resistance training volume should be maintained during the competitive season, in order to maintain preseason levels of muscle mass.
 
The precision of maximum strength assessments (1RM) is important to evaluate the functional capacity and to prescribe and to monitor the training load. Several factors can affect the precision of 1RM tests, including the warm-up procedure. General and specific warm-up routines are recommended in order to enhance performance. The effects of a specific warm-up have already been acknowledged in improving performance. However, the effects of a general warm-up are unclear but seem to depend on its ability to increase muscle temperature while avoiding fatigue. Further, temperature elevation is dependent on both the duration and the intensity of the activity, which may eventually affect 1RM performance. The objective of this study was to investigate the effect of different intensities and durations of general warm-up on 1RM performance. Sixteen strength-trained males were tested for 1RM leg-press after four general warm-up conditions following specific warm-up: short-duration low-intensity (i.e. 5 minutes at 40% VO2max) (SDLI), long-duration low-intensity (i.e. 15 minutes at 40% VO2max) (LDLI), short-duration moderate-intensity (i.e. 5 minutes at 70% VO2max) (SDMI), long-duration moderate-intensity (i.e. 15 minutes at 70% VO2max) (LDMI), and the control (CTRL) no-general warm-up condition. Leg press 1RM values were higher (on average 3%) when subjects performed LDLI (367.8 ± 70.1 kg; p=0.01), compared to the other four conditions. Following the LDMI condition, 1RM values were lower (on average -4%) than in the other four conditions (345.6 ± 70.5 kg; p=0.01). There were no differences between SDMI, SDLI, and CTRL (359.4 ± 69.2 kg, 359.1 ± 69.3 kg, and 359.4 ± 70.4 kg, respectively) (p=0.99). According to our results long-duration low-intensity general warm up seems be appropriately to improve 1RM performance in strength-trained individuals.
 
Top-cited authors
Carlo Castagna
  • University Carlo Bo Urbino Italy
Karim Chamari
  • Aspetar - Qatar Orthopaedic and Sports Medicine Hospital
Tim Gabbett
  • Gabbett Performance Solutions
William J Kraemer
  • The Ohio State University
Anis Chaouachi