ArticlePDF Available

The importance of leadership in Soldiers' nutritional behaviors: results from the Soldier Fueling Initiative program evaluation

Authors:

Abstract and Figures

Introduction: Improving Soldiers’ nutritional habits continues to be a concern of the US Army, especially amidst increasing obesity and high injury rates. This study examines leadership influence on nutritional behaviors within the context of the Soldier Fueling Initiative, a program providing nutrition education and improved dining facility menus to Soldiers in Basic Combat Training (BCT) and Advanced Individual Training (AIT). Methods: A mixed methods design using surveys (N=486) and focus groups (N=112) was used to collect data at Fort Jackson, SC, and Fort Eustis, VA, in 2011. Results: Survey results showed 75% of Soldiers in BCT believed their drill sergeant was helpful in making performance-enhancing food choices, and 86% agreed their drill sergeant believed it is important to eat for performance. Soldiers in AIT perceived their cadre as less helpful than their BCT drill sergeants and agreed less frequently that the AIT cadre believed it was important to eat for performance (P<.05). These measures of leader influence were significantly associated with nutritional attitudes and behaviors in both BCT and AIT. Focus groups revealed 5 key themes related to cadre influence and nutrition behavior (listed in order of most to least frequent): (1) cadre influence food choices through consequences related to selection, (2) cadre teach Soldiers how to eat, (3) cadre rush Soldiers to eat quickly to return to training, (4) cadre influence choice through example but often do not make healthy choices, and (5) cadre have no influence on food choices. Comment: Leaders influence most Soldiers’ nutrition practices within the training environment, particularly within BCT. Given that leader influence can impact Soldiers’ attitudes and behaviors, it is critical that military leaders become knowledgeable about optimal nutrition practices to disseminate appropriate information to their Soldiers, avoid reprimand associated with trainees’ food choices, reinforce key messages associated with nutrition programming, and lead by example in their own food choices. A mixed methods design using surveys (N=486) and focus groups (N=112) was used to collect data at Fort Jackson, SC, and Fort Eustis, VA, in 2011. Survey results showed 75% of Soldiers in BCT believed their drill sergeant was helpful in making performance-enhancing food choices, and 86% agreed their drill sergeant believed it is important to eat for performance. Soldiers in AIT perceived their cadre as less helpful than their BCT drill sergeants and agreed less frequently that the AIT cadre believed it was important to eat for performance (P<.05). These measures of leader influence were significantly associated with nutritional attitudes and behaviors in both BCT and AIT. Focus groups revealed 5 key themes related to cadre influence and nutrition behavior (listed in order of most to least frequent): (1) cadre influence food choices through consequences related to selection, (2) cadre teach Soldiers how to eat, (3) cadre rush Soldiers to eat quickly to return to training, (4) cadre influence choice through example but often do not make healthy choices, and (5) cadre have no influence on food choices. Leaders influence most Soldiers' nutrition practices within the training environment, particularly within BCT. Given that leader influence can impact Soldiers' attitudes and behaviors, it is critical that military leaders become knowledgeable about optimal nutrition practices to disseminate appropriate information to their Soldiers, avoid reprimand associated with trainees' food choices, reinforce key messages associated with nutrition programming, and lead by example in their own food choices.
Content may be subject to copyright.
A preview of the PDF is not available
... The U.S. Army 2010 Soldier Fueling Initiative was developed for use in Initial Military Training to establish a "fueling" standard for Soldiers. 9 The Soldier Fueling Initiative development group recognized that without appropriate fueling and performancebased dietary menu standards, Soldiers would be unable to maintain their health or sustain core mission competencies. They identified a number of aspects of DFAC feeding that could be modified to enhance nutrition quality and promote healthier eating, including menus, recipes, preparation methods, and portion sizes. ...
... 9 Nearly 2/3 (66.0%) reported using the labels at least once a day. 9,16 Soldiers also stated that the performance nutrition education provided to them generally helped them make performance enhancing food choices. 5 Another important finding from the evaluation was that unit and command level leadership were key influencers of food choice and eating behaviors of Soldiers in the training environment, which indicates that leaders at every echelon should be educated on performance nutrition and good eating habits. ...
... 5 Another important finding from the evaluation was that unit and command level leadership were key influencers of food choice and eating behaviors of Soldiers in the training environment, which indicates that leaders at every echelon should be educated on performance nutrition and good eating habits. 9,17 Since 2008, the G4G program has evolved based on nutrition science, changes in nutrition guidance, data from conducted research, and feedback from key stakeholders ( Table I). The G4G version 2.0 (G4G 2.0) is a rebrand of the original G4G with a new and trademarked logo, which may not be used by commercial entities in advertising or product packaging labels. ...
Article
Introduction Go for Green® (G4G) is an evidence-based, multi-component nutrition program for military dining facilities (DFAC) to improve nutritional fitness among Service Members. The program evolved from supporting “fueling” during initial Army training into a robust intervention across all U.S. Military branches. The current G4G program consists of eight program requirements to optimize the nutrition environment, including traffic light labeling, nutritious menus, choice architecture, food promotion, marketing, and staff training. The evolution of the G4G program, development of standardized program requirements, and lessons learned are described. Materials and Methods The latest scientific evidence, best practices in health promotion and nutrition education, results and data from G4G implementation in the military community support the current version of G4G. Feedback and observations from program developers, military branch foodservice headquarters, installation leadership, and local G4G DFAC teams provided insight into implementation challenges, successes, facilitators, and barriers. Results The G4G program has evolved and expanded from its initial inception over 10 years ago to its current version. Research studies, nutrition science, and feedback from military community stakeholders have informed programmatic changes and improvements. Conclusions G4G 2.0 is a robust, innovative, multi-component, performance nutrition program with clear program element requirements. Value was added to elevate the G4G program by setting program requirements, expanding program components, and establishing a centralized resource hub. Performance nutrition initiatives in local military DFAC for dining facilities, such as G4G 2.0, has great potential to impact the health and well-being of Service Members.
... We posit that cultural norms surrounding "soldierization" may help explain the observed changes in eating behaviors. In a study by Jackson et al. (Jackson et al., 2013), all six focus groups of Army recruits listed "leaders rushing recruits to eat" as a common theme describing how leadership may influence Soldiers' nutritional behaviors. Specific comments included insufficient time to taste food or eat enough food to feel full until the next meal, and occasional pressure to finish full plates within a short period of time (Jackson et al., 2013). ...
... In a study by Jackson et al. (Jackson et al., 2013), all six focus groups of Army recruits listed "leaders rushing recruits to eat" as a common theme describing how leadership may influence Soldiers' nutritional behaviors. Specific comments included insufficient time to taste food or eat enough food to feel full until the next meal, and occasional pressure to finish full plates within a short period of time (Jackson et al., 2013). Educational gaps may also explain the observed changes in eating behavior. ...
... However, modifiable eating behaviors like eating rate and intuitive eating are not Table 4 Associations between eating behaviors during initial military training (IMT) and changes in BMI, body composition, calorie intake, diet quality, and cardiometabolic biomarkers from pre to post-IMT a,b . addressed (Jackson et al., 2013), and leaders have expressed minimal confidence or interest in establishing healthy eating behaviors among recruits (Jayne et al., 2018). The concern with a greater prevalence of eating fast and a decreased reliance on internal satiety cues is that, if sustained long-term, these behavior changes may increase future risk for overweight/obesity and associated comorbidities. ...
Article
Full-text available
Eating behaviors such as eating fast and ignoring internal satiety cues are associated with overweight/obesity, and may be influenced by environmental factors. This study examined changes in those behaviors, and associations between those behaviors and BMI, cardiometabolic biomarkers, and diet quality in military recruits before and during initial military training (IMT), an environment wherein access to food is restricted. Eating rate and reliance on internal satiety cues were self-reported, and BMI, body fat, cardiometabolic biomarkers, and diet quality were measured in 1389 Army, Air Force and Marine recruits (45% female, mean ± SEM BMI = 24.1 ± 0.1 kg/m2) before and after IMT. Pre-IMT, habitually eating fast relative to slowly was associated with a 1.1 ± 0.3 kg/m2 higher BMI (P < 0.001), but not with other outcomes; whereas, habitually eating until no food is left (i.e., ignoring internal satiety cues) was associated with lower diet quality (P < 0.001) and, in men, 1.6 ± 0.6% lower body fat (P = 0.03) relative to those that habitually stopped eating before feeling full. More recruits reported eating fast (82% vs 39%) and a reduced reliance on internal satiety cues (55% vs 16%) during IMT relative to pre-IMT (P < 0.001). Findings suggest that eating behaviors correlate with body composition and/or diet quality in young, predominantly normal-weight recruits entering the military, and that IMT is associated with potentially unfavorable changes in these eating behaviors.
... Thus, in theory, recruits should be in equal or positive energy balance. However, others have reported that consumption of meals may be hindered by physical and verbal interference from drill instructors, as well as time restriction [40]. Thus, while adequate nutrition is offered, energy balance might not be achieved. ...
Article
Full-text available
Basic training is centered on developing the physical and tactical skills essential to train a recruit into a Marine. The abrupt increase in activity and energy expenditure in young recruits may contribute to high rates of musculoskeletal injuries, to which females are more susceptible. To date, the total workload of United State Marine Corps (USMC) bootcamp is unknown and should include movement around the military base (e.g., to and from dining facilities, training locations, and classrooms). Thus, the purpose of this effort was to quantify workload and caloric expenditure, as well as qualitatively assess the impact of female reproductive health and injury rates in female recruits. Female recruits (n = 79; age: 19.1 ± 0.2 years, weight: 59.6 ± 0.8 kg, height: 161.6 ± 0.7 cm) wore physiological monitors daily throughout 10 weeks of USMC bootcamp. Physical fitness test scores, physiological metrics from wearables, injury data, and menstrual cycle information were obtained. Female recruits on average expended 3096 ± 9 kcal per day, walked 11.0 ± 0.1 miles per day, and slept 5:43 ± 1:06 h:min per night throughout the 10 weeks of bootcamp. About one-third (35%) of female recruits sustained an injury. In a subset of females that were not taking birth control and had previously been menstruating, 85% experienced cycle dysfunction during boot camp. High levels of physical activity and caloric expenditure, coupled with the stress of a new environment and insufficient sleep, may lead to alterations in female reproductive cycles and musculoskeletal injuries in young USMC recruits.
... Given the apparent need for an ecological approach within the military context, comparison using the theoretical domains framework offers the potential to identify effective mechanisms of change across the cognitive, affective, social, and environmental behavioral influence domains. 99 The military context contains strong social influences 100,101 and an environment over which personnel may not always have a high degree of control 102 (but which can be shaped by the military organization); therefore, interventions employing an ecological approach appear particularly suited to this context. Furthermore, given the positive outcomes observed in both the educationand environmental-based interventions in this review, it seems appropriate that future studies should strive to incorporate a multilevel/multicomponent ecological approach to further improve dietary outcomes. ...
Article
Context: Optimizing nutrition in military groups through improved diet quality and nutrition knowledge is key in supporting the high physical and cognitive demands. Objective: The objective of this investigation was to systematically review the effectiveness of nutrition interventions among military personnel in improving diet quality and/or nutrition knowledge. Data sources: Medline, Embase, CINAHL, and Scopus were searched from the earliest records to May 2020. Data extraction: Data were extracted by 2 reviewers. The primary outcomes were diet quality and/or nutrition knowledge. Data analysis: Twenty studies were included. The main intervention approaches identified were nutrition education of individuals (i.e., education-based studies; EB) (n = 12), and manipulation of the food service environment (i.e., dining facility studies; DFACs) (n = 8). The most common strategies were face-to-face lectures (n = 8) for EB, and healthier menus (n = 7) and education of catering staff (n = 6) for DFAC interventions. Most studies (18/20) demonstrated favorable within-group effects; however, dietary changes were inconsistent. Five of 10 studies using a comparator group demonstrated positive between-group differences. Conclusion: Although potential exists for improving diet quality and nutrition knowledge in military populations, the heterogeneity of the studies to date limits conclusions on the most efficacious strategies.
Article
Objective: Identify factors influencing eating behaviors among emerging adults in the military. Design: Focused ethnography using interviews, observations, and artifacts for data. Setting: Three US Naval installations. Participants: Thirty-two active-duty Sailors aged 18-25 years. Analysis: Qualitative data were organized in NVivo and analyzed sequentially to categorize culturally relevant domains and themes using a social ecological model (SEM). Descriptive statistics were used to describe questionnaire data in SPSS (version 27.0, IBM, 2020). Results: Leaders encouraged healthy eating through policies and messages, but cultural contradictions and environmental barriers undermined Sailors' efforts to eat healthily. Stress and resource constraints (intrapersonal), peer pressure (social), unhealthy food environments and lack of access to food preparation (environmental), and eating on the go because of mission-first norms (cultural) promoted unhealthy eating behaviors. Nutrition and culinary literacy (intrapersonal); peer support and leadership engagement (social); access to healthy, convenient, and low-cost foods (environmental); and indoctrination to healthy eating during recruit training (cultural) positively influenced eating behaviors. Conclusion and implications: The eating behaviors of service members are influenced by many modifiable factors. Targeted education, leadership engagement, and policies that make nutritious foods easily accessible, appealing, and preferred are needed.
Article
Full-text available
Objective: The aim was to develop, refine, and assess the usefulness of the Go for Green® (G4G) 2.0 Program Fidelity Assessment (PFA) tool. G4G 2.0 is a Department of Defense program designed to optimize access, availability, and knowledge of high-performance nutritious foods in military dining facilities (DFACs). Design: During a multi-site study to evaluate G4G 2.0 on meal quality and diner satisfaction, subject matter experts developed and refined a PFA tool based on eight program requirements. They identified tasks critical to program success and corresponding benchmarks then proposed expansion of several program requirements and developed a scoring system to assess adherence. Three PFAs were conducted (Site 1 and Site 2A, Site B). Setting: Two DFACs in the U.S. implementing the G4G 2.0 program. Participants: Military DFACs participating in a G4G 2.0 evaluation study. Results: After G4G 2.0 implementation, Site 1 conducted a PFA and met benchmarks for 8 of 15 sections. At Site 2, a PFA was conducted after G4G 2.0 implementation (Site 2A) and one three months later (Site 2B) with 12 of 15 and 10 of 15 sections meeting benchmarks, respectively. Conclusion: Research highlights the need to maximize implementation quality to ensure interventions are effective, achievable, and efficient. Using a PFA tool to objectively assess nutrition interventions can inform program fidelity, successes, and opportunities for improvement. Results identify key areas that require additional training and resources to optimize access to nutrient-dense foods that support nutritional fitness. This feedback is critical for assessing potential program impact on Service Members.
Article
The prevalence of obesity continues to rise among youth and adults in the US and the US military. The US military is negatively impacted by the nation's growing waistline, with drastic reductions in eligible recruits, increasing risks for compromised physical endurance and performance, and ballooning health care costs. This perspective discusses the effects of the obesity epidemic on the US military, previous and continuing research and programming initiatives, the progress made to increase access to healthy foods, and opportunities for future directions in research and practice to combat the obesity epidemic.
Article
A military exists in a unique position. It is an organization in which active duty members knowingly join or are conscripted into service with the understanding that there is an increased risk of mental and/or bodily harm as compared to many other occupations. However, while the nature of the profession can inherently be dangerous, it does not follow that its members be placed at undue excess risk if that risk can be reasonably avoided or reduced. Social determinants of health are one example of influences under a military’s purview that impact health outcomes and well-being. Although the U.S. Military performs well across many health equity measures, disparities persist and require attention and redress. Military policies and practices deeply impact members’ lives during and after service, and the durability and profundity of these effects establish the ethical grounds upon which any military policy should be structured. The ethical obligation is fortified by the extent of control a military exercises over its personnel. Taken together, these factors necessitate a concerted effort by militaries to remain cognizant of the ethical impacts of their policies and practices and to ensure focus remains on the well-being and readiness of its personnel. As such, militaries have ethical responsibilities to promote healthy social determinants of health among their service members via policies and public health measures.
Article
Objective Examine associations between soldiers’ eating behaviors, compliance with body composition and fitness standards, and physical performance. Design Cross-sectional study. Setting Eight Army installations. Participants US Army Soldiers (n = 1,591; 84% male). Main Outcome Measures Characteristics, eating behaviors, compliance with body composition and physical fitness standards, and fitness level were assessed via questionnaire. Analysis Bivariate and multivariable logistic regression. Results Eating mostly at a dining facility was associated with lower odds of body composition failure (odds ratio [OR], 0.44; 95% confidence interval [CI], 0.26–0.73); whereas, eating at a fast rate (OR, 1.51; 95% CI, 1.05–2.17) or often/always ignoring satiety cues (OR, 2.12; 95% CI, 1.06–4.27) was associated with higher odds of body composition failure. Eating mostly fast-food/convenience meals (OR, 1.75; 95% CI, 1.19–2.59) and eating at a fast rate (OR, 1.42; 95% CI, 1.04–1.93) was associated with higher odds of physical fitness failure. Skipping breakfast was associated with lower odds of high physical performance (OR, 0.41; 95% CI, 0.23–0.74); whereas, nutrition education was associated with higher odds of high physical performance (OR, 1.02; 95% CI, 1.01–1.04). Conclusions and Implications As eating behaviors are modifiable, findings suggest opportunities for improving the specificity of Army health promotion and education programs.
Article
Introduction Soldiers work in various extreme environments, including the High Arctic, where energy requirements are increased compared with temperate climates. Soldiers often do not reach their energy needs with combat rations and face additional challenges to feeding in the Artic, which can hinder the performance. The purpose of this study is to document soldiers’ perception about individual, dietary, and environmental factors influencing intake of combat rations during Arctic field training. Materials and Methods This qualitative phenomenological study included in-depth semi-structured individual interviews with 16 soldiers of the Canadian Armed Forces participating in the Arctic Operations Advisor training in Yellowknife (Northwest Territories) and Resolute Bay (Nunavut) from January to March 2019. Interviews were audio-recorded, transcribed verbatim, and then coded using a directed content analysis approach. Data were analyzed with NVivo qualitative data analysis software. Results Five themes related to the individual (personal preferences; mood and morale), the diet (water availability; food variety), or the environment (meal preparation time) were identified. A sixth theme found was related to the diet and the environment (food/water temperature). Soldiers explained food and water were frozen, thus limiting water availability and greatly increasing meal preparation time. Food variety was deemed adequate by some, but others preferred more options. Individual food preferences and soldier mood and team morale could be barriers or facilitators to intake. Overall, the complexity of combat ration intake in the Arctic stemmed from the interaction of factors. Conclusions Various factors related to the individual, diet, and environment were found to influence intake of combat rations by participating soldiers during Arctic training. Reducing barriers to combat ration consumption by enhancing operational suitability of rations for the Arctic environment could promote dietary intake. Bearing in mind many interrelated factors influenced intake of soldiers, the military would benefit from further assessing which challenges related to intake in the field could be addressed.
Article
Full-text available
Although positive associations have consistently been reported between sleep disruption and breast cancer, less is known about its potential role in prostate cancer. Within the prospective AGES-Reykjavik cohort study, we followed 2,102 men recruited in 2002-2006 until the end of 2009. Participants answered questions on sleep disruption. Information on the occurrence of prostate cancer was obtained through record linkages across the Icelandic Cancer Registry. We used Cox regression models with 95% confidence intervals (CI) to estimate HRs of prostate cancer by symptoms of sleep disruption. During follow-up, 135 men (6.4%) were diagnosed with prostate cancer. Compared with men without sleep disruption, those with problems falling and staying asleep were at significantly increased risk of prostate cancer [HR, 1.7 (95% CI, 1.0-2.9) and 2.1 (95% CI, 1.2-3.7)], respectively, with increasing sleep disruption severity. When restricted to advanced prostate cancer (≥ stage T3 or lethal disease), these associations became even stronger [HR 2.1 (95% CI, 0.7-6.2) and 3.2 (95% CI, 1.1-9.7)]. The results did not change after excluding from the analyses men who woke up during the night, indicative of nocturia, suggesting limited risk of reverse association. Our data suggest that certain aspects of sleep disruption may confer an increased risk of prostate cancer and call for additional, larger studies with longer follow-up times. Impact: Prostate cancer is one of the leading public health concerns in men; if confirmed in future studies, the association between sleep disruption and prostate cancer risk may open new avenues for prevention. Cancer Epidemiol Biomarkers Prev; 22(5); 872-9. ©2013 AACR.
Article
Full-text available
The purposes of the current study were to identify mental toughness profiles in adolescent cricketers and examine differences between these profiles on developmental assets and negative emotional states. A sample of 226 community cricketers (125 New Zealanders and 101 Australians; male n = 210) aged between 10 and 18 years (Mage = 14.41 years; SD = 2.11) completed a multisection, online survey containing measures of mental toughness, developmental assets, and negative emotional states. The results of hierarchical (Ward’s method) and nonhierarchical (k means) cluster analyses revealed three mental toughness profiles characterized by low, moderate, and high levels of all five mental toughness assets (i.e., affective intelligence, desire to achieve, self-belief, attentional control, resilience). Those cricketers with high levels of mental toughness reported possession of more developmental assets and lower levels of negative emotional states when compared with cricketers with the moderate levels of mental toughness. No statistically significant differences existed between the moderate and low levels of mental toughness profiles. These findings provided preliminary evidence to suggest that mental toughness might be viewed not only from the traditional view of optimal performance but also from a stance that may represent a contextually salient representation of thriving in youth sport settings.
Article
Full-text available
The goal of our study was to investigate different aspects of sleep, namely the sleep-wake cycle and sleep stages, in the vegetative state/unresponsive wakefulness syndrome (VS/UWS) and minimally conscious state (MCS). 24h polysomnography was performed in 20 patients in a UWS (n=10) or in a MCS (n=10) due to brain injury. The data were first tested for the presence of a sleep-wake cycle and the observed sleep patterns were compared to standard scoring criteria. Sleep spindles, slow waves sleep and rapid eye movement sleep were quantified and their clinical value was investigated. According to our results, an electrophysiological sleep-wake cycle was identified in 5 MCS and 3 VS/UWS patients. Sleep stages did not always match the standard scoring criteria which therefore needed to be adapted. Sleep spindles were more present in patients who clinically improved within 6 months. Slow wave sleep was present in 8 MCS and 3 VS/UWS patients but never in the ischemic etiology. Rapid eye movement sleep, and therefore dreaming which is a form of consciousness, was present in all MCS and 3 VS/UWS patients. In conclusion, the presence of alternating periods of eyes-open/eyes-closed cycles does not necessarily imply preserved electrophysiological sleep architecture in the UWS and MCS, contrary to previous definition. The investigation of sleep is a little studied yet simple and informative way to evaluate the integrity of residual brain function in patients with disorders of consciousness with possible clinical diagnostic and prognostic implications.
Article
Full-text available
Contemporary training for power sports involves diverse routines that place a wide array of physiological demands on the athlete. This requires a multi-faceted nutritional strategy to support both general training needs--tailored to specific training phases--as well as the acute demands of competition. Elite power sport athletes have high training intensities and volumes for most of the training season, so energy intake must be sufficient to support recovery and adaptation. Low pre-exercise muscle glycogen reduces high-intensity performance, so daily carbohydrate intake must be emphasized throughout training and competition phases. There is strong evidence to suggest that the timing, type, and amount of protein intake influence post-exercise recovery and adaptation. Most power sports feature demanding competition schedules, which require aggressive nutritional recovery strategies to optimize muscle glycogen resynthesis. Various power sports have different optimum body compositions and body weight requirements, but increasing the power-to-weight ratio during the championship season can lead to significant performance benefits for most athletes. Both intra- and extracellular buffering agents may enhance performance, but more research is needed to examine the potential long-term impact of buffering agents on training adaptation. Interactions between training, desired physiological adaptations, competition, and nutrition require an individual approach and should be continuously adjusted and adapted.
Article
Sleep habits among military populations are problematic. Poor sleep hygiene occurs in parallel with the global increase in obesity and metabolic syndrome and contributes to a decrease in performance. The extent of sleep issues needs to be quantified to provide feedback for optimizing warfighter performance and readiness. This study assessed various health behaviors and habits of US Army Soldiers and their relationship with poor sleep quality by introducing a set of new questions into the Comprehensive Soldier and Family Fitness (CSF2) Global Assessment Tool (GAT). Subjects included 14,148 US Army Active, Reserve, and National Guard members (83.4% male) who completed the GAT, a self-report questionnaire that measures 4 fitness dimensions: social, family, emotional, and spiritual. Approximately 60 new questions, including ones on sleep quality, within the fifth CSF2 dimension (physical) were also answered. A sleep score was calculated from 2 questions validated in the Pittsburgh Insomnia Rating Scale (0 to 6). Poor sleepers (5-6) were significantly (P<.001) more likely than good sleepers (0-1) to consider themselves in fair or poor health, be overweight or obese, and score in the lowest quartile of the emotional, social, family, and spiritual fitness dimensions. Additionally, poor sleepers were significantly (P<.001) less likely to have a healthy body mass index and waist circumference, eat breakfast 6 or more times a week, meet aerobic exercise and resistance training recommendations, and pass their Army Physical Fitness Test in the top quartile. This study examined sleep quality in a group of military personnel and indicated significant associations between quality of sleep and physical performance, nutritional habits, measures of obesity, lifestyle behaviors and measures of psychosocial status. Targeted educational interventions and resources are needed to improve sleep patterns based on behaviors that can be most easily modified.
Article
Background: Long and short sleep duration are associated with increased risk for coronary heart disease (CHD) and cardiovascular disease (CVD); however, evidence is inconsistent. We sought to identify whether self-reported sleep duration and insomnia, based on a validated questionnaire, are associated with increased incident CHD and CVD among postmenopausal women. Methods: Women's Health Initiative Observational Study Participants (N=86,329; 50-79 years) who reported on sleep at baseline were followed for incident CVD events. Associations of sleep duration and insomnia with incident CHD and CVD were evaluated using Cox proportional hazards models over 10.3 years. Results: Women with high insomnia scores had elevated risk of CHD (38%) and CVD (27%) after adjustment for age and race, and in fully adjusted models (hazard ratio [HR]=1.19, 95% confidence interval [CI] 1.09-1.30; 1.11 95% CI 1.03-2.00). Shorter (≤5 hours) and longer (≥10 hours) sleep duration demonstrated significantly higher incident CHD (25%) and CVD (19%) in age- and race-adjusted models, but this was not significant in fully adjusted models. Formal tests for interaction indicated significant interactions between sleep duration and insomnia for risk of CHD (p<0.01) and CVD (p=0.02). Women with high insomnia scores and long sleep demonstrated the greatest risk of incident CHD compared to midrange sleep duration (HR=1.93, 95% CI 1.06-3.51) in fully adjusted models. Conclusions: Sleep duration and insomnia are associated with CHD and CVD risk, and may interact to cause almost double the risk of CHD and CVD. Additional research is needed to understand how sleep quality modifies the association between prolonged sleep and cardiovascular outcomes.
Article
Although polysomnography is necessary for diagnosis of most sleep disorders, it is also expensive, time-consuming, intrusive, and interferes with sleep. Field-based activity monitoring is increasingly used as an alternative measure that can be used to answer certain clinical and research questions. The purpose of this study was to evaluate the reliability and validity of a novel activity monitoring device (Fitbit) compared to both polysomnography and standard actigraphy (Actiwatch-64). To test validity, simultaneous Fitbit and actigraph were worn during standard overnight polysomnography by 24 healthy adults at the West Virginia University sleep research laboratory. To test inter-Fitbit reliability, three participants also wore two of the Fitbit devices overnight at home. Fitbit showed high intradevice reliability = 96.5-99.1. Fitbit and actigraph differed significantly on recorded total sleep time and sleep efficiency between each other and polysomnography. Bland-Altman plots indicated that both Fitbit and actigraph overestimated sleep efficiency and total sleep time. Sensitivity of both Fitbit and actigraphy for accurately identifying sleep was high within all sleep stages and during arousals; specificity of both Fitbit and actigraph for accurately identifying wake was poor. Specificity of actigraph was higher except for wake before sleep onset; sensitivity of Fitbit was higher in all sleep stages and during arousals. The web-based Fitbit, available at a markedly reduced price and with several convenience factors compared to standard actigraphy, may be an acceptable activity measurement instrument for use with normative populations. However, Fitbit has the same specificity limitations as actigraphy; both devices consistently misidentify wake as sleep and thus overestimate both sleep time and quality. Use of the Fitbit will also require specific validation before it can be used to assess disordered populations and or different age groups.
Article
Gender-based differences in the physiological response to exercise have been studied extensively for the last four decades, and yet the study of post-exercise, gender-specific recovery has only been developing in more recent years. This review of the literature aims to present the current state of knowledge in this field, focusing on some of the most pertinent aspects of physiological recovery in female athletes and how metabolic, thermoregulatory, or inflammation and repair processes may differ from those observed in male athletes. Scientific investigations on the effect of gender on substrate utilization during exercise have yielded conflicting results. Factors contributing to the lack of agreement between studies include differences in subject dietary or training status, exercise intensity or duration, as well as the variations in ovarian hormone concentrations between different menstrual cycle phases in female subjects, as all are known to affect substrate metabolism during submaximal exercise. If greater fatty acid mobilization occurs in females during prolonged exercise compared with males, the inverse is observed during the recovery phase. This could explain why, despite mobilizing lipids to a greater extent than males during exercise, females lose less fat mass than their male counterparts over the course of a physical training programme. Where nutritional strategies are concerned, no difference appears between males and females in their capacity to replenish glycogen stores; optimal timing for carbohydrate intake does not differ between genders, and athletes must consume carbohydrates as soon as possible after exercise in order to maximize glycogen store repletion. While lipid intake should be limited in the immediate post-exercise period in order to favour carbohydrate and protein intake, in the scope of the athlete’s general diet, lipid intake should be maintained at an adequate level (30%). This is particularly important for females specializing in long-duration events. With protein balance, it has been shown that a negative nitrogen balance is more often observed in female athletes than in male athletes. It is therefore especially important to ensure that this remains the case during periods of caloric restriction, especially when working with female athletes showing a tendency to limit their caloric intake on a daily basis. In the post-exercise period, females display lower thermolytic capacities than males. Therefore, the use of cooling recovery methods following exercise, such as cold water immersion or the use of a cooling vest, appear particularly beneficial for female athletes. In addition, a greater decrease in arterial blood pressure is observed after exercise in females than in males. Given that the return to homeostasis after a brief intense exercise appears linked to maintaining good venous return, it is conceivable that female athletes would find a greater advantage to active recovery modes than males. This article reviews some of the major gender differences in the metabolic, inflammatory and thermoregulatory response to exercise and its subsequent recovery. Particular attention is given to the identification of which recovery strategies may be the most pertinent to the design of training programmes for athletic females, in order to optimize the physiological adaptations sought for improving performance and maintaining health.
Article
Reduced sleep duration and quality appear to be endemic in modern society. Curtailment of the bedtime period to minimum tolerability is thought to be efficient and harmless by many. It has been known for several decades that sleep is a major modulator of hormonal release, glucose regulation and cardiovascular function. In particular, slow wave sleep (SWS), thought to be the most restorative sleep stage, is associated with decreased heart rate, blood pressure, sympathetic nervous activity and cerebral glucose utilization, compared with wakefulness. During SWS, the anabolic growth hormone is released while the stress hormone cortisol is inhibited. In recent years, laboratory and epidemiologic evidence have converged to indicate that sleep loss may be a novel risk factor for obesity and type 2 diabetes. The increased risk of obesity is possibly linked to the effect of sleep loss on hormones that play a major role in the central control of appetite and energy expenditure, such as leptin and ghrelin. Reduced leptin and increased ghrelin levels correlate with increases in subjective hunger when individuals are sleep restricted rather than well rested. Given the evidence, sleep curtailment appears to be an important, yet modifiable, risk factor for the metabolic syndrome, diabetes and obesity. The marked decrease in average sleep duration in the last 50 years coinciding with the increased prevalence of obesity, together with the observed adverse effects of recurrent partial sleep deprivation on metabolism and hormonal processes, may have important implications for public health.
Article
Strength and power athletes are primarily interested in enhancing power relative to body weight and thus almost all undertake some form of resistance training. While athletes may periodically attempt to promote skeletal muscle hypertrophy, key nutritional issues are broader than those pertinent to hypertrophy and include an appreciation of the sports supplement industry, the strategic timing of nutrient intake to maximize fuelling and recovery objectives, plus achievement of pre-competition body mass requirements. Total energy and macronutrient intakes of strength-power athletes are generally high but intakes tend to be unremarkable when expressed relative to body mass. Greater insight into optimization of dietary intake to achieve nutrition-related goals would be achieved from assessment of nutrient distribution over the day, especially intake before, during, and after exercise. This information is not readily available on strength-power athletes and research is warranted. There is a general void of scientific investigation relating specifically to this unique group of athletes. Until this is resolved, sports nutrition recommendations for strength-power athletes should be directed at the individual athlete, focusing on their specific nutrition-related goals, with an emphasis on the nutritional support of training.