Article

Controversy and Debate: Memory based Methods Paper 1: The Fatal Flaws of Food Frequency Questionnaires and other Memory-Based Dietary Assessment Methods

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

There is an escalating debate over the value and validity of memory-based dietary assessment methods (M-BMs). Proponents argue that despite limitations, M-BMs such as food frequency questionnaires (FFQs), provide valid and valuable information about consumed foods and beverages, and therefore can be used to assess diet-disease relations and inform public policy. In fact, over the past 60 years thousands of research reports using these methods were published and used to populate the United States Department of Agriculture's National Evidence Library, inform public policy, and establish the Dietary Guidelines for Americans. Despite this impressive history, our position is that FFQs and other M-BMs are invalid and inadmissible for scientific research and cannot be employed in evidence-based policy making. Herein, we present the empirical evidence, and theoretic and philosophic perspectives that render M-BMs data both fatally flawed and pseudo-scientific. First, the use of M-BMs is founded upon two inter-related logical fallacies: a category error and reification. Second, human memory and recall are not valid instruments for scientific data collection. Third, in standard epidemiologic contexts, the measurement errors associated with self-reported data are non-falsifiable (i.e., pseudo-scientific) because there is no way to ascertain if the reported foods and beverages match the respondent's actual intake. Fourth, the assignment of nutrient and energy values to self-reported intake (i.e., the pseudo-quantification of qualitative/anecdotal data) is impermissible and violates the foundational tenets of measurement theory. Fifth, the proxy-estimates created via pseudo-quantification are physiologically implausible (i.e. meaningless numbers) and have little relation to actual nutrient and energy consumption. Finally, investigators engendered a fictional discourse on the health effects of dietary sugar, salt, fat and cholesterol when they failed to cite contrary evidence or address decades of research demonstrating the fatal measurement, analytic, and inferential flaws presented herein.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... There is very little doubt that food, diet, or nutrient intake plays a major role in the overall health status of individuals and populations in general (Buttriss et al., 2017;Johanningsmeier, Harris, & Klevorn, 2016;Margetts & Nelson, 1995). However, generating indisputable evidence of the role of specific nutrients, non-nutritive (antinutritional) food constituents, meals, diets, foods, food supplements, food groups or dietary habits or patterns in the etiology and/or pathogenesis of foodrelated health outcomes, has been a major challenging undertaking for nutritional epidemiologists Archer, Marlow, & Lavie, 2018;Satija, Yu, Willett, & Hu, 2015). ...
... Food or diet inarguably is a complex mixture of constituents made up of nutrients and the non-nutrititve components (Margetts & Nelson, 1995;Zhao & Singh, 2020 (Margetts & Nelson, 1995). Nonetheless, nutritional epidemiologists and nutritional scientists have successfully developed reasonably valid and reliable food or dietary intake assessment methods, albeit against the odds of harsh criticisms of their usefulness and credibility (Archer & Blair, 2015;Archer, Marlow et al., 2018;Archer, Pavela, & Lavie, 2015;Lachat et al., 2016;Margetts & Nelson, 1995;Naska et al., 2017;Subar et al., 2015). ...
... Given the influence of the dichotomy of philosophical perspectives on the relationships between food intake and health status of individuals and defined populations, other researchers have 43 suggested an integrated, multidisciplinary and/or interdisciplinary approach to addressing research questions in nutritional epidemiology, as criticisms against its methodological approaches are unrelenting Archer, Marlow et al., 2018;Boeing, 2013;Satija et al., 2015;Tapsell, Neale, Satija, & Hu, 2016). ...
... Given the current controversial debate on the plausibility of dietary recalls and memory-based assessment protocols, this is potentially a major limitation. Specifically, self-reported dietary intake data and other FFQs result in non-falsifiable (i.e., pseudo-scientific) and physiologically implausible data (Archer et al., 2013;Archer, Marlow, et al., 2018aArcher et al., 2015). Therefore, the difference in self-reported and actual dietary food intake may show definitive results impossible when examining the consumption of meat on a continuous scale rather than dichotomous (Archer et al., 2013;Archer, Marlow, et al., 2018aArcher et al., 2015). ...
... Specifically, self-reported dietary intake data and other FFQs result in non-falsifiable (i.e., pseudo-scientific) and physiologically implausible data (Archer et al., 2013;Archer, Marlow, et al., 2018aArcher et al., 2015). Therefore, the difference in self-reported and actual dietary food intake may show definitive results impossible when examining the consumption of meat on a continuous scale rather than dichotomous (Archer et al., 2013;Archer, Marlow, et al., 2018aArcher et al., 2015). ...
Article
In this systematic and meta-analytic review, we examined the current evidence on positive psychological variables between individuals who consumed meat and individuals who abstained from meat consumption. After systematically searching five online databases for primary research on positive psychological outcomes in meat consumers and meat abstainers, 19 studies with 94,204 participants (nmeat consumers = 82,449, nmeat abstainers = 9,964) met the inclusion/exclusion criteria. The primary outcomes were self-esteem, satisfaction with life, and positive mental health. The secondary outcomes were positive affect, psychological well-being, vigor, optimism, happiness, and meaning in life. Individuals who consumed meat had greater positive mental health (g = 0.21, 95% CI [0.08, 0.31], p = .001) than meat abstainers. No significant differences were found between the groups on self-esteem (g = 0.19, 95% CI [-0.01, 0.38], p = .06) and satisfaction with life (g = 0.02, 95% CI [-0.04, 0.07], p = .57). The majority of studies examining the secondary outcomes showed no group differences. The evidence was limited, requiring more studies to determine the role of study quality in diet-health relations. Study designs precluded inference of causal and temporal relations. With respect to clinical practice, our findings add to the current controversial diet-health debate.
... Estimating dietary intake is challenging due to random and systematic bias in selfreported diets. To quantify food intake, most researchers rely on memory-based dietary assessments, such as food frequency questionnaires (FFQ), food records, and 24 h dietary recalls, which have been debated over their value and validity [1,2]. Random and systematic errors that may occur during memory-based dietary assessments include (a) respondent systematically overreporting or underreporting foods [3]; (b) respondent unintentionally including or omitting foods [4,5]; and (c) respondent unable to recall portion sizes [6]. ...
... Dietary assessment methods based on self-report are prone to systematic and random errors [1,2]. The objective of this study was to compare self-reported dietary intake using 24 h recalls to provided menu items in a controlled feeding intervention. ...
Article
Full-text available
Systematic and random errors based on self-reported diet may bias estimates of dietary intake. The objective of this pilot study was to describe errors in self-reported dietary intake by comparing 24 h dietary recalls to provided menu items in a controlled feeding study. This feeding study was a parallel randomized block design consisting of a standard diet (STD; 15% protein, 50% carbohydrate, 35% fat) followed by either a high-fat (HF; 15% protein, 25% carbohydrate, 60% fat) or a high-carbohydrate (HC; 15% protein, 75% carbohydrate, 10% fat) diet. During the intervention, participants reported dietary intake in 24 h recalls. Participants included 12 males (seven HC, five HF) and 12 females (six HC, six HF). The Nutrition Data System for Research was utilized to quantify energy, macronutrients, and serving size of food groups. Statistical analyses assessed differences in 24 h dietary recalls vs. provided menu items, considering intervention type (STD vs. HF vs. HC) (Student’s t-test). Caloric intake was consistent between self-reported intake and provided meals. Participants in the HF diet underreported energy-adjusted dietary fat and participants in the HC diet underreported energy-adjusted dietary carbohydrates. Energy-adjusted protein intake was overreported in each dietary intervention, specifically overreporting beef and poultry. Classifying misreported dietary components can lead to strategies to mitigate self-report errors for accurate dietary assessment.
... Thus, it is necessary to improve methods of dietary assessment. However, this is a complex analysis, and the available instruments present some hindrances in evaluation and/or quantification in food consumption [5,6]. ...
... The FFQ is highly used due to its capability of estimating usual food consumption in longer periods of time and its ease of application [7]. However, FFQs can produce unreliable energetic and nutrient estimates due to their number of food items, limitation of the foods present in the FFQ list, or respondent's memory [6][7][8]. Moreover, the FFQ assesses the respondent's current food consumption, but this is an event with great variability [5,[7][8][9]. ...
Article
Full-text available
Background: Traditional methods for assessing individual energy consumption often involve lengthy and intricate procedures. This study aims to introduce an Energy Consumption Estimation Scale, utilizing Item Response Theory (IRT) for adolescents aged 18-19 years. Methods: This psychometric investigation applies IRT to 93 items extracted from a validated food frequency questionnaire. The study encompasses a representative sample of 2515 adolescents from the São Luís birth cohort in Brazil. The latent trait, energy intake, is derived using IRT and subsequently validated through hierarchical multiple linear regression modeling. Significance was established at p < 0.05. Results: A Samejima's model was successfully fitted (CFI and TLI > 0.9 and RMSEA < 0.08), effectively capturing variations across all energy consumption levels. Factors associated with the latent trait demonstrate consistent behavioral patterns. Adolescents with higher energy intake exhibited increased consumption of dairy products, artificially sweetened beverages, and seasonal fruits and vegetables. Conclusions: The proposed Energy Consumption Estimation Scale demonstrates a reliable measurement of energy intake and serves as a practical and concise alternative for assessing energy consumption among adolescents. These findings suggest the potential for adapting similar models for different age groups and incorporating diverse food items based on the obtained results.
... Self-reported dietary intake as estimated by FFQs, are invalid measurements because FFQs have been shown to be pseudoscientific. The perception of usual intake and actual dietary intake are often assumed to be equivalent but are in fact very different due to social desirability bias (Archer et al., 2018;Hebert et al., 1995). Additionally, FFQs introduce recall bias (Archer et al., 2018). ...
... The perception of usual intake and actual dietary intake are often assumed to be equivalent but are in fact very different due to social desirability bias (Archer et al., 2018;Hebert et al., 1995). Additionally, FFQs introduce recall bias (Archer et al., 2018). Further, details on the quality assessment scores are provided in the fully extracted file which has been uploaded on OSF. ...
Article
Background: As the global population ages, there has been a growing incidence of neurodegenerative diseases such as Alzheimer's. More recently, studies exploring the relationship between dietary patterns and neuroimaging outcomes have received particular attention. This systematic literature review provides a structured overview of the association between dietary and nutrient patterns on neuroimaging outcomes and cognitive markers in middle-aged to older adults. A comprehensive literature search was conducted to find relevant articles published from 1999 to date using the following databases Ovid MEDLINE, Embase, PubMed, Scopus and Web of Science. The inclusion criteria for the articles comprised studies reporting on the association between dietary patterns and neuroimaging outcomes, which includes both specific pathological hallmarks of neurodegenerative diseases such as Aβ and tau and nonspecific markers such as structural MRI and glucose metabolism. The risk of bias was evaluated using the Quality Assessment tool from the National Heart, Lung, and Blood Institute of the National Institutes of Health. The results were then organized into a summary of results table, collated based on synthesis without meta-analysis. After conducting the search, 6050 records were extracted and screened for eligibility, with 107 eligible for full-text screening and 42 articles ultimately being included in this review. The results of the systematic review indicate that there is some evidence suggesting that healthy dietary and nutrient patterns were associated with neuroimaging measures, indicative of a protective influence on neurodegeneration and brain ageing. Conversely, unhealthy dietary and nutrient patterns showed evidence pointing to decreased brain volumes, poorer cognition and increased Aβ deposition. Future research should focus on sensitive neuroimaging acquisition and analysis methods, to study early neurodegenerative changes and identify critical periods for interventions and prevention. Systematic review registration: PROSPERO registration no, CRD42020194444).
... As TFA are not endogenously synthesized, blood levels of TFA are a biomarker for the balance of dietary TFA intake, distribution volume and catabolism independent of subjective memory-based assessment methods such as food frequency questionnaires [20]. They-like other blood fatty acid levels such as EPA and DHA levels [21]-reliably reflect cardiac and other tissue TFA levels over approximately the preceding 3 months [22,23]. ...
... Second, 53% of the patients included in Aldo-DHF were female, resulting in a representative gender distribution for the condition HFpEF, which is slightly more common in females [45]. Third, blood fatty acid levels are an objective biomarker of TFA status unbiased by recall-based dietary intake assessment methods (e.g., food frequency questionnaires) used in nutritional epidemiology which may not accurately reflect an individual's consumption of nutrients due to limitations such as measurement error, recall bias, selective reporting and incompleteness of food composition databases [22]. Our analysis is not without limitations. ...
Article
Full-text available
Background Industrially processed trans-fatty acids (IP-TFA) have been linked to altered lipoprotein metabolism, inflammation and increased NT-proBNP. In patients with heart failure with preserved ejection fraction (HFpEF), associations of TFA blood levels with patient characteristics are unknown. Methods This is a secondary analysis of the Aldo-DHF-RCT. From 422 patients, individual blood TFA were analyzed at baseline in n = 404 using the HS-Omega-3-Index ® methodology. Patient characteristics were: 67 ± 8 years, 53% female, NYHA II/III (87/13%), ejection fraction ≥ 50%, E / e ′ 7.1 ± 1.5; NT-proBNP 158 ng/L (IQR 82–298). A principal component analysis was conducted but not used for further analysis as cumulative variance for the first two PCs was low. Spearman’s correlation coefficients as well as linear regression analyses, using sex and age as covariates, were used to describe associations of whole blood TFA with metabolic phenotype, functional capacity, echocardiographic markers for LVDF and neurohumoral activation at baseline and after 12 months. Results Blood levels of the naturally occurring TFA C16:1n-7t were inversely associated with dyslipidemia, body mass index/truncal adiposity, surrogate markers for non-alcoholic fatty liver disease and inflammation at baseline/12 months. Conversely, IP-TFA C18:1n9t, C18:2n6tt and C18:2n6tc were positively associated with dyslipidemia and isomer C18:2n6ct with dysglycemia. C18:2n6tt and C18:2n6ct were inversely associated with submaximal aerobic capacity at baseline/12 months. No significant association was found between TFA and cardiac function. Conclusions In HFpEF patients, higher blood levels of IP-TFA, but not naturally occurring TFA, were associated with dyslipidemia, dysglycemia and lower functional capacity. Blood TFAs, in particular C16:1n-7t, warrant further investigation as prognostic markers in HFpEF. Graphical abstract Higher blood levels of industrially processed TFA, but not of the naturally occurring TFA C16:1n-7t, are associated with a higher risk cardiometabolic phenotype and prognostic of lower aerobic capacity in patients with HFpEF.
... They -like other blood fatty acid levels such as EPA and DHA levels [21] -reliably re ect cardiac and other tissue TFA levels over approximately the preceding three months. [22,23] We hypothesized that blood IP-TFA would directly correlate with cardiovascular risk factors, aerobic capacity and left ventricular diastolic function (LVDF), and neurohumoral activation in patients with HFpEF. ...
... food frequency questionnaires) used in nutritional epidemiology which may not accurately re ect an individual's consumption of nutrients due to limitations such as measurement error, recall bias, selective reporting, and incompleteness of food composition databases. [22] Our analysis is not without limitations. First, our analysis does not offer information on the prognostic impact of blood TFA levels in patients with HFpEF. ...
Preprint
Full-text available
Background: Industrially processed trans-fatty acids (IP-TFA) have been linked to altered lipoprotein metabolism, inflammation and increased NTproBNP. In patients with heart failure with preserved ejection fraction (HFpEF), associations of TFA blood levels with patient characteristics are unknown. Methods: This is a secondary analysis of the Aldo-DHF-RCT. From 422 patients, individual blood TFA were analyzed at baseline in n=404 using the HS-Omega-3-Index® methodology. Patient characteristics were; 67±8 years, 53% female, NYHA II/III (87/13%), ejection fraction ≥50%, E/e´ 7.1±1.5; NT-proBNP 158 ng/L (IQR 82-298). A principal component analysis was conducted but not used for further analysis as cumulative variance for the first two PCs was low. Spearman´s correlation coefficients as well as linear regression analyses, using sex and age as covariates, were used to describe associations of whole blood TFA with metabolic phenotype, functional capacity, echocardiographic markers for LVDF, and neurohumoral activation at baseline and after 12 months. Results: Blood levels of the naturally occurring TFA C16:1n-7t were inversely associated with dyslipidemia, body-mass-index/truncal adiposity, surrogate markers for non-alcoholic fatty liver disease and inflammation at baseline/12 months. Conversely, IP-TFA C18:1n9t, C18:2n6tt and C18:2n6tc were positively associated with dyslipidemia and isomer C18:2n6ct with dysglycemia. C18:2n6tt and C18:2n6ct were inversely associated with submaximal aerobic capacity at baseline/12 months. No significant association was found between TFA and cardiac function. Conclusions: In HFpEF patients, higher blood levels of IP-TFA, but not naturally occurring TFA, were associated with dyslipidemia, dysglycemia and lower functional capacity. Blood TFAs, in particular C16:1n-7t, warrant further investigation as prognostic markers in HFpEF.
... Second, while we employed an objective measure to assess the main outcome (i.e., salivary cortisol), other assessment tools relied solely on participants' self-report. As demonstrated in nutrition studies, selfreported measures produce a number of (intentional and non-intentional) distorting factors, such as misreporting and social desirability (Archer & Blair, 2015;Archer et al., 2013;Archer, Lavie, et al., 2018;Archer, Marlow, et al., 2018a, 2018b. Additionally, the internal consistency for SOQ was lower than in the previous studies (e.g., Miner-Rubino et al., 2002;. ...
Article
Objectification theory and the psychological ramifications of self-objectification are well-established. For example, internalization of the third-person perspective (i.e., self-objectification) can lead to habitual body surveillance and psychological consequences, which in turn can result in serious mental health disorders such as anxiety and depression. However, relatively few studies examined psychophysiological responses to an objectified environment as a function of biological sex. To close this gap in the current body image literature, we explored the effects of increased state self-objectification on stress levels assessed via salivary cortisol in a single-blind experimental study of 159 undergraduate male and female students. State self-objectification was manipulated by participants wearing tight, revealing exercise clothing or baggy exercise attire. Our results showed that independent of time and biological sex, the experimental group experienced greater stress levels, as indicated by significantly higher cortisol than the control group. The interaction effects among time, conditions, and biological sex, as well as main effects for time, were not statistically significant. While our results are partially consistent with prior research examining the effects of an objectified environment on heart rate, they are inconsistent with the findings on gender differences in heart rate.
... When examining relations between diet and mental health, it is important to note that nutrition science suffers from severe conceptual flaws and methodologic issues that challenge its utility and validity (Archer et al., 2018b(Archer et al., , 2013(Archer et al., , 2016(Archer et al., , 2018c(Archer et al., , 2018d. Specifically, the use of memory-based dietary assessment methods such as food frequency questionnaires and dietary recalls led to "physiologically implausible", "pseudo-quantified", "nonfalsifiable data". ...
Article
Background Over the past two decades, there has been an increase in the prevalence of psychological conditions, such as depression, anxiety, disordered eating, and body image disturbances. In concert with this trend, there was a substantial rise in the advocacy and practice of restrictive dietary patterns, such as veganism and vegetarianism. These parallel developments suggest a relation between diet and mental health, but to date, research has failed to offer clear answers on whether these associations are causal, coincidental, or more complex than superficial analyses suggest. Aim Given this context, the purpose of this commentary is to offer a consilient perspective on the role of vegan and vegetarian diets in mental health. Methods We performed a broad qualitative synthesis of the current literature on diet and mental health from sociologic and psychologic perspectives. Results Several empirically supported hypotheses were presented with equivocal support. Conclusion The current evidence suggests that if a nutritionally adequate diet is consumed, the avoidance/consumption of meat and other animal foods will have no significant effects on physical and mental health.
... Moreover, this overestimation can be related to changes in athletes' appetites in different days due to variations in their training programs, such as during competitions or during training phase. Another possible reason for this overestimation is the number of food items that makes it difficult for participants to recall their exact yearly consumption while filling the FFQ with respect to the 24h DRs [48,59] as Pinto et al. (2010) proposed that this overestimation may be due to difficulties in accurately estimating the portion size consumed [60,61]. Consequently, the Bland Altman analyses were performed in order to demonstrate the compliance between the FFQ and the 24h DRs by showing the difference between the two methods against their averages. ...
Article
Full-text available
Background and objective Nutrition is a basic need for athletes; thus, adequate dietary intake is crucial for maintaining overall health, facilitating training adaptations and boosting athletic performance. Accurate dietary assessment tools are required to minimize the challenges faced by athletes. This study verifies the validity and reproducibility of a 157 item semi-quantitative food frequency questionnaire (FFQ) among Lebanese athletes. This is the only Arabic questionnaire in Lebanon that estimates food consumption for athletes which can also be used in Arabic speaking countries. There has been no previous validated food frequency questionnaire that estimated food consumption for athletes in Lebanon. Methods A total of 194 athletes were included in the study to assess the validity of the food frequency questionnaire against four days dietary recalls by comparing the total nutrient intake values from the food frequency questionnaire with the mean values of four 24-hour dietary recalls using Spearman correlation coefficient and Bland Altman plots. In order to measure the reproducibility, the intra class correlation coefficients were calculated by repeating the same food frequency questionnaire after one month. Results The intra-class correlation coefficient between the two-food frequency questionnaires ranged from average (0.739 for carbohydrates) to good (0.870 for energy (Kcal)), to excellent (0.919 for proteins) concerning macronutrients and ranged from average (0.688 for vitamin D), to excellent (0.952 for vitamin B12), indicating an acceptable reproducibility. Spearman’s correlation coefficients of dietary intake estimate from the food frequency questionnaire and the four dietary recalls varied between 0.304 for sodium, 0.469 for magnesium to 0.953 for caloric intake (kcal). Bland-Altman plots illustrated a percentage of agreement ranging between 94.3% for fats to 96.4% for proteins. Conclusion This food frequency questionnaire has a reliable validity and reproducibility to evaluate dietary assessments and is an appropriate tool for future interventions to ensure the adoption of adequate eating strategies by athletes.
... Considering that one of the age-related physiological factors contributing to dehydration is a reduced sense of thirst, owing to the diminished responsiveness of the osmoreceptors and the decrease in angiotensin I levels [49], these results may be interpreted with caution. Moreover, it is important to note that these determinations were made using a subjective methodology that may be influenced by patient's memory, which may be altered due to their age, as previously reported by Archer et al. [50]. In any case, the prevalence of negative water balance in the study sample was 46.5%, slightly higher than that obtained in previous studies [10,12,13]. ...
Article
Full-text available
Hydration status plays a key role in healthy ageing, and it is potentially affected by several factors, including drug consumption. However, research on this issue to date is scarce, especially in highly vulnerable groups, such as the elderly. We aimed to study the relationship linking hydration status, analysed by means of a validated questionnaire, 24 h urine analysis, body composition assessment, and drug consumption in a sample of old adults. A total of 144 elders were included in the study. Cardiovascular drug consumption was significantly associated with a lower water intake in men (β = −0.282, p = 0.029). Moreover, urinary analysis revealed that total drug intake as well as the consumption of diuretics and cardiovascular drugs were associated with poorer hydration status, whereas genito-urinary drugs were associated with an opposite effect, and these results were confirmed in terms of body composition. Hence, total drug consumption (β = −0.205), diuretic (β = −0.408), cardiovascular (β = −0.297), and genito-urinary drugs (β = 0.298) were significantly associated (p < 0.05) with total body water. The obtained results confirmed the impact of chronic treatment with certain drugs on hydration status. Nutritional interventions may be of great interest in certain population groups in order to prevent complications due to altered hydration status.
... Dietary intake is a key driver of chronic disease risk, making accurate assessment of dietary intake critically important to public health [1]. Bias and error in dietary self-report have led to much discourse and debate on diet-disease relationships, especially for nutrients of concern liked added sugar (AS) [2][3][4][5]. The most widely used dietary assessment methods for human nutrition studies are 24-h dietary recalls and food-frequency questionnaires (FFQs), but limitations associated with these approaches are well documented [6][7][8]. ...
Article
Full-text available
Objective biomarkers of dietary intake are needed to advance nutrition research. The carbon isotope ratio (C¹³/C¹²; CIR) holds promise as an objective biomarker of added sugar (AS) and sugar-sweetened beverage (SSB) intake. This systematic scoping review presents the current evidence on CIRs from human studies. Search results (through April 12, 2024) yielded 6297 studies and 24 final articles. Studies were observational (n = 12), controlled feeding (n = 10), or dietary interventions (n = 2). CIRs were sampled from blood (n = 23), hair (n = 5), breath (n = 2), and/or adipose tissue (n = 1). Most (n = 17) conducted whole tissue (that is, bulk) analysis, 8 used compound specific isotope analysis (CSIA), and/or 2 studies used methods appropriate for analyzing breath. Studies were conducted in 3 concentrated geographic regions of the United States (n = 7 Virginia; n = 5 Arizona; n = 4 Alaska), with only 2 studies conducted in other countries. Studies that used CSIA to examine the CIR from the amino acid alanine (CIR-Ala; n = 4) and CIR analyzed from breath (n = 2) provided the most robust evidence for CIR as an objective biomarker of AS and SSBs (R² range 0.36–0.91). Studies using bulk analysis of hair or blood showed positive, but modest and more variable associations with AS and SSBs (R² range 0.05–0.48). Few studies showed no association, particularly in non-United States populations and those with low AS and SSB intakes. Two studies provided evidence for CIR to detect changes in SSB intake in response to dietary interventions. Overall, the most compelling evidence supports CIR-Ala as an objective indicator of AS intake and breath CIR as an indicator of short-term AS intake. Considering how to adjust for underlying dietary patterns remains an important area of future work and emerging methods using breath and CSIA warrant additional investigation. More evidence is needed to refine the utility and specificity of CIRs to measure AS and SSB intake.
... This is due to the dietary recall in this study being restricted to 1-year and not capturing longer-term dietary information. Furthermore, food frequency questionnaires are prone to recall and memory error, reproducibility problems, and portion size inaccuracies, not providing the precision and level of nutrient detail required for drawing definitive conclusions [47,48]. That said, even if the observed differences in butyrate metabolism are due to long-term dietary habits, this further validates diet as an important determinant of kidney stone disease risk, and raises the possibility that dietary compounds other than oxalate play important roles. ...
Article
Full-text available
Intestinal microbiome dysbiosis is a known risk factor for recurrent kidney stone disease (KSD) with prior data suggesting a role for dysfunctional metabolic pathways other than those directly utilizing oxalate. To identify alternative mechanisms, the current study analyzed differences in the metabolic potential of intestinal microbiomes of patients (n = 17) and live-in controls (n = 17) and determined their relevance to increased risk for KSD using shotgun metagenomic sequencing. We found no differences in the abundance of genes associated with known oxalate degradation pathways, supporting the notion that dysfunction in other metabolic pathways plays a role in KSD. Further analysis showed decreased abundance of key enzymes involved in butyrate biosynthesis in patient intestinal microbiomes. Furthermore, de novo construction of microbial genomes showed that the majority of genes significantly enriched in non-stone formers are affiliated with Faecalibacterium prausnitzii, a major butyrate producer. Specifically pertaining to butyrate metabolism, the majority of abundant genes mapped back to F. prausnitzii, Alistipes spp., and Akkermansia muciniphila. No differences were observed in ascorbate or glyoxylate metabolic pathways. Collectively, these data suggest that impaired bacterial-associated butyrate metabolism may be an oxalate-independent mechanism that contributes to an increased risk for recurrent KSD. This indicates that the role of the intestinal microbiome in recurrent KSD is multi-factorial, which is representative of the highly intertwined metabolic nature of this complex environment. Future bacteria-based treatments must not be restricted to targeting only oxalate metabolism.
... In Morocco, respondents to the national survey were asked to inform their total annual purchase of foods from the previous year (Haut Commissariat au Plan, 2016); in Poland, estimated consumption per household was calculated by monthly purchases and divided by the national average of residents per household (GUS, 2019). These differences are a limitation of our methodology since FFQ, 24 h food recall, and food records are more precise and trustworthy than household estimations and relaying on memory-based information (Henríquez-Sánchez et al., 2009;Archer et al., 2018). ...
Article
Full-text available
To solve the rising issue of how to feed our planet in the future, we need to enhance our knowledge of peoples' current eating patterns and analyze those in terms of their health and environmental impacts. Current studies about adherence to existing national and global dietary recommendations often lack the ability to cross-compare the results among countries. Therefore, this study aims to develop a methodology to evaluate adherence to food-based dietary guidelines (FBDGs) and the Planetary Health Diet (PHD) on a national level, which can be replicable in different countries. First, national dietary intake data was collected from surveys published by the respective responsible public institutions from five countries (Italy, Denmark, Germany, Morocco, and Poland). Second, food groups represented in the intake data and the FBDGs were mapped to establish a proposal for a new common grouping (i.e., comprehensive food groups) that enables cross-country comparison. Third, dietary intake was compared to the recommendations according to national FBDG and the PHD. The adherence to the recommended diets was assessed using an adapted version of the German Food Pyramid Index. Our results show that different ways of grouping foods may change adherence levels; when measuring adherence to the FBDGs with the food groups suggested in the FBDGs, average scores (45.5 ± 5.4) were lower than by using comprehensive food groups (46.9 ± 3.7). Higher adherence to the PHD (52.4 ± 6.1) was found also using the comprehensive food groups. Particularly the foods meats, eggs, and legumes in one group (i.e., protein equivalents) appear to influence the outcome of scores using the comprehensive food groups. This study developed a methodology to evaluate national dietary intake against national FBDGs and the PHD. Our study points out the fact that it is difficult to overcome the challenge that countries have different food grouping clusters. Yet, the combination of the methods developed enables cross-country comparisons and has the potential to be applied to different national settings globally.
... At a group level, people may be asked to report their consumption to inform nutritional guidelines and public policy; this information is also used to understand the association between diet and disease (e.g., Afshin et al., 2017;Forouzanfar et al., 2015). These reports of dietary consumption rely on people's memories, which has led to criticisms about the validity of the information reported because they are not direct measures of intake and there is no way to determine whether the reported consumption matches people's actual consumption (e.g., Archer et al., 2018; although see Martin-Calvo & Martinez-González, 2018). In the current study, we considered dietary consumption to be a repeated event because meals are similar types of experiences that involve the same actions, and often occur in the same locations with the same people (Dilevski, Paterson, Walker, & van Golde, 2021). ...
Article
Self‐reported dietary intake is commonly used to inform policy; however, memory‐based reports are subject to error. Our aim was to examine dietary reporting errors using a repeated‐events framework. Participants ( N = 102) completed a 3‐day food diary and 10 days later recalled what they had consumed on one self‐nominated day and one experimenter‐nominated day from the diary period. Self‐nominated day reports were more accurate than experimenter‐nominated day reports. Across both days, participants made more errors by reporting a food from the wrong day than by reporting foods not recorded in the diary at all. Unexpectedly, participants who completed their food‐diary across Sunday–Monday–Tuesday were more accurate than those who completed across Thursday–Friday–Saturday, and participants who completed the study in 2020 were more accurate than those who completed it in 2021/2. Overall, results are consistent with the repeated events literature and outline a new approach to better understand dietary self‐reporting.
... The study only focused on Gwanda, a rural based district, where consumption of edible insects tends to be higher than in urban areas [16]. One of the weaknesses of interviews is that participants tend to give socially acceptable responses [29]. Dietary intake assessment methods are based on respondent memory, and are therefore prone to recall bias [30]. ...
Article
Edible insects are nutritious with potential to improve nutritional outcomes and livelihoods in low-income countries. However, it is not clear whether consumption of edible insects is positively correlated with improved dietary diversity and food security indicators. Therefore, this cross-sectional study was designed to investigate the relationship between consumption of edible insects and diet diversity and food security indicators among children and adults from Gwanda district, Matabeleland province in Southern Zimbabwe. The survey collected data on the following; household sociodemographic characteristics, household dietary diversity score (HDDS), food consumption score (FCS), and child dietary diversity score (CDDS). Logistic regression was used to examine the associations between edible insect consumption and food security indicators. A total of 303 households were surveyed. A high proportion were edible insect consumers (80.9%) and the rest non-consumers (19.1%). The most consumed insect was mopani worms (Gonimbrasia belina, madora, amacimbi) (74.8%). The consumption of mopani worms was highest in the age group 20-49 years (34.4%) and significantly associated with being married and age of the household head. There was no difference between the mean CDDS for consumers (5.9±1.7) and for non-consumers (6.0±2.0) (p=0.802). The median (IQR) FCS for consumers was lower at 49 (35, 65) than for non-consumers 53 (36.5, 64). This difference was not statistically significant (p=0.526). There also was no difference between the average HDDS for consuming households (6.2±1.7) and for non-consuming households 6.2±1.5 (p=0.866). There was no significant association between consumption of edible insects and CDDS (p=0.802), HDDS (p=0.866), and FCS (p=0.585). In conclusion, this study showed that Gonimbrasia belina (mopani worms, madora, amacimbi) were the commonly consumed insect mostly as relish due to their palatable taste. Overall, the consumption of edible insects did not seem to improve diet diversity and food security indicators in this setting. National level studies with bigger sample sizes that investigate the contribution of edible insects to overall nutrient intake and dietary diversity are required. Furthermore, interventions to promote the consumption of edible insects, including their commercialization should adopt a social ecological approach to maximise impact. Key words: Entomophagy, food security, mopani worms, stunting, gender, Zimbabwe
... Absent associations of global EF indicators as measured by the self-and parent-version of the BRIEF with EAH outcomes contrast findings in 9-year-old children on significant associations of BRIEF self-report data and snack food intake based on a Food Frequency Questionnaire (FFQ; Riggs et al., 2010;Tate et al., 2015). However, because FFQs do not measure EAH, but rather capture memories of food intake with limited validity (Archer, Marlow, & Lavie, 2018), the results are not fully comparable. ...
Article
Eating in the absence of hunger (EAH) is one of the key behavioral features of binge-eating disorder (BED) in youth. Although preliminary evidence revealed that adolescent BED co-occurs with deficits in executive functions (EFs), it is unclear whether EFs are related to EAH. Thus, this study experimentally examined whether deficits in EFs predict EAH in adolescents with and without BED. Adolescents (12-20 years) with BED (n = 28) and age-, sex-, and weight-matched controls (n = 28) underwent an EAH paradigm in the laboratory, where they were offered snacks ad libitum after having established satiety during a lunch meal. Cognitive interference, cognitive flexibility, decision making, and EFs in daily life were assessed by neuropsychological tests and self-report. The BED group showed a significantly higher food intake in gram during the EAH trial than controls with medium effect, but no significant group differences in EFs emerged. Dysfunctional decision making in terms of risky decision making, but no other EFs, predicted increased EAH (g, kcal) in the total sample. Although increases in risky decision making over adolescence are well known, this study uniquely revealed that general decision-making abilities driven by short-term reward may account for disinhibited eating behavior. Interventions targeting decision making with focus on reward sensitivity should be evaluated for their efficacy in preventing and reducing disinhibited eating behavior in adolescents.
... Diet is hard to measure Tools such as food frequency questionnaires or 24 h dietary recall instruments are commonly used to assess habitual dietary intakes. Despite efforts towards validating these tools and their ability to produce credible estimates of dietdisease associations, critics have called for them to be abandoned, considering them flawed because of their reliance on memory and cognition and issues of bias and measurement error [63,64]. Suggestions for suitable alternatives are sparse, however. ...
Article
Full-text available
Nutrition therapy has been emphasised for decades for people with type 2 diabetes, and the vital importance of diet and nutrition is now also recognised for type 2 diabetes prevention. However, the complexity of diet and mixed messages on what is unhealthy, healthy or optimal have led to confusion among people with diabetes and their physicians as well as the general public. What should people eat for the prevention, management and remission of type 2 diabetes? Recently, progress has been made in research evidence that has advanced our understanding in several areas of past uncertainty. This article examines some of these issues, focusing on the role of diet in weight management and in the prevention and management of type 2 diabetes. It considers nutritional strategies including low-energy, low-fat and low-carbohydrate diets, discusses inter-relationships between nutrients, foods and dietary patterns, and examines aspects of quantity and quality together with new developments, challenges and future directions. Graphical abstract
... There may be recall bias in the use of food diaries for data collection. 66 Confounding factors include age, co-morbidities, psycho-social factors, indications for the BD, mixed feeding methods (varying degrees of BD and CF) and the impact of multidisciplinary input, and studies acknowledge these. The studies involved many patients that were medically complex, which could bias the results against clinical benefit of the BD. ...
Article
Background: Interest and use of blended diets (BD) for young people who are tube fed has significantly increased in the last decade, driven primarily by the desires of motivated caregivers. This review identified, appraised, and synthesised the available evidence on the benefits and complications of BD versus commercial feeds. Methods: A systematic review following PRISMA guidance and registered with PROSPERO was conducted across PubMed, Embase, CINAHL, Scopus and Cochrane up to August 2022. Inclusion criteria: English language studies including (1) children, (2) original research (interventional and observational) and (3) examination of BD outcomes. Exclusion criteria were (1) unoriginal research or case reports, (2) focus on feeding management, preparations, or attitudes and (3) comparing commercial blends only. Data was synthesised using the narrative synthesis approach established by Popay et al, using the Mixed Methods Appraisal Tool. Results: 806 database results were identified and 61 were sought for retrieval. Full text article review revealed 7 eligible, involving 267 participants (age range 9 months - 26 years). Studies reported differences in GI symptoms (n=222), medication use (n=119), growth (n=189), complications or adverse events (n=91). The results indicate towards positive outcomes, particularly in gastrointestinal symptom control with few reports of mild adverse events in the included studies. Conclusion: There is a paucity of data in this area and much heterogeneity in included studies, but the available literature points towards positive outcomes. This is an important and highly relevant topic and more primary research, ideally using standardised reporting, is required to answer the key questions. This article is protected by copyright. All rights reserved.
... Other reconsequences as a driver for their disordered eating[16][17][18] and this, rather than other ARFID presentations (i.e., sensory sensitivity to food characteristics, low appetite/lack of interest in eating), was most common among our patients.This study had several limitations related to its retrospective chart design, and specifically reliance on free-text documentation of diet in notes. Patient self-report of diet can be inaccurate,67 and we were not able to evaluate important factors around exclusion diet use (e.g., the duration of diet, adherence to diet, and motivation for trialing dietary therapy). Exclusion diets in pediatric and adult populations differ because of parental involvement and TA B L E 4 Multivariate analysis of the association of exclusion diet history with ARFID symptom presence a Note: Presenting complaints were selected as a priori covariates. ...
Article
Full-text available
Background Exclusion diets for gastrointestinal symptom management have been hypothesized to be a risk factor for avoidant/restrictive food intake disorder (ARFID; a non‐body image‐based eating disorder). In a retrospective study of pediatric and adult neurogastroenterology patients, we aimed to (1) identify the prevalence and characteristics of an exclusion diet history and (2) evaluate if an exclusion diet history was concurrently associated with the presence of ARFID symptoms. Methods We conducted a chart review of 539 consecutive referrals (ages 6–90, 69% female) to adult (n = 410; January–December 2016) and pediatric (n = 129; January 2016–December 2018) neurogastroenterology clinics. Masked coders (n = 4) retrospectively applied DSM‐5 criteria for ARFID and a separate coder assessed documentation of exclusion diet history. We excluded patients with no documentation of diet in the chart (n = 35) or who were not orally fed (n = 9). Results Of 495 patients included, 194 (39%) had an exclusion diet history, and 118 (24%) had symptoms of ARFID. Of reported diets, dairy‐free was the most frequent (45%), followed by gluten‐free (36%). Where documented, exclusion diets were self‐initiated by patients/parents in 66% of cases, and recommended by gastroenterology providers in 30%. Exclusion diet history was significantly associated with the presence of ARFID symptoms (OR = 3.12[95% CI 1.92–5.14], p < 0.001). Conclusions History of following an exclusion diet was common and was most often patient‐initiated among pediatric and adult neurogastroenterology patients. As patients with self‐reported exclusion diet history were over three times as likely to have ARFID symptoms, providers should be cognizant of this potential association when considering dietary interventions.
... Although a large number of potential confounders, such as cancer type, study design, and region of the study, were adjusted for in most studies, we cannot exclude that some other dietary biologically active components may partly or wholly affect the association. Second, recall bias associated with the assessment methods of chili pepper exposure should be considered because FFQ or frequency-reported questionnaires are subject to measurement errors, which can attenuate or overestimate the observed association (55). Additional limitations related to different cooking and processing methods for chili pepper. ...
Article
Full-text available
Background Stimulating food is emerging as an important modifiable factor in the development of gastrointestinal (GI) tract cancers, but the association between chili pepper consumption and the risk of GI cancers is unclear. We aimed to evaluate the direction and magnitude of the association between chili pepper consumption and the risk of GI cancers. Methods A literature search was performed in PubMed, Embase, and Web of Science databases from inception to 22 December 2021. Observational studies reporting the association between chili pepper consumption and the risk of gastric cancer (GC), esophageal cancer (EC), and/or colorectal cancer (CRC) in adults were eligible for inclusion. Data extraction and quality assessment were conducted independently by two reviewers for the included literature. Summary odds ratios (ORs) and 95% confidence intervals (CIs) were calculated using a random-effects model. Subgroup analyses were also performed based on the cancer type, study design, region of the study, study quality, and adjustments. Results A total of 11,421 studies were screened, and 14 case-control studies were included involving 5009 GI cancers among 11,310 participants. The summary OR showed that high consumption of chili pepper was positively related to the risk of GI cancers (OR = 1.64; 95% CI: 1.00–2.70). A stronger positive relationship was observed between chili pepper consumption and EC risk (OR = 2.71; 95% CI: 1.54–4.75), but there was no statistically significant association between GC and CRC risk. In analyses stratified by geographical location, a positive association was found between chili pepper consumption and the risk of GI cancers in Asian studies (OR = 2.50; 95% CI: 1.23–5.08), African studies (OR = 1.62; 95% CI: 1.04–2.52), and North American studies (OR = 2.61; 95% CI: 1.34–5.08), but an inverse association was seen in South American studies (OR = 0.50; 95% CI: 0.29–0.87) and European studies (OR = 0.30; 95% CI: 0.15–0.61). Conclusion This meta-analysis suggests that chili pepper is a risk factor for certain GI cancers (e.g., EC). Geographical regions influence the risk of GI cancers, especially in Asian, African, and North American populations, which require more attention during dietary guidance. Systematic review registration [https://www.crd.york.ac.uk/PROSPERO/], identifier [CRD42022320670].
... FFQs and other forms of memory-based dietary assessment methods are useful tools in epidemiological studies to understand subjects' dietary intake [49]. Even though the limitation of these assessment methods is acknowledged, SQFFQs remain until nowadays the most used dietary assessment method to study dietary patterns. ...
Article
Full-text available
This study aims to evaluate the reproducibility and validity of a semi-quantitative food frequency questionnaire (SQFFQ) developed for vegetarians and omnivores in Harbin, China. Participants (36 vegetarians and 64 omnivores) administered SQFFQ at baseline (SQFFQ1) and six months later (SQFFQ2) to assess the reproducibility. The 24 h recalls (24 HRs) for three consecutive days were completed between the administrations of two SQFFQs to determine the validity. For reproducibility, Pearson correlation coefficients between SQFFQ1 and SQFFQ2 for vegetarians and omnivores were 0.45~0.88 and 0.44~0.84, respectively. For validity, unadjusted Pearson correlation coefficients were 0.46~0.83 with an average of 0.63 and 0.43~0.86 with an average of 0.61, respectively; energy-adjusted Pearson correlation coefficients were 0.43~0.82 with an average of 0.61 and 0.40~0.85 with an average of 0.59, respectively. Majority of the correlation coefficients for food groups and macronutrients decreased or remained unchanged after energy adjustment. Furthermore, all correlations were statistically significant (p < 0.05). Bland–Altman plots also showed reasonably acceptable agreement between the two methods. In conclusion, the SQFFQ developed in this study has reasonably acceptable reproducibility and validity.
... Advantages to FFQs include scalability, ease of administration, and opportunity for serial measurement. Disadvantages include concerns about reproducibility and systematic errors and biases associated with self-reported data, memory-based measurements, inability to verify or falsify data (161), or objective infant brain measurement (61), and data processing assumptions. ...
Article
Full-text available
Multimodal brain magnetic resonance imaging (MRI) can provide biomarkers of early influences on neurodevelopment such as nutrition, environmental and genetic factors. As the exposure to early influences can be separated from neurodevelopmental outcomes by many months or years, MRI markers can serve as an important intermediate outcome in multivariate analyses of neurodevelopmental determinants. Key to the success of such work are recent advances in data science as well as the growth of relevant data resources. Multimodal MRI assessment of neurodevelopment can be supplemented with other biomarkers of neurodevelopment such as electroencephalograms, magnetoencephalogram, and non-imaging biomarkers. This review focuses on how maternal nutrition impacts infant brain development, with three purposes: (1) to summarize the current knowledge about how nutrition in stages of pregnancy and breastfeeding impact infant brain development; (2) to discuss multimodal MRI and other measures of early neurodevelopment; and (3) to discuss potential opportunities for data science and artificial intelligence to advance precision nutrition. We hope this review can facilitate the collaborative march toward precision nutrition during pregnancy and the first year of life.
... Previous meta-analyses did not consider the differences between prescribed and actual carbohydrate intakes, and only performed a sensitivity analysis restricting the analyses to participants with high adherence to the prescribed diets (8). Although self-reported dietary intakes are subject to measurement error, especially in trials wherein participants are not blinded (105,106), they can present more accurate information about the amounts of carbohydrate intake in the trials than can prescribed data (40). We converted g/d to % calorie, and thereby harmonized the data across trials. ...
Article
Full-text available
Background Carbohydrate restriction is effective for type 2 diabetes management. Objectives We aimed to evaluate the dose-dependent effect of carbohydrate restriction in patients with type 2 diabetes. Methods We systematically searched PubMed, Scopus, and Web of Science to May 2021 for randomized controlled trials evaluating the effect of a carbohydrate-restricted diet (≤45% total calories) in patients with type 2 diabetes. The primary outcome was glycated hemoglobin (HbA1c). Secondary outcomes included fasting plasma glucose (FPG); body weight; serum total, LDL, and HDL cholesterol; triglyceride (TG); and systolic blood pressure (SBP). We performed random-effects dose-response meta-analyses to estimate mean differences (MDs) for a 10% decrease in carbohydrate intake. Results Fifty trials with 4291 patients were identified. At 6 months, compared with a carbohydrate intake between 55%–65% and through a maximum reduction down to 10%, each 10% reduction in carbohydrate intake reduced HbA1c (MD, −0.20%; 95% CI, −0.27% to −0.13%), FPG (MD, −0.34 mmol/L; 95% CI, −0.56 to −0.12 mmol/L), and body weight (MD, −1.44 kg; 95% CI, −1.82 to −1.06 kg). There were also reductions in total cholesterol, LDL cholesterol, TG, and SBP. Levels of HbA1c, FPG, body weight, TG, and SBP decreased linearly with the decrease in carbohydrate intake from 65% to 10%. A U-shaped effect was seen for total cholesterol and LDL cholesterol, with the greatest reduction at 40%. At 12 months, a linear reduction was seen for HbA1c and TG. A U-shaped effect was seen for body weight, with the greatest reduction at 35%. Conclusions Carbohydrate restriction can exert a significant and important reduction on levels of cardiometabolic risk factors in patients with type 2 diabetes. Levels of most cardiometabolic outcomes decreased linearly with the decrease in carbohydrate intake. U-shaped effects were seen for total cholesterol and LDL cholesterol at 6 months and for body weight at 12 months.
... Although controversy exists in the literature relating to the accuracy and validity of self-reported dietary intake as estimated by Food Frequency Questionnaires, 24-h dietary interviews, and dietary records [46,47], the first limitation of our study is the lack of reports the athletes' dietary data. Further, we did not measured stool metabolites and therefore were unable to confirm the effects of any significant increase of intestinal SCFA producers at the end of the training period. ...
Article
Full-text available
Background Physical exercise has favorable effects on the structure of gut microbiota and metabolite production in sedentary subjects. However, little is known whether adjustments in an athletic program impact overall changes of gut microbiome in high-level athletes. We therefore characterized fecal microbiota and serum metabolites in response to a 7-week, high-intensity training program and consumption of probiotic Bryndza cheese. Methods Fecal and blood samples and training logs were collected from young competitive male ( n = 17) and female ( n = 7) swimmers. Fecal microbiota were categorized using specific primers targeting the V1–V3 region of 16S rDNA, and serum metabolites were characterized by NMR-spectroscopic analysis and by multivariate statistical analysis, Spearman rank correlations, and Random Forest models. Results We found higher α-diversity, represented by the Shannon index value (HITB-pre 5.9 [± 0.4]; HITB-post 6.4 [± 0.4], p = 0.007), (HIT-pre 5.5 [± 0.6]; HIT-post 5.9 [± 0.6], p = 0.015), after the end of the training program in both groups independently of Bryndza cheese consumption. However, Lactococcus spp . increased in both groups, with a higher effect in the Bryndza cheese consumers (HITB-pre 0.0021 [± 0.0055]; HITB-post 0.0268 [± 0.0542], p = 0.008), (HIT-pre 0.0014 [± 0.0036]; HIT-post 0.0068 [± 0.0095], p = 0.046). Concomitant with the increase of high-intensity exercise and the resulting increase of anaerobic metabolism proportion, pyruvate ( p [HITB] = 0.003; p [HIT] = 0.000) and lactate ( p [HITB] = 0.000; p [HIT] = 0.030) increased, whereas acetate ( p [HITB] = 0.000; p [HIT] = 0.002) and butyrate ( p [HITB] = 0.091; p [HIT] = 0.019) significantly decreased. Conclusions Together, these data demonstrate a significant effect of high-intensity training (HIT) on both gut microbiota composition and serum energy metabolites. Thus, the combination of intensive athletic training with the use of natural probiotics is beneficial because of the increase in the relative abundance of lactic acid bacteria.
... Limitations to this study include the attrition of the cohort at the 1-year visit (71% follow-up at 1-year). Further, a limitation of this work is that the dietary outcome is based on a self-reported food frequency questionnaire [54]. Sources of measurement error in such instruments have been well-documented. ...
Article
Full-text available
Background: Understanding the impact of maternal health behaviours and social conditions on childhood nutrition is important to inform strategies to promote health during childhood. Objective: To describe how maternal health sociodemographic factors (e.g., socioeconomic status, education), health behaviours (e.g., diet), and traditional health care use during pregnancy impact infant diet at age 1-year. Methods: Data were collected from the Indigenous Birth Cohort (ABC) study, a prospective birth cohort formed in partnership with an Indigenous community-based Birthing Centre in southwestern Ontario, Canada. 110 mother-infant dyads are included in the study and were enrolled between 2012 and 2017. Multiple linear regression analyses were performed to understand factors associated with infant diet scores at age 1-year, with a higher score indicating a diet with more healthy foods. Results: The mean age of women enrolled during pregnancy was 27.3 (5.9) years. Eighty percent of mothers had low or moderate social disadvantage, 47.3% completed more than high school education, and 70% were cared for by a midwife during their pregnancy. The pre-pregnancy body mass index (BMI) was <25 in 34.5% of women, 15.5% of mothers smoked during pregnancy, and 14.5% of mothers had gestational diabetes. Being cared for by an Indigenous midwife was associated with a 0.9-point higher infant diet score (p = 0.001) at age 1-year, and lower maternal social disadvantage was associated with a 0.17-point higher infant diet quality score (p = 0.04). Conclusion: This study highlights the positive impact of health care provision by Indigenous midwives and confirms that higher maternal social advantage has a positive impact on child nutrition.
... However, others have observed similar perceptions of fasting appetite [10], and three-day self-reported energy intake [11] following programs of HIIT compared with MICT, and a recent meta-analysis of the effect of interval training on energy intake revealed no significant differences in energy intake following varying interventions of HIIT or SIT and MICT [12]. Importantly, all but one of the 16 studies included in this analysis relied on self-report measures of food intake, such as food diaries or food frequency questionnaires, which may provide erroneous and/or biased results [13], particularly given that participants in many of the included studies were instructed to maintain their habitual food consumption. The heterogeneity of energy intake assessment, together with the varied interval training protocols studied, suggests that conclusions about the efficacy of interval training protocols to influence appetite and food choices may be premature. ...
Article
Full-text available
An acute bout of sprint interval training (SIT) performed with psychological need-support incorporating autonomy, competence, and relatedness has been shown to attenuate energy intake at the post-exercise meal, but the long-term effects are not known. The aim of this trial was to investigate the effects of 12 weeks of SIT combined with need-support on post-exercise food consumption. Thirty-six physically inactive participants with overweight and obesity (BMI: 29.6 ± 3.8 kg·m−2; V˙O2peak 20.8 ± 4.1 mL·kg−1·min−1) completed three sessions per week of SIT (alternating cycling for 15 s at 170% V˙O2peak and 60 s at 32% V˙O2peak) with need-support or traditional moderate-intensity continuous training (MICT) without need-support (continuous cycling at 60% V˙O2peak). Assessments of appetite, appetite-related hormones, and ad libitum energy intake in response to acute exercise were conducted pre- and post-intervention. Fasting appetite and blood concentrations of active ghrelin, leptin, and insulin did not significantly differ between groups or following the training. Post-exercise energy intake from snacks decreased significantly from pre- (807 ± 550 kJ) to post- SIT (422 ± 468 kJ; p < 0.05) but remained unaltered following MICT. SIT with psychological need-support appears well-tolerated in a physically inactive population with overweight and offers an alternative to traditional exercise prescription where dietary intake is of concern.
... There has been some suggestion that obesity is the result of declining activity levels, but taking into account that heavier individuals require greater amounts of energy to sustain and move their bodyweight, the reduction in activity required to explain the rise in obesity is also too large to be plausible (Millward, 2013). This supports previous authors who question the validity of self-reported EI (Dhurandhar et al., 2015;Archer, Lavie and Hill, 2018;Archer, Marlow and Lavie, 2018). ...
Article
Full-text available
Objective: The aim of this study was to assess the extent of misreporting in obese and nonobese adults on an absolute, ratio-scaled, and allometrically-scaled basis. Method: Self-reported daily energy intake (EI) was compared with total energy expenditure (TEE) in 221 adults (106 male, 115 female; age 53 ± 17 years, stature 1.68 ± 0.09 m, mass 79.8 ± 17.2 kg) who participated in a doubly-labeled water (DLW) subsection of 2013-2015 National Diet and Nutrition Survey. Data were log transformed and expressed as absolute values, according to simple ratio-standards (per kg body mass) and adjusted for body mass allometrically. Absolute and ratio-scaled misreporting were examined using full-factorial General Linear Models with repeated measures of the natural logarithms of TEE or EI as the within-subjects factor. The natural logarithm of body mass was included as a covariate in the allometric method. The categorical variables of gender, age, obesity, and physical activity level (PAL) were the between-factor variables. Results: On an absolute-basis, self-reported EI (2759 ± 590 kcal·d-1 ) was significantly lower than TEE measured by DLW (2759 ± 590 kcal·d-1 : F1,205 = 598.81, p < .001, ηp 2 =0.75). We identified significantly greater underreporting in individuals with an obese BMI (F1,205 = 29.01, p <.001, ηp 2 =0.12), in more active individuals (PAL > 1.75; F1,205 = 34.15, p <.001, ηp 2 =0.14) and in younger individuals (≤55 years; F1,205 = 14.82, p < .001, ηp 2 =0.07), which are all categories with higher energy needs. Ratio-scaling data reduced the effect sizes. Allometric-scaling removed the effect of body mass (F1,205 =0.02, p = 0.887, ηp 2 =0.00). Conclusion: In weight-stable adults, obese individuals do not underreport dietary intake to a greater extent than nonobese individuals. These results contradict previous research demonstrating that obesity is associated with a greater degree of underreporting.
... As limitations, we assessed dietary intake using dietary surveys (FFQ), a method that possesses limitations [69]. However, we have used this instrument previously, and it has been validated in this population with the focus on dietary sources of fat and fatty acid intake [31,37,41]. ...
Article
Full-text available
Obesity during pregnancy is a worrying public health problem worldwide. Maternal diet is critical for fatty acid (FA) placental transport and FA content in breast milk (BM). We evaluated FA composition in erythrocytes phospholipids (EP) and BM in pregnant women with (OBE, n = 30) and without (non-OBE, n = 31) obesity. Sixty-one healthy women were evaluated at their 20–24th gestational week and followed until 6th month of lactation. Diet was evaluated through a food frequency questionnaire. FA composition of EP and BM was assessed by gas-liquid chromatography. The OBE group showed lower diet quality, but total n-6 and n-3 polyunsaturated FA (PUFA), ALA, EPA, and DHA dietary intake was similar between groups. N-3 PUFA, ALA, DHA, and the n-6/n-3 PUFA ratio in EP were lower at the 6th lactation month in the OBE group. In BM, the arachidonic acid (AA) concentration was lower at the end of the lactation, and DHA content showed an earlier and constant decline in the OBE group compared to the non-OBE group. In conclusion, n-3 PUFA and AA and DHA levels were reduced in EP and BM in pregnant women with obesity. Strategies to increase n-3 PUFA are urgently needed during pregnancy and lactation, particularly in women with obesity.
... Studies were prone to inherent errors from portion size estimation, seasonal variations and recall bias due to score calculations being based on self-reported dietary intake (6,8, . Memory-based tools such as FFQ and 24-h recalls have been cited for misreporting dietary intake as they report on participants' perceived intake rather than the actual intake (52) . This was somewhat accounted for by studies via adjustments in their statistical models, such as excluding participants with an unreasonably high or low energy intake though they have been criticised for alteration of data (53) . ...
Article
Diet quality indices (DQIs) are tools used to evaluate the overall diet quality against dietary guidelines or known healthy dietary patterns. This review aimed to evaluate DQIs and their validation processes to facilitate decision-making in the selection of appropriate DQIs for use in Australian contexts. A search of CINAHL, PubMed and Scopus electronic databases was conducted for studies published between January 2010 – May 2020, which validated a DQI, measuring >1 dimension of diet quality (adequacy, balance, moderation, variety) and was applicable to the Australian context. Data on constructs, scoring, weighting and validation methods (construct validity, criterion validity, reliability and reproducibility) were extracted and summarised. The quality of the validation process was evaluated using COSMIN Risk of Bias and Joanna Briggs Appraisal checklists. The review identified 27 indices measuring adherence to: national dietary guidelines (n=13), Mediterranean diet (n=8), and specific population recommendations and chronic disease risk (n=6). Extensiveness of the validation process varied widely across and within categories. Construct validity was the most strongly assessed measurement property, while evaluation of measurement error was frequently inadequate. DQIs should capture multiple dimensions of diet quality, possess a reliable scoring system, and demonstrate adequate evidence in their validation framework to support use in the intended context. Researchers need to understand the limitations of newly developed DQIs and interpret results in view of the validation evidence. Future research on DQIs is indicated to improve evaluation of measurement error, reproducibility and reliability.
... Future studies should incorporate measurement of peripheral concentrations of SCFAs in addition to fecal sampling (81). Third, fiber intake was based on food recalls, and it is well known that self-reporting assessment tools can be inherently biased (82). The use of self-reported measures (i.e., fiber intake) when analyzing results may have caused residual confounding. ...
Article
Background The gut microbiome has been associated with cardiorespiratory fitness. Objective To assess the effects of oligofructose (FOS)-enriched inulin supplementation on the gut microbiome and the peak oxygen uptake (V̇O2peak) response to high-intensity interval training (HIIT). Methods The study was a randomized controlled trial. Forty sedentary and apparently healthy adults (n = 31 females; age = 31.8±9.8 years, BMI = 25.9±4.3 kg·m−2) were randomly allocated to: i) six weeks of supervised HIIT (4×4 min bouts at 85–95% HRpeak, interspersed with 3 min of active recovery, 3·week−1) + 12 g·day−1 of FOS-enriched inulin (HIIT-I) or ii) six weeks of supervised HIIT (3·week−1, 4×4 min bouts) + 12 g·day−1 of maltodextrin/placebo (HIIT-P). Each participant completed an incremental treadmill test to assess V̇O2peak and ventilatory thresholds (VTs), provided a stool and blood sample, and completed a 24-hour diet recall and food frequency questionnaire before and after the intervention. Gut microbiome analyses were performed using metagenomic sequencing. Fecal short-chain fatty acids were measured by mass spectrometry. Results There were no differences in the mean change in V̇O2peak response between groups (P = 0.58). HIIT-I had a greater improvement in VTs than HIIT-P (VT1 - lactate accumulation: mean difference +4.3% and VT2 – lactate threshold: +4.2%, P<0.05). HIIT-I had a greater increase in the abundance of Bifidobacterium taxa (False Discovery Rate (FDR) <0.05) and several metabolic processes related to exercise capacity (FDR <0.05). Exploratory analysis of merged data found participants with a greater response to HIIT (V̇O2peak ≥3.5mL·kg−¹·min−¹) had a 2.2-fold greater mean abundance of gellan degradation pathways (FDR <0.05) and a greater, but not significant, abundance of B. Uniformis spp. (P<0.00023, FDR = 0.08). Conclusions FOS-enriched inulin supplementation did not potentiate HIIT-induced improvements in V̇O2peak, but led to gut microbiome changes possibly associated with greater ventilatory threshold improvements in healthy inactive adults. Gellan degradation pathways and B.uniformis spp. were associated with greater V̇O2peak responses to HIIT. Clinical Trials Register ACTRN12618000501246.
... 94 Dietary records and diaries can usually be considered more valid than recall-based methods, although all self-reported methods suffer from serious limitations. 47 48 95 96 The validity of food frequency questionnaires and other recall-based methods also depends on the results of validation studies. 48 A questionnaire may be sufficiently valid for some exposures and may not have been validated or may not be valid for other exposures and so review authors should look for results of validation studies specific to the exposure being investigated. ...
Article
Full-text available
Background An essential component of systematic reviews is the assessment of risk of bias. To date, there has been no investigation of how reviews of non-randomised studies of nutritional exposures (called ‘nutritional epidemiologic studies’) assess risk of bias. Objective To describe methods for the assessment of risk of bias in reviews of nutritional epidemiologic studies. Methods We searched MEDLINE, EMBASE and the Cochrane Database of Systematic Reviews (Jan 2018–Aug 2019) and sampled 150 systematic reviews of nutritional epidemiologic studies. Results Most reviews (n=131/150; 87.3%) attempted to assess risk of bias. Commonly used tools neglected to address all important sources of bias, such as selective reporting (n=25/28; 89.3%), and frequently included constructs unrelated to risk of bias, such as reporting (n=14/28; 50.0%). Most reviews (n=66/101; 65.3%) did not incorporate risk of bias in the synthesis. While more than half of reviews considered biases due to confounding and misclassification of the exposure in their interpretation of findings, other biases, such as selective reporting, were rarely considered (n=1/150; 0.7%). Conclusion Reviews of nutritional epidemiologic studies have important limitations in their assessment of risk of bias.
... However, the reliability of these questionnaires has been challenged (e.g. Archer et al., 2018) and it is pertinent to note that measures of dietary intake that rely on memory for what has been consumed may lead to issues of circularity when considering the impact of diet on memory, and particularly on memory for what has been consumed. Studies of "whole diets" or the frequency of particular foods (e.g., sugar-sweetened beverages or "junk food") in the diet have identified a number of consistent patterns (Kim and Kang, 2017;Muñoz-García et al., 2020;Wiles et al., 2009). ...
Article
This paper reviews evidence demonstrating a bidirectional relationship between memory and eating in humans and rodents. In humans, amnesia is associated with impaired processing of hunger and satiety cues, disrupted memory of recent meals, and overconsumption. In healthy participants, meal-related memory limits subsequent ingestive behavior and obesity is associated with impaired memory and disturbances in the hippocampus. Evidence from rodents suggests that dorsal hippocampal neural activity contributes to the ability of meal-related memory to control future intake, that endocrine and neuropeptide systems act in the ventral hippocampus to provide cues regarding energy status and regulate learned aspects of eating, and that consumption of hypercaloric diets and obesity disrupt these processes. Collectively, this evidence indicates that diet-induced obesity may be caused and/or maintained, at least in part, by a vicious cycle wherein excess intake disrupts hippocampal functioning, which further increases intake. This perspective may advance our understanding of how the brain controls eating, the neural mechanisms that contribute to eating-related disorders, and identify how to treat diet-induced obesity.
... We were not able to confirm the associations between serum AGP and diet in multiple regression, which may indicate indirect relationships or may be associated with a limited number of studied patients. Data on dietary intake of nutrients were obtained via 24 h recall, which is prone to distortions [52,53]. However, in the studied group of patients we were not able to measure the concentration of each nutrient by means of adequate laboratory tests. ...
Article
Full-text available
Management of end-stage renal disease (ESRD) patients requires monitoring each of the components of malnutrition–inflammation–atherosclerosis (MIA) syndrome. Restrictive diet can negatively affect nutritional status and inflammation. An acute-phase protein—α1-acid glycoprotein (AGP), has been associated with energy metabolism in animal and human studies. The aim of our study was to look for a relationship between serum AGP concentrations, laboratory parameters, and nutrient intake in ESRD patients. The study included 59 patients treated with maintenance hemodialysis. A 24 h recall assessed dietary intake during four non-consecutive days—two days in the post-summer period, and two post-winter. Selected laboratory tests were performed: complete blood count, serum iron, total iron biding capacity (TIBC) and unsaturated iron biding capacity (UIBC), vitamin D, AGP, C-reactive protein (CRP), albumin, prealbumin, and phosphate–calcium metabolism markers (intact parathyroid hormone, calcium, phosphate). Recorded dietary intake was highly deficient. A majority of patients did not meet recommended daily requirements for energy, protein, fiber, iron, magnesium, folate, and vitamin D. AGP correlated positively with CRP (R = 0.66), platelets (R = 0.29), and negatively with iron (R = −0.27) and TIBC (R = −0.30). AGP correlated negatively with the dietary intake of plant protein (R = −0.40), potassium (R = −0.27), copper (R = −0.30), vitamin B6 (R = −0.27), and folates (R = −0.27), p < 0.05. However, in multiple regression adjusted for confounders, only CRP was significantly associated with AGP. Our results indicate that in hemodialyzed patients, serum AGP is weakly associated with dietary intake of several nutrients, including plant protein.
Article
This systematic review and meta‐analysis synthesized evidence pertaining to consummatory and appetitive responses to acute exercise in children and adolescents with and without obesity (5–18 years). Articles reporting on supervised, controlled trials of any modality, duration, or intensity with laboratory‐measured food intake were found using MEDLINE, EMBASE, and Cochrane up to July 2023. Differences between conditions in laboratory energy and macronutrient intake, appetite sensations, and food reward were quantitatively synthesized using random‐effects meta‐analyses. Thirty‐five studies were eligible for the systematic review of energy intake, consisting of 60 distinct intervention arms with lean ( n = 374) and overweight/obesity participants ( n = 325; k = 51 eligible for meta‐analysis). Study quality as indicated by the Effective Public Healthy Practice Project tool was rated as low and moderate risk of bias for 80% and 20% of studies, respectively. Acute exercise had no significant effect on energy intake during an ad libitum test meal (mean difference [MD] = −4.52 [−30.58, 21.54] kcal, p = .729). Whilst absolute carbohydrate intake was lower after exercise (23 arms; MD = −6.08 [−11.26, −0.91] g, p = .023), the proportion of carbohydrate was not (30 arms; MD = −0.62 [−3.36, 2.12] %, p = .647). A small elevation in hunger (27 arms; MD = 4.56 [0.75, 8.37] mm, p = .021) and prospective food consumption (27 arms; PFC; MD = 5.71 [1.62, 9.80] mm, p = .008) was observed post‐exercise, but not immediately prior to the test meal (Interval: Mdn = 30 min, Range = 0–180). Conversely, a modest decrease in explicit wanting for high‐fat foods was evident after exercise (10 arms; MD = −2.22 [−3.96, −0.47] mm, p = .019). Exercise intensity ( p = .033) and duration ( p = .013) moderated food intake only in youth with overweight/obesity, indicating lower intake at high intensity and short duration. Overall, acute exercise does not lead to compensation of energy intake or a meaningful elevation of appetite or food reward and might have a modest benefit in youth with overweight/obesity if sufficiently intense. However, conclusions are limited by substantial methodological heterogeneity and the small number of trials employing high‐intensity exercise, especially in youth with overweight/obesity.
Article
Full-text available
Personalized nutrition (PN) represents a transformative approach in dietary science, where individual genetic profiles guide tailored dietary recommendations, thereby optimizing health outcomes and managing chronic diseases more effectively. This review synthesizes key aspects of PN, emphasizing the genetic basis of dietary responses, contemporary research, and practical applications. We explore how individual genetic differences influence dietary metabolisms, thus underscoring the importance of nutrigenomics in developing personalized dietary guidelines. Current research in PN highlights significant gene–diet interactions that affect various conditions, including obesity and diabetes, suggesting that dietary interventions could be more precise and beneficial if they are customized to genetic profiles. Moreover, we discuss practical implementations of PN, including technological advancements in genetic testing that enable real-time dietary customization. Looking forward, this review identifies the robust integration of bioinformatics and genomics as critical for advancing PN. We advocate for multidisciplinary research to overcome current challenges, such as data privacy and ethical concerns associated with genetic testing. The future of PN lies in broader adoption across health and wellness sectors, promising significant advancements in public health and personalized medicine.
Article
Full-text available
Personalised nutrition (PN) has emerged over the past twenty years as a promising area of research in the postgenomic era and has been popularized as the new big thing out of molecular biology. Advocates of PN claim that previous approaches to nutrition sought general and universal guidance that applied to all people. In contrast, they contend that PN operates with the principle that “one size does not fit all” when it comes to dietary guidance. While the molecular mechanisms studied within PN are new, the notion of a personal dietary regime guided by medical advice has a much longer history that can be traced back to Galen’s “On Food and Diet” or Ibn Sina’s (westernized as Avicenna) “Canon of Medicine”. Yet this history is either wholly ignored or misleadingly appropriated by PN proponents. This (mis)use of history, we argue helps to sustain the hype of the novelty of the proposed field and potential commodification of molecular advice that undermines longer histories of food management in premodern and non-Western cultures. Moreover, it elides how the longer history of nutritional advice always happened in a heavily moralized, gendered, and racialized context deeply entwined with collective technologies of power, not just individual advice. This article aims at offering a wider appreciation of this longer history to nuance the hype and exceptionalism surrounding contemporary claims.
Article
Full-text available
Healthy dietary patterns such as the Mediterranean diet (MeDi), DASH and MIND have been evaluated for their potential association with health outcomes. However, the lack of standardisation in scoring methodologies can hinder reproducibility and meaningful cross-study comparisons. Here we provide a reproducible workflow for generating the MeDi, DASH, and MIND dietary pattern scores from frequently used dietary assessment tools including the 24-hour recall tool and two variations of food frequency questionnaires. Subjective aspects of the scoring process are highlighted and have led to a recommended reporting checklist. This checklist enables standardised reporting with sufficient detail to enhance the reproducibility and comparability of their outcomes. In addition to these aims, valuable insights in the strengths and limitations of each assessment tool for scoring the MeDi, DASH and MIND diet can be utilised by researchers and clinicians to determine which dietary assessment tool best meets their needs.
Article
Full-text available
The objective of this scoping review was to examine the breadth of the existing literature on the relation between meat consumption or meat abstention and positive psychological functioning. In April 2022, we conducted a systematic search of online databases (PubMed, PsycINFO, CINAHL Plus, Medline, Cochrane Library, and Web of Science) for primary research examining positive psychological functioning in meat consumers and those who abstain from meat. Thirteen studies met the inclusion/exclusion criteria, representing 89,138 participants (54,413 females and 33,863 males) with 78,562 meat consumers and 10,148 meat abstainers (13–102 years) from multiple geographic regions. The primary outcomes were life satisfaction, “positive mental health”, self-esteem, and vigor. The secondary outcomes were “meaning in life”, optimism, positive emotions, and psychological well-being. Eight of the 13 studies demonstrated no differences between the groups on positive psychological functioning, three studies showed mixed results, and two studies showed that compared to meat abstainers, meat consumers had greater self-esteem, “positive mental health”, and “meaning in life”. Studies varied substantially in methods and outcomes. Although a small minority of studies showed that meat consumers had more positive psychological functioning, no studies suggested that meat abstainers did. There was mixed evidence for temporal relations, but study designs precluded causal inferences. Our review demonstrates the need for future research given the equivocal nature of the extant literature on the relation between meat consumption and meat abstention and positive psychological functioning.
Article
Introduction: There is a growing interest in vegetarian and vegan diets, but both can potentially affect tissue fatty acids (FA) composition. We aimed to evaluate the effect of vegetarian diets on plasma, erythrocytes, and sperm n-3 polyunsaturated fatty acids (n-3 PUFA) status in healthy young men. Methods: Four groups were studied: i) men consuming a regular omnivore diet (OMV-1, n = 35); ii) men consuming an omnivore diet but excluding fish and seafood (OMV-2, n = 34); iii) men consuming a pescetarian diet (including dairy, eggs, fish, and seafood) (PESC, n = 36); and iv) men following a strict vegan diet (VEG, n = 35). Participants in each group should follow their diet for at least the previous 12 months. Diet evaluation used a structured validated food frequency questionnaire. FA composition was measured in plasma, erythrocyte phospho-lipids, and spermatozoa by gas-liquid chromatography, expressed as a mole percentage of the total FA content. Results: Main findings showed higher alpha-linolenic fatty acid (ALA) and total n-3 PUFA dietary intake in the VEG group. In plasma, arachidonic and eicosapentaenoic acids were higher in OMV and PESC groups, whereas docosahexaenoic acid (DHA) level was lower in VEG. Higher ALA, but reduced DHA and total n-3 PUFA levels were found in erythrocytes and spermatozoa in the VEG group. Conclusion: Higher dietary ALA intake was found in pescetarians and vegan men. However, the higher ALA intake was not reflected in higher DHA content in the evaluated tissues. PUFA assessment, with particular emphasis in DHA, are necessary to improve PUFA status in vegan men.
Article
Full-text available
Nutrition science has been criticised for its methodology, apparently contradictory findings and generating controversy rather than consensus. However, while certain critiques of the field are valid and informative for developing a more cogent science, there are also unique considerations for the study of diet and nutrition that are either overlooked or omitted in these discourses. The ongoing critical discourse on the utility of nutrition sciences occurs at a time when the burden of non-communicable cardiometabolic disease continues to rise in the population. Nutrition science, along with other disciplinary fields, is tasked with producing a translational evidence-base fit for the purpose of improving population and individual health and reducing disease risk. Thus, an exploration of the unique methodological and epistemic considerations for nutrition research is important for nutrition researchers, students and practitioners, to further develop an improved scientific discipline for nutrition. This paper will expand on some of the challenges facing nutrition research, discussing methodological facets of nutritional epidemiology, randomised controlled trials and meta-analysis, and how these considerations may be applied to improve research methodology. A pragmatic research paradigm for nutrition science is also proposed, which places methodology at its centre, allowing for questions over both how we obtain knowledge and research design as the method to produce that knowledge to be connected, providing the field of nutrition research with a framework within which to capture the full complexity of nutrition and diet.
Article
Several epidemiological studies have investigated the association between sugar intake, the levels of systolic blood pressure (SBP) and diastolic blood pressure (DBP) and the risk of hypertension, but findings have been inconsistent. We carried out a systematic review and meta-analysis of observational studies to examine the associations between sugar intake, hypertension risk, and BP levels. Articles published up to February 2, 2021 were sourced through PubMed, EMBASE and Web of Science. Pooled relative risks (RRs) and 95% confidence intervals (CIs) were estimated using a fixed- or random-effects model. Restricted cubic splines were used to evaluate dose-response associations. Overall, 35 studies were included in the present meta-analysis (23 for hypertension and 12 for BP). Sugar-sweetened beverages (SSBs) and artificially sweetened beverages (ASBs) were positively associated with hypertension risk: 1.26 (95% CI, 1.15-1.37) and 1.10 (1.07-1.13) per 250-g/day increment, respectively. For SBP, only SSBs were significant with a pooled β value of 0.24 mmHg (95% CI, 0.12-0.36) per 250 g increase. Fructose, sucrose, and added sugar, however, were shown to be associated with elevated DBP with 0.83 mmHg (0.07-1.59), 1.10 mmHg (0.12-2.08), and 5.15 mmHg (0.09-10.21), respectively. Current evidence supports the harmful effects of sugar intake for hypertension and BP level, especially SSBs, ASBs, and total sugar intake.
Article
Full-text available
Zusammenfassung Kritisch kranke Patienten leiden häufig unter einer komplexen und schwerwiegenden immunologischen Dysfunktion. Die Differenzierung und Funktion von Immunzellen werden maßgeblich durch metabolische Prozesse gesteuert. Neue immunonutritive Konzepte versuchen daher, die Immunfunktionen intensivmedizinischer Patienten über enterale und parenterale Ernährung positiv zu beeinflussen. Die vorliegende Übersichtsarbeit präsentiert kondensiert die verfügbare Evidenz zu den gängigen isolierten Supplementen (antioxidative Substanzen, Aminosäuren, essenzielle Fettsäuren) und die damit verbundenen Problematiken. Im zweiten Teil werden sich daraus ergebende neuartige und umfassendere Konzepte der Immunonutrition zur Beeinflussung des intestinalen Mikrobioms und zur Modulation der Makronährstoffkomposition vorgestellt. Die Immunonutrition des kritisch kranken Patienten hat enormes Potenzial und kann sich zukünftig zu einem wertvollen klinischen Tool zur Modulation des Immunmetabolismus intensivmedizinischer Patienten entwickeln.
Article
Background and aims: This study was designed to obtain daily weighed food intake of participants engaged in Alternate Day Feeding (ADF). Prior ADF studies have used self-reported food intake, a method that has received criticism for its limited accuracy. Subjects and methods: Forty-nine university students received academic credit for participating in the study. Following a 10-day baseline period, participants underwent ADF for the next 8 days. Restricted daily intake to ∼ 75% of baseline food intake levels was followed by ad libitum intake on alternate days. Food intake was weighed before and after each meal. Daily body weight was also tracked. Intervention: After the baseline period, participants underwent 8 days of ADF during which they consumed ∼ 75% of baseline energy intake by weight followed by ad libitum intake on alternate days. The trial concluded with 2 additional days of ad libitum feeding, for a total study duration of 10 days. Results: Daily food intake was constant during the baseline period (slope = -0.93 g/d, p = 0.56), and did not differ significantly (995 g ( 95% CI [752, 1198]) from the total consumed on ad libitum ADF days (951 g (95% CI [777, 1227]). Intake on ad libitum days did not show a trend to increase during the intervention. Body weight declined significantly during ADF. Conclusions: ADF produces significant weight loss because food intake does not increase on ad libitum feeding days to compensate for reduced intake on restricted energy days. Data are consistent with prior work that suggests humans do not fully compensate for imposed deficits in energy intake.
Article
Dietary assessment, one of the four basic components of nutritional assessment, is required yet difficult to correctly measure in order to establish accurate links with health outcomes. The current systematic review was conducted to study the effectiveness and limitations of various types of dietary assessment tools as well as to pool qualitative evidence on comparing manual and digital versions of the same tool. Studies published between 2010-June 2021 comparing various types and/ or versions of dietary data collection tools by reporting at least one outcome variable (reliability, feasibility, validity and acceptability) were searched from Pubmed, advanced search in Google scholar, and Cochrane library. A total of 33 studies were considered eligible for final analyses. All studies showed a good agreement between the tested dietary assessment tool and the gold standard adopted (n=25). The studies conducted to find out feasibility and acceptability of digital versions of tools also showed positive results (n=9). Technological advancement in dietary assessment can offer promising outcomes in research as well as clinical settings.
Article
Full-text available
Eating behavior problems are characteristic of children with autism spectrum disorders (ASD) with a highly restricted range of food choices, which may pose an associated risk of nutritional problems. Hence, detailed knowledge of the dietary patterns (DPs) and nutrient intakes of ASD patients is necessary to carry out intervention strategies if required. The present study aimed to determine the DPs and macro-and micronutrient intakes in a sample of Spanish preschool children with ASD compared to typically developing control children. Fifty-four children with ASD (two to six years of age) diagnosed with ASD according to the Diagnostic Manual-5 criteria), and a control group of 57 typically developing children of similar ages were recruited. A validated food frequency questionnaire was used, and the intake of energy and nutrients was estimated through three non-consecutive 24-h dietary registrations. DPs were assessed using principal component analysis and hierarchical clustering analysis. Children with ASD exhibited a DP characterized by high energy and fat intakes and a low intake of vegetables and fruits. Likewise, meat intake of any type, both lean and fatty, was associated with higher consumption of fish and dietary fat. Furthermore, the increased consumption of dairy products was associated with increased consumption of cereals and pasta. In addition, they had frequent consumption of manufactured products with poor nutritional quality, e.g., beverages, sweets, snacks and bakery products. The percentages of children with ASD complying with the adequacy of nutrient intakes were higher for energy, saturated fat, calcium, and vitamin C, and lower for iron, iodine, and vitamins of group B when compared with control children. In conclusion, this study emphasizes the need to assess the DPs and nutrient intakes of children with ASD to correct their alterations and discard some potential nutritional diseases.
Article
Full-text available
The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author.
Article
Full-text available
Animal-source foods (ASF) have the potential to enhance the nutritional adequacy of cereal-based diets in low- and middle-income countries, through the provision of high-quality protein and bioavailable micronutrients. The development of guidelines for including ASF in local diets requires an understanding of the nutrient content of available resources. This article reviews food composition tables (FCT) used in sub-Saharan Africa, examining the spectrum of ASF reported and exploring data sources for each reference. Compositional data are shown to be derived from a small number of existing data sets from analyses conducted largely in high-income nations, often many decades previously. There are limitations in using such values, which represent the products of intensively raised animals of commercial breeds, as a reference in resource-poor settings where indigenous breed livestock are commonly reared in low-input production systems, on mineral-deficient soils and not receiving nutritionally balanced feed. The FCT examined also revealed a lack of data on the full spectrum of ASF, including offal and wild foods, which correspond to local food preferences and represent valuable dietary resources in food-deficient settings. Using poultry products as an example, comparisons are made between compositional data from three high-income nations, and potential implications of differences in the published values for micronutrients of public health significance, including Fe, folate and vitamin A, are discussed. It is important that those working on nutritional interventions and on developing dietary recommendations for resource-poor settings understand the limitations of current food composition data and that opportunities to improve existing resources are more actively explored and supported.
Article
Full-text available
Background: Poor lifestyle behaviors are leading causes of preventable diseases globally. Added sugars contribute to a diet that is energy dense but nutrient poor and increase risk of developing obesity, cardiovascular disease, hypertension, obesity-related cancers, and dental caries. Methods and results: For this American Heart Association scientific statement, the writing group reviewed and graded the current scientific evidence for studies examining the cardiovascular health effects of added sugars on children. The available literature was subdivided into 5 broad subareas: effects on blood pressure, lipids, insulin resistance and diabetes mellitus, nonalcoholic fatty liver disease, and obesity. Conclusions: Associations between added sugars and increased cardiovascular disease risk factors among US children are present at levels far below current consumption levels. Strong evidence supports the association of added sugars with increased cardiovascular disease risk in children through increased energy intake, increased adiposity, and dyslipidemia. The committee found that it is reasonable to recommend that children consume ≤25 g (100 cal or ≈6 teaspoons) of added sugars per day and to avoid added sugars for children <2 years of age. Although added sugars most likely can be safely consumed in low amounts as part of a healthy diet, few children achieve such levels, making this an important public health target.
Article
Full-text available
This paper describes the Observing Protein and Energy Nutrition (OPEN) Study, conducted from September 1999 to March 2000. The purpose of the study was to assess dietary measurement error using two self-reported dietary instruments-the food frequency questionnaire (FFQ) and the 24-hour dietary recall (24HR)-and unbiased biomarkers of energy and protein intakes: doubly labeled water and urinary nitrogen. Participants were 484 men and women aged 40-69 years from Montgomery County, Maryland. Nine percent of men and 7% of women were defined as underreporters of both energy and protein intake on 24HRs; for FFQs, the comparable values were 35% for men and 23% for women. On average, men underreported energy intake compared with total energy expenditure by 12-14% on 24HRs and 31-36% on FFQs and underreported protein intake compared with a protein biomarker by 11-12% on 24HRs and 30-34% on FFQs. Women underreported energy intake on 24HRs by 16-20% and on FFQs by 34-38% and underreported protein intake by 11-15% on 24HRs and 27-32% on FFQs. There was little underreporting of the percentage of energy from protein for men or women. These findings have important implications for nutritional epidemiology and dietary surveillance.
Article
Full-text available
NHANES is the cornerstone for national nutrition monitoring to inform nutrition and health policy. Nutritional assessment in NHANES is described with a focus on dietary data collection, analysis, and uses in nutrition monitoring. NHANES has been collecting thorough data on diet, nutritional status, and chronic disease in cross-sectional surveys with nationally representative samples since the early 1970s. Continuous data collection began in 1999 with public data release in 2-y cycles on ∼10,000 participants. In 2002, the Continuing Survey of Food Intakes by Individuals and the NHANES dietary component were merged, forming a consolidated dietary data collection known as What We Eat in America; since then, 24-h recalls have been collected on 2 d using the USDA's Automated Multiple-Pass Method. Detailed and targeted food-frequency questionnaires have been collected in some NHANES cycles. Dietary supplement use data have been collected (in detail since 2007) so that total nutrient intakes can be described for the population. The continuous NHANES can adapt its content to address emerging public health needs and reflect federal priorities. Changes in data collection methods are made after expert input and validation/crossover studies. NHANES dietary data are used to describe intake of foods, nutrients, food groups, and dietary patterns by the US population and large sociodemographic groups to plan and evaluate nutrition programs and policies. Usual dietary intake distributions can be estimated after adjusting for day-to-day variation. NHANES remains open and flexible to incorporate improvements while maintaining data quality and providing timely data to track the nation's nutrition and health status. In summary, NHANES collects dietary data in the context of its broad, multipurpose goals; the strengths and limitations of these data are also discussed in this review.
Article
Full-text available
Recent reports have asserted that, because of energy underreporting, dietary self-report data suffer from measurement error so great that findings that rely on them are of no value. This commentary considers the amassed evidence that shows that self-report dietary intake data can successfully be used to inform dietary guidance and public health policy. Topics discussed include what is known and what can be done about the measurement error inherent in data collected by using self-report dietary assessment instruments and the extent and magnitude of underreporting energy vs. other nutrients and food groups. Also discussed is the overall impact of energy underreporting on dietary surveillance and nutritional epidemiology. In conclusion, 7 specific recommendations for collecting, analyzing, and interpreting self-report dietary data are provided: 1) continue to collect self-report dietary intake data because they contain valuable, rich, and critical information about foods and beverages consumed by populations that can be used to inform nutrition policy and assess diet-disease associations; 2) do not use self-reported energy intake as a measure of true energy intake; 3) do use self-reported energy intake for energy adjustment of other self-reported dietary constituents to improve risk estimation in studies of diet-health associations; 4) acknowledge the limitations of self-report dietary data and analyze and interpret them appropriately; 5) design studies and conduct analyses that allow adjustment for measurement error; 6) design new epidemiologic studies to collect dietary data from both short-term (recalls or food records) and long-term (food-frequency questionnaires) instruments on the entire study population to allow for maximizing the strengths of each instrument; and 7) continue to develop, evaluate, and further expand methods of dietary assessment, including dietary biomarkers and methods using new technologies.
Article
Full-text available
Two experiments (modeled after J. Deese's 1959 study) revealed remarkable levels of false recall and false recognition in a list learning paradigm. In Experiment 1, subjects studied lists of 12 words (e.g., bed, rest, awake ); each list was composed of associates of 1 nonpresented word (e.g., sleep). On immediate free recall tests, the nonpresented associates were recalled 40% of the time and were later recognized with high confidence. In Experiment 2, a false recall rate of 55% was obtained with an expanded set of lists, and on a later recognition test, subjects produced false alarms to these items at a rate comparable to the hit rate. The act of recall enhanced later remembering of both studied and nonstudied material. The results reveal a powerful illusion of memory: People remember events that never happened.
Article
Full-text available
The Scientific Report of the 2015 Dietary Guidelines Advisory Committee was primarily informed by memory-based dietary assessment methods (M-BMs) (eg, interviews and surveys). The reliance on M-BMs to inform dietary policy continues despite decades of unequivocal evidence that M-BM data bear little relation to actual energy and nutrient consumption. Data from M-BMs are defended as valid and valuable despite no empirical support and no examination of the foundational assumptions regarding the validity of human memory and retrospective recall in dietary assessment. We assert that uncritical faith in the validity and value of M-BMs has wasted substantial resources and constitutes the greatest impediment to scientific progress in obesity and nutrition research. Herein, we present evidence that M-BMs are fundamentally and fatally flawed owing to well-established scientific facts and analytic truths. First, the assumption that human memory can provide accurate or precise reproductions of past ingestive behavior is indisputably false. Second, M-BMs require participants to submit to protocols that mimic procedures known to induce false recall. Third, the subjective (ie, not publicly accessible) mental phenomena (ie, memories) from which M-BM data are derived cannot be independently observed, quantified, or falsified; as such, these data are pseudoscientific and inadmissible in scientific research. Fourth, the failure to objectively measure physical activity in analyses renders inferences regarding diet-health relationships equivocal. Given the overwhelming evidence in support of our position, we conclude that M-BM data cannot be used to inform national dietary guidelines and that the continued funding of M-BMs constitutes an unscientific and major misuse of research resources. Copyright © 2015 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Article
Full-text available
Studies on the role of diet in the development of chronic diseases often rely on self-report surveys of dietary intake. Unfortunately, many validity studies have demonstrated that self-reported dietary intake is subject to systematic under-reporting, although the vast majority of such studies have been conducted in industrialised countries. The aim of the present study was to investigate whether or not systematic reporting error exists among the individuals of African ancestry (n 324) in five countries distributed across the Human Development Index (HDI) scale, a UN statistic devised to rank countries on non-income factors plus economic indicators. Using two 24 h dietary recalls to assess energy intake and the doubly labelled water method to assess total energy expenditure, we calculated the difference between these two values ((self-report - expenditure/expenditure) × 100) to identify under-reporting of habitual energy intake in selected communities in Ghana, South Africa, Seychelles, Jamaica and the USA. Under-reporting of habitual energy intake was observed in all the five countries. The South African cohort exhibited the highest mean under-reporting ( - 52·1 % of energy) compared with the cohorts of Ghana ( - 22·5 %), Jamaica ( - 17·9 %), Seychelles ( - 25·0 %) and the USA ( - 18·5 %). BMI was the most consistent predictor of under-reporting compared with other predictors. In conclusion, there is substantial under-reporting of dietary energy intake in populations across the whole range of the HDI, and this systematic reporting error increases according to the BMI of an individual.
Article
The debate on the relative contributions of presumptive etiologic factors in the development of obesity is becoming increasingly speculative, insular, and partisan. As the global prevalence of obesity continues to rise, the sheer volume of unfounded conjecture threatens to obscure well-established evidence. We posit that the failure to distinguish between causal factors and mere statistical associations engendered the proliferation of misleading and demonstrably false research programs and failed public health initiatives. Nevertheless, scientific progress necessitates the elimination of unsupported speculation via critical examinations of contrary evidence. Thus, the purpose of this review is to present a concise survey of potentially falsifying evidence for the major presumptive etiologic factors inclusive of 'diet', 'genes', physical activity, and non-physiologic factors from the social sciences. Herein, we advance two 'Fundamental Questions of Obesity' that provide a conceptually clear but challenging constraint on conjecture. First, why would an individual (i.e., human or non-human animal) habitually consume more calories than s/he expends? And second, why would the excess calories be stored predominantly as 'fat' rather than as lean tissue? We posit that the conceptual constraint presented by these questions in concert with the parallel trends in body-mass, adiposity, and metabolic diseases in both human and non-human mammals offer a unique opportunity to refute the oversimplification, causal reductionism, and unrestrained speculation that impede progress. We conclude this review with an attempt at consilience and present two novel paradigms, the 'Metabolic Tipping Point' and the 'Maternal Resources Hypothesis', that offer interdisciplinary explanatory narratives on the etiology of obesity and metabolic diseases across mammalian species.
Article
Sugars are foundational to biological life and played essential roles in human evolution and dietary patterns for most of recorded history. The simple sugar glucose is so central to human health that it is one of the World Health Organization's Essential Medicines. Given these facts, it defies both logic and a large body of scientific evidence to claim that sugars and other nutrients that played fundamental roles in the substantial improvements in life- and health-spans over the past century are now suddenly responsible for increments in the prevalence of obesity and chronic non-communicable diseases. Thus, the purpose of this review is to provide a rigorous, evidence-based challenge to 'diet-centrism' and the disease-mongering of dietary sugar. The term 'diet-centrism' describes the naïve tendency of both researchers and the public to attribute a wide-range of negative health outcomes exclusively to dietary factors while neglecting the essential and well-established role of individual differences in nutrient-metabolism. The explicit conflation of dietary intake with both nutritional status and health inherent in 'diet-centrism' contravenes the fact that the human body is a complex biologic system in which the effects of dietary factors are dependent on the current state of that system. Thus, macronutrients cannot have health or metabolic effects independent of the physiologic context of the consuming individual (e.g., physical activity level). Therefore, given the unscientific hyperbole surrounding dietary sugars, I take an adversarial position and present highly-replicated evidence from multiple domains to show that 'diet' is a necessary but trivial factor in metabolic health, and that anti-sugar rhetoric is simply diet-centric disease-mongering engendered by physiologic illiteracy. My position is that dietary sugars are not responsible for obesity or metabolic diseases and that the consumption of simple sugars and sugar-polymers (e.g., starches) up to 75% of total daily caloric intake is innocuous in healthy individuals.
Article
Background: Underreporting of food intake is common in obese subjects. Objective: One aim of this study was to assess to what extent underreporting by obese men is explained by underrecording (failure to record in a food diary everything that is consumed) or undereating. Another aim of the study was to find out whether there was an indication for selective underreporting. Design: Subjects were 30 obese men with a mean (±SD) body mass index (in kg/m²) of 34 ± 4. Total food intake was measured over 1 wk. Energy expenditure (EE) was measured with the doubly labeled water method, and water loss was estimated with deuterium-labeled water. Energy balance was checked for by measuring body weight at the start and end of the food-recording week and 1 wk after the recording week. Results: Mean energy intake and EE were 10.4 ± 2.5 and 16.7 ± 2.4 MJ/d, respectively; underreporting was 37 ± 16%. The mean body mass loss of 1.0 ± 1.3 kg over the recording week was significantly different (P < 0.05) from the change in body mass over the nonrecording week, and indicated 26% undereating. Water intake (reported + metabolic water) and water loss were significantly different from each other and indicated 12% underrecording. The reported percentage of energy from fat was a function of the level of underreporting: percentage of energy from fat = 46 – 0.2 × percentage of underreporting (r² = 0.28, P = 0.003). Conclusions: Total underreporting by the obese men was explained by underrecording and undereating. The obese men selectively underreported fat intake.
Article
Background: A limited number of studies have evaluated self-reported dietary intakes against objective recovery biomarkers. Objective: The aim was to compare dietary intakes of multiple Automated Self-Administered 24-h recalls (ASA24s), 4-d food records (4DFRs), and food-frequency questionnaires (FFQs) against recovery biomarkers and to estimate the prevalence of under- and overreporting. Design: Over 12 mo, 530 men and 545 women, aged 50-74 y, were asked to complete 6 ASA24s (2011 version), 2 unweighed 4DFRs, 2 FFQs, two 24-h urine collections (biomarkers for protein, potassium, and sodium intakes), and 1 administration of doubly labeled water (biomarker for energy intake). Absolute and density-based energy-adjusted nutrient intakes were calculated. The prevalence of under- and overreporting of self-report against biomarkers was estimated. Results: Ninety-two percent of men and 87% of women completed ≥3 ASA24s (mean ASA24s completed: 5.4 and 5.1 for men and women, respectively). Absolute intakes of energy, protein, potassium, and sodium assessed by all self-reported instruments were systematically lower than those from recovery biomarkers, with underreporting greater for energy than for other nutrients. On average, compared with the energy biomarker, intake was underestimated by 15-17% on ASA24s, 18-21% on 4DFRs, and 29-34% on FFQs. Underreporting was more prevalent on FFQs than on ASA24s and 4DFRs and among obese individuals. Mean protein and sodium densities on ASA24s, 4DFRs, and FFQs were similar to biomarker values, but potassium density on FFQs was 26-40% higher, leading to a substantial increase in the prevalence of overreporting compared with absolute potassium intake. Conclusions: Although misreporting is present in all self-report dietary assessment tools, multiple ASA24s and a 4DFR provided the best estimates of absolute dietary intakes for these few nutrients and outperformed FFQs. Energy adjustment improved estimates from FFQs for protein and sodium but not for potassium. The ASA24, which now can be used to collect both recalls and records, is a feasible means to collect dietary data for nutrition research.
Article
Which articles published in the JAMA Network of journals resonated with readers the most this year—and why? Those are the questions we set out to answer in our first-ever end-of-year roundup.
Article
Calibrating dietary self-report instruments is recommended as a way to adjust for measurement error when estimating diet-disease associations. Because biomarkers available for calibration are limited, most investigators use self-reports (e.g., 24-hour recalls (24HRs)) as the reference instrument. We evaluated the performance of 24HRs as reference instruments for calibrating food frequency questionnaires (FFQs), using data from the Validation Studies Pooling Project, comprising 5 large validation studies using recovery biomarkers. Using 24HRs as reference instruments, we estimated attenuation factors, correlations with truth, and calibration equations for FFQ-reported intakes of energy and for protein, potassium, and sodium and their densities, and we compared them with values derived using biomarkers. Based on 24HRs, FFQ attenuation factors were substantially overestimated for energy and sodium intakes, less for protein and potassium, and minimally for nutrient densities. FFQ correlations with truth, based on 24HRs, were substantially overestimated for all dietary components. Calibration equations did not capture dependencies on body mass index. We also compared predicted bias in estimated relative risks adjusted using 24HRs as reference instruments with bias when making no adjustment. In disease models with energy and 1 or more nutrient intakes, predicted bias in estimated nutrient relative risks was reduced on average, but bias in the energy risk coefficient was unchanged.
Article
Importance In the United States, national associations of individual dietary factors with specific cardiometabolic diseases are not well established. Objective To estimate associations of intake of 10 specific dietary factors with mortality due to heart disease, stroke, and type 2 diabetes (cardiometabolic mortality) among US adults. Design, Setting, and Participants A comparative risk assessment model incorporated data and corresponding uncertainty on population demographics and dietary habits from National Health and Nutrition Examination Surveys (1999-2002: n = 8104; 2009-2012: n = 8516); estimated associations of diet and disease from meta-analyses of prospective studies and clinical trials with validity analyses to assess potential bias; and estimated disease-specific national mortality from the National Center for Health Statistics. Exposures Consumption of 10 foods/nutrients associated with cardiometabolic diseases: fruits, vegetables, nuts/seeds, whole grains, unprocessed red meats, processed meats, sugar-sweetened beverages (SSBs), polyunsaturated fats, seafood omega-3 fats, and sodium. Main Outcomes and Measures Estimated absolute and percentage mortality due to heart disease, stroke, and type 2 diabetes in 2012. Disease-specific and demographic-specific (age, sex, race, and education) mortality and trends between 2002 and 2012 were also evaluated. Results In 2012, 702 308 cardiometabolic deaths occurred in US adults, including 506 100 from heart disease (371 266 coronary heart disease, 35 019 hypertensive heart disease, and 99 815 other cardiovascular disease), 128 294 from stroke (16 125 ischemic, 32 591 hemorrhagic, and 79 578 other), and 67 914 from type 2 diabetes. Of these, an estimated 318 656 (95% uncertainty interval [UI], 306 064-329 755; 45.4%) cardiometabolic deaths per year were associated with suboptimal intakes—48.6% (95% UI, 46.2%-50.9%) of cardiometabolic deaths in men and 41.8% (95% UI, 39.3%-44.2%) in women; 64.2% (95% UI, 60.6%-67.9%) at younger ages (25-34 years) and 35.7% (95% UI, 33.1%-38.1%) at older ages (≥75 years); 53.1% (95% UI, 51.6%-54.8%) among blacks, 50.0% (95% UI, 48.2%-51.8%) among Hispanics, and 42.8% (95% UI, 40.9%-44.5%) among whites; and 46.8% (95% UI, 44.9%-48.7%) among lower-, 45.7% (95% UI, 44.2%-47.4%) among medium-, and 39.1% (95% UI, 37.2%-41.2%) among higher-educated individuals. The largest numbers of estimated diet-related cardiometabolic deaths were related to high sodium (66 508 deaths in 2012; 9.5% of all cardiometabolic deaths), low nuts/seeds (59 374; 8.5%), high processed meats (57 766; 8.2%), low seafood omega-3 fats (54 626; 7.8%), low vegetables (53 410; 7.6%), low fruits (52 547; 7.5%), and high SSBs (51 694; 7.4%). Between 2002 and 2012, population-adjusted US cardiometabolic deaths per year decreased by 26.5%. The greatest decline was associated with insufficient polyunsaturated fats (−20.8% relative change [95% UI, −18.5% to −22.8%]), nuts/seeds (−18.0% [95% UI, −14.6% to −21.0%]), and excess SSBs (−14.5% [95% UI, −12.0% to −16.9%]). The greatest increase was associated with unprocessed red meats (+14.4% [95% UI, 9.1%-19.5%]). Conclusions and Relevance Dietary factors were estimated to be associated with a substantial proportion of deaths from heart disease, stroke, and type 2 diabetes. These results should help identify priorities, guide public health planning, and inform strategies to alter dietary habits and improve health.
Article
The purpose of this study was to examine the validity of the 1971-2010 United States Department of Agriculture's (USDA's) loss-adjusted food availability (LAFA) per capita caloric consumption estimates. Estimated total daily energy expenditure (TEE) was calculated for nationally representative samples of US adults, 20-74 years, using the Institute of Medicine's predictive equations with “low-active” (TEE L-ACT) and “sedentary” (TEE SED) physical activity values. TEE estimates were subtracted from LAFA estimates to create disparity values (kcal/d). A validated mathematical model was applied to calculate expected weight change in reference individuals resulting from the disparity. From 1971-2010, the disparity between LAFA and TEE L-ACT varied by 394 kcal/d—(P < 0.001), from −205 kcal/d (95% CI: −214, −196) to +189 kcal/d (95% CI: 168, 209). The disparity between LAFA and TEE SED varied by 412 kcal/d (P < 0.001), from −84 kcal/d (95% CI: −93, −76) to +328 kcal/d (95% CI: 309, 348). Our model suggests that if LAFA estimates were actually consumed, reference individuals would have lost ~1-4 kg/y from 1971-1980 (an accumulated loss of ~12 to ~36 kg), and gained ~3-7 kg/y from 1988-2010 (an accumulated gain of ~42 to ~98 kg). These estimates differed from the actual measured increments of 10 kg and 9 kg in reference men and women, respectively, over the 39-year period. The USDA LAFA data provided inconsistent, divergent estimates of per capita caloric consumption over its 39-year history. The large, variable misestimation suggests that the USDA LAFA per capita caloric intake estimates lack validity and should not be used to inform public policy.
Article
Importance: Most studies of US dietary trends have evaluated major macronutrients or only a few dietary factors. Understanding trends in summary measures of diet quality for multiple individual foods and nutrients, and the corresponding disparities among population subgroups, is crucial to identify challenges and opportunities to improve dietary intake for all US adults. Objective: To characterize trends in overall diet quality and multiple dietary components related to major diseases among US adults, including by age, sex, race/ethnicity, education, and income. Design, setting, and participants: Repeated cross-sectional investigation using 24-hour dietary recalls in nationally representative samples including 33 932 noninstitutionalized US adults aged 20 years or older from 7 National Health and Nutrition Examination Survey (NHANES) cycles (1999-2012). The sample size per cycle ranged from 4237 to 5762. Exposures: Calendar year and population sociodemographic subgroups. Main outcomes and measures: Survey-weighted, energy-adjusted mean consumption and proportion meeting targets of the American Heart Association (AHA) 2020 continuous diet scores, AHA score components (primary: total fruits and vegetables, whole grains, fish and shellfish, sugar-sweetened beverages, and sodium; secondary: nuts, seeds, and legumes, processed meat, and saturated fat), and other individual food groups and nutrients. Results: Several overall dietary improvements were identified (P < .01 for trend for each). The AHA primary diet score (maximum of 50 points) improved from 19.0 to 21.2 (an improvement of 11.6%). The AHA secondary diet score (maximum of 80 points) improved from 35.1 to 38.5 (an improvement of 9.7%). Changes were attributable to increased consumption between 1999-2000 and 2011-2012 of whole grains (0.43 servings/d; 95% CI, 0.34-0.53 servings/d) and nuts or seeds (0.25 servings/d; 95% CI, 0.18-0.34 servings/d) (fish and shellfish intake also increased slightly) and to decreased consumption of sugar-sweetened beverages (0.49 servings/d; 95% CI, 0.28-0.70 servings/d). No significant trend was observed for other score components, including total fruits and vegetables, processed meat, saturated fat, or sodium. The estimated percentage of US adults with poor diets (defined as <40% adherence to the primary AHA diet score components) declined from 55.9% to 45.6%, whereas the percentage with intermediate diets (defined as 40% to 79.9% adherence to the primary AHA diet score components) increased from 43.5% to 52.9%. Other dietary trends included increased consumption of whole fruit (0.15 servings/d; 95% CI, 0.05-0.26 servings/d) and decreased consumption of 100% fruit juice (0.11 servings/d; 95% CI, 0.04-0.18 servings/d). Disparities in diet quality were observed by race/ethnicity, education, and income level; for example, the estimated percentage of non-Hispanic white adults with a poor diet significantly declined (53.9% to 42.8%), whereas similar improvements were not observed for non-Hispanic black or Mexican American adults. There was little evidence of reductions in these disparities and some evidence of worsening by income level. Conclusions and relevance: In nationally representative US surveys conducted between 1999 and 2012, several improvements in self-reported dietary habits were identified, with additional findings suggesting persistent or worsening disparities based on race/ethnicity, education level, and income level. These findings may inform discussions on emerging successes, areas for greater attention, and corresponding opportunities to improve the diets of individuals living in the United States.