Nutrition Journal

Published by Springer Nature
Learn more about this page
Recent publications
Mean (SD) cost and nutritional data for 'regular' foods and toddler food categories
Aim To compare the cost and nutritional profiles of toddler-specific foods and milks to ‘regular’ foods and milks. Methods Cross-sectional audit of non-toddler specific (‘regular’) foods and milks and secondary analysis of existing audit data of toddler specific (12-36 months) foods and milks in Australia. Main findings The cost of all toddler-specific foods and milks was higher than the regular non-toddler foods. Foods varied in nutritional content, but toddler foods were mostly of poorer nutritional profile than regular foods. Fresh milk cost, on average, $0.22 less per 100 mL than toddler milk. Toddler milks had higher mean sugar and carbohydrate levels and lower mean protein, fat, saturated fat, sodium and calcium levels per 100 mL, when compared to fresh full fat cow’s milk. Conclusions Toddler specific foods and milks cost more and do not represent value for money or good nutrition for young children.
 
Dietary factors may play a role in the etiology of endometriosis and dietary intake of some food groups and nutrients could be associated with endometriosis risk. This systematic review and meta-analysis of observational studies was conducted to summarize the findings on the association between dietary intakes of selected food groups and nutrients (dairy, fats, fruits, vegetables, legumes, and animal-derived protein sources), and the risk of endometriosis among adult women. PubMed, Scopus, and ISI Web of Science were systematically searched up to September 2022. The inverse variance-weighted fixed-effect method was used to estimate the effect size and corresponding 95% CI. A total of 8 publications (4 studies) including 5 cohorts and 3 case-control with a sample size ranging from 156 to 116,607 were included in this study. A higher intake of total dairy [all low-fat and high-fat dairy foods] was associated with decreased risk of endometriosis (RR 0.90; 95% CI, 0.85 to 0.95; P < 0.001; I² = 37.0%), but these associations were not observed with intakes of low or high-fat dairy, cheese or milk. Increased risk of endometriosis was associated with higher consumption of red meat (RR 1.17; 95% CI, 1.08 to 1.26; P < 0.001; I² = 82.4%), trans fatty acids (TFA) (RR 1.12; 95% CI, 1.02 to 1.23; P = 0.019; I² = 73.0%), and saturated fatty acids (SFA) (RR 1.06; 95% CI, 1.04 to 1.09; P < 0.001; I² = 57.3%). The results of this meta-analysis suggest that there may be differing associations between dietary intake of dairy foods, red meat, SFAs, and TFAs and the risk of endometriosis. It may be useful to extend the analysis to other types of food groups and dietary patterns to obtain a complete picture. Additionally, further investigations are needed to clarify the role of diet in the incidence and progression of endometriosis. Trial registration: PROSPERO, CRD42020203939.
 
Selection flow chart of participants in this study. The flow chart describes the exclusion criteria and the total number of participants, excluded participants, and eligible participants. FFQ: food frequency questionnaire
Background Although small for gestational age (SGA) is a serious problem worldwide, the association of dietary patterns before and during pregnancy with SGA risk is unclear. We evaluated this association among Japanese pregnant women using three methods: reduced rank regression (RRR) and partial least squares (PLS), methods for extracting dietary patterns that can explain the variation of response variables, and principal component analysis (PCA), a method for extracting dietary patterns of the population. Methods Between July 2013 and March 2017, 22,493 pregnant women were recruited to the Tohoku Medical Megabank Project Birth and Three-Generation Cohort Study, a population-based prospective birth cohort study in Japan. Information on dietary intake was obtained using food frequency questionnaires, and dietary patterns were extracted using RRR, PLS, and PCA. Information on birth weight was obtained from obstetric records, and the birth weight SD score and SGA were defined by the method of the Japan Pediatric Society. The associations of dietary patterns with birth weight SD score and SGA risk were investigated using multiple linear regression and multiple logistic regression, respectively. Results A total of 17,728 mother-child pairs were included. The birth weight SD score was 0.15 ± 0.96, and the prevalence of SGA was 6.3%. The dietary patterns extracted by RRR and PLS were similar and characterized by a high intake of cereals and fruits and a low intake of alcoholic and non-alcoholic beverages in both pre- to early pregnancy and from early to mid-pregnancy. Higher adoption of the RRR and PLS patterns in both periods was associated with an increased birth weight SD score and lower risk of SGA. In contrast, the PCA1 pattern was not associated with birth weight SD score or SGA risk in either period. Although the PCA2 pattern was associated with increased birth weight SD score from early to mid-pregnancy, no other associations with birth weight SD score or SGA risk were observed. Conclusions The dietary pattern with a high intake of cereals and fruits and a low intake of alcoholic and non-alcoholic beverages before and during pregnancy was associated with a decreased SGA risk in Japan.
 
The population distribution and correlation matrix. A The distribution of nutritional scores and the CI-AKI incidence. Bar plots depict the population distribution according to categories of nutritional scores. The dashed line chart depicts the change of the incidence of CI-AKI. Left axis, population count (persons); right axis, the incidence rate of CI-AKI (%); (B) Correlation matrix of the proportion of Scr elevation and nutritional scores. Coefficients of spearman rank-order correlations are displayed (all P values < 0.001). A higher correlation is represented by lower transparency and narrower ellipses. Blue indicates positive correlation and red indicates negative correlation. NRS-2002 indicates nutritional risk screening 2002; CONUT, controlling nutritional status; PNI, prognostic nutritional index; GNRI, geriatric nutritional risk index; Scr, serum creatinine; CI-AKI, contrast-induced acute kidney injury
Restricted cubic spline analyses between nutritional risk and CI-AKI. The restricted cubic spline plot visualizes the association between CI-AKI and nutritional scores, including (A) NRS-2002, (B) CONUT, (C) PNI, and (D) GNRI. The spline model adjusted for underlying clinical confounders, including age (except NRS-2002), gender, diabetes, average SBP, eGFR, LVEF, hemoglobin, C-reactive protein, the volume of contrast agent consumption, the type of contrast agent, pre-procedure medications (statin, furosemide, and dopamine). Abbreviations refer to Fig. 1. *P < 0.05
Receiver operating characteristic (ROC) analyses between nutritional scores and CI-AKI The ROC curves depict the predictive performance of (A) NRS-2002, (B) CONUT, (C) PNI, and (D) GNRI on CI-AKI, respectively. The maximum value of the Youden index determines the optimal cut-off point for CI-AKI and is marked with a cross in the plot. The AUC was calculated for each nutritional scores. AUC indicates area under the curve; other abbreviations, refer to Fig. 1
Subgroup analyses according to age. Patients were divided into groups according to the age (< 70 or ≥ 70 years). Multivariable logistic regression analyses were performed. The category with the lowest nutritional score was set to be the reference. P for trend was calculated by entering the median value of each category as a continuous variable in the models. Tests for interaction (nutritional categories × subgroup stratification) were performed by the likelihood ratio test. Abbreviations refer to Fig. 1. *P < 0.05
Background Nutritional risk is prevalent in various diseases, but its association with contrast-induced acute kidney injury (CI-AKI) remains unclear. This study aimed to explore this association in patients undergoing coronary angiography (CAG). Methods In this retrospective cross-sectional study, 4386 patients undergoing CAG were enrolled. Nutritional risks were estimated by nutritional risk screening 2002 (NRS-2002), controlling nutritional status (CONUT), prognostic nutritional index (PNI), and geriatric nutritional risk index (GNRI), respectively. CI-AKI was determined by the elevation of serum creatinine (Scr). Multivariable logistic regression analyses and receiver operator characteristic (ROC) analyses were conducted. Subgroup analyses were performed according to age (< 70/≥70 years), gender (male/female), percutaneous coronary intervention (with/without), and estimated glomerular filtration rate (< 60/≥60 ml/min/1.73m²). Results Overall, 787 (17.9%) patients were diagnosed with CI-AKI. The median score of NRS-2002, CONUT, PNI, and GNRI was 1.0, 3.0, 45.8, and 98.6, respectively. Nutritional risk was proven to be associated with CI-AKI when four different nutritional tools were employed, including NRS-2002 ([3–7 vs. 0]: odds ratio [95% confidence interval], OR [95%CI] = 4.026 [2.732 to 5.932], P < 0.001), CONUT ([6–12 vs. 0–1]: OR [95%CI] = 2.230 [1.586 to 3.136], P < 0.001), PNI ([< 38 vs. ≥52]: OR [95%CI] = 2.349 [1.529 to 3.610], P < 0.001), and GNRI ([< 90 vs. ≥104]: OR [95%CI] = 1.822 [1.229 to 2.702], P = 0.003). This is consistent when subgroup analyses were performed. Furthermore, nutritional scores were proved to be accurate in predicting CI-AKI (area under ROC curve: NRS-2002, 0.625; CONUT, 0.609; PNI, 0.629; and GNRI, 0.603). Conclusions Nutritional risks (high scores of NRS-2002 and CONUT; low scores of PNI and GNRI) were associated with CI-AKI in patients undergoing CAG.
 
Inclusion and exclusion criteria for the analytic sample, CHNS, 2015. * Individuals may be missing more than one covariate
Associations between commonly consumed, uncommonly consumed, dichotomous dietary variables with overall urbanization index*, CHNS 2015. A Commonly consumed foods (≥ 80% consumption). Relative risk ratios for associations between quintiles of intake and 1-standard deviation change in overall urbanization index* among commonly consumed foods (≥ 80% consumption). * Urbanization index is a validated multicomponent measure of urbanization in the CHNS [6]. † The referent group was the lowest quintile, which included non-consumers, in unadjusted multinomial logistic regressions with overall urbanization index as the outcome. B Uncommonly consumed foods (< 80% consumers). Relative risk ratios for associations between quartiles of intake and 1-standard deviation change in overall urbanization index * among uncommonly consumed foods (< 80% consumers). *Urbanization index is a validated multicomponent measure of urbanization in the CHNS [6]. † Unadjusted multinomial logistic regressions with overall urbanization index as the outcome. C Dichotomous dietary variables. Odds ratios for associations between dichotomous dietary variables and 1-unit change in overall urbanization index*. *Urbanization index is a validated multicomponent measure of urbanization in the CHNS [6]. †Unadjusted logistic regression models with overall urbanization index as the outcome
Odds ratios for associations between final Chinese Urbanized Diet Index and HTN, Overweight*, T2DM. *Overweight was defined as having a BMI of 24 kg/m² or greater, based on the Chinese overweight BMI cut point (Zhou, 2002). † The overall urbanization index is a validated multicomponent measure of urbanization in the CHNS [6]. ‡Models included logistic regressions with urbanized diet index as the exposure and CMDs as the outcomes
Background In recent decades China has experienced rapid urbanization leading to a major nutrition transition, with increased refined carbohydrates, added sweeteners, edible oils, and animal-source foods, and reduced legumes, vegetables, and fruits. These changes have accompanied increased prevalence of cardiometabolic disease (CMD). There is no single dietary measure that summarizes the distinct food changes across regions and levels of urbanization. Methods Using a sample of adults (≥18 years) in the 2015 wave of the China Health and Nutrition Survey (CHNS; n = 14,024), we selected literature-based candidate dietary variables and tested their univariate associations with overall and within-region urbanization. Using iterative exclusion of select diet-related variables, we created six potential urbanized diet indices, which we examined relative to overall urbanization to select a final urbanized diet index based on a priori considerations, strength of association with urbanization, and minimal missingness. We tested stability of the final urbanized diet index across sociodemographic factors. To examine whether our new measure reflected health risk, we used mixed effects logistic regression models to examine associations between the final urbanized diet index and CMD risk factors – hypertension (HTN), overweight, and type 2 diabetes mellitus (T2DM), adjusting for sociodemographics, overall urbanization, physical activity, and including random intercepts to account for correlation at community and household level. Results We identified a final urbanized diet index that captured dietary information unique to consumption of an urbanized diet and performed well across regions. We found a positive association (R ² = 0.17, 0.01 SE) between the final urbanized diet index and overall urbanization in the fully adjusted model. The new measure was negatively associated with HTN [OR (95% CI) = 0.93 (0.88–0.99)] and positively associated with T2D [OR = 1.13; 1.05–1.21] in minimally adjusted models, but not in the fully adjusted models. Conclusion We derived an urbanized diet index that captured dietary urbanization that was distinct from overall urbanization and performed well across all regions of China. This urbanized diet index provides an alternative to measures of traditional versus urbanized diet that vary across regions due to different cultural dietary traditions. In addition, the new measure is best used in combination with diet quality measures, sociodemographic, and lifestyle measures to examine distinct pathways from urbanization to health in urbanizing countries.
 
Prevalence of MUO across tertiles of FRAP in the study population A MUO based on IDF definition. B MUO based on IDF/HOMA-IR definition
Background Although several studies evaluated the relationship between individual dietary antioxidants and metabolic health conditions, data on the association between dietary total antioxidant capacity and metabolic health among children and adolescents is limited. This study investigated the relationship between dietary total antioxidant capacity and metabolic health status in Iranian overweight/obese adolescents. Methods This cross-sectional study was conducted on 203 overweight/obese adolescents. Dietary intakes were evaluated by a validated food frequency questionnaire. Ferric Reducing-Antioxidant Power (FRAP) was considered to indicate dietary total antioxidant capacity. Anthropometric parameters and blood pressure status were measured. Fasting blood samples were obtained to determine circulating insulin, glucose, and lipid profile. Two different methods (modified International Diabetes Federation (IDF) criteria and IDF criteria along with insulin resistance) were applied to classify participants as metabolically healthy obese (MHO) or metabolically unhealthy obese (MUO). Results According to IDF and IDF/HOMA definitions, a total of 79 (38.9%) and 67 (33.0%) adolescents were respectively defined as MUO. Considering IDF criteria, the highest tertile of FRAP was related to lower odds of being MUO in the maximally-adjusted model (OR: 0.40; 95%CI: 0.16–0.96), compared to the lowest tertile. However, based on the IDF/HOMA-IR criteria, no significant relation was found between FRAP and odds of MUO (OR: 0.49; 95%CI: 0.19–1.23) after considering all possible confounders. Conclusions Adolescents with higher intakes of dietary antioxidants have a lower possibility of being MUO based on IDF criteria. However, no substantial relation was found considering HOMA-IR/IDF definition. Further prospective cohort studies need to be done to confirm these findings.
 
Odds ratio of having a positive C-reactive protein across the quartiles of the dietary inflammatory index. P-trend was obtained from the Binary Logistic Regression
Relationship of dietary inflammatory index with individual factors of DAS-28 scores
Background Diet plays an important role in regulating inflammation, which is a hallmark of rheumatoid arthritis (RA). Our aim was to investigate the association between the Dietary Inflammatory Index (DII) scores and RA activity. Methods This cross-sectional study was conducted on 184 patients with RA in rheumatology clinic in Kermanshah city, Iran, in 2020. RA was diagnosed according to the criteria of the 2010 American College of Rheumatology/ European League against Rheumatism. The overall inflammatory potential of the diet was extracted from a validated 168-item food frequency questioner (FFQ) using the DII. RA disease activity was assessed using Disease Activity Score 28 (DAS-28) scores. Logistic regression and one-way ANOVA/ ANCOVA were conducted. Results Individuals in the highest DII quartile had the significantly higher odds of positive C-reactive protein than those in the lowest quartile of the DII scores (OR 4.5, 95% CI 1.16 – 17.41, P = 0.029). A statistically significant downward linear trend in fat-free mass and weight were observed with increasing the DII quartiles ( P = 0.003, P = 0.019, respectively). Patients in the highest DII quartile had higher DAS-28 scores than those in the first quartile (Mean difference: 1.16, 95% CI 0.51 – 1.81, P < 0.001) and second quartile of the DII scores (Mean difference: 1.0, 95% CI 0.34 – 1.65, P < 0.001). Conclusion Our results indicated that reducing inflammation through diet might be one of the therapeutic strategies to control and reduce the disease activity in RA patients.
 
Background The quality of foods taken during breakfast could contribute in shaping diet quality. This study determined the regularity of breakfast consumption and breakfast quality based on the food, energy and nutrient intakes of Filipinos. Materials and methods Data from the 2018 Expanded National Nutrition Survey (ENNS) was extracted for analysis. There were 63,655 individuals comprising about 14,013 school-aged children (6–12 years old), 9,082 adolescents (13–18 years old), 32,255 adults (19–59 years old), and 8,305 elderly (60 years old and above). Two-day non-consecutive 24-h food recalls were used to measure food and nutrient intakes. Diet quality was measured using Nutrient-Rich Food Index (NRF) 9.3. The sample was stratified by age group and NRF9.3 tertiles. Results and findings Results showed that 96 – 98% Filipinos across age groups were consuming breakfast. Children age 6–12 years have the highest NRF9.3 average score (417), followed by the elderly (347), adolescents (340), and adults (330). These scores were very low in comparison with the maximum possible NRF score which is 900. The essential nutrient intakes of respondents were significantly higher among those with the healthiest breakfast diet (Tertile 3) compared to those with the poorest breakfast diet (Tertile 1). However, participants in the healthiest breakfast diet did not meet 20% of the recommendations for calcium, fiber, vitamin C, and potassium. Conclusion and recommendations This study revealed that majority of the population are regular breakfast consumers. However, the breakfast consumed regularly by Filipinos were found to be nutritionally inadequate. And even those classified under Tertile 3 which were assumed as having a better quality of breakfast were still found to have nutrient inadequacies. Thus, the study suggests that Filipinos must consume a healthy breakfast by including nutrient-dense foods such as fruits, vegetables, whole grains, fresh meat, and milk to provide at least 20–25% of the daily energy and nutrient intakes.
 
Full-adjustment model for the association between flesh meat group intake and odds of RA
Background Rheumatoid arthritis (RA) is a chronic, systemic inflammatory, and debilitating autoimmune illness. The objective of the present study was to evaluate the relationship between animal flesh foods consumption and rheumatoid arthritis. Methods Meat consumption was assessed by using a semi-quantitative Food Frequency Questionnaire (168 items) in a case-control study of 297 subjects (100 newly diagnosed cases and 197 healthy controls). An expert rheumatologist diagnosed patients based on the American College of Rheumatology definitions, 2010. Multivariate logistic regression, adjusted for lifestyle and nutritional confounders, was used to evaluate the relationship between dairy consumption and rheumatoid arthritis. Results Participants with greater consumption of fish and seafood were less likely to have RA (OR 0.52; 95% CI 0.27–0.98). Conversely, a higher processed meat intake was associated with increased odds of RA (OR 3.45; 95% CI 1.78–6.68). However, no significant association was found between red meats and poultry consumption and the risk of RA in the fully adjusted model. Conclusions The present study suggests an inverse association between fish and seafood consumption and the risk of RA. On the contrary, a higher amount of processed meat intake was associated with increased odds of RA. However, further studies are warranted to confirm the veracity of our findings.
 
Schematic representation of the relationships among the main themes: A Students assessing the potential food outlet options (considering distance to outlet, opening hours, access to cookware, time needed to prepare or procure food); B information environment [students mostly relied on their social network—friends, colleagues, and roommates—complimented by information from phone-based applications) for information on available food outlets; followed by knowledge from their own past experiences of using the foodscape (C) and exposure to outlets they encountered in their daily classroom-residence journeys (D)]. E Cravings (food environment exposure influenced individuals to form intentions to eat). Preference for quantity/satiety (F) due to mismatch between environmental level and personal level factors, leading to routine use of outlet(s) offering more calories for price (H). The prevailing university food environment characteristics appeared to be a product of continuous patronage by students over the years (G)
Background In recent decades, the food environment has seen rapid transformation globally, altering food availability and access along with how people interact with the food environment and make food-related choice. Objectives & method This explorative study aimed to identify the factors that shape the decision-making process for food outlet choices among emerging adults in a Ghanaian University food environment. The study uses focus group discussions in combination with novel dyadic interviews with best friend pairs. Verbatim transcripts were analysed thematically using NVivo 12. Results Drawing on socio-ecological model (SEM) of behaviour, the study used testimony from 46 participants aged 18–25, 47% female, including individuals from major ethnicities and religions in Ghana, and identified three interwoven levels of influence shaping emerging adults’ choices of food outlet. The main factors influencing food outlet choice were identified as food prices, spatial accessibility, budget, and food quantity/satiety with additional factors including hygiene, variety of foods, food quality and taste preferences as well as societal factors such as ambience and peer influence. Conclusion Multi-component approaches that combine structural level interventions in food retailing along with individual level components may be effective at changing emerging adult consumption behaviour in SSA, although this needs to be studied.
 
Flow diagram of study
The effect of soy isoflavones supplementation on CGRP level. *P-value based on paired t-test was used to compare pre-post tests. #p-value Obtained from analysis of covariance in the adjusted models (adjusted for baseline value and certain dietary factors such as vitamin D, thiamin, riboflavin, niacin, cobalamin and magnesium)
Dietary intake and physical activity of the participants throughout the study
The effects of 8 weeks' soy isoflavones supplementation on migraine characteristics and clinical indices of migraine
The effects of 8 weeks' soy isoflavones on quality of life and mental status of migraine patients
Background Literature suggests a relationship between estrogen levels and migraine headache pathogenesis. However, the effect of soy isoflavones on migraine characteristic remains unclear. This study aimed to investigate the effect of soy isoflavones on migraine characteristics and calcitonin gene-related peptide (CGRP) levels in women with migraine. Methods Eighty-three participants completed a randomized double-blind controlled trial, receiving 50 mg per day soy isoflavones or placebo supplementation for 8 weeks. Migraine severity, migraine days per month, frequency and duration of attacks, mental status, quality of life and serum CGRP levels were measured at baseline and the end of the intervention. Bivariate comparison and intention-to-treat (ITT) were used for analysis. Results Soy isoflavones intake resulted in a significant decrease in mean frequency (-2.36 vs -0.43, P < 0.001), duration (-2.50 vs -0.02, P < 0.001) of migraine attacks and CGRP level (-12.18 ng/l vs -8.62, P = 0.002) in compared to placebo group. Also, a significant improvement was found in quality of life (16.76 vs 2.52, P < 0.001). Although, reduction in the migraine severity and mental status did not reach a statistically significant level ( P > 0.05). Conclusion soy isoflavones supplementation may be considered as a complementary treatment for women with migraine to improve migraine characteristics and reduce the burden of disease.
 
Warning labels as originally considered by the Mexican regulation, before it was reviewed and debated from August 2019 to January 24, 2020, when the modification was approved. A Traditional warning labels. B Numeric warning labels
A. WL interpretation posters. B WL interpretation video
Example of a dummy product (breakfast cereal) with a front-of-pack area >10 cm² by group of study. A Nutrition Facts Panel (NF). B Nutrition Facts Panel with cartoon characters (NF + C). C Warning Labels (WL). D Warning Labels with cartoon character (WL + C)
Example of a dummy product (orange juice) with a front-of-pack area <10 cm² by group of study. A Nutrition Facts Panel (NF). B Nutrition Facts Panel with cartoon characters (NF + C). C Warning Labels (WL). D Warning Labels with cartoon characters (WL + C)
Percentage of children correctly choosing the healthiest option, the least healthy option and both, and time required to make these decisions, across numeric and traditional warning labels (WL)
Background: Warning Labels (WL) highlight excessive amounts of critical nutrients in order to discourage consumption of unhealthful packaged food products. This study aimed to evaluate among Mexican school children, the objective understanding of traditional and numeric WL (aimed at small products) considered by the Mexican regulation, and whether cartoon characters influenced the understanding of WL. We also tested some communication strategies to facilitate the correct use of the WL. Methods: We carried out a randomized experiment in July 2019 in public elementary schools from Morelos, Mexico. Participants aged 6-13 years, were randomly assigned to one of four groups: 1) Nutrient Facts Panel (NF) (n = 120), 2) Nutrient Facts Panel with cartoon characters (NF + C) (n = 83), considered the control groups, 3) Warning Labels (WL) (n = 109), and 4) Warning Labels with cartoon characters (WL + C) (n = 96). After allocation, children assigned to both WL groups (WL or WL + C), were randomly required to watch two posters simultaneously or a video explaining how to correctly interpret WLs. Logistic regression models adjusted by sex, age and cluster (school) were fitted. Results: The percentage of children correctly choosing the healthiest or the unhealthiest option was higher for WL groups (56.8, 95%CI; 40.8-72.8) compared to NF groups (24.3, 95%CI; 20.4-28.3, p < 0.05). The understanding of traditional WL was higher (28.7, 95%CI: 22.8-35.4) than the numeric WL (19, 95%CI: 14.2-25.0, p < 0.05). But, correct answers for identifying healthy and unhealthy products were higher for numeric WL than for NF groups. Cartoon characters reduced the percentage of correct answers for choosing unhealthiest products (WL + C: 48.9, 95%CI: 25.6-72.4 vs WL: 58.7, 95%CI: 36.4-81.1, p < 0.05). The video was 2.23 times more helpful than the posters to the correct interpretation of the WL (p < 0.05). Conclusions: In scholar Mexican children, traditional and numeric WL were useful to identify healthier and unhealthier packaged products in comparison to NF, suggesting that both WL formats may effectively communicate the excessive content of nutrients of concern among children. Cartoon characters may reduce the objective understanding of the WL, underscoring the need to regulate advertising directed to children along with the implementation of front-of-pack labeling.
 
Flow chart of inclusion of participants in the study
Adjusted odds ratios and 95% confidence intervals (CI) for delayed development with low vs. high adherence to the New Nordic Diet (NND). Delayed development is defined by scoring 2 standard deviations below the mean on short forms of the Ages and Stages Questionnaire (ASQ) and the Child Development Inventory (CDI; 5 years, motor development)
Adjusted odds ratios and 95% confidence intervals (CI) for delayed development with medium vs. high adherence to the New Nordic Diet (NND). Delayed development is defined by scoring 2 standard deviations below the mean on short forms of the Ages and Stages Questionnaire (ASQ) and the Child Development Inventory (CDI; 5 years, motor development)
Background The rapid neurodevelopment that occurs during the first years of life hinges on adequate nutrition throughout fetal life and early childhood. Therefore, adhering to a dietary pattern based on healthy foods during pregnancy and the first years of life may be beneficial for future development. The aim of this paper was to investigate the relationship between adherence to a healthy and potentially sustainable Nordic diet during pregnancy and in early childhood and child development. Methods This study is based on the Norwegian Mother, Father and Child Cohort Study (MoBa) and uses data from the Medical Birth Registry of Norway (MBRN). In 83,800 mother-child pairs, maternal pregnancy diet and child diet at 6 months, 18 months and 3 years were scored according to adherence to the New Nordic Diet (NND). NND scores were calculated both as a total score and categorized into low, medium, or high adherence. Child communication and motor development skills were reported by parents at 6 months, 18 months, 3 and 5 years, using short forms of the Ages and Stages Questionnaire and the Child Development Inventory. Associations of NND adherence with child development were estimated with linear and logistic regression in crude and adjusted models. Results When examining the NND and child developmental scores as percentages of the total scores, we found positive associations between the NND scores (both maternal pregnancy diet and child diet) and higher scoring on child development (adjusted $$\hat{\beta}$$ β ^ s [95% confidence intervals] ranging from 0.007 [0.004, 0.009] to 0.045 [0.040, 0.050]). We further found that low and medium adherence to NND were associated with higher odds of later emerging developmental skills compared to high NND adherence at nearly all measured timepoints (odds ratios [95% CI] ranging from significant values 1.15 [1.03–1.29] to 1.79 [1.55, 2.06] in adjusted analyses). Conclusions Our findings support that adherence to a healthy and potentially sustainable diet early in life is important for child development every step of the way from pregnancy until age 5 years.
 
Background: Food and nutrition literacy is a key factor in shaping healthy dietary behaviors and may result in decreasing decrease the prevalence of overweight. Empirical research on food and nutrition literacy and its outcomes is limited, especially among children and adolescents. Thus, this study investigates the link between Food and Nutrition Literacy (FNLIT) with eating behaviors, academic performance, and overweight in 10-12 years old students in Tehran, Iran. Methods: This study was performed through two phases: 1) Proposing a conceptual model of the relationship between FNLIT and its determinants and outcomes, based on the existing evidence and previous models, and 2) Testing the proposed FNLIT model through a cross-sectional study on 803 primary school students (419 boys and 384 girls, from 34 public and 10 private primary schools), aged 10-12 years using structural equation modeling. Demographic, socio-economic, and household food security characteristics were collected by interviewing the students and their mothers/caregivers using a questionnaire. FNLIT was measured by a self-administered, locally designed, and validated questionnaire. Results: The fit indices suggested a reasonably adequate fit of the data to the hypothesized model (χ2/df = 2.03, p < 0.001, goodness of fit index (GFI) = 0.90, adjusted goodness of fit index (AGFI) = 0.88, comparative fit index (CFI) = 0.91, incremental fit index (IFI) = 0.91, root mean square error of approximation (RMSEA) = 0.04, standardized root mean residual (SRMR) = 0.06). SES was directly and positively related to FNLIT and its subscale in students. FNLIT score had a positive direct (non-mediated) relationship with healthy eating behavior and academic performance. This pattern was strongly reversed in unhealthy eating behavior. There was a full mediation relationship between FNLIT and overweight/obesity via healthy eating behaviors. SES predicted academic performance partially through the mediating effect of Food Label Literacy (FLL). The results indicated that despite the direct relationship between SES and academic performance, an indirect but negative relationship existed with food insecurity. The finding also revealed the fully mediating role of Food Choice Literacy (FCL) in the relationship between demographic factors and healthy eating behaviors. Our study also found that Interactive Food and Nutrition Literacy (IFNL) protected unhealthy eating behaviors, and FCL predicted healthy eating behaviors in children. Conclusion: Our study draws attention to FNLIT, especially the skills domain, including IFNL, FCL, and FLL, as the most important determinant of healthy eating behavior, academic performance, and weight status in school-age children reduces social inequalities in children's development. To ensure an adequate level of FNLIT, educators should assess and plan to enhance food literacy skills in children and adolescents.
 
Flow chart of the sampling of the study participants
Background: Studies have shown that a direct association exists between the diet and blood uric acid concentrations. However, works on the association of dietary patterns with blood uric acid concentrations and hyperuricemia remain limited. OBJECTIVE: This study aims to evaluate the association of dietary patterns with blood uric acid concentrations and hyperuricemia. Methods: The relationship between dietary patterns and hyperuricemia was explored through a nutritional epidemiological survey in China (n = 4855). Three statistical methods, including principal component analysis, reduced rank regression (RRR), and partial least squares regression, were used to extract dietary patterns. General linear regression and logistic regression analyses were utilized to explore the relationship of dietary patterns with blood uric acid concentrations and hyperuricemia. Results: After adjusting for potential confounding factors, the score for the plant-based dietary pattern was found to be negatively correlated with blood uric acid levels (β = - 3.225) and that for the animal dietary pattern was discovered to be directly correlated with blood uric acid levels (β = 3.645). The participants in the highest quartile of plant-based dietary pattern scores were at a low risk of hyperuricemia (OR = 0.699; 95% CI: 0.561-0.870, P < 0.05), whereas those in the highest quartile of animal dietary pattern scores were at a high risk of hyperuricemia (OR = 1.401; 95% CI: 1.129-1.739, P < 0.05). The participants in the third quartile of scores for the RRR dietary pattern, which was characterized by the relatively high intake of poultry, sugary beverages, and animal organs and the low intake of desserts and snacks, had a significantly higher risk of hyperuricemia than those in the first quartile of scores for the RRR dietary pattern (OR = 1.421; 95% CI: 1.146-1.763, P < 0.05). Conclusions: Our research indicated that plant-based dietary pattern analyzed by PCA was negatively associated with blood uric acid concentrations, while animal-based dietary pattern was directly associated with blood uric acid concentrations. The RRR dietary pattern may have the potential to induce elevations in blood uric acid concentrations.
 
Intake of macronutrients and nutrients to limit in the M20-HEP and M25-HEP models. The solid black line represents the nutrient concentration per 2,000 kcal provided by the Healthy U.S.-Style Eating Pattern (HEP) and the dashed black lines represent the Acceptable Macronutrient Distribution Ranges (AMDR) for macronutrient intake by adults as established by the IOM,² the Chronic Disease Risk Reduction (CDRR) level for sodium [25], and the cholesterol limit used in food pattern modeling exercises to support the 2020–2025 DGA [22]. Blue bars (1) are patterns with the same proportion of processed meat/poultry as the HEP (34% of total meat and 12% of total poultry as processed); red bars (2) are patterns with the current level of processed meat (4 ounce-eq per week) / poultry (1.5 ounce-eq per week); green bars (3) are patterns with a reduced level of processed meat (2 ounce-eq per week) / (1 ounce-eq per week) poultry; and purple bars (4) are patterns with no processed meat/poultry
Background Dietary patterns developed by the USDA provide modest levels of protein (14–18% energy) within the Acceptable Macronutrient Distribution Range (AMDR) of 10–35% for adults, though diets providing a higher percentage of energy may be beneficial for some individuals. The purpose of this study was to determine if it is feasible to modify the Healthy U.S.-Style Eating Pattern (“HEP”) to provide a higher percentage of energy from protein. Methods Using the framework implemented by the USDA in developing the HEP, energy from protein was set at 20%, 25%, and 30%. Amounts of protein foods were proportionally increased while amounts of other foods were adjusted iteratively within specified parameters. The models also disaggregated total meat/poultry into fresh and processed forms to develop patterns maintaining current proportions, current levels, reduced, or no processed meat/poultry. Nutrient intakes were compared with nutrient goals for representative U.S. populations with 2,000 kcal needs (females 19–30 years, males 51–70 years), with 90% of the Recommended Dietary Allowance or Adequate Intake regarded as sufficient. Results Dietary patterns with 20% energy from protein were constructed with minor deviations from the current 2,000 kcal HEP. Dietary patterns with 25% energy from protein were constructed for all levels of processed meat/poultry excluding the current proportion model, though relative to the current HEP the constructed patterns reflect substantial reductions in amounts of refined grains and starchy vegetables, and substantial increases in protein foods consumed as beans and peas, seafood, and soy products. It was not possible to develop a pattern with 30% energy from protein without reducing the percentage of energy from carbohydrate below the AMDR or non-compliance with other modeling constraints. Stepwise reductions in processed meat/poultry reduced sodium intake. Conclusions It is feasible to develop dietary patterns in a 2,000 kcal diet while mirroring the HEP that meet recommended intakes of nutrients with 20% or 25% energy from protein, though the pattern with 25% energy from protein may be more idealistic than realistic. Reduced levels of processed meat/poultry may translate to lower sodium intake.
 
Design of the reproducibility and validity study of FFQ among older Lebanese. 24HDR: 24-hour dietary recall, FFQ: food frequency questionnaire
Agreement between average nutrient intakes measured by the FFQ and the mean of 24HDR. Agreement is measured by Bland–Altman plots for (A) caloric intake (Calories/day), (B) protein (g/day), (C) fat (g/day) and (D) calcium (mg/day), (E) omega 3 fatty acid (g/day), (F) vitamin B6 (mg/day). (—) lines represent mean difference (FFQ – mean 24HDR) and (− − -) represent lower and upper 95% limits of agreement. The higher and lower extremes represented the mean ± 2 SD
Background Food frequency questionnaires (FFQ) is an easy and inexpensive tool that can be used to evaluate nutrient and dietary trends of groups and individuals. Few studies in the East Mediterranean region tailored FFQs to describe dietary intakes of older adults. The purpose of the study is therefore to assess the validity and reproducibility of a FFQ, designed for use with older adults living in a Mediterranean Arabic speaking country, Lebanon. Methods The FFQ is composed of a list of 90 food items, commonly consumed by adults above 60 years of age. Validity of the FFQ was tested using the mean of two 24-hours dietary recalls (24HDR), and reproducibility, by repeating the questionnaire within a one-month period, along the second dietary recall. Our study included 42 and 76 participants, for the repoducibility and validity analysis respectively. Subjects were randomly selected from 2 of the 8 governorates in the country. Results FFQ reproducibility showed a mean relative difference of 1.03% without any significant difference between all paired components of nutrients. Intra class correlation (ICC) showed good and excellent reliability for caloric intake and all macronutrients, moderate to good reliability for all remaining nutrients, except for poly-unsaturated fatty acids, vitamins A, B12 and fibers. Correlation coefficients for all nutrients were fair to strong. Both administrations of the FFQ showed good internal validity. Validation of FFQ showed a mean relative difference between FFQ and mean 24HDR at 19.5%. Agreements between the 2 methods, for classifying individuals in the same or adjacent quartile, for nutrient intake and nutrient adequacy, were 80 and 78.2% respectively. Mean Kappa coefficient was 0.56 and energy-adjusted correlations were within the recommended values for all items except for vitamin A and B12. Adjusting for nutrient-dense food intake improved the agreement for theses 2 vitamins to 0.49 and 0.56, respectively. Conclusion The proposed FFQ can be considered a valid tool to help describe nutrient intake of older individuals in an Arabic speaking Mediterranean country. It could serve for possible use in the East Mediterranean region for the evaluation of regular dietary intake of community-dwelling older adults.
 
The structure of neural network
The relationship between serum pyridoxal 5′-phosphate concentration and predicted pyridoxal 5′-phosphate value based on deep learning algorithm (DLA)
Background Multivariable linear regression (MLR) models were previously used to predict serum pyridoxal 5′-phosphate (PLP) concentration, the active coenzyme form of vitamin B6, but with low predictability. We developed a deep learning algorithm (DLA) to predict serum PLP based on dietary intake, dietary supplements, and other potential predictors. Methods This cross-sectional analysis included 3778 participants aged ≥20 years in the National Health and Nutrition Examination Survey (NHANES) 2007-2010, with completed information on studied variables. Dietary intake and supplement use were assessed with two 24-hour dietary recalls. We included potential predictors for serum PLP concentration in the models, including dietary intake and supplement use, sociodemographic variables (age, sex, race-ethnicity, income, and education), lifestyle variables (smoking status and physical activity level), body mass index, medication use, blood pressure, blood lipids, glucose, and C-reactive protein. We used a 4-hidden-layer deep neural network to predict PLP concentration, with 3401 (90%) participants for training and 377 (10%) participants for test using random sampling. We obtained outputs after sending the features of the training set and conducting forward propagation. We then constructed a loss function based on the distances between outputs and labels and optimized it to find good parameters to fit the training set. We also developed a prediction model using MLR. Results After training for 10 ⁵ steps with the Adam optimization method, the highest R 2 was 0.47 for the DLA and 0.18 for the MLR model in the test dataset. Similar results were observed in the sensitivity analyses after we excluded supplement-users or included only variables identified by stepwise regression models. Conclusions DLA achieved superior performance in predicting serum PLP concentration, relative to the traditional MLR model, using a nationally representative sample. As preliminary data analyses, the current study shed light on the use of DLA to understand a modifiable lifestyle factor.
 
Consort diagram
Mean systolic blood pressure (A) and diastolic blood pressure (B) at baseline and after 10 years. Legend: Mean systolic blood pressure (A) and diastolic blood pressure (B) at baseline and after 10 years. in adults consuming 0 servings of soft drinks at baseline and then maintained their intake or increased it by 1 or 2 servings. Fixed-effects models were used to predict mean diastolic blood pressure adjusting for baseline age centered to mean, sex centered, body mass index centered to mean, physical activity, smoking status, alcohol intake, education, and energy intake
Background A few prospective studies have investigated the potential association of soft drink and non-caloric soft drink intake with high blood pressure using methods that adequately consider changes in intake over time and hypertensive status at baseline. Objective To prospectively examine the association of soft drink and non-caloric soft drink intake with systolic and diastolic blood pressure in a sample of Mexican adults, overall and by hypertension status. Methods We used data from the Health Workers Cohort Study spanning from 2004 to 2018 ( n = 1,324 adults). Soft drink and non-caloric soft drink intake were assessed with a semiquantitative food frequency questionnaire. We fit multivariable-adjusted fixed-effects models to test the association of soft drink and non-caloric soft drink intake with systolic and diastolic blood pressure. The models were adjusted for potential confounders and considering the potential modifying effect of hypertension status at baseline. Results A one-serving increase in soft drink intake was associated with a 2.08 mm Hg (95% CI: 0.21, 3.94) increase in systolic blood pressure and 2.09 mm Hg (95% CI: 0.81, 3.36) increase in diastolic blood pressure over ten years. A stronger association between soft drink intake and diastolic pressure was observed among participants with versus without hypertension at baseline. We found no association between non-caloric soft drink intake and blood pressure. Conclusions Our findings support the hypothesis that soft drink intake increases blood pressure. While further studies should be conducted to confirm our findings, food policies and recommendations to limit soft drink intake are likely to help reduce blood pressure at the population level. We probably did not find an association between non-caloric soft drink intake and blood pressure because of the low consumption of this type of beverage in the cohort. More studies will be needed to understand the potential effect of non-caloric beverages on blood pressure.
 
Background Intermittent fasting (IF), consisting of either a one-day (IF1) or two consecutive days (IF2) per week, is commonly used for optimal body weight loss. Our laboratory has previously shown an IF1 diet combined with 6d/week of protein pacing (P; 4–5 meals/day evenly spaced, ~ 30% protein/day) significantly enhances weight loss, body composition, and cardiometabolic health in obese men and women. Whether an IF1-P or IF2-P, matched for weekly energy intake (EI) and expenditure (EE), is superior for weight loss, body composition, and cardiometabolic health is unknown. Methods This randomized control study directly compared an IF1-P ( n = 10) versus an IF2-P ( n = 10) diet on weight loss and body composition, cardiovascular (blood pressure and lipids), hormone, and hunger responses in 20 overweight men and women during a 4-week weight loss period. Participants received weekly dietary counseling and monitoring of compliance from a registered dietitian. All outcome variables were assessed pre (week 0) and post (week 5). Results Both groups significantly reduced body weight, waist circumference, percent body fat, fat mass, hunger, blood pressure, lipids, glucose, and increased percent fat-free mass ( p < 0.05). However, IF2-P resulted in significantly greater reductions in body weight (-29%) and waist circumference (-38%) compared to IF1-P ( p < 0.05), and showed a strong tendency for greater reductions in fat mass, glucose, and hunger levels ( p < 0.10) despite similar weekly total EI (IF1-P, 9058 ± 692 vs. IF2-P, 8389 ± 438 kcals/week; p = 0.90), EE (~ 300 kcals/day; p = 0.79), and hormone responses ( p > 0.10). Conclusions These findings support short-term IF1-P and IF2-P to optimize weight loss and improve body composition, cardiometabolic health, and hunger management, with IF2-P providing enhanced benefits in overweight women and men. Trial registration This trial was registered March 03, 2020 at www.clinicaltrials.gov as NCT04327141 .
 
We enthusiastically read the research of Motamedi et al. that was published in the valuable Nutrition Journal. This cross-sectional study used the data of the Fasa Cohort Study as a branch of the Prospective Epidemiological Research Study in Iran (PERSIAN) cohort. The authors presented associations between the diet qualities, assessed by Mediterranean dietary score, dietary diversity score, healthy eating index-2015, and diet quality index-international with the risk of hypertension in an Iranian population. However, there are some concerns regarding the methodology and interpretation of the present study.
 
Study design flow chart for 37 Chinese young adults
Regression of daily iodine excretion on iodine intake. Each point represent 1 day data of each subject. A Male, data included 139 samples from 14 young male; B Female, data included 220 samples from 23 young female; C Total subjects, data included 359 samples from 37 young adults
Regression of ratio on iodine intake. ratio=△incrementex(a)/△incrementin(b). (a)△incrementex: iodine excretion increment, (b) △incrementin: iodine intake increment. A Male, data included 91 samples from 14 young male ;B Female, data included 144 samples from 23 young female; C Total subjects, data included 235 samples from 37 young adults
Background: Appropriate iodine intake for adults is essential to reduce the prevalence of thyroid diseases, but there is little research data on iodine requirement of Chinese population. This study aimed to explore the iodine requirement of young adults to maintain a healthy status based on 'overflow theory'. Methods: Iodine-balance experiment has been performed in this project. We conducted an 18-day study consisted of a 6-day acclimation period and 3 consecutive experimental stages in 37 Chinese healthy young adults (23 female and 14 male). Each stage was consumed for 4 days. Strictly-controlled low-iodine intake diets were provided for adults in the first period, an egg or 125mL milk was added in the second and third period, respectively. The dietary samples, 24-h urine specimens and faeces of volunteers were collected daily for assessment of iodine intake and excretion in volunteers. Results: Mean values of iodine intake (22.7±3.6, 35.1±3.7, and 52.2±3.8μg/d), excretion (64.7±13.9, 62.3±12.6, and 94.3±14.5μg/d) and iodine balance (-35.2±19.5, -21.0±19.8, and -33.5±26.9μg/d) were significantly different among three periods for male (P<0.001 for all); mean values of iodine intake (16.6±3.1, 29.7±2.7, and 48.0±2.7μg/d), and excretion (47.0±9.9, 55.5±8.1, and 75.7±12.4μg/d) were significantly different among three periods for female (P < 0.001 for all). No significant difference was observed among the 3 periods for female in the iodine balance (-30.5±9.3, -25.9±7.3, and -27.6±12.1μg/d). The linear regression equation of iodine excretion on iodine intake was Y=0.979X+37.04 (male) and Y=0.895X+31.48 (female). Compared with stage 2, iodine excretion increments in stage 3 had exceeded the iodine intake increment for men. The ratio of increment was 1.675 for male when the average iodine intake was 52.2μg/d in stage 3. When the iodine excretion increment equaled to the iodine intake increment, the daily iodine intake of men was 47.0μg. Conclusion: We have evaluated the iodine requirement of young adults in southern China based on overflow theory. Our results indicate the lower limit of iodine requirement for Chinese young men is 47.0μg/d. The trial was registered at www.chictr.org.cn as ChiCTR1800014877.
 
Percent of energy from major food groups, as consumed, in 0–48 months Lebanese children. Legend: ¹ Includes any milk (breast milk, infant formula, cow’s milk, and goat’s milk) as well as dairy foods, cheeses, and yogurt. ² Includes bread, rolls, pita, saj, baby food cereals/grains, baby food finger food, cereals, crackers, pretzels, kaak, pancakes, French toast, pasta, rice, and other grains. ³ Includes all yogurt, grain, and meat based mixed dishes such as sandwiches, macaroni and cheese, spaghetti and lasagna, sandwiches, beans and rice, pizzas, Mahashi, and soups. ⁴ Includes any baby food and non-baby food meats, dried beans, peas and legumes, eggs, peanut butter, nuts and seeds. ⁵Includes baby food vegetables, canned, cooked and raw vegetables, white potatoes, and 100% vegetable juice. ⁶ Includes baby food fruits, canned, dried, and raw fruits, 100% baby food juices, and other 100% fruit juices. ⁷ Includes popcorn, potato chips, and corn chips. ⁸ Includes baby food desserts and cookies, non-baby food dessert items (cakes, pies, cookies, bars, brownies, biscuits, pastries, muffins, and traditional desserts), ice cream and dairy desserts, puddings, candy, cereal and nutrition bars, gelatins, ices, and sorbets, sugars, syrups, preserves, and jelly, fruit drinks and other sugar sweetened beverages. ⁹Includes butter, margarine, animal fats, dressings, oils, and olives. ¹⁰Includes condiments, herbs, seasonings, gravies, and sauces
Adherence to dietary recommendations pertinent to food group intake#, in children aged above 1 year. Legend: Adherence assessment was based on the recommended servings for the various food groups by age and gender based on the AHA/AAP Dietary Recommendations for Children [24, 25].# Recipes of composite foods were disaggregated prior to the assessment of adherence to dietary recommendations. * Indicates significant difference between the age groups in the proportion of children adhering to food group recommendations. Abbreviations: AHA/AAP: American Heart Association/American Academy of Pediatrics
Background This is the first study on dietary intakes of infants and young children in the Eastern Mediterranean Region, a region that is currently witnessing the nutrition transition. It aims at characterizing food consumption patterns amongst 0–4 year old children in Lebanon, evaluating their macro- and micronutrient intakes and assessing adherence to dietary recommendations. Methods Based on a national cross-sectional survey in 2012 ( n = 866), the study collected data on sociodemographic and anthropometric characteristics, and one 24-hour dietary recall was administered. Nutrient intakes were compared with reference values: Estimated Average Requirement (EAR), Adequate Intake (AI) and Acceptable Macronutrient Distribution Range (AMDR). Results Milk was the highest contributor to energy intake (EI) in infants (95.8 and 56.5% in 0–5.9 months and 6–11.9 months old infants, respectively), while its intake was lower among toddlers and preschoolers (35.4 and 15.1%, respectively). In contrast, intakes of sweets and sweetened beverages were the highest in preschoolers compared to younger children, contributing 18.5% EI in preschoolers. Compared to dietary guidelines, the lowest dietary adherence was found for vegetables (17.8–20.7%) and fruits (14.4–34.3%). Protein intake was within the recommendations for the vast majority of children. Although total fat intake was lower in toddlers and preschoolers compared to infants, more than 40% of toddlers and preschoolers exceeded the AMDR for fat and 87.3% of preschoolers exceeded the upper limit for saturated fat. Only 3.6% of toddlers and 11.5% of preschoolers exceeded the AI level for dietary fiber. Micronutrient intake assessment showed that mean intakes in infants exceeded the AI for all micronutrients, except for vitamin D and magnesium. In toddlers, vitamin D and calcium were below the EAR among 84.7, and 44.6%, respectively. In preschoolers, most of the children (91.9%) had inadequate intakes of vitamin D, and a third had inadequate intakes of folate, calcium and vitamin A. Conclusions This study identified priority issues for nutrition intervention in infants and young children in Lebanon. Concerted multi-stakeholder efforts are needed to instill heathier food consumption and nutrient intake patterns early in life.
 
Showing the metabolism of caffeine into its various metabolites. Theophylline, Paraxanthine, and Theobromine represent the major metabolites of caffeine, denoted by the bold arrows (4%, 84%, and 12% respectively)
Showing the demographics data for the total cohort
Showing the laboratory and demographic data by quartile of caffeine concentration. Comparisons of laboratory and demographic data between quartile levels were performed using one-way ANOVA for continuous variables, and chi-square for categorical variables. Statistical significance was set at p < 0.05
Showing the weighted, multivariable regression coefficients for caffeine and each metabolite. Statistical significance was set for p < 0.05
Background Caffeine is one of the most commonly used psychoactive drugs in the world, and provides many health benefits including alertness, improved memory, and reducing inflammation. Despite these benefits, caffeine has been implicated in a number of adverse health outcomes possibly due to effects within the endocrine system, effects that may contribute to impaired reproductive function and low testosterone in men. Previous studies have investigated associations between caffeine consumption and testosterone levels in men, although the quantity and generalizability of these studies is lacking, and the results between studies are conflicting and inconclusive. Methods Using data from a cross-sectional study of 372 adult men in the 2013–2014 NHANES survey cycle, the researchers set out to characterize the association between serum testosterone levels, caffeine, and 14 caffeine metabolites. Results Multivariable, weighted linear regression revealed a significant inverse association between caffeine and testosterone. Multivariable, linear regression revealed significant, inverse associations between 6 xanthine metabolic products of caffeine and testosterone. Inverse associations were observed between 5-methyluric acid products and testosterone, as well as between 5-acetlyamino-6-amino-3-methyluracil and testosterone. A significant, positive association was observed for 7-methyl xanthine, 3,7-dimethyluric acid, and 7-methyluric acid. Logistic regression models to characterize the association between 2 biologically active metabolites of caffeine (theobromine and theophylline) and odds of low testosterone (< 300 ng/dL) were non-significant. Conclusions These findings suggest a potential role for caffeine’s contribution to the etiology of low testosterone and biochemical androgen deficiency. Future studies are warranted to corroborate these findings and elucidate biological mechanisms underlying this association.
 
Abbreviations 25OHD: 25-Hydroxyvitamin D; T1: Early pregnancy; T3: Late pregnancy; OR: Odds ratio; BMI: Body mass index; LC-MS/MS: Liquid chromatography tandem-mass spectrometry; SD: Standard deviation; CI: Confidence interval; ref: Reference category.
Background The relationship between maternal vitamin D status in pregnancy and the development of atopic diseases in the offspring has been frequently studied, but with contradictory results. Previous studies have found an inverse relation between maternal vitamin D in pregnancy and the risk of atopic diseases in the child. In contrast, others have found a higher maternal 25OHD to be related to a higher risk of atopic diseases. Thus, the aim was to investigate the associations between maternal vitamin D status and intake in pregnancy with asthma, eczema and food allergies in the children up to 5 years. In addition, effect modification by reported atopic heredity was studied. Methods Participants in the GraviD study had 25-hydroxyvitamin D (25OHD) analyzed in serum in early (T1) and late (T3) pregnancy. Maternal dietary vitamin D intake was estimated from a short food frequency questionnaire and supplement use by questionnaires. At 5 years of age the child´s history of asthma, eczema and food allergy, including atopic heredity, was reported by questionnaire. Multivariable logistic regression was used. Results The cumulative incidence of asthma was 13%, eczema 22%, and food allergy 18%. Only among children without reported atopic heredity, maternal 25OHD of 50–75 nmol/L in T1 was associated with lower odds of asthma (OR 0.271, 95% CI 0.127–0.580), compared to maternal 25OHD > 75 nmol/L. Additionally in these children, maternal 25OHD in T3 (continuous) was associated with asthma (OR 1.014, 95% CI 1.002–1.009), and dietary vitamin D intake with eczema (OR 1.141, 95% CI 1.011–1.288). Conclusions Among children without reported atopic heredity, higher maternal vitamin D status and intake during pregnancy was associated with increased risk of reported atopic disease.
 
Flow diagram of participant selection
Cumulative probability based on alcohol consumption. The cumulative probability of incidence of proteinuria (≥ 1 +) in males (A) and females (B) and of low eGFR (eGFR < 60 mL/min/1.73 m² and a 25% decrease) in males (C) and females (D)
Background The difference in the clinical impact of alcohol consumption on kidney function based on sex remains to be elucidated. This study aimed to assess the association between the dose of alcohol consumption and the incidence of proteinuria and chronic kidney disease stratified by sex. Methods This retrospective cohort study included 26,788 workers (19,702 men and 7086 women) with normal renal function (estimated glomerular filtration rate ≥ 60 mL/min/1.73 m ² ) at annual health examinations between January 2010 and March 2015 in Japan. The main exposure was alcohol consumption. The primary outcomes were the incidence of proteinuria (dipstick urinary protein ≥ 1) and incidence of low estimated glomerular filtration rate (eGFR; rate < 60 mL/min per 1.73 m ² ; decreased from the baseline eGFR by 25%). Results During a median observational period of 4 years (interquartile range: 2–6), 1993 (10.1%) men and 462 (6.5%) women developed proteinuria, whereas 667 (3.4%) men and 255 (3.6%) women developed low eGFR. After adjustment for clinically relevant factors using a Cox proportional hazards model, alcohol consumption of ≥ 46 g/day in females was significantly associated with the incidence of proteinuria (hazard ratio, 1.57; 95% confidence interval, 1.10–2.26) and low eGFR (hazard ratio, 1.62; 95% confidence interval, 1.04–2.53). However, no significant association between alcohol consumption and primary outcomes was observed in men. Conclusions In conclusion, daily higher alcohol consumption was significantly associated with a higher incidence of proteinuria and low eGFR among women. Women might be prone to high alcohol consumption with kidney dysfunction.
 
Association of Hcy levels and 5-MeTHF levels among all participants (A) and stratified by MTHFR C677T (B). Panel A was adjusted for sex, age, study site, MTHFR C677T, BMI, systolic blood pressure, diastolic blood pressure, estimated glomerular filtration rate, total cholesterol, triglycerides, high-density lipoprotein cholesterol, vitamin B12, vitamin D3, fasting glucose, folate, smoking and drinking at baseline. Panel B was adjusted for the variables in panel A but removed MTHFR C677T
Background and aims Clarifying the association between 5-methyltetrahydrofolate and homocysteine and the effect pattern of methylene tetrahydrofolate reductase (MTHFR C677T) may contribute to the management of homocysteine and may serve as a significant reference for a randomized controlled trial of 5-methyltetrahydrofolate intervention. This study aimed to reveal the association between these two biochemical indices. Methods Study population was drawn from the baseline data of the China Stroke Primary Prevention Trial (CSPPT), including 2328 hypertensive participants. 5-methyltetrahydrofolate and homocysteine were determined by stable-isotope dilution liquid chromatography-tandem mass spectrometry and automatic clinical analyzers, respectively. MTHFR C677T polymorphisms were detected using TaqMan assay. Multiple linear regression was performed to evaluate the association between serum 5-methyltetrahydrofolate and homocysteine. Results There was a significant inverse association between 5-methyltetrahydrofolate and homocysteine when 5-methyltetrahydrofolate was ≤ 10 ng/mL, and this association was modified by MTHFR C677T (per 1-ng/mL increment; All: β = − 0.50, P < 0.001; CC: β = − 0.14, P = 0.087; CT: β = − 0.20, P = 0.011; TT: β = − 1.19, P < 0.001). Moreover, the decline in trend in genotype TT participants was stronger than in genotype CC participants (P for difference < 0.001) and genotype CT participants (P for difference < 0.001), while there was no significant difference between genotype CC and genotype CT participants (P for difference = 0.757). Conclusions Our data showed a non-linear association between serum homocysteine and 5-methyltetrahydrofolate among Chinese hypertensive adults, however, it could be inversely linearly fitted when serum 5-methyltetrahydrofolate was ≤ 10 ng/mL, and this association was modified by MTHFR C677T.
 
Prevalence of Diet Modification Efforts to Consume More in the Past 12 Months among Male and Female Participants who Reported Weight Gain Attempts from Five Countries in the 2018 and 2019 International Food Policy Study. Note: Chi-square tests for sex differences (*p < .05 **p < .01 ***p < .001). Analyses included sample weights
Prevalence of Diet Modification Efforts to Consume More in the Past 12 Months among Male Participants who Reported Weight Gain Attempts, by Country, in the 2018 and 2019 International Food Policy Study by Country. Note: Chi-square tests for country differences (*p < .05 **p < .01 ***p < .001). Analyses included sample weights
Prevalence of Diet Modification Efforts to Consume More in the Past 12 Months among Female Participants who Reported Weight Gain Attempts, by Country, in the 2018 and 2019 International Food Policy Study by Country. Note: Chi-square tests for country differences (*p < .05 **p < .01 ***p < .001). Analyses included sample weights
Prevalence of Weight Gain Attempts by Self-Rated Diet Quality among Male and Female Participants in the 2018 and 2019 International Food Policy Study. Note: Chi-square tests for differences in self-rated diet (***p < .001). Analyses included sample weights
Background Recent research has emphasized a growing trend of weight gain attempts, particularly among adolescents and boys and young men. Little research has investigated these efforts among adults, as well as the specific diet modifications individuals who are trying to gain weight engage in. Therefore, the aims of this study were to characterize the diet modification efforts used by adults across five countries who reported engaging in weight gain attempts and to determine the associations between weight gain attempts and concerted diet modification efforts. Methods Cross-sectional data from the 2018 and 2019 International Food Policy Study, including participants from Australia, Canada, Mexico, the United Kingdom, and the United States (N = 42,108), were analyzed. In reference to the past 12 months, participants reported on weight gain attempts and diet modification efforts related to increased consumption of calories, protein, fiber, fruits and vegetables, whole grains, dairy products, all meats, red meat only, fats, sugar/added sugar, salt/sodium, and processed foods. Unadjusted (chi-square tests) and adjusted (modified Poisson regressions) analyses were conducted to examine associations between weight gain attempts and diet modification efforts. Results Weight gain attempts were significantly associated with higher likelihood of each of the 12 forms of diet modification efforts among male participants, and 10 of the diet modification efforts among female participants. Notably, this included higher likelihood of efforts to consume more calories (males: adjusted prevalence ratio [aPR] 3.25, 95% confidence interval [CI] 2.94–3.59; females: aPR 4.05, 95% CI 3.50–4.70) and fats (males: aPR 2.71, 95% CI 2.42–3.03; females: aPR 3.03, 95% CI 2.58–3.55). Conclusions Overall, the patterns of association between weight gain attempts and diet modification efforts may be indicative of the phenomenon of muscularity-oriented eating behaviors. Findings further highlight the types of foods and nutrients adults from five countries may try to consume in attempts to gain weight.
 
Flowchart of subjects’ selection
Liner regression odds ratios and 95% confidence intervals for chronic low back pain across categories of high protein (a) and energy dense diets (b)
Background Chronic low back pain (LBP) is the most common musculoskeletal pain that affects a person’s daily activities. This present study aimed at evaluating the relationship between major dietary pattern and Chronic LBP. Methods This cross-sectional analysis was examined 7686 Kurdish adults. The RaNCD cohort study physician diagnosed chronic LBP. Dietary patterns were derived using principal component analysis. The three identified dietary patterns derived were named: 1) the vegetarian diet included vegetables, whole grain, legumes, nuts, olive, vegetable oil, fruits, and fruit juice; 2) high protein diet related to higher adherence to red and white meat, legumes, nuts, and egg; and 3) energy-dense diet characterized with higher intake of salt, sweet, dessert, hydrogenated fat, soft drink, refined grain, tea, and coffee. Dietary pattern scores were divided into tertiles. Binary logistic regression in crude, adjusted odds ratios (OR) and 95% confidence intervals (CI) were used to determine this association. Results Twenty-two per cent of participants had chronic LBP. Higher adherence to high protein dietary pattern was inversely associated with chronic LBP in crude (OR: 0.79, 95% CI: 0.69–0.9) and adjusted model (for age, sex, smoking, drinking, diabetes, physical activity, body mass index, and waist circumference) (OR: 0.84, 95% CI: 0.72–0.97). In addition, after controlling for the mentioned potential confounders, participants in the highest category of energy dense diet were positively associated with chronic LBP compared with those in the lowest category (OR: 1.13, 95% CI: 1.01–1.32). Conclusions Higher adherence to the high protein diet was inversely related to chronic LBP prevalence. In addition, we found that following energy dense diet was positively associated with chronic LBP.
 
Flowchart for retrieving articles and selecting eligible studies. *CNKI, China National Knowledge Infrastructure; CI, confidence interval
Forest plots for the association between total/subclass of flavonoids consumption and hormone-related cancer risk. A breast cancer; B ovarian cancer; C endometrial cancer; D thyroid cancer; E prostate cancer. Use (1) and (2) to distinguish two different studies by the same author in the same year; two datasets of the same study were represented by (a) and (b). *ES, effect size; CI, confidence interval
Forest plots for the association between individual flavonoid compounds consumption and hormone-related cancer risk. A breast cancer; B ovarian cancer. *ES, effect size; CI, confidence interval
Funnel plots for the association between flavonoid consumption and hormone-related cancer risk. A breast cancer; B ovarian cancer; C endometrial cancer; D prostate cancer
Plots of sensitivity analyses by sequential removal of each study. A breast cancer; B ovarian cancer; C women-specific cancers; D prostate cancer; E men-specific cancers. *BC, breast cancer; EC, endometrial cancer; OC, ovarian cancer; PC, prostate cancer; TC, testicular cancer
Background Flavonoids seem to have hormone-like and anti-hormone properties so that the consumption of flavonoids may have potential effects on hormone-related cancers (HRCs), but the findings have been inconsistent so far. This meta-analysis was aimed to explore the association between flavonoids intake and HRCs risk among observational studies. Methods Qualified articles, published on PubMed, EMBASE, and China National Knowledge Infrastructure (CNKI) from January 1999 to March 2022 and focused on relationships between flavonoids (total, subclass of and individual flavonoids) and HRCs (breast, ovarian, endometrial, thyroid, prostate and testicular cancer), were retrieved for pooled analysis. Random effects models were performed to calculate the pooled odds ratios (ORs) and corresponding 95% confidence intervals (CIs). Funnel plots and Begg’s/Egger’s test were used to evaluate the publication bias. Subgroup analyses and sensitivity analyses were conducted to explore the origins of heterogeneity. Results All included studies were rated as medium or high quality. Higher consumption of flavonols (OR = 0.85, 95% CI: 0.76–0.94), flavones (OR = 0.85, 95% CI: 0.77–0.95) and isoflavones (OR = 0.87, 95% CI: 0.82–0.92) was associated with a decreased risk of women-specific cancers (breast, ovarian and endometrial cancer), while the higher intake of total flavonoids was linked to a significantly elevated risk of prostate cancer (OR = 1.11, 95% CI: 1.02–1.21). A little evidence implied that thyroid cancer risk was augmented with the higher intake of flavones (OR = 1.24, 95% CI: 1.03–1.50) and flavanones (OR = 1.31, 95% CI: 1.09–1.57). Conclusions The present study suggests evidence that intake of total flavonoids, flavonols, flavones, flavanones, flavan-3-ols and isoflavones would be associated with a lower or higher risk of HRCs, which perhaps provides guidance for diet guidelines to a certain extent. Trial registration This protocol has been registered on PROSPERO with registration number CRD42020200720 .
 
Flow chart of Framingham Heart Study (FHS) Offspring and Generation 3 (Gen3) cohort participants
Background Previous studies reported that dairy foods are associated with higher areal bone mineral density (BMD) in older adults. However, data on bone texture are lacking. We determined the association of dairy food intake (milk, yogurt, cheese, milk + yogurt and milk + yogurt + cheese) with spinal trabecular bone score (TBS). Methods In this cross-sectional study, a validated semi-quantitative food frequency questionnaire was used to assess dairy food intake (servings/wk). TBS, an analysis of bone texture, was calculated from dual energy X-ray absorptiometry (DXA) scans. Sex-specific multivariable linear regression was used to estimate the association of dairy food intake (energy adjusted via residual methods) with each bone measure adjusting for covariates. Results Mean age of 4,740 participants was 49 (SD: 13) years and mean milk + yogurt + cheese intake was 10.1 (SD: 8.4) servings/week in men and 10.9 (SD: 8.0) servings/week in women. There were no associations between dairy food intake and spinal TBS in adjusted models. Conclusions In this cohort of primarily healthy adults, dairy intake was not associated with bone texture.
 
Flow chart of participant selection process
Dose–response relationship between dietary fiber intake and severe AAC (p for linear < 0.001, p for nonlinear = 0.695), using the cutoff value of lowest quartile (Q1) of dietary fiber intake (10.95 g) as the reference. AAC24 score > 6 was defined as severe AAC. The restricted cubic spline model was adjusted by age, gender, ethnicity, BMI, education, DM, hypertension, smoking, TC/HDL-C, albumin, creatinine, total calcium, phosphorus, WBC, total 25-hydroxyvitamin D, and caloric intake
Background Abdominal aortic calcification (AAC) is recognized as a valuable predictor of cardiovascular diseases (CVDs). Dietary fiber is strongly correlated with CVDs. However, the effect of dietary fiber on AAC in the population is not well understood. Objective To assess the relationship between dietary fiber intake and AAC in the US adult population. Methods A total of 2671 individuals with both dietary fiber intake and AAC score data were enrolled from the 2013–2014 National Health and Nutrition Examination Survey (NHANES), a cross-sectional health examination in the US. Multinomial logistic regression was used to calculate the odds ratio (OR), with 95% confidence interval (CI). To reveal the relationship between dietary fiber intake and AAC, restricted cubic spline was also applied. Results Out of the total participants, 241 (9%) had severe AAC and 550 (20%) had mild-moderate AAC. Multinomial logistic regression indicated that higher intake of dietary fiber was associated with lower risk of severe AAC, but not with lower risk of mild-moderate AAC. For every one standard deviation increase (9.4 g/day) in dietary fiber intake, the odds of severe AAC were reduced by 28% [OR 0.72 (95% CI, 0.57–0.90), p = 0.004], after adjusting for confounding factors. Dose–response relationship revealed that dietary fiber intake was negatively correlated with severe AAC ( p for linear < 0.001, p for nonlinear = 0.695). Conclusions Dietary fiber intake was negatively associated with severe AAC, and showed a dose–response relationship in US adults.
 
Flow diagram of study selection in this systematic review and meta-analysis
Forest plot of the differences in IBS-SSS scores between participants in the intervention group and placebo group before and after intervention. a The differences in IBS-SSS scores between participants in the intervention group and placebo group before intervention; b the differences in IBS-SSS scores between participants in the intervention group and placebo group after intervention
Forest plot of the improvement of IBS-SSS after intervention in participants receiving vitamin D supplementation and receiving placebo. a The improvement of IBS-SSS in participants receiving vitamin D supplementation; b the improvement of IBS-SSS in participants receiving placebo
Forest plot of the improvement of IBS-QoL after intervention in participants receiving vitamin D supplementation and receiving placebo. a The improvement of IBS-QoL in participants receiving vitamin D supplementation; b the improvement of IBS-QoL in participants receiving placebo
Background Irritable bowel syndrome (IBS) is a chronic gastrointestinal disorder involving gut-brain interactions with limited effective treatment options. Vitamin D deficiency is commonly observed in patients with IBS, but whether vitamin D supplementation ameliorates IBS is controversial in randomized controlled trials. The present systematic review and meta-analysis explored the efficacy of vitamin D supplementation in patients with IBS. Methods We performed a systematic search of potentially relevant publications from PubMed, EMBASE, the Cochrane Central Register of Controlled Studies and the Web of Science up until January 2022. We assessed the weighted mean difference (WMD) and 95% confidence interval (95% CI) of the IBS severity scoring system (IBS-SSS), IBS quality of life (IBS-QoL) and IBS total score (IBS-TS) before and after vitamin D supplementation intervention. Results We included four randomized, placebo-controlled trials involving 335 participants. The differences in IBS-SSS score between participants in the intervention group and the placebo group increased after intervention (WMD: -55.55, 95% CI: -70.22 to -40.87, I 2 = 53.7%, after intervention; WMD: -3.17, 95% CI: -18.15 to 11.81, I 2 = 0.0%, before intervention). Participants receiving vitamin D supplementation showed greater improvement in IBS-SSS after intervention than participants receiving placebo treatment (WMD: -84.21, 95% CI: -111.38 to -57.05, I 2 = 73.2%; WMD: -28.29, 95% CI: -49.95 to -6.62, I 2 = 46.6%, respectively). Vitamin D supplementation was also superior to placebo in IBS-QoL improvement (WMD: 14.98, 95% CI: 12.06 to 17.90, I 2 = 0.0%; WMD: 6.55, 95% CI: -2.23 to 15.33, I 2 = 82.7%, respectively). Sensitivity analyses revealed an unstable pooled effect on IBS-TS in participants receiving vitamin D supplementation. Therefore, we did not evaluate the efficacy of vitamin D intervention in IBS-TS. Conclusions This systematic review and meta-analysis suggested that vitamin D supplementation was superior to placebo for IBS treatment.
 
Background High sodium and low potassium intakes are associated with the early development of chronic diseases (e.g., hypertension, obesity). Taking into account the limited data on sodium and potassium intakes by 24-h excretion in urine in pre-adolescents and adolescents, we wished to determine baseline salt intake in Iranian subjects aged 11–18 years. Methods This was an observational study involving 374 pre-adolescents and adolescents (154 boys and 220 girls). Sodium and potassium intakes were ascertained by measuring sodium and potassium excretion in urine over 24 h. Creatinine level was used to validate the completeness of the urine collections. The association between sodium and potassium intake and adiposity was determined based on body fat percentage. Results The mean 24-h urine sodium concentration was 3130 ± 2200 mg/day, equal to 7.961 ± 5.596 g/day salt intake. Approximately half of the study participants exceeded the upper limit of Na intake. The mean potassium intake was estimated 1480 ± 1050 mg/day. There was a positive association between urinary sodium excretion and adiposity in crude (OR 1.79; 95% CI: 1.08—2.74) and full adjusted model (OR: 3.15; 95% CI: 2.28–4.63). Also, in subsample analysis, there was a positive correlation between urinary sodium and adiposity in both pre-adolescents (OR: 2.71; 95% CI: 2.29—3.93) and adolescents (OR: 3.55; 95% CI: 2.17—4.74). However, no significant association was found between 24-h urinary potassium and adiposity. Conclusion Sodium intake, as estimated by 24-h urinary excretion, was higher than recommended and it was positively associated with adiposity. Also, this study reported low compliance of potassium intake recommendations in 11–18 years’ Iranian pre-adolescents and adolescents. Health promotion interventions are needed in order to broaden public awareness of high sodium intake and potassium inadequacy to reduce chronic diseases.
 
Daily average consumption of caffeinated substances by gender and age
Background Although representative data on caffeine intake in Americans are available, these data do not include US service members (SMs). The few previous investigations in military personnel largely involve convenience samples. This cross-sectional study examined prevalence of caffeine consumers, daily caffeine consumption, and factors associated with caffeine use among United States active duty military service members (SMs). Methods A stratified random sample of SMs were asked to complete an on-line questionnaire on their personal characteristics and consumption of caffeinated products (exclusive of dietary supplements). Eighteen percent ( n = 26,680) of successfully contacted SMs ( n = 146,365) completed the questionnaire. Results Overall, 87% reported consuming caffeinated products ≥1 time/week. Mean ± standard error per-capita consumption (all participants) was 218 ± 2 and 167 ± 3 mg/day for men and women, respectively. Caffeine consumers ingested 243 ± 2 mg/day (251 ± 2 mg/day men, 195 ± 3 mg/day women). On a body-weight basis, men and women consumed respectively similar caffeine amounts (2.93 vs 2.85 mg/day/kg; p = 0.12). Among individual caffeinated products, coffee had the highest use (68%), followed by sodas (42%), teas (29%), energy drinks (29%) and gums/candy/medications (4%). In multivariable logistic regression, characteristics independently associated with caffeine use (≥1 time/week) included female gender, older age, white race/ethnicity, higher body mass index, tobacco use or former use, greater alcohol intake, and higher enlisted or officer rank. Conclusion Compared to National Health and Nutrition Examination Survey data, daily caffeine consumption (mg/day) by SMs was higher, perhaps reflecting higher mental and physical occupational demands on SMs.
 
Flowchart of the study participants
Cut-off point of Ox-to-Ca ratio for CVD events (0.14, sensitivity = 67.3%, Youden index = 0.10)
Background and aim The potential cardiovascular impact of usual intakes of oxalate (Ox) is uninvestigated. We evaluated the effect of dietary Ox and its interaction with dietary calcium (Ca) on incident cardiovascular disease (CVD). Methods We included 2966 adult men and women aged 19–84 y without known CVD during baseline enrollment (2006–2008) of the Tehran Lipid and Glucose Study. Dietary intakes were assessed using a validated FFQ, and incident CVD (i.e., coronary heart disease, stroke, and CVD mortality) were documented through March 2018. Results A 7.1% incident of CVD occurred during a median follow-up of 10.6 y. After multivariable adjustment for traditional risk factors and key dietary nutrients, including total fat and fiber, Ox intakes ≥220 mg/d increased incident CVD (HR T3 vs. T1 = 1.47, 95% CI = 1.02–2.12). This association was potentiated (HR T3 vs. T1 = 2.42, 95% CI = 1.19–4.89) in subjects who had a lower intake of Ca (< 981 mg/d); in a low-Ca diet, an even lower amount of dietary Ox (second tertile, 148–220 mg/d) was related to increased CVD events by 92% (HR = 1.92, 95% CI = 1.00–3.70). No association was observed between dietary Ox and CVD events in the presence of medium- and high levels of Ca intakes. The critical cut-off point of Ox-to-Ca for predicting CVD events was 0.14, which was related to an increased risk of CVD by 37% (HR = 1.37, 95% CI = 1.02–1.84). Conclusion Higher dietary Ox intake appeared to be associated with a modestly elevated risk of incident CVD, especially in a diet with a lower amount of Ca.
 
A flow chart of identification and selection of the studies in the current meta-analysis
Possible association between maternal FA supplementation and CHD. FA: folic acid; CHD: congenital heart disease; A. Odd ratio (OR) estimates for the overall association of maternal FA supplementation with the risk of CHD; B. OR estimates for the association between maternal FA supplement intake and CHD; C. OR estimates for the association between the initiation of FA supplementation on CHDs; D. Estimated OR of the association between FA supplementation time and CHD
Possible association between maternal FA supplementation and ASD. FA: folic acid; ASD: atrial septal defects; A. Odd ratio (OR) estimates for the overall association of maternal FA supplementation with the risk of ASD; B. OR estimates for the association between Maternal folic acid supplement intake and ASD; C. OR estimates for the association between the initiation of folic acid supplementation on ASD; D. Estimated OR of the association between folic acid supplementation time and ASD
Begg’s test for possible association between FA supplementation and CHD/ASD. FA: folic acid; CHD: congenital heart disease; ASD: atrial septal defects; A. Begg's test of studies examining the association between maternal folate supplementation and the risk of CHD; B. Begg's test of studies examining the association between maternal folate supplementation and the risk of ASD
Background Folic acid (FA), as a synthetic form of folate, has been widely used for dietary supplementation in pregnant women. The preventive effect of FA supplementation on the occurrence and recurrence of fetal neural tube defects (NTD) has been confirmed. Incidence of congenital heart diseases (CHD), however, has been parallelly increasing worldwide. The present study aimed to evaluate whether FA supplementation is associated with a decreased risk of CHD. Methods We searched the literature using PubMed, Web of Science and Google Scholar, for the peer-reviewed studies which reported CHD and FA and followed with a meta-analysis. The study-specific relative risks were used as summary statistics for the association between maternal FA supplementation and CHD risk. Cochran's Q and I² statistics were used to test for the heterogeneity. Results Maternal FA supplementation was found to be associated with a decreased risk of CHD (OR = 0.82, 95% CI: 0.72–0.94). However, the heterogeneity of the association was high (P < 0.001, I² = 92.7%). FA supplementation within 1 month before and after pregnancy correlated positively with CHD (OR 1.10, 95%CI 0.99–1.23), and high-dose FA intake is positively associated with atrial septal defect (OR 1.23, 95%CI 0.64–2.34). Pregnant women with irrational FA use may be at increased risk for CHD. Conclusions Data from the present study indicate that the heterogeneity of the association between maternal FA supplementation and CHD is high and suggest that the real relationship between maternal FA supplementation and CHD may need to be further investigated with well-designed clinical studies and biological experiments.
 
Background Food insecurity (FI) is a dynamic phenomenon. Experiences of daily FI may impact dietary outcomes differently within a given month, across seasons, and before or during the COVID-19 pandemic. Objectives The aims of this study were to investigate the association of short-term FI with dietary quality and energy 1) over six weeks in two seasonal months and 2) before and during the COVID-19 pandemic. Methods Using an ecological momentary assessment framework on smartphones, this study tracked daily FI via the 6-item U.S. Adult Food Security Survey Module and dietary intake via food diaries in 29 low-income adults. A total of 324 person-days of data were collected during two three-week long waves in fall and winter months. Generalized Estimating Equation models were applied to estimate the daily FI-diet relationship, accounting for intrapersonal variation and covariates. Results A one-unit increase in daily FI score was associated with a 7.10-point (95%CI:-11.04,-3.15) and 3.80-point (95%CI: -6.08,-1.53) decrease in the Healthy Eating Index-2015 (HEI-2015) score in winter and during COVID-19, respectively. In winter months, a greater daily FI score was associated with less consumption of total fruit (-0.17 cups, 95% CI: -0.32,-0.02), whole fruit (-0.18 cups, 95%CI: -0.30,-0.05), whole grains (-0.57 oz, 95%CI: -0.99,-0.16) and higher consumption of refined grains (1.05 oz, 95%CI: 0.52,1.59). During COVID-19, elevated daily FI scores were associated with less intake of whole grains (-0.49 oz, 95% CI: -0.88,-0.09), and higher intake of salt (0.34 g, 95%CI: 0.15,0.54). No association was observed in fall nor during the pre-COVID-19 months. No association was found between daily FI and energy intake in either season, pre-COVID 19, or during-COVID-19 months. Conclusion Daily FI is associated with compromised dietary quality in low-income adults in winter months and during the COVID-19 period. Future research should delve into the underlying factors of these observed relationships.
 
Schematic representation of E. coli attachment by FimH tips of the type 1 pili adhesins to mannosylated uroplakins on the surface of uroepithelium
D-mannose, from supplementation to urine. Roughly one third of supplemented D-mannose ends up into urine where it has the potential to block pathogenic Escherichia coli from adhering to uroepithelial cells. Some of the D-mannose can be detected in the feces and some is utilized within the target tissues
Urinary tract infections (UTIs) are one of the most prevalent bacterial diseases worldwide. Despite the efficacy of antibiotics targeted against UTI, the recurrence rates remain significant among the patients. Furthermore, the development of antibiotic resistance is a major concern and creates a demand for alternative treatment options. D-mannose, a monosaccharide naturally found in fruits, is commonly marketed as a dietary supplement for reducing the risk for UTIs. Research suggests that supplemented D-mannose could be a promising alternative or complementary remedy especially as a prophylaxis for recurrent UTIs. When excreted in urine, D-mannose potentially inhibits Escherichia coli , the main causative organism of UTIs, from attaching to urothelium and causing infection. In this review, we provide an overview of UTIs, E. coli pathogenesis and D-mannose and outline the existing clinical evidence of D-mannose in reducing the risk of UTI and its recurrence. Furthermore, we discuss the potential effect mechanisms of D-mannose against uropathogenic E.coli .
 
Background The COVID-19 pandemic has impacted the Australian food supply through changed consumer purchasing patterns, and potentially, household food security. The aim of this study was to determine the impact of COVID-19 on the prevalence of food insecurity and food supply issues, and perspectives of food supply stakeholders in regional Australia. Methods A mixed-methods consumer survey and in-depth interviews with food supply stakeholders were conducted in regional Australia, more specifically South West Western Australia between May and July 2020, immediately after the 1st wave of the pandemic. Results The prevalence of food insecurity was 21% among consumers, and significantly more prevalent for those aged less than 30 years and living with a disability. Most consumers (73%) agreed that the COVID-19 pandemic had impacted the food supply. Food insecure respondents were more likely to report that food was more expensive, resulting in changes to the types and quantities of food bought. Food supply stakeholders perceived that consumers increased their intention to buy locally grown produce. Panic buying temporarily reduced the availability of food for both food suppliers and consumers, regardless of their food security status. Conclusions This study provided novel insights from South West Australian consumer and food supply stakeholder perceptions. Food insecure consumers provided insights about the high cost of food and the subsequent adaptation of their shopping habits, namely type and amount of food purchased. Stakeholder perceptions largely focused on supply chain issues and corroborated consumer reports.
 
Trends in global burden of ID from 1990 to 2017, in terms of age-standardized DALY rates in both sexes
Sex-specific age-standardized DALY rates among countries with different HDI in 2017. Lines inside the boxes indicates the medians; boxes, the 25% and 75% percentiles; and lines outside the boxes, the ± 1·5 times of quantile range; points, outliers. *** p < 0.001
Clustered HDI trends for 152 countries from 1990 to 2017 (a). We grouped 23, 77, and 52 countries into each of the low-, moderate- and high-HDI cluster, respectively. Median HDI of each cluster from 1990 to 2017 was plotted. Sex-specific age-standardized DALY rates from 1990 to 2017 with HDI clusters(b). 152 countries were included with 23, 77, and 52 countries into each of the low-, moderate- and high-HDI cluster, respectively. Median age-standardized DALY rates from 1990 to 2017 of each cluster were plotted
Trends in the Gini coefficients (a) and the Concentration indexes (b) of ID burden across countries from 1990 to 2017 in both sexes
Background Iron deficiency (ID) impairs patient physical activity, recognition and life quality, which is difficult to perceive but should not be underestimated. Worldwide efforts have been made to lower ID burden, however, whether it decreased equally in different regions and sexes is unclear. This study is to examine regional and sex inequalities in global ID from 1990 to 2017. Methods We conducted a longitudinal, comparative burden-of-disease study. Disability-adjusted life-years (DALYs) of ID were obtained from Global Burden of Disease Report 2017. Human Development Index (HDI) data were obtained from Human Development Report 2017. Gini coefficient and the concentration index were calculated to assess the equities in global burden of ID. Results A downward trend of global ID burden (from 569.3 (95% Uncertainty Interval [UI]: 387.8–815.6) to 403.0 (95% UI: 272.4–586.6), p < 0.001), age-adjusted DALYs per 100,000 population) but an uptrend of its inequalities (from 0.366 to 0.431, p < 0.001, Gini coefficients) was observed between 1990 and 2017. ID burden was heavier in women than that in men ([age-adjusted DALYs per 100,000 population from 742.2 to 514.3] vs [from 398.5 to 291.9]), but its inequalities were higher in men since 1990. The between-sex gap of ID burden was narrowed with higher HDI ( β = − 364.11, p < 0.001). East Asia & Pacific and South Asia regions made a big stride for ID control in both sexes over decades [age-adjusted DALYs per 100,000 population from 378.7 (95% UI: 255.8–551.7) in 1990 to 138.9 (95%UI: 91.8–206.5) in 2017], while a heavy burden among Sub-Saharan African men was persistent[age-adjusted DALYs per 100,000 population, 572.5 (95% UI: 385.3–815) in 1990 and 562.6 (95% UI: 367.9–833.3) in 2017]. Conclusions Redistributing attention and resources to help countries with low HDI, especially take care of women with low socioeconomic status (SES) and men under high ID burden may help hold back the expanding ID inequality.
 
Dietary intakes of EDII (Empirical Dietary Inflammatory Index) components of the study participants
Background Pro-inflammatory diet and lifestyle factors lead to diseases related to chronically systemic inflammation. We examined the novel dietary/lifestyle indicators related to inflammation such dietary inflammation score (DIS), lifestyle inflammation score (LIS), empirical dietary inflammatory index (EDII) and, risk of Breast Cancer (BrCa) in Iranian woman. Methods In this hospital-based case–control study, 253 patients with BrCa and 267 non-BrCa controls were enrolled. Food consumption was recorded to calculate the DIS, LIS and EDII using a semi-quantitative Food Frequency Questionnaire (FFQ). We estimated odds ratios (ORs) and, 95% confidence intervals for the association of the inflammatory potential with risk of these cancers using binary logistic regression models modified for the case–control design. Results Mean ± SD of age and BMI of the study participants were 47.92 ± 10.33 years and 29.43 ± 5.51 kg/m², respectively. After adjustment for confounders, individuals in highest compared to lowest quartile of DIS and EDII had significantly higher risk of BrCa (DIS: 2.13 (1.15 – 3.92), p-trends: 0.012), EDII: 2.17 (1.12 – 4.22), p-trends: 0.024). However, no significant association was observed for LIS (P-trends: 0.374). Conclusion Findings of this study suggested that higher DIS and EDI increased the risk of BrCa, but concerning LIS, further investigation is needed.
 
Flow chart of diet record (per batch) and FFQ inclusion. EFR; Estimated Food Record, FFQ; Food Frequency Questionnaire
100 most frequently consumed food items, and food co-occurrence network of food pairings consumed within a food record; A 100 most frequently appearing word within food item descriptions, with the font size and colour depicting the importance of the food item B 100 most frequently appearing food item description, C co-occurrence network of most common food pairings within individual participants food records
Bland Altman plot comparing residual energy adjusted macronutrient and fibre intake of EFR vs FFQs; A EA protein (g), B EA carbohydrate (g), C EA fat (g), D EA NSP fibre (g)
Procrustes analysis comparing the total intake (energy adjusted nutrient composition of 44 variables) between different dietary assessment methods for an individual, and between the same dietary assessment methods (taken at two different timepoints) for an individual. Multi-dimensional dietary variables are transformed into one datapoint. Each datapoint is the composition of a participant’s diet, the line connecting the diets of the same participant. The closer the distance between each point the more similar the dietary composition. A FFQ vs FFQ (p = 0.001) 2356 participants, B Food Record vs Food Record (p = 0.001) 114 participants, C Food record vs FFQ (p = 0.001) 1224 participants
Moderate to great heritability of nutrients (n = 720) (> 20% AE model) estimated using linear structural equation modelling with considering A additive genetic effects, C environmental effects in common; A heritability of nutrients via EFR B heritability of nutrients via FFQ
Background Estimated food records (EFR) are a common dietary assessment method. This investigation aimed to; (1) define the reporting quality of the EFR, (2) characterise acute dietary intake and eating behaviours, (3) describe diet heritability. Methods A total of 1974 one-day EFR were collected from 1858 participants in the TwinsUK cohort between 2012 and 2017. EFR were assessed using a six-point scoring system to determine reporting quality. The frequency and co-occurrence of food items was examined using word clouds and co-occurrence networks. The impact of eating behaviours on weight, BMI and nutrient intake were explored using mixed-effect linear regression models. Finally, diet heritability was estimated using ACE modelling. Results We observed that 75% of EFR are of acceptable reporting quality (score > 5). Black tea and semi-skimmed milk were the most consumed items, on an individual basis (respectively 8.27, 6.25%) and paired (0.21%) as co-occurring items. Breakfast consumption had a significantly ( p = 5.99 × 10 − 7 ) greater impact on energy (kcal) (mean 1874.67 (±SD 532.42)) than skipping breakfast (1700.45 (±SD 620.98)), however only length of eating window was significantly associated with body weight (kg) (effect size 0.21 (±SD 0.10), p = 0.05) and BMI (effect size 0.08 (±SD 0.04), p = 0.04) after adjustment for relevant covariates. Lastly, we reported that both length of eating window (h2 = 33%, CI 0.24; 0.41), and breakfast consumption (h2 = 11%, CI 0.02; 0.21) were weakly heritable. Conclusions EFR describing acute dietary intake allow for eating behaviour characterisation and can supplement habitual diet intake assessments. Novel findings of heritability warrant further investigation.
 
Background Human milk oligosaccharides (HMOs) have important and diverse biological functions in early life. This study tested the safety and efficacy of a starter infant formula containing Limosilactobacillus (L.) reuteri DSM 17938 and supplemented with 2’-fucosyllactose (2’FL). Methods Healthy infants < 14 days old ( n = 289) were randomly assigned to a bovine milk-based formula containing L. reuteri DSM 17938 at 1 × 10 ⁷ CFU/g (control group; CG) or the same formula with added 1.0 g/L 2’FL (experimental group; EG) until 6 months of age. A non-randomized breastfed group served as reference (BF; n = 60). The primary endpoint was weight gain through 4 months of age in the formula-fed infants. Secondary endpoints included additional anthropometric measures, gastrointestinal tolerance, stooling characteristics, adverse events (AEs), fecal microbiota and metabolism, and gut immunity and health biomarkers in all feeding groups. Results Weight gain in EG was non-inferior to CG as shown by a mean difference [95% CI] of 0.26 [-1.26, 1.79] g/day with the lower bound of the 95% CI above the non-inferiority margin (-3 g/day). Anthropometric Z-scores, parent-reported stooling characteristics, gastrointestinal symptoms and associated behaviors, and AEs were comparable between formula groups. Redundancy analysis indicated that the microbiota composition in EG was different from CG at age 2 ( p = 0.050) and 3 months ( p = 0.052), approaching BF. Similarly, between sample phylogenetic distance (weighted UniFrac) for BF vs EG was smaller than for BF vs CG at 3-month age ( p = 0.045). At age 1 month, Clostridioides difficile counts were significantly lower in EG than CG. Bifidobacterium relative abundance in EG tracked towards that in BF. Fecal biomarkers and metabolic profile were comparable between CG and EG. Conclusion L. reuteri -containing infant formula with 2’FL supports age-appropriate growth, is well-tolerated and may play a role in shifting the gut microbial pattern towards that of breastfed infants. Trial Registration The trial was registered on ClinicalTrials.gov ( NCT03090360 ) on 24/03/2017.
 
Sodium sources of restaurant dishes in China
Percentage of restaurant dishes containing salted condiments/−seasonings by dish category in China
Background Sodium intake in China is extremely high and eating in restaurants is increasingly popular. Little research has explored the sodium level of restaurant dishes. The present study aims to assess the content and sources of sodium in Chinese restaurants. Methods Cross-sectional data were obtained from the baseline survey of the Restaurant-based Intervention Study (RIS) in 2019. A total of 8131 best-selling restaurant dishes with detailed recipes from 192 restaurants in China were included. Sodium content per 100 g and per serving were calculated according to the Chinese Food Composition Table. The proportion of restaurant dishes exceeding the daily sodium reference intake level in a single serving and the major sources of sodium were determined. Results Median sodium content in restaurant dishes were 487.3 mg per 100 g, 3.4 mg per kcal, and 2543.7 mg per serving. For a single serving, 74.9% of the dishes exceeded the Chinese adults’ daily adequate intake for sodium (AI, 1500 mg per day), and 62.6% of dishes exceeded the proposed intake for preventing non-communicable chronic diseases (PI, 2000 mg per day). Cooking salt was the leading source of sodium in Chinese restaurant dishes (45.8%), followed by monosodium glutamate (17.5%), food ingredients (17.1%), soy sauce (9.4%), and other condiments/seasonings (10.2%). More types of salted condiments/seasonings use were related to higher sodium level. Conclusions The sodium levels in Chinese restaurant dishes are extremely high and variable. In addition to cooking salt, other salted condiments/seasonings also contribute a large proportion of sodium. Coordinated sodium reduction initiatives targeting the main sources of sodium in restaurant dishes are urgently needed.
 
Study design and schedule used in this study. A 76-items food frequency questionnaire (FFQ) was conducted with face-to-face interview at baseline (FFQ1) and 1 month later (FFQ2). Two three-day dietary recalls (DRs) were completed by participants between FFQ1 and FFQ2, with two-week apart. The reproducibility was tested by comparing the results from FFQ1 and FFQ2, and the validity was assessed by comparing results from FFQ1 and from DRs
Background This study aimed to assess the reproducibility and validity of a food frequency questionnaire (FFQ) developed for diet-related studies in a rural population. Methods One hundred fifty-four healthy residents were interviewed with a 76-item FFQ at baseline (FFQ1) and 1 month later (FFQ2) to assess reproducibility, and required to complete two three-day dietary recalls (DRs) between two FFQs to determine the validity by comparing DRs with FFQ1. Results Crude Spearman correlation coefficients between FFQ1 and FFQ2 ranged from 0.58 to 0.92 and energy-adjusted coefficients ranged from 0.62 to 0.92; weighted kappa statistic covered a spectrum from 0.45 to 0.81, depicting moderate to good agreements. For validity, there were moderate to strong associations (0.40–0.68) in most nutrients and food between FFQ1 and DRs; weighted kappa statistic demonstrated fair to moderate agreements for nutrients and food (0.21–0.49). Conclusions The results suggest that the FFQ has reasonably reproducibility and validity in measuring most nutrients and food intake, and it can be used to explore the dietary habits in studying the diet-disease relationship in Chinese rural populations.
 
Top-cited authors
Mahshid Dehghan
  • McMaster University
Digant Gupta
  • Cancer Treatment Centers of America
Ergul Belge Kurutas
  • Kahramanmaras Sutcu Imam University
Anwar Merchant
  • University of South Carolina
Victor L Fulgoni
  • Nutrition Impact, LLC