Article

Darwinian applications to nutrition: The value of evolutionary insights to teachers and students

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Evolutionary biology informs us that the living world is a product of evolution, guided by the Darwinian mechanism of natural selection. This recognition has been fruitfully employed to a number of issues in health and nutrition sciences; however, it has not been incorporated into education. Nutrition and dietetics students generally learn very little or nothing on the subject of evolution, despite the fact that evolution is the process by which our genetically determined physiological traits and needs were shaped. In the present paper, three examples of topics (inflammatory diseases, nutrition transition, and food intolerance) that can benefit from evolutionary information and reasoning are given, with relevant lines of research and inquiry provided throughout. It is argued that the application of evolutionary science to these and other areas of nutrition education can facilitate a deeper and more coherent teaching and learning experience. By recognizing and reframing nutrition as an aspect and discipline of biology, grounded in the fundamental principle of adaptation, revelatory light is shed on physiological states and responses, contentious and unresolved issues, genomic-, epigenomic-, and microbiomic features, and optimal nutrient status and intakes.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Mounting evidence demonstrates that a high-salt diet (HSD) not only affects hemodynamic changes but also disrupts immune homeostasis. The T helper 17 (Th17) and regulatory T cells (Tregs) are susceptible to hypersalinity. However, research on the influence of sodium on Th2-mediated food allergies remains scarce. We aimed to investigate the effect of dietary sodium on the immune response to food allergies. Mice maintained on an HSD (4% NaCl), low-salt diet (LSD; 0.4% NaCl), or control diet (CTRL; 1.0% NaCl) were orally sensitized with ovalbumin (OVA) and a cholera toxin (CT) adjuvant, and then subjected to an intragastric OVA challenge. OVA-specific immunoglobulin G (IgG), IgG1, IgG2a, and IgE antibodies were significantly higher in the HSD group than in the CTRL group (p < 0.001, p < 0.05, p < 0.01, and p < 0.05, respectively). Mice on HSD had significantly higher interleukin (IL)-4 levels than the CTRL group (p < 0.01). The IL-10 levels were significantly lower in the HSD group than in the CTRL group (p < 0.05). The serum levels of interferon-γ (IFN-γ), sodium, and chloride did not differ among the three groups. This study indicates that excessive salt intake promotes Th2 responses in a mouse model of food allergy.
Article
Full-text available
Infectious diseases and infections remain a leading cause of death in low-income countries and a major risk to vulnerable groups, such as infants and the elderly. The immune system plays a crucial role in the susceptibility, persistence, and clearance of these infections. With 70–80% of immune cells being present in the gut, there is an intricate interplay between the intestinal microbiota, the intestinal epithelial layer, and the local mucosal immune system. In addition to the local mucosal immune responses in the gut, it is increasingly recognized that the gut microbiome also affects systemic immunity. Clinicians are more and more using the increased knowledge about these complex interactions between the immune system, the gut microbiome, and human pathogens. The now well-recognized impact of nutrition on the composition of the gut microbiota and the immune system elucidates the role nutrition can play in improving health. This review describes the mechanisms involved in maintaining the intricate balance between the microbiota, gut health, the local immune response, and systemic immunity, linking this to infectious diseases throughout life, and highlights the impact of nutrition in infectious disease prevention and treatment.
Article
Full-text available
This study assessed the effect of probiotic yogurt fortified with Lactobacillus acidophilus and Bifidobacterium sp. in patients with lactose intolerance. Fifty‐five patients suffering from lactose intolerance were randomly divided into control group of 28 lactose intolerance patients who received nonprobiotic yogurt (100 ml) and experimental group consisted of 27 lactose intolerance patients who received probiotic yogurt fortified (100 ml) with L. acidophilus and Bifidobacterium sp. Each individual received yogurt for one week. Lactose intolerance was confirmed when the patients received 75 g lactose and were positive after 30 min until 3 hr for lactose intolerance symptoms and by hydrogen breath test (HBT). After intervention, the hydrogen level was lower in experimental group in comparison with the control group. Lactose intolerance symptoms in experimental group were much less than the control group. Our findings revealed that probiotic yogurt fortified with L. acidophilus and Bifidobacterium sp. could safely and effectively decrease lactose intolerance symptoms and HBT, so our probiotic can be recommended as a treatment of choice in lactose intolerance patients.
Article
Full-text available
The relatively rapid shift from consuming preagricultural wild foods for thousands of years, to consuming postindustrial semi-processed and ultra-processed foods endemic of the Western world less than 200 years ago did not allow for evolutionary adaptation of the commensal microbial species that inhabit the human gastrointestinal (GI) tract, and this has significantly impacted gut health. The human gut microbiota, the diverse and dynamic population of microbes, has been demonstrated to have extensive and important interactions with the digestive, immune, and nervous systems. Western diet-induced dysbiosis of the gut microbiota has been shown to negatively impact human digestive physiology, to have pathogenic effects on the immune system, and, in turn, cause exaggerated neuroinflammation. Given the tremendous amount of evidence linking neuroinflammation with neural dysfunction, it is no surprise that the Western diet has been implicated in the development of many diseases and disorders of the brain, including memory impairments, neurodegenerative disorders, and depression. In this review, we discuss each of these concepts to understand how what we eat can lead to cognitive and psychiatric diseases.
Article
Full-text available
In this paper, we present updated data on proximate composition, amino acid, and fatty acid composition, as well as concentrations of dioxins, polychlorinated biphenyls (PCBs), and selected heavy metals, in fillets from farmed (n = 20), escaped (n = 17), and wild (n = 23) Atlantic salmon (Salmo salar L.). The concentrations of dioxins (0.53 ± 0.12 pg toxic equivalents (TEQ)/g), dioxin-like PCBs (0.95 ± 0.48 pg TEQ/g), mercury (56.3 ± 12.9 µg/kg) and arsenic (2.56 ± 0.87 mg/kg) were three times higher in wild compared to farmed salmon, but all well below EU-uniform maximum levels for contaminants in food. The six ICES (International Council for the Exploration of the Sea) PCBs concentrations (5.09 ± 0.83 ng/g) in wild salmon were higher than in the farmed fish (3.34 ± 0.46 ng/g). The protein content was slightly higher in wild salmon (16%) compared to the farmed fish (15%), and the amount of essential amino acids were similar. The fat content of farmed salmon (18%) was three times that of the wild fish, and the proportion of marine long-chain omega-3 fatty acids was a substantially lower (8.9 vs. 24.1%). The omega-6 to omega-3 fatty acid ratio was higher in farmed than wild salmon (0.7 vs. 0.05). Both farmed and wild Atlantic salmon are still valuable sources of eicosapentaenoic acid and docosahexaenoic acid. One 150 g portion per week will contribute to more (2.1 g and 1.8 g) than the recommended weekly intake for adults.
Article
Full-text available
Vitamin D is responsible for regulation of calcium and phosphate metabolism and maintaining a healthy mineralized skeleton. It is also known as an immunomodulatory hormone. Experimental studies have shown that 1,25-dihydroxyvitamin D, the active form of vitamin D, exerts immunologic activities on multiple components of the innate and adaptive immune system as well as endothelial membrane stability. Association between low levels of serum 25-hydroxyvitamin D and increased risk of developing several immune-related diseases and disorders, including psoriasis, type 1 diabetes, multiple sclerosis, rheumatoid arthritis, tuberculosis, sepsis, respiratory infection, and COVID-19, has been observed. Accordingly, a number of clinical trials aiming to determine the efficacy of administration of vitamin D and its metabolites for treatment of these diseases have been conducted with variable outcomes. Interestingly, recent evidence suggests that some individuals might benefit from vitamin D more or less than others as high inter-individual difference in broad gene expression in human peripheral blood mononuclear cells in response to vitamin D supplementation has been observed. Although it is still debatable what level of serum 25-hydroxyvitamin D is optimal, it is advisable to increase vitamin D intake and have sensible sunlight exposure to maintain serum 25-hydroxyvitamin D at least 30 ng/mL (75 nmol/L), and preferably at 40–60 ng/mL (100–150 nmol/L) to achieve the optimal overall health benefits of vitamin D.
Article
Full-text available
Although there are many recognized health benefits for the consumption of omega-3 (n-3) long-chain polyunsaturated fatty acids (LCPUFA), intake in the United States remains below recommended amounts. This analysis was designed to provide an updated assessment of fish and n-3 LCPUFA intake (eicosapentaenoic (EPA), docosahexaenoic acid (DHA), and EPA+DHA) in the United States adult population, based on education, income, and race/ethnicity, using data from the 2003-2014 National Health and Nutrition Examination Survey (NHANES) (n = 44,585). Over this survey period, participants with less education and lower income had significantly lower n-3 LCPUFA intakes and fish intakes (p < 0.001 for all between group comparisons). N-3 LCPUFA intake differed significantly according to ethnicity (p < 0.001), with the highest intake of n-3 LCPUFA and fish in individuals in the “Other” category (including Asian Americans). Supplement use increased EPA + DHA intake, but only 7.4% of individuals consistently took supplements. Overall, n-3 LCPUFA intake in this study population was low, but our findings indicate that individuals with lower educational attainment and income are at even higher risk of lower n-3 LCPUFA and fish intake.
Article
Full-text available
Our understanding of how diet affects health is limited to 150 key nutritional components that are tracked and catalogued by the United States Department of Agriculture and other national databases. Although this knowledge has been transformative for health sciences, helping unveil the role of calories, sugar, fat, vitamins and other nutritional factors in the emergence of common diseases, these nutritional components represent only a small fraction of the more than 26,000 distinct, definable biochemicals present in our food—many of which have documented effects on health but remain unquantified in any systematic fashion across different individual foods. Using new advances such as machine learning, a high-resolution library of these biochemicals could enable the systematic study of the full biochemical spectrum of our diets, opening new avenues for understanding the composition of what we eat, and how it affects health and disease.
Article
Full-text available
The relationship of evolution with diet and environment can provide insights into modern disease. Fossil evidence shows apes and early human ancestors were fruit‐eaters living in environments with strongly seasonal climates. Rapid cooling at the end of the middle Miocene (15‐12Ma: millions of years ago) increased seasonality in Africa and Europe, and ape survival may be linked with a mutation in uric acid metabolism. Climate stabilized in the later Miocene and Pliocene (12‐5Ma), and fossil apes and early hominins were both adapted for life on ground and in trees. Around 2.5Ma, early species of Homo introduced more animal products into their diet, and this coincided with developing bipedalism, stone tool technology and increase in brain size. Early species of Homo such as Homo habilis still lived in woodland habitats, and the major habitat shift in human evolution occurred at 1.8Ma with the origin of Homo erectus. Homo erectus had increased body size, greater hunting skills and a diet rich in meat, control of fire and understanding about cooking food, and moved from woodland to savanna. Group size may also have increased at the same time, facilitating the transmission of knowledge from one generation to the next. The earliest fossils of Homo sapiens appeared about 300kyr, but they had separated from Neanderthals by 480kyr or earlier. Their diet shifted towards grain‐based foods about 100kyr ago, and settled agriculture developed about 10kyr ago. This pattern remains for many populations to this day and provides important insights into current burden of lifestyle diseases.
Article
Full-text available
Anatomically modern humans originated in Africa around 200 thousand years ago (ka)1,2,3,4. Although some of the oldest skeletal remains suggest an eastern African origin², southern Africa is home to contemporary populations that represent the earliest branch of human genetic phylogeny5,6. Here we generate, to our knowledge, the largest resource for the poorly represented and deepest-rooting maternal L0 mitochondrial DNA branch (198 new mitogenomes for a total of 1,217 mitogenomes) from contemporary southern Africans and show the geographical isolation of L0d1’2, L0k and L0g KhoeSan descendants south of the Zambezi river in Africa. By establishing mitogenomic timelines, frequencies and dispersals, we show that the L0 lineage emerged within the residual Makgadikgadi–Okavango palaeo-wetland of southern Africa⁷, approximately 200 ka (95% confidence interval, 240–165 ka). Genetic divergence points to a sustained 70,000-year-long existence of the L0 lineage before an out-of-homeland northeast–southwest dispersal between 130 and 110 ka. Palaeo-climate proxy and model data suggest that increased humidity opened green corridors, first to the northeast then to the southwest. Subsequent drying of the homeland corresponds to a sustained effective population size (L0k), whereas wet–dry cycles and probable adaptation to marine foraging allowed the southwestern migrants to achieve population growth (L0d1’2), as supported by extensive south-coastal archaeological evidence8,9,10. Taken together, we propose a southern African origin of anatomically modern humans with sustained homeland occupation before the first migrations of people that appear to have been driven by regional climate changes.
Article
Full-text available
Background: Irritable bowel syndrome (IBS) is the most common functional digestive condition in the industrialized world. The gut microbiota plays a key role in disease pathogenesis. Objective: A systematic review and meta-analysis on case-control studies was conducted to determine whether there is gut microbial dysbiosis in participants with IBS in comparison with healthy controls and, if so, whether the dysbiosis pattern differs among IBS subtypes and geographic regions. Methods: This review was conducted and reported according to the MOOSE (Meta-Analysis of Observational Studies in Epidemiology) 2000 and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) 2009 guidelines. Research articles published up to May 9, 2018 were identified through MEDLINE (PubMed), Cochrane Central Register of Controlled Trials (Cochrane Library), ClinicalTrials.gov, EMBASE, and Web of Science. Study quality was assessed using the Newcastle-Ottawa Scale. Case-control studies of participants with IBS who had undergone quantitative gut microbial stool analysis were included. The primary exposure measure of interest is log10 bacterial counts per gram of stool. Meta-analyses were performed to estimate the mean difference (MD) in gut microbiota between participants with IBS and healthy controls using the random-effects model with inverse variance in Revman 5.3 and R 3.5.1. Publication bias was assessed with funnel plots and Egger's test. Between-study heterogeneity was analyzed using Higgins I2 statistic with 95% CIs. Results: There were 6,333 unique articles identified; 52 qualified for full-text screening. Of these, 23 studies were included for analysis (n=1,340 participants from North America, Europe, and Asia). Overall, the studies were moderate in quality. Comparing participants with IBS to healthy controls, lower fecal Lactobacillus (MD= -0.57 log10 colony-forming unit [CFU]/g; P<0.01) and Bifidobacterium (MD= -1.04 log10CFU/g; P<0.01), higher Escherichia coli (MD=0.60 log10CFU/g; P<0.01), and marginally higher Enterobacter (MD=0.74 log10CFU/g; P=0.05). No difference was found between participants with IBS and healthy controls in fecal Bacteroides and Enterococcus (P=0.18 and 0.68, respectively). Publication bias was not observed except in Bifidobacterium (P=0.015). Subgroup analyses on participants with diarrhea-predominant and constipation-predominant IBS showed consistent results with the primary results. A subgroup analysis of Chinese studies was consistent with the primary results, except for fecal Bacteroides, which was increased in participants with IBS vs healthy controls (MD=0.29; 95% CI 0.13 to 0.46; P<0.01). Although substantial heterogeneity was detected (I2>75%) in most comparisons, the direction of the effect estimates is relatively consistent across studies. Conclusions: IBS is characterized by gut microbial dysbiosis. Prospective, large-scale studies are needed to delineate how gut microbial profiles can be used to guide targeted therapies in this challenging patient population.
Article
Full-text available
Atherosclerotic cardiovascular disease is a leading cause of death and morbidity globally. Over the past several years, arterial inflammation has been implicated in the pathophysiology of athero-thrombosis, substantially confirming what pathologist Rudolf Virchow had observed in the 19th century. Lipid lowering, lifestyle changes, and modification of other risk factors have reduced cardiovascular complications of athero-thrombosis, but a substantial residual risk remains. In view of the pathogenic role of inflammation in athero-thrombosis, directly targeting inflammation has emerged as an additional potential therapeutic option; and some early promising results have been suggested by the Canakinumab Anti-inflammatory Thrombosis Outcome Study (CANTOS), in which canakinumab, a fully human monoclonal antibody targeting the pro-inflammatory and pro-atherogenic cytokine interleukin 1 beta, was shown to reduce cardiovascular events.
Article
Full-text available
Depression and fatigue are conditions responsible for heavy global societal burden, especially in patients already suffering from chronic diseases. These symptoms have been identified by those affected as some of the most disabling symptoms which affect the quality of life and productivity of the individual. While many factors play a role in the development of depression and fatigue, both have been associated with increased inflammatory activation of the immune system affecting both the periphery and the central nervous system (CNS). This is further supported by the well-described association between diseases that involve immune activation and these symptoms in autoimmune disorders, such as multiple sclerosis and immune system activation in response to infections, like sepsis. Treatments for depression also support this immunopsychiatric link. Antidepressants have been shown to decrease inflammation, while higher levels of baseline inflammation predict lower treatment efficacy for most treatments. Those patients with higher initial immune activation may on the other hand be more responsive to treatments targeting immune pathways, which have been found to be effective in treating depression and fatigue in some cases. These results show strong support for the hypothesis that depression and fatigue are associated with an increased activation of the immune system which may serve as a valid target for treatment. Further studies should focus on the pathways involved in these symptoms and the development of treatments that target those pathways will help us to better understand these conditions and devise more targeted treatments.
Article
Full-text available
Globally, ∼70% of adults are deficient in intestinal lactase, the enzyme required for the digestion of lactose. In these individuals, the consumption of lactose-containing milk and dairy products can lead to the development of various gastrointestinal (GI) symptoms. The primary solution to lactose intolerance is withdrawing lactose from the diet either by eliminating dairy products altogether or substituting lactose-free alternatives. However, studies have shown that certain individuals erroneously attribute their GI symptoms to lactose and thus prefer to consume lactose-free products. This has raised the question whether consuming lactose-free products reduces an individual's ability to absorb dietary lactose and if lactose-absorbers should thus avoid these products. This review summarizes the current knowledge regarding the acclimatization of lactose processing in humans. Human studies that have attempted to induce intestinal lactase expression with different lactose feeding protocols have consistently shown lack of enzyme induction. Similarly, withdrawing lactose from the diet does not reduce intestinal lactase expression. Evidence from cross-sectional studies shows that milk or dairy consumption is a poor indicator of lactase status, corroborating the results of intervention studies. However, in lactase-deficient individuals, lactose feeding supports the growth of lactose-digesting bacteria in the colon, which enhances colonic lactose processing and possibly results in the reduction of intolerance symptoms. This process is referred to as colonic adaptation. In conclusion, endogenous lactase expression does not depend on the presence of dietary lactose, but in susceptible individuals, dietary lactose might improve intolerance symptoms via colonic adaptation. For these individuals, lactose withdrawal results in the loss of colonic adaptation, which might lower the threshold for intolerance symptoms if lactose is reintroduced into the diet.
Article
Full-text available
Modern lifestyle limits our exposure to sunlight, which photosynthesizes vitamin D in the skin, and the incidence of nutritional rickets has been resurging. Vitamin D is one of the first hormones; it is photosynthesized in all organism from the phytoplankton to mammals. A selective sweep of the promoter of the vitamin D receptor (VDR) happened as soon as Homo sapiens migrated out of Africa; it co-adapted with skin color genes to provide adaptation to latitudes and the levels of exposure to ultraviolet (UV)B radiation along the route out of Africa. Exposure to UVB radiation balances the need for vitamin D photosynthesis and degradation of folic acid by UVB radiation. Skin color follows a latitude distribution: the darkest populations dwell in the tropical belt; and the fair-skinned populations inhabit the northern countries. Due to their greater need for calcium during their reproductive life, the skin color of women is lighter- than that of men. Vitamin D is essential for mineral homeostasis and has a wide variety of non-skeletal functions, of which the most important for natural selection is a regulatory function in the innate immune system. In the human fossil record, vitamin D deficiency coincided with bone tuberculosis. About 6,000 years ago, a diet which included cow's milk provided Neolithic humans with twice as much calcium and was more alkaline than that of its Paleolithic predecessors. Adiposity is negatively associated with the vitamin D status and obese individuals require 2–3 times more vitamin D than non-obese individuals to normalize circulating 25OHD levels. In an era of an obesity epidemic, we need more research to determine whether adiposity should be considered when determining the dietary requirements for vitamin D and calcium and the optimal serum 25OHD levels.
Article
Full-text available
The microbiome is composed of hundreds of interacting species that have co-evolved with the host and alterations in microbiome composition have been associated with health and disease. Insights from evolutionary ecology may aid efforts to ameliorate microbiome-associated diseases. One step toward this goal involves recognition that the idea of commensalism has been applied too broadly to human/microbe symbioses. Commensalism is most accurately viewed on a symbiosis continuum as a dividing line that separates a spectrum of mutualisms of decreasing positive interdependence from parasitisms of increasing severity. Insights into the evolution of the gut microbial symbiosis continuum will help distinguish between human actions that will advance or hinder health. Theory and research indicate that a major benefit of mutualistic microbes will be protection against pathogens. Mismatches between current and ancestral diets may disfavor mutualists, resulting in microbiome effects on health problems, including obesity, diabetes, autism, and childhood allergy. Evolutionary theory indicates that mutualisms will be favored when symbionts depend on resources that are not used by the host. These resources, which are referred to as human-inaccessible microbiota-accessible carbohydrates (HIMACs), can be supplied naturally through diet. Public health interventions need to consider the position of gut microbes on the mutualist-parasite continuum and the specific associations between prebiotics, such as HIMACs, and the mutualists they support. Otherwise interventions may fail to restore the match between human adaptations, diet, and microbiome function and may thereby fail to improve health and even inadvertently promote illness.
Article
Full-text available
Lay Summary Through an online survey of nutrition and dietetic professionals and students, we learned there is interest to incorporate evolutionary medicine into the nutrition and dietetics field and education programs. Background and objectives Evolutionary medicine is an emerging field that examines the evolutionary significance of modern disease to develop new preventative strategies or treatments. While many areas of interest in evolutionary medicine and public health involve diet, we currently lack an understanding of whether nutrition and dietetics professionals and students appreciate the potential of evolutionary medicine. Methodology Cross-sectional online survey to measure the level of appreciation, applicability and knowledge of evolutionary medicine among nutrition and dietetics professionals and students. We then examined the relationships between support of evolutionary medicine and (i) professionals and students, (ii) US region, (iii) religious belief and (iv) existing evolutionary knowledge. Results A total of 2039 people participated: students (n = 893) and professionals (n = 1146). The majority of the participants agree they are knowledgeable on the theory of evolution (59%), an understanding of evolution can aid the nutrition and dietetics field (58%), an evolutionary perspective would be beneficial in dietetics education (51%) and it is equally important to understand both the evolutionary and direct causes of disease (71%). Significant differences in responses between professionals and students suggest students are currently learning more about evolution and are also more supportive of using an evolutionary perspective. Whereas differences in responses by US region were minimal, differences by religious belief and prior evolutionary knowledge were significant; however, all responses were either neutral or supportive at varying strengths. Conclusion and implications There is interest among professionals and students to incorporate evolutionary medicine into the nutrition and dietetics field and education programs.
Article
Full-text available
Alzheimer's disease (AD) is a progressive neurodegenerative disorder that is characterized by cognitive decline and the presence of two core pathologies, amyloid β plaques and neurofibrillary tangles. Over the last decade, the presence of a sustained immune response in the brain has emerged as a third core pathology in AD. The sustained activation of the brain's resident macrophages (microglia) and other immune cells has been demonstrated to exacerbate both amyloid and tau pathology and may serve as a link in the pathogenesis of the disorder. In the following review, we provide an overview of inflammation in AD and a detailed coverage of a number of microglia-related signaling mechanisms that have been implicated in AD. Additional information on microglia signaling and a number of cytokines in AD are also reviewed. We also review the potential connection of risk factors for AD and how they may be related to inflammatory mechanisms.
Article
Full-text available
Background Humans and other primates have evolved particular morphological and biological traits (e.g., larger brains, slower growth, longer-lived offspring) that distinguish them from most other mammals. The evolution of many distinctive human characteristics, such as our large brain sizes, reduced gut sizes, and high activity budgets, suggest major energetic and dietary shifts. Main body Over the course of the last three million years, hominin brain sizes tripled. It is often taken for granted that the benefit of a larger brain is an increase in “intelligence” that makes us stand out among other mammals, including our nearest relatives, the primates. In the case of humans, brain expansion was associated with changes in diet, foraging, and energy metabolism. The first marked expansion occurred with the appearance of the genus Homo . Improved diet quality, allomaternal subsidies, cognitive buffering [by earlier weaning and longer juvenile periods], reduced costs for locomotion and by cooperative behavior, and reduced allocation to production, all operated simultaneously, thus enabling the extraordinary brain enlargement in our lineage. Conclusion It appears that major expansion of brain size in the human lineage is the product of synergistically interacting dietary/nutritional and social forces. Although dietary change was not being the sole force responsible for the evolution of large brain size, the exploitation of high-quality foods likely fueled the energetic costs of larger brains and necessitated more complex behaviors that would have selected for greater brain size.
Article
Full-text available
Chronic or persistent fatigue is a common, debilitating symptom of several diseases. Persistent fatigue has been associated with low-grade inflammation in several models of fatigue, including cancer-related fatigue and chronic fatigue syndrome. However, it is unclear how low-grade inflammation leads to the experience of fatigue. We here propose a model of an imbalance in energy availability and energy expenditure as a consequence of low-grade inflammation. In this narrative review, we discuss how chronic low-grade inflammation can lead to reduced cellular-energy availability. Low-grade inflammation induces a metabolic switch from energy-efficient oxidative phosphorylation to fast-acting, but less efficient, aerobic glycolytic energy production; increases reactive oxygen species; and reduces insulin sensitivity. These effects result in reduced glucose availability and, thereby, reduced cellular energy. In addition, emerging evidence suggests that chronic low-grade inflammation is associated with increased willingness to exert effort under specific circumstances. Circadian-rhythm changes and sleep disturbances might mediate the effects of inflammation on cellular-energy availability and non-adaptive energy expenditure. In the second part of the review, we present evidence for these metabolic pathways in models of persistent fatigue, focusing on chronic fatigue syndrome and cancer-related fatigue. Most evidence for reduced cellular-energy availability in relation to fatigue comes from studies on chronic fatigue syndrome. While the mechanistic evidence from the cancer-related fatigue literature is still limited, the sparse results point to reduced cellular-energy availability as well. There is also mounting evidence that behavioral-energy expenditure exceeds the reduced cellular-energy availability in patients with persistent fatigue. This suggests that an inability to adjust energy expenditure to available resources might be one mechanism underlying persistent fatigue.
Article
Full-text available
Since 2006, T1D in Finland has been decreasing after an initial plateau preceded by an increase in serum-25OHD after the authorities??decision for fortification of dietary milk products with cholecalciferol. The role of Vitamin-D in innate and adaptive immunity is critical. A statistical error in the estimation of the Recommended Dietary Allowance (RDA) for Vitamin-D was recently discovered, indicating that 8895 IU/day are needed for 97.5% of individuals to achieve values ??50 nmol/l, analyzing correctly the same data used by the Institute of Medicine. These results were confirmed, showing that 6201 IU/day are needed to achieve 75 nmol/l and 9122 IU/day are needed to reach 100nmol/l. The largest meta-analysis ever conducted on published studies between January 1966 and January 2013, showed that 25(OH)D < 75 nmol/l may be too low for safety, associated with higher all-cause mortality, demolishing the previously presumed U-shape. Since all-disease mortality is reduced to 1.0 with serum Vitamin-D ??100 nmol/l, we call public health authorities to consider designating, as the RDA, at least 3/4 of the upper tolerable dose proposed by the Endocrine Society Expert Committee as safe upper tolerable daily intake doses: 2000 < 1yr, 4000 1-18yrs and 10,000 IU > 18yrs. This could translate as i.e.1000 IU for children < 1yr on enriched formula and 1500 IU on breast-feeding older than 6 months; 3000 IU for children > 1yr and around 8000 IU for young adults and thereafter. Actions are urgently needed to safely protect global health from Vitamin-D deficiency.
Article
Full-text available
The diverse microbial community that inhabits the human gut has an extensive metabolic repertoire that is distinct from, but complements the activity of mammalian enzymes in the liver and gut mucosa and includes functions essential for host digestion. As such, the gut microbiota is a key factor in shaping the biochemical profile of the diet and, therefore, its impact on host health and disease. The important role that the gut microbiota appears to play in human metabolism and health has stimulated research into the identification of specific microorganisms involved in different processes, and the elucidation of metabolic pathways, particularly those associated with metabolism of dietary components and some host-generated substances. In the first part of the review, we discuss the main gut microorganisms, particularly bacteria, and microbial pathways associated with the metabolism of dietary carbohydrates (to short chain fatty acids and gases), proteins, plant polyphenols, bile acids, and vitamins. The second part of the review focuses on the methodologies, existing and novel, that can be employed to explore gut microbial pathways of metabolism. These include mathematical models, omics techniques, isolated microbes, and enzyme assays.
Article
Full-text available
Vitamin D deficiency in pregnancy has negative clinical consequences, such as associations with glucose intolerance, and has been shown to be distributed differently in certain ethnic groups. In some countries, a difference in the rate of vitamin D deficiency was detected in pregnant women depending on their skin color. We examined the prevalence of vitamin D deficiency (<20 ng/mL) in women in early pregnancy in Switzerland and evaluated the association of skin color with vitamin D deficiency. In a single-center cohort study, the validated Fitzpatrick scale and objective melanin index were used to determine skin color. Of the 204 pregnant women included, 63% were vitamin D deficient. The mean serum 25-hydroxyvitamin D concentration was 26.1 ng/mL (95% confidence interval (CI) 24.8–27.4) in vitamin D–sufficient women and 10.5 ng/mL (95% CI 9.7–11.5) in women with deficiency. In the most parsimonious model, women with dark skin color were statistically significantly more often vitamin D deficient compared to women with light skin color (OR 2.60; 95% CI 1.08–6.22; adjusted for age, season, vitamin D supplement use, body mass index, smoking, parity). This calls for more intense counseling as one policy option to improve vitamin D status during pregnancy, i.e., use of vitamin D supplements during pregnancy, in particular for women with darker skin color.
Article
Full-text available
Obesity is associated with physical inactivity, which exacerbates the health consequences of weight gain. However, the mechanisms that mediate this association are unknown. We hypothesized that deficits in dopamine signaling contribute to physical inactivity in obesity. To investigate this, we quantified multiple aspects of dopamine signaling in lean and obese mice. We found that D2-type receptor (D2R) binding in the striatum, but not D1-type receptor binding or dopamine levels, was reduced in obese mice. Genetically removing D2Rs from striatal medium spiny neurons was sufficient to reduce motor activity in lean mice, whereas restoring Gi signaling in these neurons increased activity in obese mice. Surprisingly, although mice with low D2Rs were less active, they were not more vulnerable to diet-induced weight gain than control mice. We conclude that deficits in striatal D2R signaling contribute to physical inactivity in obesity, but inactivity is more a consequence than a cause of obesity.
Article
Full-text available
Crosstalk between inflammatory pathways and neurocircuits in the brain can lead to behavioural responses, such as avoidance and alarm, that are likely to have provided early humans with an evolutionary advantage in their interactions with pathogens and predators. However, in modern times, such interactions between inflammation and the brain appear to drive the development of depression and may contribute to non-responsiveness to current antidepressant therapies. Recent data have elucidated the mechanisms by which the innate and adaptive immune systems interact with neurotransmitters and neurocircuits to influence the risk for depression. Here, we detail our current understanding of these pathways and discuss the therapeutic potential of targeting the immune system to treat depression.
Article
Full-text available
Objectives: To evaluate if dietary vitamin D intake is adequate for sufficient vitamin D status during early winter in children living in Sweden, irrespective of latitude or skin color. Methods: As part of a prospective, comparative, two-center intervention study in northern (63°N) and southern (55°N) Sweden, dietary intake, serum 25-hydroxyvitamin D (S-25(OH) D), associated laboratory variables, and socio-demographic data were studied in 5 to 7-year-old children with fair and dark skin in November and December. Results: 206 children with fair/dark skin were included, 44/41 and 64/57 children in northern and southern Sweden, respectively. Dietary vitamin D intake was higher in northern than southern Sweden (p=0.001), irrespective of skin color, partly due to higher consumption of fortified foods, but only met 50-70% of national recommendations (10 μg/day). S-25(OH) D was higher in northern than southern Sweden, in children with fair (67 vs. 59 nmol/L; p < 0.05) and dark skin (56 vs. 42 nmol/L; p < 0.001). S-25(OH) D was lower in dark than fair skinned children at both sites (p < 0.01), and below 50 nmol/L in 40 and 75% of dark-skinned children in northern and southern Sweden, respectively. Conclusions: Insufficient vitamin D status was common during early winter in children living in Sweden, particularly in those with dark skin. Although, higher dietary vitamin D intake in northern than southern Sweden attenuated the effects of latitude, a northern country of living combined with darker skin and vitamin D intake below recommendations are important risk factors for vitamin D insufficiency.
Article
Full-text available
The evolution of the human brain has been a combination of reorganization of brain components and increases of brain size through both hyperplasia and hypertrophy during development, underlain by neurogenomic changes that have involved epigenetic changes largely effecting regulation of growth dynamics. While both genomics and comparative neuroanatomical studies are invaluable to understanding how brains and behavior correlate, it is paleoneurology, based on endocast studies ( chapter “Virtual Anthropology and Biomechanics,” Vol. 1), which are the direct evidence demonstrating volume changes through time. Some convolutional details of the underlying cerebral cortex do appear on the endocranial surface. These details allow one to recognize reorganizational changes that include (1) a reduction of primary visual cortex and relative enlargement of posterior association cortex, (2) expanded Broca’s regions, and (3) cerebral asymmetries. The size of the hominid brain increased from about 450 ml 3.5 Ma ago to our current average volume of 1,350 ml, with a slight reduction since Neolithic times. Many more data from additional fossils will be necessary to decide how and when these two changes through time occurred and whether these were gradual or punctuated.
Article
Full-text available
We propose that plant foods containing high quantities of starch were essential for the evolution of the human phenotype during the Pleistocene. Although previous studies have highlighted a stone tool-mediated shift from primarily plant-based to primarily meat-based diets as critical in the development of the brain and other human traits, we argue that digestible carbohydrates were also necessary to accommodate the increased metabolic demands of a growing brain. Furthermore, we acknowledge the adaptive role cooking played in improving the digestibility and palatability of key carbohydrates. We provide evidence that cooked starch, a source of preformed glucose, greatly increased energy availability to human tissues with high glucose demands, such as the brain, red blood cells, and the developing fetus. We also highlight the auxiliary role copy number variation in the salivary amylase genes may have played in increasing the importance of starch in human evolution following the origins of cooking. Salivary amylases are largely ineffective on raw crystalline starch, but cooking substantially increases both their energy-yielding potential and glycemia. Although uncertainties remain regarding the antiquity of cooking and the origins of salivary amylase gene copy number variation, the hypothesis we present makes a testable prediction that these events are correlated.
Article
Full-text available
Researchers Olli Arjamaa and Timo Vuorisalo describe the biological and cultural evolution of hominid diets, concluding with three examples of cultural evolution that led to genetic changes in Homo sapiens. The first hominid species arose 10 to 7 million years ago in late Miocene Africa. The change in diet in part was made possible by stone tools used to manipulate food items. The oldest known stone tools date back to 2.6 million years. Progressive changes in diet were associated with changes in body size and anatomy. Increased use of C 4 plants was indeed gradually followed by increased consumption of meat, either scavenged or hunted. Several factors contributed to increased meat availability. First, savanna ecosystems with several modern characteristics started to spread about 1.8 million years ago. A classic example of gene-culture co-evolution is lactase persistence (LP) in human adults.
Article
Full-text available
Cardiovascular disease is the leading cause of premature mortality in the developed world, and hypertension is its most important risk factor. Controlling hypertension is a major focus of public health initiatives, and dietary approaches have historically focused on sodium. While the potential benefits of sodium-reduction strategies are debatable, one fact about which there is little debate is that the predominant sources of sodium in the diet are industrially processed foods. Processed foods also happen to be generally high in added sugars, the consumption of which might be more strongly and directly associated with hypertension and cardiometabolic risk. Evidence from epidemiological studies and experimental trials in animals and humans suggests that added sugars, particularly fructose, may increase blood pressure and blood pressure variability, increase heart rate and myocardial oxygen demand, and contribute to inflammation, insulin resistance and broader metabolic dysfunction. Thus, while there is no argument that recommendations to reduce consumption of processed foods are highly appropriate and advisable, the arguments in this review are that the benefits of such recommendations might have less to do with sodium—minimally related to blood pressure and perhaps even inversely related to cardiovascular risk—and more to do with highly-refined carbohydrates. It is time for guideline committees to shift focus away from salt and focus greater attention to the likely more-consequential food additive: sugar. A reduction in the intake of added sugars, particularly fructose, and specifically in the quantities and context of industrially-manufactured consumables, would help not only curb hypertension rates, but might also help address broader problems related to cardiometabolic disease.
Article
Full-text available
Aim: The aim was to study vitamin D status in a healthy adolescent Norwegian population at 69°N. Methods: The data presented come from The Tromsø Study: Fit Futures, during the school year 2010/2011 (not including the summer months), where 1,038 (92% of those invited) participated. Physical examinations, questionnaires and blood samples were collected, and serum 25-hydroxyvitamin D (25(OH)D) were analyzed using LC-MS/MS. Results: RESULTS are presented from 475 boys and 415 girls (15-18 years old) with available blood samples. A total of 60.2% had vitamin D deficiency or insufficiency (serum 25(OH)D <50 nmol/l), 16.5% were deficient (<25 nmol/l) and 1.6% had severe vitamin D deficiency (<12.5 nmol/l). Only 12.4% had levels >75 nmol/l. A significant gender difference with a mean (SD) serum 25(OH)D level of 40.5 (20.5) nmol/l in boys and 54.2 (23.2) nmol/l in girls (p <0.01) was present. Furthermore, 51.3% of girls had levels >50 nmol/l in comparison to 29.7% of boys (p <0.01). There was an inverse correlation between parathyroid hormone levels and 25(OH)D, rs= -0.30 (p<0.01). Explanatory factors that were significantly associated with serum 25(OH)D levels in multivariate models were use of snuff, consumption of vitamin D fortified milk, cod liver oil and vitamin/mineral supplements, physical activity, sunbathing holiday and use of solarium in boys, and vitamin/mineral supplements, physical activity, sunbathing holiday and use of solarium in girls . Conclusions: Vitamin D deficiency is prevalent during the school year among adolescents in northern Norway, particularly among boys.
Article
Full-text available
More than a quarter of human populations now suffer from hypertension paralleling the marked increase in the dietary intake of salt during the recent several decades. Despite overwhelming experimental and epidemiological evidence, some still debate the relation between salt and hypertension. Pointing to some conflicting data in a few flawed studies, they argue that policy interventions to reduce the dietary intake of salt are premature and maybe unsafe without further studies. A brief review of data relating salt intake to hypertension, along with an overview of the history of the introduction of salt to human diet on an historic and evolutionary time scale, should help dispel doubts on the effectiveness and safety of low-salt diet. The recorded history confirms how rare and inaccessible salt has been until recent times. Like all other terrestrial life forms, humans evolved in a salt-free environment under intense evolutionary pressure for the selection of salt-conserving genes. Hypertension is a prototypical evolutionary maladaptation disorder of the modern man—a species exquisitely well adapted to low salt conditions suddenly confronted with salt excess. The World Health Organization and many governments have finally taken action to reduce dietary intake of salt, which already has started to reduce the burden of hypertension and the associated cardiovascular morbidity and mortality. This brief review is to broadly look at the evidence linking salt to hypertension from a historic and evolutionary perspective as well as touching upon some of the epidemiological and experimental data.
Article
Full-text available
While numerous changes in human lifestyle constitute modern life, our diet has been gaining attention as a potential contributor to the increase in immune-mediated diseases. The Western diet is characterized by an over consumption and reduced variety of refined sugars, salt, and saturated fat. Herein our objective is to detail the mechanisms for the Western diet's impact on immune function. The manuscript reviews the impacts and mechanisms of harm for our over-indulgence in sugar, salt, and fat, as well as the data outlining the impacts of artificial sweeteners, gluten, and genetically modified foods; attention is given to revealing where the literature on the immune impacts of macronutrients is limited to either animal or in vitro models versus where human trials exist. Detailed attention is given to the dietary impact on the gut microbiome and the mechanisms by which our poor dietary choices are encoded into our gut, our genes, and are passed to our offspring. While today's modern diet may provide beneficial protection from micro- and macronutrient deficiencies, our over abundance of calories and the macronutrients that compose our diet may all lead to increased inflammation, reduced control of infection, increased rates of cancer, and increased risk for allergic and auto-inflammatory disease.
Article
Full-text available
In the wild, western lowland gorillas consume a diet high in fiber and low in caloric density. In contrast, many gorillas in zoos consume a diet that is high‐calorie and low in fiber. Some items commonly used in captive gorilla diets contain high levels of starch and sugars, which are minimal in the natural diet of gorillas. There is a growing concern that captive gorillas may qualify as obese. Furthermore, the leading cause of death for adult male gorillas in zoos is heart disease. In humans, a diet that is high in simple carbohydrates is associated with both obesity and the incidence of heart disease. In response to these issues, we implemented a biscuit‐free diet (free of biscuits and low in fruit) and measured serum biomarkers of obesity and insulin resistance pre‐ and post‐diet change at three institutions: North Carolina Zoological Garden, Cleveland Metroparks Zoo, and Columbus Zoo and Aquarium. We also added a resistant starch supplement to gorilla diets at two of the above institutions. We anticipated that these diet changes would positively affect biomarkers of obesity and insulin resistance. Both diet manipulations led to a reduction in insulin. Resistant starch also decreased overall serum cholesterol levels. Future research will examine these health changes in a greater number of individuals to determine if the results remain consistent with these preliminary findings. Zoo Biol. 33:74–80, 2014. © 2014 Wiley Periodicals, Inc.
Article
Objective A diet low in fermentable oligosaccharides, disaccharides, monosaccharides, and polyols (FODMAP) is recommended for irritable bowel syndrome (IBS), if general lifestyle and dietary advice fails. However, although the impact of a low FODMAP diet on individual IBS symptoms has been examined in some randomised controlled trials (RCTs), there has been no recent systematic assessment, and individual trials have studied numerous alternative or control interventions, meaning the best comparator is unclear. We performed a network meta-analysis addressing these uncertainties. Design We searched the medical literature through to 2 April 2021 to identify RCTs of a low FODMAP diet in IBS. Efficacy was judged using dichotomous assessment of improvement in global IBS symptoms or improvement in individual IBS symptoms, including abdominal pain, abdominal bloating or distension, and bowel habit. Data were pooled using a random effects model, with efficacy reported as pooled relative risks (RRs) with 95% CIs, and interventions ranked according to their P-score. Results We identified 13 eligible RCTs (944 patients). Based on failure to achieve an improvement in global IBS symptoms, a low FODMAP diet ranked first vs habitual diet (RR of symptoms not improving=0.67; 95% CI 0.48 to 0.91, P-score=0.99), and was superior to all other interventions. Low FODMAP diet ranked first for abdominal pain severity, abdominal bloating or distension severity and bowel habit, although for the latter it was not superior to any other intervention. A low FODMAP diet was superior to British Dietetic Association (BDA)/National Institute for Health and Care Excellence (NICE) dietary advice for abdominal bloating or distension (RR=0.72; 95% CI 0.55 to 0.94). BDA/NICE dietary advice was not superior to any other intervention in any analysis. Conclusion In a network analysis, low FODMAP diet ranked first for all endpoints studied. However, most trials were based in secondary or tertiary care and did not study effects of FODMAP reintroduction and personalisation on symptoms.
Article
In evolutionary terms, the transformations which humans have engendered in social, ecological and built environments are increasingly out of step with their biological makeup. We briefly review the evidence on health-relevant practices and status of our Paleolithic ancestors and contrast these with current food, transportation, work and governance systems with their associated impacts on human health. As public health and planning practitioners engaged in the EcoHealth Ontario Collaborative, we argue for recognition of our hunter-gatherer nature to promote joint efforts in building sustainable and equitable community infrastructures, both built and green. Although such efforts are underway at multiple jurisdictional levels across Canada, the pace is frustratingly slow for the burden of endemic chronic diseases and global environmental change which humans face. Reminding reluctant stakeholders of the hunter-gatherers in us all could bring about deeper reflection on the urgent work in redirecting community planning.
Article
Background The temporality between the mandated reduction of salt in processed food and the decrease of death from stroke and ischemic heart disease, the association of hypertension, and cardiovascular disease led many public health organizations to recommend reducing dietary sodium to a maximum of 2300 mg per day. It turns out that some nuances can be brought about to this universally shared belief.Methods & ResultsIndeed, consideration of health outcomes instead of only blood pressure as a surrogate marker of cardiovascular disease and prognosis gave contradictory results whereas low sodium intake is associated to an excess of death and cardiovascular events.Conclusions Accordingly, sodium intake should be adapted to individual risk factors, and evidence is still clearly lacking to support indiscriminate recommendations in healthy people. By contrast, a restricted sodium diet is certainly useful in patients with chronic kidney disease exposed to salt retention, and by reciprocity, low sodium diet must be absolutely avoided in all patients presenting renal or extra renal sodium wasting where sodium depletion is a life-threatening condition.
Article
Although intermittent increases in inflammation are critical for survival during physical injury and infection, recent research has revealed that certain social, environmental and lifestyle factors can promote systemic chronic inflammation (SCI) that can, in turn, lead to several diseases that collectively represent the leading causes of disability and mortality worldwide, such as cardiovascular disease, cancer, diabetes mellitus, chronic kidney disease, non-alcoholic fatty liver disease and autoimmune and neurodegenerative disorders. In the present Perspective we describe the multi-level mechanisms underlying SCI and several risk factors that promote this health-damaging phenotype, including infections, physical inactivity, poor diet, environmental and industrial toxicants and psychological stress. Furthermore, we suggest potential strategies for advancing the early diagnosis, prevention and treatment of SCI. Systemic chronic inflammation increases with age and is linked to the development of several diseases, as presented in this Perspective.
Article
Salt intake as part of a western diet currently exceeds recommended limits, and the small amount found in the natural diet enjoyed by our Paleolithic ancestors. Excess salt is associated with the development of hypertension and cardiovascular disease, but other adverse effects of excess salt intake are beginning to be recognized, including the development of autoimmune and inflammatory disease. Over the last decade there has been an increasing body of evidence demonstrating that salt affects multiple components of both the innate and adaptive immune systems. In this review we outline the recent laboratory, animal and human data, highlighting the effect of salt on immunity, with a particular focus on the relevance to inflammatory kidney disease.
Article
Maternal nutrition is an important factor for infant neurodevelopment. However, prior magnetic resonance imaging (MRI) studies on maternal nutrients and infant brain have focused mostly on preterm infants or on few specific nutrients and few specific brain regions. We present a first study in term-born infants, comprehensively correlating 73 maternal nutrients with infant brain morphometry at the regional (61 regions) and voxel (over 300 000 voxel) levels. Both maternal nutrition intake diaries and infant MRI were collected at 1 month of life (0.9 ± 0.5 months) for 92 term-born infants (among them, 54 infants were purely breastfed and 19 were breastfed most of the time). Intake of nutrients was assessed via standardized food frequency questionnaire. No nutrient was significantly correlated with any of the volumes of the 61 autosegmented brain regions. However, increased volumes within subregions of the frontal cortex and corpus callosum at the voxel level were positively correlated with maternal intake of omega-3 fatty acids, retinol (vitamin A) and vitamin B12, both with and without correction for postmenstrual age and sex (P < 0.05, q < 0.05 after false discovery rate correction). Omega-3 fatty acids remained significantly correlated with infant brain volumes after subsetting to the 54 infants who were exclusively breastfed, but retinol and vitamin B12 did not. This provides an impetus for future larger studies to better characterize the effect size of dietary variation and correlation with neurodevelopmental outcomes, which can lead to improved nutritional guidance during pregnancy and lactation.
Article
Free link to the manuscript: https://authors.elsevier.com/a/1XxZd,LqAZFslH Since the discovery of the first hominin fossils, East Africa has been in the spotlight of palaeo-anthropological investigation for its role as a potential cradle of humanity and as a gateway out of Africa. With the advent of the genomic era an ever increasing amount of information has started to complement this notion, and to place the area within a broader, Pan-African scenario. Here we examine the most recent genetic and fossil results that recapitulate the last hundreds of thousands of years of human evolution in the area, and point to a number of uncharted avenues that may complement the emerging scenario in the coming years.
Article
It is vital to provide appropriate nutrition to maintain healthy populations in conservation breeding programs. Knowledge of the wild diet of a species can be used to inform captive diet formulation. The nutritional content of the wild diet of the critically endangered mountain chicken frog (Leptodactylus fallax) is unknown, like that of most amphibians. In this study, we analyzed the nutritional content of food items that comprise 91% of the wild diet of L. fallax, by dry weight of food items, and all food items offered to captive L. fallax at ZSL London Zoo and Jersey Zoo. We subsequently compared the nutritional content of the wild diet and captive diet at ZSL London Zoo consumed by L. fallax. To the authors’ knowledge, this is the first study to directly compare the nutritional content of the wild and captive diets of an anuran amphibian. The captive diet at ZSL London Zoo, without dusting of nutritional supplements, was higher in gross energy and crude fat and lower in ash, calcium and calcium:phosphorus ratio than the wild diet. Most of the food items in the captive diets had a high omega‐6:omega‐3 fatty acid ratio and in the wild diet had a low omega‐6:omega‐3 fatty acid ratio. We recommend a combination of modifications to the captive diets to better reflect the nutritional content of the wild diet. Nutritional analysis of captive and wild diets is recommended for other species in conservation breeding programs to improve captive husbandry and ultimately fitness.
Article
Humans are a colourful species of primate, with human skin, hair and eye coloration having been influenced by a great variety of evolutionary forces throughout prehistory. Functionally naked skin has been the physical interface between the physical environment and the human body for most of the history of the genus Homo, and hence skin coloration has been under intense natural selection. From an original condition of protective, dark, eumelanin-enriched coloration in early tropical-dwelling Homo and Homo sapiens, loss of melanin pigmentation occurred under natural selection as Homo sapiens dispersed into non-tropical latitudes of Africa and Eurasia. Genes responsible for skin, hair and eye coloration appear to have been affected significantly by population bottlenecks in the course of Homo sapiens dispersals. Because specific skin colour phenotypes can be created by different combinations of skin colour–associated genetic markers, loss of genetic variability due to genetic drift appears to have had negligible effects on the highly redundant genetic ‘palette’ for the skin colour. This does not appear to have been the case for hair and eye coloration, however, and these traits appear to have been more strongly influenced by genetic drift and, possibly, sexual selection. This article is part of the themed issue ‘Animal coloration: production, perception, function and application’.
Article
Background: Temporal trends in the US population's vitamin D status have been uncertain because of nonstandardized serum 25-hydroxyvitamin D [25(OH)D] measurements. Objective: To accurately assess vitamin D status trends among those aged ≥12 y, we used data from the cross-sectional NHANESs. Design: A liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for measuring 25(OH)D (sum of 25-hydroxyvitamin D2 and 25-hydroxyvitamin D3), calibrated to standard reference materials, was used to predict LC-MS/MS-equivalent concentrations from radioimmunoassay data (1988-2006 surveys; n = 38,700) and to measure LC-MS/MS concentrations (2007-2010 surveys; n = 12,446). Weighted arithmetic means and the prevalence of 25(OH)D above or below cutoff concentrations were calculated to evaluate long-term trends. Results: Overall, mean predicted 25(OH)D showed no time trend from 1988 to 2006, but during 2007-2010 the mean measured 25(OH)D was 5-6 nmol/L higher. Those groups who showed the largest 25(OH)D increases (7-11 nmol/L) were older, female, non-Hispanic white, and vitamin D supplement users. During 1988-2010, the proportions of persons with 25(OH)D <40 nmol/L were 14-18% (overall), 46-60% (non-Hispanic blacks), 21-28% (Mexican Americans), and 6-10% (non-Hispanic whites). Conclusions: An accurate method for measuring 25(OH)D showed stable mean concentrations in the US population (1988-2006) and recent modest increases (2007-2010). Although it is unclear to what extent supplement usage compared with different laboratory methods explain the increases in 25(OH)D, the use of higher vitamin D supplement dosages coincided with the increase. Marked race-ethnic differences in 25(OH)D concentrations were apparent. These data provide the first standardized information about temporal trends in the vitamin D status of the US population.
Article
Recently, high serum 25-hydroxyvitamin D concentration (~110nmol/L) was found in the Hadza tribe still keeping ancient hunter-gather life style. This level could serve as optimal vitamin D level that was built during millennia of human evolution. The personal vitamin D3 effective solar exposures of the Hadza adults are estimated using radiative model simulations with input from the satellite observations over lake Eyasi (3.7°S, 35.0°E). The calculations are carried out assuming the Hadza typical clothing habits and specific scenarios of the out-door activity comprising early morning and late afternoon working time in sun and prolonged midday siesta in the shade. The modeled doses received by the Hadza are converted to the vitamin D3 effective daily doses pertaining to the lighter skinned persons. We propose a novel formula to get adequate vitamin D level - exposure of 1/3 MED around local noon to 1/3 part of the whole body during warm sub-period of the year in the low- and mid-latitude regions. Such daily solar exposure is equivalent to ~2000IU of vitamin D3 taken orally. For many contemporary humans with limited out-door activity habit achieving such daily norm requires vitamin D3 supplementation of 2000IU throughout the whole year.
Article
Several health institutions recommend sodium intake be reduced to below 2,300mg, which means that 6–7 billion individuals should alter their diet to accommodate. Such a radical recommendation should be based on solid evidence. However, this review reveals that (i) there are no randomized controlled trials (RCTs) allocating individuals to below 2,300mg and measuring health outcomes; (ii) RCTs allocating risk groups such as obese prehypertensive individuals and hypertensive individuals down to (but not below) 2,300mg show no effect of sodium reduction on all-cause mortality; (iii) RCTs allocating individuals to below 2,300mg show a minimal effect on blood pressure in the healthy population (less than 1mm Hg) and significant increases in renin, aldosterone, noradrenalin cholesterol, and triglyceride; and (iv) observational studies show that sodium intakes below 2,645 and above 4,945mg are associated with increased mortality. Given that 90% of the worlds’ population currently consumes sodium within the optimal range of 2,645–4,945mg, there is no scientific basis for a public health recommendation to alter sodium intake.
Article
Regulatory components of the immune system are critical for preventing unintended activation of immune cells. Failure to prevent this unintended activation raises the risk of developing exaggerated inflammation and autoimmunity. In this issue of the JCI, Binger et al. and Hernandez et al. report that salt can play an important role in undermining regulatory mechanisms of the innate and adaptive immune systems. High salt levels interfere with alternative activation of macrophages (M2), which function in attenuating tissue inflammation and promoting wound healing. High salt also impairs Treg function by inducing IFNγ production in these cells. Together, these results provide evidence that environmental signals in the presence of high dietary salt enhance proinflammatory responses by interfering with both innate and adaptive regulatory mechanisms.
Article
Naturalistic feeding methods, such as the provision of whole carcasses to zoo animals, are potentially controversial because zoo visitors might not approve of them. However, since several species of zoo animals feed from large carcasses in the wild, this food type could benefit their welfare in captivity compared to other less-natural food types. Scavengers in particular almost exclusively live on carcasses in nature; therefore, their welfare in captivity could significantly depend on the opportunity to express behaviors related to carcass feeding. In this study, we assessed the frequency of carcass feeding for vultures in North American zoos and investigated the effect of different food types on the behavior of zoo-housed Andean condors (Vultur gryphus). We also evaluated the opinion of North American zoo visitors about carcass feeding. Our results show that small whole carcasses (rats, rabbits) are part of the diet of vultures in most North American zoos, but large whole carcasses (ungulates) are rarely fed. Our behavioral study indicated that Andean condors appear to be more motivated to feed on more natural food types, which also seem to physically engage the birds more and occupy them longer. Most zoo visitors approved of carcass feeding for captive vultures over a range of prey animals, and the majority would also like to observe the vultures eat. Collectively, our results demonstrate that carcass feeding, particularly with larger prey, potentially enriches both zoo-housed vultures as well as the visitor experience. Zoo Biol. 9999:1–12, 2015. © 2015 Wiley Periodicals, Inc.
Article
Even though there are many factors which determine the human colon microbiota composition, diet is an important one because most microorganisms in the colon obtain energy for their growth by degrading complex dietary compounds, particularly dietary fibers. While fiber carbohydrates that escape digestion in the upper gastrointestinal tract are recognized to have a range of structures, the vastness in number of chemical structures from the perspective of the bacteria is not well appreciated. In this article, the concept of "discrete structure" is introduced that is defined as a unique chemical structure, often within a fiber molecule, which aligns with encoded gene clusters in bacterial genomes. The multitude of discrete structures originates from the array of different fiber types coupled with structural variations within types due to genotype and growing environment, anatomical parts of the grain or plant, discrete regions within polymers, and size of oligosaccharides and small polysaccharides. These thousands of discrete structures conceivably could be used to favor bacteria in the competitive colon environment. A global framework needs to be developed to better understand how dietary fibers can be used to obtain predicted changes in microbiota composition for improved health. This will require a multi-disciplinary effort that includes biological scientists, clinicians, and carbohydrate specialists.