Article

Dietary intake of people with dementia on acute hospital wards

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The aim of this service evaluation was to understand the factors affecting meal provision on a dementia ward and assess whether meals-based dietary intake met estimated mean energy requirement and reference nutrient intake (RNI). A mixed-methods approach was adopted: 1. Mapping the dietary intake for a cohort ( n =6) of inpatients with advanced dementia over a period of four consecutive days. 2. Semi-structured interviews to explore the experiences of the professionals ( n =5) involved. Two of the participants' four-day mean intakes did not meet their estimated energy requirement based on meal provision. All participants apart from one met the target of 0.75g of protein per kg body weight per day for the general population. Several of the nutrients consumed were at or exceeded the RNI for adults of this age, although participants' mean dietary vitamin D intake was substantially below the RNI for the general population. The themes included communication, time pressure and the continuity of service provision. The meals provided were nutritionally sound. Individuals' nutritional status was improved through staff supporting them with eating. The contribution of drinks and snacks to nutrient intake warrants further exploration. Effective communication between food providers and ward areas is important. Mealtimes should reflect patient need - for example, having the main meal in the evening and a lighter option at lunch. This does not clash with the benefits of a flexible breakfast time.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Estimated plate waste (n = 59, 38%), followed by food records (n = 43, 28%) and then 24 h recall (n = 23, 15%), were the most frequently reported assessment methods with the remaining studies using a variety of approaches to quantification ( Figure 2) (12, . Estimated plate waste was predominantly collected using paper-based forms (n = 40, 68%), with six studies (10%) using an electronic form and the remaining studies providing inadequate detail to enable classification of the recording approach (n = 13, 22%) ( [111][112][113]118,121,124,127,128,131,134,138,141,143,147,152,153,157,160,[165][166][167][168]174) . Similarly, food records were mainly completed using paper-based forms (n = 30, 70%) with two (5%) studies reporting on the use of an electronic form and eleven (25%) studies providing insufficient data to enable classification (23,24,27,31,37,42,44,45,(48)(49)(50)53,(57)(58)(59)62,(65)(66)(67)(68)73,85,88,91,104,105,(107)(108)(109)(110)114,119,122,126,129,136,(144)(145)(146)155,162,171,174) . ...
Article
Full-text available
Quantification of oral intake within the hospital setting is required to guide nutrition care. Multiple dietary assessment methods are available, yet details regarding their application in the acute care setting are scarce. This scoping review, conducted in accordance with JBI methodology, describes dietary assessment methods used to measure oral intake in acute and critical care hospital patients. The search was run across four databases to identify primary research conducted in adult acute or critical care from 1 st January 2000-15 th March 2023 which quantified oral diet with any dietary assessment method. In total, 155 articles were included, predominantly from the acute care setting (n=153, 99%). Studies were mainly single-center (n=138, 88%) and of observational design (n=135, 87%). Estimated plate waste (n=59, 38%) and food records (n=42, 27%) were the most frequent assessment methods with energy and protein the main nutrients quantified (n=81, 52%). Validation was completed in 23 (15%) studies, with the majority of these using a reference method reliant on estimation (n=17, 74%). A quarter of studies (n=39) quantified completion (either as complete versus incomplete or degree of completeness) and four studies (2.5%) explored factors influencing completion. Findings indicate a lack of high-quality evidence to guide selection and application of existing dietary assessment methods to quantify oral intake with a particular absence of evidence in the critical care setting. Further validation of existing tools and identification of factors influencing completion is needed to guide the optimal approach to quantification of oral intake in both research and clinical contexts.
Article
Full-text available
In the last decade, the association between diet and cognitive function or dementia has been largely investigated. In the present article, we systematically reviewed observational studies published in the last three years (2014-2016) on the relationship among dietary factors and late-life cognitive disorders at different levels of investigation (i.e., dietary patterns, foods and food-groups, and dietary micro- and macronutrients), and possible underlying mechanisms of the proposed associations. From the reviewed evidence, the National Institute on Aging-Alzheimer's Association guidelines for Alzheimer's disease (AD) and cognitive decline due to AD pathology introduced some evidence suggesting a direct relation between diet and changes in the brain structure and activity. There was also accumulating evidence that combinations of foods and nutrients into certain patterns may act synergistically to provide stronger health effects than those conferred by their individual dietary components. In particular, higher adherence to a Mediterranean-type diet was associated with decreased cognitive decline. Moreover, also other emerging healthy dietary patterns such as the Dietary Approach to Stop Hypertension (DASH) and the Mediterranean-DASH diet Intervention for Neurodegenerative Delay (MIND) diets were associated with slower rates of cognitive decline and significant reduction of AD rate. Furthermore, some foods or food groups traditionally considered harmful such as eggs and red meat have been partially rehabilitated, while there is still a negative correlation of cognitive functions with saturated fatty acids and a protective effect against cognitive decline of elevated fish consumption, high intake of monounsaturated fatty acids and polyunsaturated fatty acids (PUFA), particularly n-3 PUFA.
Article
Full-text available
The field of health and wellbeing scholarship has a strong tradition of qualitative research—and rightly so. Qualitative research offers rich and compelling insights into the real worlds, experiences, and perspectives of patients and health care professionals in ways that are completely different to, but also sometimes complimentary to, the knowledge we can obtain through quantitative methods. There is a strong tradition of the use of grounded theory within the field—right from its very origins studying dying in hospital (Glaser & Strauss, 1965)—and this covers the epistemological spectrum from more positivist forms (Glaser, 1992, 1978) through to the constructivist approaches developed by Charmaz (2006) in, for instance, her compelling study of the loss of self in chronic illness (Charmaz, 1983). Similarly, narrative approaches (Riessman, 2007) have been used to provide rich and detailed accounts of the social formations shaping subjective experiences of health and well-being (e.g., Riessman, 2000). Phenomenological and hermeneutic approaches, including the more recently developed interpretative phenomenological analysis (Smith, Flowers, & Larkin, 2009), are similarly regularly used in health and wellbeing research, and they suit it well, oriented as they are to the experiential and interpretative realities of the participants themselves (e.g., Smith & Osborn, 2007).
Article
Full-text available
The role of selenium (Se) in human health and diseases has been discussed in detail in several recent reviews, with the main conclusion being that selenium deficiency is recognised as a global problem which urgently needs resolution. Since selenium content in plant-based food depends on its availability from soil, the level of this element in food and feeds varies among regions. In general, eggs and meat are considered to be good sources of selenium in human diet. When considering ways to improve human selenium intake, there are several potential options. These include direct supplementation, soil fertilisation and supplementation of food staples such as flour, and production of functional foods. Analysing recent publications related to functional food production, it is evident that selenium-enriched eggs can be used as an important delivery system of this trace mineral for humans. In particular, developments and commercialisation of organic forms of selenium have initiated a new era in the availability of selenium-enriched products. It has been shown that egg selenium content can easily be manipulated to give increased levels, especially when organic selenium is included in hens' diet at levels that provide 0.3-0.5 mg/kg selenium in the feed. As a result, technology for the production of eggs delivering approximately 50% (30-35 microg) of the human selenium RDA have been developed and successfully tested. Currently companies all over the world market selenium-enriched eggs including the UK, Ireland, Mexico, Columbia, Malaysia, Thailand, Australia, Turkey, Russia and the Ukraine. Prices for enriched eggs vary from country to country, typically being similar to free-range eggs. Selenium-enriched chicken, pork and beef can also be produced when using organic selenium in the diet of poultry and farm animals. The scientific, technological and other advantages and limitations of producing designer/modified eggs as functional foods are discussed in this review.
Article
Full-text available
To facilitate the Food and Agriculture Organization/World Health Organization/United Nations University Joint (FAO/WHO/UNU) Expert Consultation on Energy and Protein Requirements which met in Rome in 1981, Schofield et al. reviewed the literature and produced predictive equations for both sexes for the following ages: 0-3, 3-10, 10-18, 18-30, 30-60 and >60 years. These formed the basis for the equations used in 1985 FAO/WHO/UNU document, Energy and Protein Requirements. While Schofield's analysis has served a significant role in re-establishing the importance of using basal metabolic rate (BMR) to predict human energy requirements, recent workers have subsequently queried the universal validity and application of these equations. A survey of the most recent studies (1980-2000) in BMR suggests that in most cases the current FAO/WHO/UNU predictive equations overestimate BMR in many communities. The FAO/WHO/UNU equations to predict BMR were developed using a database that contained a disproportionate number--3388 out of 7173 (47%)--of Italian subjects. The Schofield database contained relatively few subjects from the tropical region. The objective here is to review the historical development in the measurement and application of BMR and to critically review the Schofield et al. BMR database presenting a series of new equations to predict BMR. This division, while arbitrary, will enable readers who wish to omit the historical review of BMR to concentrate on the evolution of the new BMR equations. BMR data collected from published and measured values. A series of new equations (Oxford equations) have been developed using a data set of 10,552 BMR values that (1) excluded all the Italian subjects and (2) included a much larger number (4018) of people from the tropics. In general, the Oxford equations tend to produce lower BMR values than the current FAO/WHO/UNU equations in 18-30 and 30-60 year old males and in all females over 18 years of age. This is an opportune moment to re-examine the role and place of BMR measurements in estimating total energy requirements today. The Oxford equations' future use and application will surely depend on their ability to predict more accurately the BMR in contemporary populations.
Article
Persons living with dementia have many health concerns, including poor nutritional states. This narrative review provides an overview of the literature on nutritional status in persons diagnosed with a dementing illness or condition. Poor food intake is a primary mechanism for malnutrition, and there are many reasons why poor food intake occurs, especially in the middle and later stages of the dementing illness. Research suggests a variety of interventions to improve food intake, and thus nutritional status and quality of life, in persons with dementia. For family care partners, education programs have been the focus, while a range of intervention activities have been the focus in residential care, from tableware changes to retraining of self-feeding. It is likely that complex interventions are required to more fully address the issue of poor food intake, and future research needs to focus on diverse components. Specifically, modifying the psychosocial aspects of mealtimes is proposed as a means of improving food intake and quality of life and, to date, is a neglected area of intervention development and research.
Chapter
As the population of elderly people is growing rapidly, the number of individuals with dementia and cognitive impairment is also increasing. One of the preventive measures against cognitive decline is diet, and different dietary factors have already been investigated. This chapter provides an overview of studies on dietary protein and dementia and cognitive decline. Also studies on some individual amino acids, the structural units that make up proteins, which may be related with cognitive functioning are discussed. Overall, the role of dietary protein intake on both dementia and cognitive decline has hardly been studied. Data are also limited on the role of individual amino acids, especially in elderly populations. More research is needed to come to definitive conclusions and specific recommendations regarding protein intake or intake of specific amino acids for maintaining optimal cognitive functioning.
Article
Multiple biological functions of selenium manifest themselves mainly via 25 selenoproteins that have selenocysteine at their active centre. Selenium is vital for the brain and seems to participate in the pathology of disorders such as Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis and epilepsy. Since selenium was shown to be involved in diverse functions of the central nervous system, such as motor performance, coordination, memory and cognition, a possible role of selenium and selenoproteins in brain signalling pathways may be assumed. The aim of the present review is to analyse possible relations between selenium and neurotransmission. Selenoproteins seem to be of special importance in the development and functioning of GABAergic (GABA, γ-aminobutyric acid) parvalbumin positive interneurons of the cerebral cortex and hippocampus. Dopamine pathway might be also selenium dependent as selenium shows neuroprotection in the nigrostriatal pathway and also exerts toxicity towards dopaminergic neurons under higher concentrations. Recent findings also point to acetylcholine neurotransmission involvement. The role of selenium and selenoproteins in neurotransmission might not only be limited to their antioxidant properties but also to inflammation, influencing protein phosphorylation and ion channels, alteration of calcium homeostasis and brain cholesterol metabolism. Moreover, a direct signalling function was proposed for selenoprotein P through interaction with post-synaptic apoliprotein E receptors 2 (ApoER2).
Article
Presented here is an overview of the pathway from early nutrient deficiency to long-term brain function, cognition, and productivity, focusing on research from low- and middle-income countries. Animal models have demonstrated the importance of adequate nutrition for the neurodevelopmental processes that occur rapidly during pregnancy and infancy, such as neuron proliferation and myelination. However, several factors influence whether nutrient deficiencies during this period cause permanent cognitive deficits in human populations, including the child's interaction with the environment, the timing and degree of nutrient deficiency, and the possibility of recovery. These factors should be taken into account in the design and interpretation of future research. Certain types of nutritional deficiency clearly impair brain development, including severe acute malnutrition, chronic undernutrition, iron deficiency, and iodine deficiency. While strategies such as salt iodization and micronutrient powders have been shown to improve these conditions, direct evidence of their impact on brain development is scarce. Other strategies also require further research, including supplementation with iron and other micronutrients, essential fatty acids, and fortified food supplements during pregnancy and infancy.
Article
New evidence shows that older adults need more dietary protein than do younger adults to support good health, promote recovery from illness, and maintain functionality. Older people need to make up for age-related changes in protein metabolism, such as high splanchnic extraction and declining anabolic responses to ingested protein. They also need more protein to offset inflammatory and catabolic conditions associated with chronic and acute diseases that occur commonly with aging. With the goal of developing updated, evidence-based recommendations for optimal protein intake by older people, the European Union Geriatric Medicine Society (EUGMS), in cooperation with other scientific organizations, appointed an international study group to review dietary protein needs with aging (PROT-AGE Study Group). To help older people (>65 years) maintain and regain lean body mass and function, the PROT-AGE study group recommends average daily intake at least in the range of 1.0 to 1.2 g protein per kilogram of body weight per day. Both endurance- and resistance-type exercises are recommended at individualized levels that are safe and tolerated, and higher protein intake (ie, ≥1.2 g/kg body weight/d) is advised for those who are exercising and otherwise active. Most older adults who have acute or chronic diseases need even more dietary protein (ie, 1.2-1.5 g/kg body weight/d). Older people with severe kidney disease (ie, estimated GFR <30 mL/min/1.73m(2)), but who are not on dialysis, are an exception to this rule; these individuals may need to limit protein intake. Protein quality, timing of ingestion, and intake of other nutritional supplements may be relevant, but evidence is not yet sufficient to support specific recommendations. Older people are vulnerable to losses in physical function capacity, and such losses predict loss of independence, falls, and even mortality. Thus, future studies aimed at pinpointing optimal protein intake in specific populations of older people need to include measures of physical function.
Article
Background: This review examines the associations between low vitamin B12 levels, neurodegenerative disease, and cognitive impairment. The potential impact of comorbidities and medications associated with vitamin B12 derangements were also investigated. In addition, we reviewed the evidence as to whether vitamin B12 therapy is efficacious for cognitive impairment and dementia. Methods: A systematic literature search identified 43 studies investigating the association of vitamin B12 and cognitive impairment or dementia. Seventeen studies reported on the efficacy of vitamin B12 therapy for these conditions. Results: Vitamin B12 levels in the subclinical low-normal range (<250 ρmol/L) are associated with Alzheimer’s disease, vascular dementia, and Parkinson’s disease. Vegetarianism and metformin use contribute to depressed vitamin B12 levels and may independently increase the risk for cognitive impairment. Vitamin B12 deficiency (<150 ρmol/L) is associated with cognitive impairment. Vitamin B12 supplements administered orally or parenterally at high dose (1 mg daily) were effective in correcting biochemical deficiency, but improved cognition only in patients with pre-existing vitamin B12 deficiency (serum vitamin B12 levels <150 ρmol/L or serum homocysteine levels >19.9μmol/L). Conclusion: Low serum vitamin B12 levels are associated with neurodegenerative disease and cognitive impairment. There is a small subset of dementias that are reversible with vitamin B12 therapy and this treatment is inexpensive and safe. Vitamin B12 therapy does not improve cognition in patients without preexisting deficiency. There is a need for large, well-resourced clinical trials to close the gaps in our current understanding of the nature of the associations of vitamin B12 insufficiency and neurodegenerative disease.
Article
To investigate the relationships between previous diet, biomarkers of selected B vitamins, nutritional status and length of stay. Cross sectional study. Geriatric rehabilitation patients, Sydney, Australia. Fifty two consenting patients with normal serum creatinine levels and no dementia. Serum vitamin B12, plasma vitamin B6, serum and erythrocyte folate, homocysteine and methylmalonic acid (MMA) concentrations; dietary intake using a validated semi-quantitative food frequency questionnaire and nutritional assessment using the Mini Nutritional Assessment (MNA). Length of stay data were collected from medical records after discharge. The age was 80 ± 8 year (mean ± SD), BMI 26.4 ± 6.8 kg/m2 and MNA score 22 ± 3 indicating some risk of malnutrition. Deficiencies of vitamins B6, B12 and folate were found in 30, 22 and 5 subjects respectively. Length of stay was positively correlated with age and MMA (Spearman's correlation 0.4, p<0.01 and 0.28, p<0.05 respectively) and negatively correlated with albumin, vitamin B6 and MNA score (Spearman's correlation -0.35, -0.33 and -0.29, p<0.05). After adjustment for age and sex, ln vitamin B6 and ln MMA concentrations were significant in predicting ln LOS (p=0.006 and p=0.014 respectively). The study indicates a high risk of vitamin B deficiencies in the elderly and suggests that deficiencies of vitamins B6 and B12 are associated with length of stay. This is concerning as B vitamin status is rarely fully assessed.
Article
For individuals, a statistical approach is available to compare observed intakes to the EAR or AI (to assess adequacy), and the UL (to assess risk of excess). A more qualitative assessment of intakes involves comparison directly to the RDA to evaluate adequacy, but this is accurate only if long-term usual intake is known. For groups of people, the prevalence of inadequacy can usually be estimated as the proportion with intakes below the EAR, while the prevalence of potentially excessive intakes is estimated as the proportion above the UL. The accuracy of all assessments depends on unbiased and accurate intake estimates as well as a consideration of the effects of day-to-day variation in intake. Nutrition practitioners will find the new DRIs useful for assessing diets in a variety of settings. Computerized assessment systems will be important tools when incorporating these theoretical concepts into dietetic practice.
Improving nutrition and hydration in hospital: the nurse’s responsibility
National Diet and Nutrition Survey Results from Years 5 and 6 (combined) of the Rolling Programme
  • Public Health England