[Show abstract][Hide abstract] ABSTRACT: Binge eating is a key symptom of many eating disorders (e.g. binge eating disorder, bulimia nervosa, anorexia nervosa binge/purge type), yet the neurobiological underpinnings of binge eating are poorly understood. The mesocorticolimbic reward circuit, including the nucleus accumbens and the medial prefrontal cortex, is likely involved because this circuit mediates the hedonic value and incentive salience of palatable foods (PF). Here we tested the hypothesis that higher propensity for binge eating is associated with a heightened response (i.e., Fos induction) of the nucleus accumbens and medial prefrontal cortex to PF, using an animal model that identifies binge eating prone (BEP) and binge eating resistant (BER) rats. Forty adult female Sprague-Dawley rats were given intermittent access to PF (high fat pellets) 3×/week for 3weeks. Based on a pattern of either consistently high or consistently low PF consumption across these feeding tests, 8 rats met criteria for categorization as BEP, and 11 rats met criteria for categorization as BER. One week after the final feeding test, BEP and BER rats were either exposed to PF in their home cages or were given no PF in their home cages for one hour prior to perfusion, leading to three experimental groups for the Fos analysis: BEPs given PF, BERs given PF, and a No PF control group. The total number of Fos-immunoreactive (Fos-ir) cells in the nucleus accumbens core and shell, and the cingulate, prelimbic, and infralimbic regions of the medial prefrontal cortex was estimated by stereological analysis. PF induced higher Fos expression in the nucleus accumbens shell and core and in the prelimbic and infralimbic cortex of BEP rats compared to No PF controls. Throughout the nucleus accumbens and medial prefrontal cortex, PF induced higher Fos expression in BEP than in BER rats, even after adjusting for differences in PF intake. Differences in the neural activation pattern between BEP and BER rats were more robust in prefrontal cortex than in nucleus accumbens. These data confirm that PF activates brain regions responsible for encoding the incentive salience and hedonic properties of PF, and suggest that binge eating proneness is associated with enhanced responses to PF in brain regions that exert executive control over food reward.
[Show abstract][Hide abstract] ABSTRACT: Previous studies have shown significant within-person changes in binge eating and emotional eating across the menstrual cycle, with substantial increases in both phenotypes during post-ovulation. Increases in both estradiol and progesterone levels appear to account for these changes in phenotypic risk, possibly via increases in genetic effects. However, to date, no study has examined changes in genetic risk for binge phenotypes (or any other phenotype) across the menstrual cycle. The goal of the present study was to examine within-person changes in genetic risk for emotional eating scores across the menstrual cycle.
Participants were 230 female twin pairs (460 twins) from the Michigan State University Twin Registry who completed daily measures of emotional eating for 45 consecutive days. Menstrual cycle phase was coded based on dates of menstrual bleeding and daily ovarian hormone levels.
Findings revealed important shifts in genetic and environmental influences, where estimates of genetic influences were two times higher in post- as compared with pre-ovulation. Surprisingly, pre-ovulation was marked by a predominance of environmental influences, including shared environmental effects which have not been previously detected for binge eating phenotypes in adulthood.
Our study was the first to examine within-person shifts in genetic and environmental influences on a behavioral phenotype across the menstrual cycle. Results highlight a potentially critical role for these shifts in risk for emotional eating across the menstrual cycle and underscore the need for additional, large-scale studies to identify the genetic and environmental factors contributing to menstrual cycle effects.
Psychological Medicine 07/2015; -1(15):1-11. DOI:10.1017/S0033291715001221 · 5.94 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Prenatal testosterone exposure may be protective against disordered eating. However, prior studies have produced mixed results. Developmental differences in prenatal testosterone's protective effects on disordered eating may explain these discrepancies. Indeed, studies have differed in the age of participants assessed, with data supporting prenatal testosterone effects on disordered eating in early adolescent and young adult samples but not in late adolescence. The present studies are the first to investigate age differences in prenatal testosterone's protective effects on disordered eating. Two indirect markers of higher prenatal testosterone were examined: (a) lower finger-length ratios (Study 1: index [2D]/ring [4D] finger [2D:4D]) and (b) lower disordered eating in female s from opposite-sex twin pairs (who are thought to be exposed to higher prenatal testosterone from their male co-twin) relative to female controls (Study 2). Participants were twins from the Michigan State University Twin Registry (Study 1: n = 409; Study 2: n = 1,538) in early adolescence, late adolescence, or young adulthood. Disordered eating was assessed with well-validated questionnaires. Finger-length ratios were measured from hand scans, using electronic computer calipers. Findings were consistent across both studies. Higher prenatal testosterone (lower 2D:4D; females from opposite-sex twin pairs vs. controls) predicted lower disordered eating in early adolescence and young adulthood only. Prenatal testosterone-disordered eating associations were not observed during late adolescence. Results point to the possibility of developmental windows of expression for prenatal testosterone's protective effects on disordered eating and suggest that prior discrepant results may reflect age differences across samples. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
[Show abstract][Hide abstract] ABSTRACT: Objective:
Mean-levels of thin-ideal internalization increase during adolescence and pubertal development, but it is unknown whether these phenotypic changes correspond to developmental changes in etiological (i.e., genetic and environmental) risk. Given the limited knowledge on risk for thin-ideal internalization, research is needed to guide the identification of specific types of risk factors during critical developmental periods. The present twin study examined genetic and environmental influences on thin-ideal internalization across adolescent and pubertal development.
Participants were 1,064 female twins (ages 8-25 years) from the Michigan State University Twin Registry. Thin-ideal internalization and pubertal development were assessed using self-report questionnaires. Twin moderation models were used to examine if age and/or pubertal development moderate genetic and environmental influences on thin-ideal internalization.
Phenotypic analyses indicated significant increases in thin-ideal internalization across age and pubertal development. Twin models suggested no significant differences in etiologic effects across development. Nonshared environmental influences were most important in the etiology of thin-ideal internalization, with genetic, shared environmental, and nonshared environmental accounting for approximately 8%, 15%, and 72%, respectively, of the total variance.
Despite mean-level increases in thin-ideal internalization across development, the relative influence of genetic versus environmental risk did not differ significantly across age or pubertal groups. The majority of variance in thin-ideal internalization was accounted for by environmental factors, suggesting that mean-level increases in thin-ideal internalization may reflect increases in the magnitude/strength of environmental risk across this period. Replication is needed, particularly with longitudinal designs that assess thin-ideal internalization across key developmental phases.
International Journal of Eating Disorders 11/2014; 47(7). DOI:10.1002/eat.22321 · 3.13 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Changes in ovarian hormones predict changes in emotional eating across the menstrual cycle. However, prior studies have not examined whether the nature of associations varies across dysregulated eating severity. The current study determined whether the strength and/or nature of hormone/dysregulated eating associations differ based on the presence of clinically diagnosed binge episodes (BEs). Participants included 28 women with BEs and 417 women without BEs who provided salivary hormone samples, ratings of emotional eating, and BE frequency for 45 days. Results revealed stronger associations between dysregulated eating and ovarian hormones in women with BEs as compared to women without BEs. The nature of associations also differed, as progesterone moderated the effects of lower estradiol levels on dysregulated eating in women with BEs only. Although hormone/dysregulated eating associations are present across the spectrum of pathology, the nature of associations may vary in ways that have implications for etiological models and treatment.
[Show abstract][Hide abstract] ABSTRACT: A classical twin study was used to estimate the magnitude of genetic and environmental influences on four measurements of within-person variability: dominance flux, warmth flux, spin and pulse. Flux refers to the variability of an individual's interpersonal dominance and warmth. Spin measures changes in the tone of interpersonal styles and pulse measures changes in the intensity of interpersonal styles. Daily reports of interpersonal styles were collected from 494 same-sex female twins (142 monozygotic pairs and 105 dizygotic pairs) over 45 days. For dominance flux, warmth flux, and spin, genetic effects accounted for a larger proportion of variance (37%, 24%, and 30%, respectively) than shared environmental effects (14%, 13%, 0%, respectively), with the remaining variance due to the non-shared environment (62%, 50%, 70% respectively). Pulse appeared to be primarily influenced by the non-shared environment, although conclusions about the contribution of familial influences were difficult to draw from this study.
Social Psychological and Personality Science 04/2014; 6(3):300-308. DOI:10.1177/1948550614552729 · 2.56 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Binge eating is a significantly heritable phenotype, but efforts to detect specific risk genes have fallen short. Identification of animal strain differences in risk for binge eating could highlight genetic differences across individuals of the same species that can be exploited in future animal and molecular genetic research. The current study aimed to explore strain differences in risk for binge eating in Sprague-Dawley versus Wistar female rats using the Binge Eating Resistant/Binge Eating Prone model. A sample of male Sprague Dawley rats, a known low-risk group for binge eating, was included as a comparison group. A total of 83 rats (23 Wistar female, 30 Sprague-Dawley female, 30 Sprague-Dawley male) completed a protocol of intermittently administered, palatable food. Binge eating prone (BEP) and binge eating resistant (BER) rats were identified using a tertile approach. Sprague-Dawley female rats consumed the highest amount of palatable food and were more likely to be classified as BEP compared to Wistar female and Sprague-Dawley male rats. Wistar female rats were not significantly different from Sprague-Dawley male rats in their palatable food intake and tendency to be classified as BER rather than BEP. Sprague-Dawley female rats appear to be a particularly vulnerable genotype for binge eating. Comparisons between this group and others could help identify specific genetic/biological factors that differentiate it from lower risk groups. The reward system, linked to binge eating in humans, is a possible candidate to explore. Strain differences in the reward system could help increase understanding of individual differences in risk for binge eating in humans.
[Show abstract][Hide abstract] ABSTRACT: Testosterone may be a biological factor that protects males against eating disorders. Elevated prenatal testosterone exposure is linked to lower levels of disordered eating symptoms, but effects emerge only after mid-puberty. Whether circulating levels of testosterone account for decreased risk for disordered eating in boys after mid-puberty is currently unknown; however, animal data support this possibility. In rodents, prenatal testosterone's masculinizing effects on sex-differentiated behaviors emerge during puberty when circulating levels of testosterone increase and 'activate' the expression of masculinized phenotypes. This study investigated whether higher levels of circulating testosterone predict lower levels of disordered eating symptoms in adolescent boys, and in particular whether effects are associated with advancing pubertal maturation.
Participants were 213 male twins from the Michigan State University Twin Registry. The Minnesota Eating Behavior Survey and Eating Disorder Examination Questionnaire assessed several disordered eating symptoms. The Pubertal Development Scale assessed pubertal status. Afternoon saliva samples were assayed for testosterone using enzyme immunoassays.
Consistent with animal data, higher levels of circulating testosterone predicted lower levels of disordered eating symptoms in adolescent boys and effects emerged with advancing puberty. Results were not accounted for by several important covariates, including age, adiposity, or mood/anxiety symptoms.
Findings suggest that elevated circulating testosterone may be protective and underlie decreased risk for eating pathology in males during/after puberty, whereas lower levels of testosterone may increase risk and explain why some, albeit relatively few, males develop eating disorders.
Psychological Medicine 01/2014; 44(11):1-16. DOI:10.1017/S0033291713003073 · 5.94 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Previous research indicates that borderline personality disorder (BPD) is well conceptualized as a dimensional construct that can be represented using normal personality traits. A previous study successfully developed and validated a BPD measure embedded within a normal trait measure, the Minnesota Borderline Personality Disorder Scale (MBPD). The current study performed a further validation of the MBPD by examining its convergent validity, external correlates, and heritability in a sample of 429 female twins. The MBPD correlated strongly with the Structured Clinical Interview for DSM-IV Axis II Personality Disorders (SCID-II) screener for BPD and moderately with external correlates. Moreover, the MBPD and SCID-II screener exhibited very similar patterns of external correlations. Additionally, results indicated that the genetic and environmental influences on MBPD overlap with the genetic and environmental influences on the SCID-II screener, which suggests that these scales are measuring the same construct. These data provide further evidence for the construct validity of the MBPD. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
[Show abstract][Hide abstract] ABSTRACT: Objective
Several efforts are underway to model binge eating in animals in order to advance neurobiological models of risk. However, knowledge of sex differences in these models is currently lacking. The goal of the present study was to examine sex differences in binge eating phenotypes using a well-established rodent model (i.e., the binge eating resistant/binge eating prone model). Method
Thirty male and 30 female adult Sprague-Dawley rats were exposed to feeding tests consisting of intermittent access to palatable food (PF). Rats were then categorized as binge eating prone (BEP) based on the amount and consistency of PF consumption across tests. ResultsAcross multiple methods for BEP classification, rates of BEP phenotypes were two to six times higher in female than male rats. DiscussionFindings provide support for sex differences in rodent models of binge eating and highlight the promise of the BER/BEP model for understanding neurobiological mechanisms underlying sex differences in risk. (c) 2013 Wiley Periodicals, Inc. (Int J Eat Disord 2013; 46:729-736)
International Journal of Eating Disorders 11/2013; 46(7). DOI:10.1002/eat.22139 · 3.13 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Hormones exert powerful influences on mammalian nervous system development, particularly during developmental transitions associated with a change in reproductive state, including the perinatal period of sexual differentiation, puberty, pregnancy and lactation, and reproductive aging. This chapter focuses on the influences of gonadal steroid hormones, their sites and mechanisms of action, and behavioral outcomes during these reproductive transitions in mammals. The major emphasis of this chapter is on organizational influences of gonadal steroid hormones – that is, how early life exposures to hormones program neural and behavioral phenotypes expressed later in life. Organizational hormones, both endogenous and exogenous (such as endocrine disrupting chemicals), alter developmental trajectory, often irreversibly, and they program sensitivity and responsiveness of the nervous system to hormones during subsequent developmental transitions. Because hormonal influences during later periods of development depend to a large extent on hormonal events that occurred during earlier periods of development, the organizational effects of hormones are compounded over the lifespan. Thus, the overarching premise of this chapter is that hormonal life history underlies much of the complexity that characterizes individual differences in neural, behavioral, and other physiological responses, not only to endogenous hormones, but also to endocrine disruptors and hormonal therapies.
Neuroscience in the 21st Century, 11/2013: pages 1715-1752; , ISBN: 978-1-4614-1996-9
[Show abstract][Hide abstract] ABSTRACT: This article is part of a Special Issue "Puberty and Adolescence". Sexual differentiation is the process by which the nervous system becomes structurally and functionally dissimilar in females and males. In mammals, this process has been thought to occur during prenatal and early postnatal development, when a transient increase in testosterone secretion masculinizes and defeminizes the developing male nervous system. Decades of research have led to the views that structural sexual dimorphisms created during perinatal development are passively maintained throughout life, and that ovarian hormones do not play an active role in feminization of the nervous system. Furthermore, perinatal testosterone was thought to determine sex differences in neuron number by regulating cell death and cell survival, and not by regulating cell proliferation. As investigations of neural development during adolescence became more prominent in the late 20th century and revealed the extent of brain remodeling during this time, each of these tenets has been challenged and modified. Here we review evidence from the animal literature that 1) the brain is further sexually differentiated during puberty and adolescence; 2) ovarian hormones play an active role in the feminization of the brain during puberty; and 3) hormonally modulated, sex-specific addition of new neurons and glial cells, as well as loss of neurons, contribute to sexual differentiation of hypothalamic, limbic, and cortical regions during adolescence. This architectural remodeling during the adolescent phase of sexual differentiation of the brain may underlie the known sex differences in vulnerability to addiction and psychiatric disorders that emerge during this developmental period.
Hormones and Behavior 07/2013; 64(2):203-10. DOI:10.1016/j.yhbeh.2013.05.010 · 4.63 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Research suggests that prenatal testosterone exposure may masculinize (i.e., lower) disordered eating (DE) attitudes and behaviors and influence the lower prevalence of eating disorders in males versus females. How or when these effects become prominent remains unknown, although puberty may be a critical developmental period. In animals, the masculinizing effects of early testosterone exposure become expressed during puberty when gonadal hormones activate sex-typical behaviors, including eating behaviors. This study examined whether the masculinizing effects of prenatal testosterone exposure on DE attitudes emerge during puberty in 394 twins from opposite-sex and same-sex pairs. Twin type (opposite sex vs. same sex) was used as a proxy for level of prenatal testosterone exposure because females from opposite-sex twin pairs are thought to be exposed to testosterone in utero from their male co-twin. Consistent with animal data, there were no differences in levels of DE attitudes between opposite-sex and same-sex twins during pre-early puberty. However, during mid-late puberty, females from opposite-sex twin pairs (i.e., females with a male co-twin) exhibited more masculinized (i.e., lower) DE attitudes than females from same-sex twin pairs (i.e., females with a female co-twin), independent of several "third variables" (e.g., body mass index [BMI], anxiety). Findings suggest that prenatal testosterone exposure may decrease DE attitudes and at least partially underlie sex differences in risk for DE attitudes after mid-puberty. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
[Show abstract][Hide abstract] ABSTRACT: Within-person changes in estradiol and progesterone predict changes in binge eating tendencies across the menstrual cycle. However, all women have menstrual-cycle fluctuations in hormones, but few experience binge eating. Personality traits may be critical individual difference factors that influence who will engage in emotional eating in the presence of a vulnerable hormonal environment. Women (N=239) provided self-reports of emotional eating and saliva samples for hormone measurement for 45 consecutive days. Negative urgency and negative emotionality were measured once and were examined as moderators of hormone-emotional eating associations. Consistent with prior research, within-person changes in the interaction between estradiol and progesterone predicted emotional eating. Neither negative urgency nor negative emotionality interacted with changes in estradiol and progesterone to predict changes in emotional eating. Additional factors, other than the two personality traits examined, may account for individual differences in within-person associations between hormones and emotional eating.