ArticlePublisher preview available

Category-Sensitive Age-Related Shifts Between Prosodic and Semantic Dominance in Emotion Perception Linked to Cognitive Capacities

American Speech-Language-Hearing Association
Journal of Speech, Language, and Hearing Research
Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract and Figures

Purpose Prior research extensively documented challenges in recognizing verbal and nonverbal emotion among older individuals when compared with younger counterparts. However, the nature of these age-related changes remains unclear. The present study investigated how older and younger adults comprehend four basic emotions (i.e., anger, happiness, neutrality, and sadness) conveyed through verbal (semantic) and nonverbal (facial and prosodic) channels. Method A total of 73 older adults (43 women, Mage = 70.18 years) and 74 younger adults (37 women, Mage = 22.01 years) partook in a fixed-choice test for recognizing emotions presented visually via facial expressions or auditorily through prosody or semantics. Results The results confirmed age-related decline in recognizing emotions across all channels except for identifying happy facial expressions. Furthermore, the two age groups demonstrated both commonalities and disparities in their inclinations toward specific channels. While both groups displayed a shared dominance of visual facial cues over auditory emotional signals, older adults indicated a preference for semantics, whereas younger adults displayed a preference for prosody in auditory emotion perception. Notably, the dominance effects observed in older adults for visual and semantic cues were less pronounced for sadness and anger compared to other emotions. These challenges in emotion recognition and the shifts in channel preferences among older adults were correlated with their general cognitive capabilities. Conclusion Together, the findings underscore that age-related obstacles in perceiving emotions and alterations in channel dominance, which vary by emotional category, are significantly intertwined with overall cognitive functioning. Supplemental Material https://doi.org/10.23641/asha.27307251
This content is subject to copyright. Terms and conditions apply.
Research Article
Category-Sensitive Age-Related Shifts Between
Prosodic and Semantic Dominance in Emotion
Perception Linked to Cognitive Capacities
Yi Lin,
a,b
Xiaoqing Ye,
c
Huaiyi Zhang,
a,b
Fei Xu,
a,b
Jingyu Zhang,
a,b
Hongwei Ding,
a,b
and Yang Zhang
d
a
Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, China
b
National Research Centre for
Language and Well-Being, Shanghai, China
c
Shanghai Jiao Tong University School of Medicine, China
d
Department of Speech-Language-
Hearing Sciences and Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis
AR T I CLE I N F O
Article History:
Received December 30, 2023
Revision received June 20, 2024
Accepted August 21, 2024
Editor-in-Chief: Rachael Frush Holt
Editor: Michelle R. Molis
https://doi.org/10.1044/2024_JSLHR-23-00817
Correspondence to Hongwei Ding: hwding@sjtu.edu.cn and Yang
Zhang: zhanglab@umn.edu. Disclosure: The authors have declared
that no competing financial or nonfinancial interests existed at the time
of publication.
AB ST R A C T
Purpose: Prior research extensively documented challenges in recognizing ver-
bal and nonverbal emotion among older individuals when compared with youn-
ger counterparts. However, the nature of these age-related changes remains
unclear. The present study investigated how older and younger adults compre-
hend four basic emotions (i.e., anger, happiness, neutrality, and sadness) con-
veyed through verbal (semantic) and nonverbal (facial and prosodic) channels.
Method: A total of 73 older adults (43 women, M
age
= 70.18 years) and 74
younger adults (37 women, M
age
= 22.01 years) partook in a fixed-choice test
for recognizing emotions presented visually via facial expressions or auditorily
through prosody or semantics.
Results: The results confirmed age-related decline in recognizing emotions
across all channels except for identifying happy facial expressions. Furthermore,
the two age groups demonstrated both commonalities and disparities in their
inclinations toward specific channels. While both groups displayed a shared
dominance of visual facial cues over auditory emotional signals, older adults
indicated a preference for semantics, whereas younger adults displayed a pref-
erence for prosody in auditory emotion perception. Notably, the dominance
effects observed in older adults for visual and semantic cues were less pro-
nounced for sadness and anger compared to other emotions. These challenges
in emotion recognition and the shifts in channel preferences among older adults
were correlated with their general cognitive capabilities.
Conclusion: Together, the findings underscore that age-related obstacles in
perceiving emotions and alterations in channel dominance, which vary by emo-
tional category, are significantly intertwined with overall cognitive functioning.
Supplemental Material: https://doi.org/10.23641/asha.27307251
Social cognitive and communicative skills play an
essential role in navigating interpersonal conducts and
relationships in older age. One key domain of soc iocogni-
tive functions is to understand the communicative inten-
tions and emotional states of others (Arioli et al., 2018;
Kalokerinos et al., 2015). In previous studies, older adults
have consistently demonstrated perceptua l difficulties in
decoding emotions from various verbal and nonverbal
channels (Lambrecht et al., 2012; Ruffman et al., 2008),
which may lead to interpersonal dysfunctioning and social
isolation commonly observed in psychopathological aging
conditions (e.g., mild cognitive impairment, Alzheimers
disease; Cardenas et al., 2021). However, there are still
many unknowns about how these age-related emotional
deficits are differentially represented across communica-
tion channels (Schirmer & Adolphs, 2017). It also remains
Journal of Speech, Language, and Hearing Research Vol. 67 48294849 December 2024 Copyright © 2024 American Speech-Language-Hearing Association 4829
... fundamental frequency (f0), duration, intensity], which is known as prosody. In many languages and cultures, prosody can be more perceptually salient than the literal meaning of an utterance (Filippi et al., 2017;Kim & Sumner, 2017;Lin et al., 2020Lin et al., , 2021aLin et al., , 2021bLin et al., 2024). In addition to speech communication, it plays a crucial role in distinguishing and interpreting the social significance of some nonspeech sounds (e.g. ...
Article
This study investigated the neural mechanisms underlying the pre-attentive processing of emotional prosody in speech and nonspeech contexts. Participants completed a double-oddball electroencephalography experiment by passively listening to speech sounds spoken in neutral or two alternating emotional (i.e. angry and happy) prosody and the acoustically matched nonspeech hums. Mismatched Negativity, P3a and their associated neural oscillatory activities were analysed. Results showed that speech (relative to nonspeech) sounds produced greater and faster brain responses especially for anger compared to happiness. Anger also triggered greater brain responses than happiness for both speech and nonspeech during auditory change detection, but this emotion-specific effect was only preserved for speech and reversed for nonspeech during involutionary attentional orientation. Together, these findings showcase highly complex interactions between speech dominance and emotion specificity effects during different stages of pre-attentive processing. Our study provides evidence for the specialised mechanisms involved in emotional speech perception, which may have practical implications for understanding and treating clinical populations with altered speech sensitivity and emotion processing.
Article
Full-text available
Purpose Understanding how older adults perceive and interpret emotional cues in speech prosody contributes to our knowledge of cognitive aging. This study provides a systematic review with meta-analysis to investigate the extent of the decline in affective prosody recognition (APR) among older adults in terms of overall and emotion-specific performance and explore potential moderators that may cause between-studies heterogeneity. Method The literature search encompassed five electronic databases, with a specific emphasis on studies comparing the APR performance of older adults with that of younger adults. This comparison was focused on basic emotions. Meta-regression analyses were executed to pinpoint potential moderators related to demographic and methodological characteristics. Results A total of 19 studies were included in the meta-analysis, involving 560 older adults with a mean age of 69.15 years and 751 younger adults with a mean age of 23.02 years. The findings indicated a substantial negative effect size (g = −1.21). Furthermore, the magnitude of aggregated effect sizes showed a distinct valence-related recognition pattern with positive prosody exhibiting smaller effect sizes. Language background and years of education were found to moderate the overall and emotion-specific (i.e., disgust and surprise) performance effect estimate, and age and gender significantly influenced the effect estimate of happiness. Conclusions The results confirmed a significant decline in APR ability among older adults compared to younger adults, but this decline was unbalanced across basic emotions. Language background and educational level emerged as significant factors influencing older adults' APR ability. Moreover, participants with a higher mean age exhibited notably poorer performance in recognizing happy prosody. These findings underscore the need to further investigate the neurobiological mechanisms for APR decline associated with aging. Supplemental Material https://doi.org/10.23641/asha.26407888
Article
Full-text available
Sex differences in verbal and nonverbal emotion processing in older individuals are underexplored despite declining emotional performance with age. This study aimed to investigate the nature of sex differences in age-related decline in emotion perception, exploring modulatory effects on communication channels and emotion categories. Seventy-three older adults (43 female participants, aged 60–89 years) and 74 younger adults (37 female participants, aged 18–30 years) completed a task to recognize basic emotions (i.e., anger, happiness, neutrality, sadness) expressed by female or male encoders through verbal (i.e., semantic) and nonverbal (i.e., facial and prosodic) channels. Female participants consistently demonstrated an overall advantage in emotion perception and expression across both age cohorts. In older adults, this superiority was heightened in decoding angry and sad faces, as well as angry prosody and happy and sad semantics. However, older individuals exhibited decreased sensitivities to angry semantics, sad prosody, and neutral prosody from female encoders, whereas they showed heightened sensitivities to happy faces from female encoders and angry faces from male encoders. Both older and younger adults displayed age-related changes in sex interactions specific to emotional categories and channels. But neither own-sex nor opposite-sex bias was systematically observed across the two age groups. These results suggest that explicit emotion processing involves an intricate integration of individual and contextual differences, with significant age and sex interplay linked to specific emotions and channels.
Article
Full-text available
Computational research suggests that semantic memory, operationalized as semantic memory networks, undergoes age-related changes. Previous work suggests that concepts in older adults' semantic memory networks are more separated, more segregated, and less connected to each other. However, cognitive network research often relies on group averages (e.g., young vs. older adults), and it remains unclear if individual differences influence age-related disparities in language production abilities. Here, we analyze the properties of younger and older participants' individual-based semantic memory networks based on their semantic relatedness judgments. We related individual-based network measures-clustering coefficient (CC; connectivity), global efficiency, and modularity (structure)-to language production (verbal fluency) and vocabulary knowledge. Similar to previous findings, we found significant age effects: CC and global efficiency were lower, and modularity was higher, for older adults. Furthermore, vocabulary knowledge was significantly related to the semantic memory network measures: corresponding with the age effects, CC and global efficiency had a negative relationship, while modularity had a positive relationship with vocabulary knowledge. More generally, vocabulary knowledge significantly increased with age, which may reflect the critical role that the accumulation of knowledge within semantic memory has on its structure. These results highlight the impact of diverse life experiences on older adults' semantic memory and demonstrate the importance of accounting for individual differences in the aging mental lexicon. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Article
Full-text available
The ability to recognise emotion from faces or voices appears to decline with advancing age. However, some studies have shown that emotion recognition of auditory-visual (AV) expressions is largely unaffected by age, i.e., older adults get a larger benefit from AV presentation than younger adults resulting in similar AV recognition levels. An issue with these studies is that they used well-recognised emotional expressions that are unlikely to generalise to real-life settings. To examine if an AV emotion recognition benefit generalizes across well and less well recognised stimuli, we conducted an emotion recognition study using expressions that had clear or unclear emotion information for both modalities, or clear visual, but unclear auditory information. Older (n = 30) and younger (n = 30) participants were tested on stimuli of anger, happiness, sadness, surprise, and disgust (expressed in spoken sentences) in auditory-only (AO), visual-only (VO), or AV format. Participants were required to respond by choosing one of 5 emotion options. Younger adults were more accurate in recognising emotions than older adults except for clear VO expressions. Younger adults showed an AV benefit even when unimodal recognition was poor. No such AV benefit was found for older adults; indeed, AV was worse than VO recognition when AO recognition was poor. Analyses of confusion responses indicated that older adults generated more confusion responses that were common between AO and VO conditions, than younger adults. We propose that older adults’ poorer AV performance may be due to a combination of weak auditory emotion recognition and response uncertainty that resulted in a higher cognitive load.
Article
Full-text available
How language mediates emotional perception and experience is poorly understood. The present event-related potential (ERP) study examined the explicit and implicit processing of emotional speech to differentiate the relative influences of communication channel, emotion category and task type in the prosodic salience effect. Thirty participants (15 women) were presented with spoken words denoting happiness, sadness and neutrality in either the prosodic or semantic channel. They were asked to judge the emotional content (explicit task) and speakers' gender (implicit task) of the stimuli. Results indicated that emotional prosody (relative to semantics) triggered larger N100, P200 and N400 amplitudes with greater delta, theta and alpha inter-trial phase coherence (ITPC) and event-related spectral perturbation (ERSP) values in the corresponding early time windows , and continued to produce larger LPC amplitudes and faster responses during late stages of higher-order cognitive processing. The relative salience of prosodic and semantics was modulated by emotion and task, though such modulatory effects varied across different processing stages. The prosodic salience effect was reduced for sadness processing and in the implicit task during early auditory processing and decision-making but reduced for happiness processing in the explicit task during conscious emotion processing. Additionally, across-trial synchronization of delta, theta and alpha bands predicted the ERP components with higher ITPC and ERSP values significantly associated with stronger N100, P200, N400 and LPC enhancement. These findings reveal the neurocognitive dynamics of emotional speech processing with prosodic salience tied to stage-dependent emotion-and task-specific effects, which can reveal insights into understanding language and emotion processing from cross-linguistic/cultural and clinical perspectives.
Article
Full-text available
Objectives Older adults are often less accurate than younger counterparts at identifying emotions such as anger, sadness and fear from faces. They also look less at the eyes and more at the mouth during emotion perception. The current studies advance understanding of the nature of these age effects on emotional processing. Method Younger and older participants identified emotions from pictures of eyes or mouths (Experiment 1) and incongruent mouth-eyes emotion combinations (Experiment 2). In Experiment 3 participants categorised emotions from pictures in which face masks covered the mouth region. Results Older adults were worse than young at identifying anger and sadness from eyes, but better at identifying the same emotions from the mouth region (Experiment 1) and they were more likely than young to use information from the mouth to classify anger, fear and disgust (Experiment 2). In Experiment 3 facemasks impaired perception of anger, sadness and fear more for older compared to younger adults. Discussion These studies indicate that older people are more able than young to interpret emotional information from the mouth, they are more biased to use information from the mouth, and suffer more difficulty in emotion perception when the mouth is covered with a facemask. This has implications for social communication in different age groups.
Article
Recent clinical reports have suggested a possible decline in the ability to understand emotions in speech (affective prosody comprehension) with aging. The present study aims to further examine the differences in performance between older and younger adults in terms of affective prosody comprehension. Following a recent cognitive model dividing affective prosody comprehension into perceptual and lexico-semantic components, a cognitive approach targeting these components was adopted. The influence of emotions' valence and category on aging performance was also investigated. A systematic review of the literature was carried out using six databases. Twenty-one articles, presenting 25 experiments, were included. All experiments analyzed affective prosody comprehension performance of older versus younger adults. The results confirmed that older adults' performance in identifying emotions in speech was reduced compared to younger adults. The results also brought out the fact that affective prosody comprehension abilities could be modulated by the emotion category but not by the emotional valence. Various theories account for this difference in performance, namely auditory perception, brain aging, and socioemotional selectivity theory suggesting that older people tend to neglect negative emotions. However, the explanation of the underlying deficits of the affective prosody decline is still limited.
Article
Individuals with schizophrenia reportedly demonstrate deficits in emotion perception. Relevant studies on the effects of decoder's sex, communication channels and emotion categories have produced mixed findings and seldom explored the interactions among these three key factors. The present pilot study examined how male and female individuals with schizophrenia and healthy controls perceived emotional (e.g., angry, happy, and sad) and neutral expressions from verbal semantic and nonverbal prosodic and facial channels. Twenty-eight (11 females) individuals with schizophrenia and 30 healthy controls (13 females) were asked to recognize emotional facial expressions, emotional prosody, and emotional semantics. Both accuracy and response time showed subpar performance for all communication channels and emotional categories in the schizophrenia group. More severe emotion perception deficits were found with the nonverbal (not the verbal) materials. There was also a reduced level of impairment with anger perception, especially in the female individuals with schizophrenia while biased perception towards emotional semantics was more pronounced in male individuals with schizophrenia. These findings, although preliminary, indicate the channel- and category-specific nature of emotion perception with potential sex differences among people with schizophrenia, which has important theoretical and practical implications.
Article
The present study aimed to investigate how basic emotions are processed in verbal and nonverbal channels by those with schizophrenia and how their emotional deficits are related to other core symptoms and cognitive domains in a Mandarin Chinese context. Fourteen individuals with schizophrenia and sixteen healthy individuals participated in this study. Individuals with schizophrenia demonstrate disrupted nonverbal emotion perception while having preserved verbal processing and spared processing of anger. These channel- and category-specific emotional deficits show significant associations with symptom severity (esp. positive and negative symptoms) and cognitive abilities (esp. memory and attention). The results have implications for developing and optimizing diagnosis and intervention methods to relieve core symptoms of schizophrenia and enhance socio-cognitive functioning via the improvement of multisensory (esp. nonverbal) emotion processing skills.