A preview of this full-text is provided by American Speech-Language-Hearing Association.
Content available from Journal of Speech, Language, and Hearing Research
This content is subject to copyright. Terms and conditions apply.
• • • •
Research Article
Category-Sensitive Age-Related Shifts Between
Prosodic and Semantic Dominance in Emotion
Perception Linked to Cognitive Capacities
Yi Lin,
a,b
Xiaoqing Ye,
c
Huaiyi Zhang,
a,b
Fei Xu,
a,b
Jingyu Zhang,
a,b
Hongwei Ding,
a,b
and Yang Zhang
d
a
Speech-Language-Hearing Center, School of Foreign Languages, Shanghai Jiao Tong University, China
b
National Research Centre for
Language and Well-Being, Shanghai, China
c
Shanghai Jiao Tong University School of Medicine, China
d
Department of Speech-Language-
Hearing Sciences and Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis
AR T I CLE I N F O
Article History:
Received December 30, 2023
Revision received June 20, 2024
Accepted August 21, 2024
Editor-in-Chief: Rachael Frush Holt
Editor: Michelle R. Molis
https://doi.org/10.1044/2024_JSLHR-23-00817
Correspondence to Hongwei Ding: hwding@sjtu.edu.cn and Yang
Zhang: zhanglab@umn.edu. Disclosure: The authors have declared
that no competing financial or nonfinancial interests existed at the time
of publication.
AB ST R A C T
Purpose: Prior research extensively documented challenges in recognizing ver-
bal and nonverbal emotion among older individuals when compared with youn-
ger counterparts. However, the nature of these age-related changes remains
unclear. The present study investigated how older and younger adults compre-
hend four basic emotions (i.e., anger, happiness, neutrality, and sadness) con-
veyed through verbal (semantic) and nonverbal (facial and prosodic) channels.
Method: A total of 73 older adults (43 women, M
age
= 70.18 years) and 74
younger adults (37 women, M
age
= 22.01 years) partook in a fixed-choice test
for recognizing emotions presented visually via facial expressions or auditorily
through prosody or semantics.
Results: The results confirmed age-related decline in recognizing emotions
across all channels except for identifying happy facial expressions. Furthermore,
the two age groups demonstrated both commonalities and disparities in their
inclinations toward specific channels. While both groups displayed a shared
dominance of visual facial cues over auditory emotional signals, older adults
indicated a preference for semantics, whereas younger adults displayed a pref-
erence for prosody in auditory emotion perception. Notably, the dominance
effects observed in older adults for visual and semantic cues were less pro-
nounced for sadness and anger compared to other emotions. These challenges
in emotion recognition and the shifts in channel preferences among older adults
were correlated with their general cognitive capabilities.
Conclusion: Together, the findings underscore that age-related obstacles in
perceiving emotions and alterations in channel dominance, which vary by emo-
tional category, are significantly intertwined with overall cognitive functioning.
Supplemental Material: https://doi.org/10.23641/asha.27307251
Social cognitive and communicative skills play an
essential role in navigating interpersonal conducts and
relationships in older age. One key domain of soc iocogni-
tive functions is to understand the communicative inten-
tions and emotional states of others (Arioli et al., 2018;
Kalokerinos et al., 2015). In previous studies, older adults
have consistently demonstrated perceptua l difficulties in
decoding emotions from various verbal and nonverbal
channels (Lambrecht et al., 2012; Ruffman et al., 2008),
which may lead to interpersonal dysfunctioning and social
isolation commonly observed in psychopathological aging
conditions (e.g., mild cognitive impairment, Alzheimer’s
disease; Cardenas et al., 2021). However, there are still
many unknowns about how these age-related emotional
deficits are differentially represented across communica-
tion channels (Schirmer & Adolphs, 2017). It also remains
Journal of Speech, Language, and Hearing Research Vol. 67 4829–4849 December 2024 Copyright © 2024 American Speech-Language-Hearing Association 4829