
Claudia MännelCharité-Universitätsmedizin Berlin; Max Planck Institute for Human Cognitive and Brain Sciences Leipzig, Germany · Audiology and Phoniatrics
Claudia Männel
Professor (PhD, MSc, MA)
About
53
Publications
9,076
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
597
Citations
Introduction
Publications
Publications (53)
Infants rapidly advance in their speech perception, electrophysiologically reflected in the transition from an immature, positive-going to an adult-like, negative-going mismatch response (MMR) to auditory deviancy. Although the MMR is a common tool to study speech perception development, it is not yet completely understood how different speech cont...
Despite humans’ ability to communicate about concepts relating to different senses, word learning research tends to largely focus on labeling visual objects. Although sensory modality is known to influence memory and learning, its specific role for word learning remains largely unclear. We investigated associative word learning in adults, that is t...
Infants show impressive speech decoding abilities and detect acoustic regularities that highlight the syntactic relations of a language, often coded via non-adjacent dependencies (NADs, e.g., is singing). It has been claimed that infants learn NADs implicitly and associatively through passive listening and that there is a shift from effortless asso...
Infants prefer to be addressed with infant-directed speech (IDS). IDS benefits language acquisition through amplified low-frequency amplitude modulations. It has been reported that this amplification increases electrophysiological tracking of IDS compared to adult-directed speech (ADS). It is still unknown which particular frequency band triggers t...
Infants prefer to be addressed with infant-directed speech (IDS). IDS benefits language acquisition through amplified low-frequency amplitude modulations. It has been reported that this amplification increases electrophysiological tracking of IDS compared to adult-directed speech (ADS). It is still unknown which particular frequency band triggers t...
Infants show impressive speech decoding abilities and detect acoustic regularities that highlight the syntactic relations of a language, often coded via non-adjacent dependencies (NADs, e.g., is singing). It has been claimed that infants learn NADs implicitly and associatively through passive listening and that there is a shift from effortless asso...
Long before their first words, children communicate by using speech-like vocalizations. These protophones might be indicative of infants' later language development. We here examined infants' (n = 56) early vocalizations at 6 months (vocal reactivity scale of the IBQ-R) as a predictor of their expressive and receptive language at 12 months (German...
In order to become proficient native speakers, children have to learn the morpho-syntactic relations between distant elements in a sentence, so-called non-adjacent dependencies (NADs). Previous research suggests that NAD learning in children comprises different developmental stages, where until 2 years of age children are able to learn NADs associa...
Grammar is central to any natural language. In the past decades, the artificial grammar of the AⁿBⁿ type in which a pair of associated elements can be nested in the other pair was considered as a desirable model to mimic human language syntax without semantic interference. However, such a grammar relies on mere associating mechanisms, thus insuffic...
Despite the prominence of non-visual semantic features for some words (e.g., siren or thunder), little is known about when and how the meanings of those words that refer to auditory objects can be acquired in early infancy. With associative learning being an important mechanism of word learning, we ask the question whether associations between soun...
In order to become proficient native speakers, children have to learn the morpho-syntactic relations between distant elements in a sentence, so-called non-adjacent dependencies (NADs). Previous research suggests that NAD learning in children comprises different developmental stages, where until 2 years of age children are able to learn NADs associa...
Non-adjacent dependencies (NADs) are important building blocks for language and extracting them from the input is a fundamental part of language acquisition. Prior event-related potential (ERP) studies revealed changes in the neural signature of NAD learning between infancy and adulthood, suggesting a developmental shift in the learning route for N...
Becoming a successful speaker depends on acquiring and learning grammatical dependencies between neighboring and non-neighboring linguistic elements (non-adjacent dependencies; NADs). Previous studies have demonstrated children’s and adults’ ability to distinguish NADs from NAD violations right after familiarization. However, demonstrating NAD reca...
Becoming a successful speaker depends on acquiring and learning grammatical dependencies between neighboring and non-neighboring linguistic elements (non-adjacent dependencies; NADs). Previous studies have demonstrated children’s and adults’ ability to distinguish NADs from NAD violations right after familiarization. However, demonstrating NAD reca...
Objectives:
Individuals with dyslexia often suffer from deficient segmental phonology, but the status of suprasegmental phonology (prosody) is still discussed.
Methods:
In three passive-listening event-related brain potential (ERP) studies, we examined prosodic processing in literacy-impaired children for various prosodic units by contrasting th...
Human cognition relies on the ability to encode complex regularities in the input. Regularities above a certain complexity level can involve the feature of embedding, defined by nested relations between sequential elements. While comparative studies suggest the cognitive processing of embedding to be human specific, evidence of its ontogenesis is l...
Our fNIRS study shows a developmental shift in the ability to learn non-adjacent dependencies in the linguistic, but not in the non-linguistic domain. Two-year-old children show learning of non-adjacent dependencies in a novel natural language under passive listening conditions, whereas 3-year-olds do not. fNIRS data show that non-adjacent dependen...
Infants’ ability to learn complex linguistic regularities from early on has been revealed
by electrophysiological studies indicating that 3-month-olds, but not adults, can automatically detect non-adjacent dependencies between syllables. While different ERP
responses in adults and infants suggest that both linguistic rule learning and its link t...
The ability to process structured sequences of sounds lies at the basis of human language processing. Language is characterized by a high level of structural complexity including non-adjacent dependencies where the relationships between elements can span multiple intervening elements. Understanding how such structures can be learned is of particula...
This chapter reviews electrophysiological studies on early word-form segmentation and word-referent mapping, with a focus on the role of prosody in these early abilities closely related to vocabulary acquisition. First, we will review event-related brain potential (ERP) studies on word segmentation showing the impact of lexical stress cues, infant-...
During information processing, individuals benefit from bimodally presented input, as has been demonstrated for speech perception (i.e., printed letters and speech sounds) or the perception of emotional expressions (i.e., facial expression and voice tuning). While typically developing individuals show this bimodal benefit, school children with dysl...
Objective:
Cortical malformations are documented postmortem in speech processing areas of the dyslexic human brain. The goal of this pilot study was to find out if such anatomic anomalies can be detected noninvasively and in vivo.
Methods:
We developed a reconstruction of left perisylvian cortex profiles at a resolution of 400 μm using T1 data a...
The artificial grammar learning (AGL) paradigm enables systematic investigation of the acquisition of linguistically relevant structures. It is a paradigm of interest for language processing research, interfacing with theoretical linguistics, and for comparative research on language acquisition and evolution. We present a key for understanding majo...
The ability to extract and generalize abstract rules in an unknown language is present very early in life, but less pronounced in adulthood. Our previous EEG studies revealed that 3- to 4-month-old infants, but not adults, can learn nonadjacent dependencies in an unknown language under passive listening conditions.
This raises the question whether...
There is considerable interest in understanding the ontogeny and phylogeny of the human language system, yet, neurobiological work at the interface of both fields is absent. Syntactic processes in language build on sensory processing and sequencing capabilities on the side of the receiver. While we better understand language-related ontogenetic cha...
Intact phonological processing is crucial for successful literacy acquisition. While individuals with difficulties in reading and spelling (i.e., developmental dyslexia) are known to experience deficient phoneme discrimination (i.e., segmental phonology), findings concerning their prosodic processing (i.e., suprasegmental phonology) are controversi...
Language, an elaborate system of discrete units and combinatorial rules, builds on complex neurocognitive foundations. Language development results from both maturation and learning. Specifically, learning mechanisms are implemented in brain networks that are still in the process of structurally and functionally maturing during the first year of li...
Successful communication in everyday life crucially involves the processing of auditory and visual components of speech. Viewing our interlocutor and processing visual components of speech facilitates speech processing by triggering auditory processing. Auditory phoneme processing, analyzed by event-related brain potentials (ERP), has been shown to...
Spoken language is hierarchically structured into prosodic units divided by prosodic breaks. The largest prosodic breaks in an utterance are intonational phrase boundaries (IPBs), which are defined by three acoustic cues, namely, pitch change, preboundary lengthening, and pausing. Previous studies have revealed that the electrophysiological marker...
Learning a spoken language presupposes efficient auditory functions. In the present event-related potential study, we tested whether and how basic auditory processes are related to online learning of a linguistic rule in infants and adults. Participants listened to frequent standard stimuli, which were interspersed with infrequent pitch deviants an...
This study explored the electrophysiology underlying intonational phrase processing at different stages of syntax acquisition. Developmental studies suggest that children's syntactic skills advance significantly between 2 and 3 years of age. Here, children of three age groups were tested on phrase-level prosodic processing before and after this dev...
In language learning, infants are faced with the challenge of decomposing continuous speech into relevant units, such as syntactic clauses and words. Within the framework of prosodic bootstrapping, behavioral studies suggest infants approach this segmentation problem by relying on prosodic information, especially on acoustically marked intonational...
Projects
Projects (3)
Sub-project: The sensitive period for associative learning of non-adjacent dependencies. Young infants easily learn non-adjacent dependencies from mere exposure, while adults need an explicit task. This suggests that these populations use different learning mechanisms acquiring these dependencies. Using fNIRS, we study when the shift from implicit to explicit learning takes place and which brain regions underly these different types of grammar learning.
Based on our infant EEG-fNIRS studies on processing complex regularities (nested dependencies), we conducted two adult EEG studies under active and passive listening conditions.
Are infants able to process nested dependencies and if so, up to which level of complexity?