About
86
Publications
27,270
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
4,788
Citations
Introduction
Additional affiliations
July 2014 - October 2014
October 2011 - June 2013
April 2009 - September 2011
Publications
Publications (86)
Listeners implicitly use statistical regularities to segment continuous sound input into meaningful units, e.g., transitional probabilities between syllables to segment a speech stream into separate words. Implicit learning of such statistical regularities in a novel stimulus stream is reflected in a synchronisation of neural responses to the seque...
Equitable collaboration between culturally diverse scientists reveals that acoustic fingerprints of human speech and song share parallel relationships across the globe.
The right hemisphere of the human brain plays an important role in music processing with lateralized functions for pitch, meter and melody recognition among other features. However, the relationship between white matter structure and function in music processing is relatively little explored. We report an interesting case study of a 50-year-old mus...
Electroencephalography (EEG) microstates are short successive periods of stable scalp field potentials representing spontaneous activation of brain resting-state networks. EEG microstates are assumed to mediate local activity patterns. To test this hypothesis, we correlated momentary global EEG microstate dynamics with the local temporo-spectral ev...
Considerable debate surrounds syntactic processing similarities in language and music. Yet few studies have investigated how syntax interacts with meter considering that metrical regularity varies across domains. Furthermore, there are reports on individual differences in syntactic and metrical structure processing in music and language. Thus, a di...
Introduction
Electroencephalography (EEG) microstates are successive short time periods of stable scalp field potentials that represent spontaneous activation of brain resting-state networks. EEG microstates are assumed to mediate local activity patterns. To assess this hypothesis, we correlated momentary EEG microstate dynamics with the temporo-sp...
Joint music performance requires flexible sensorimotor coordination between self and other. Cognitive and sensory parameters of joint action—such as shared knowledge or temporal (a)synchrony—influence this coordination by shifting the balance between self-other segregation and integration. To investigate the neural bases of these parameters and the...
During conversations, speech prosody provides important clues about the speaker’s communicative intentions. In many languages, a rising vocal pitch at the end of a sentence typically expresses a question function, whereas a falling pitch suggests a statement. Here, the neurophysiological basis of intonation and speech act understanding were investi...
When people interact with each other, their brains synchronize. However, it remains unclear whether interbrain synchrony (IBS) is functionally relevant for social interaction or stems from exposure of individual brains to identical sensorimotor information. To disentangle these views, the current dual-EEG study investigated amplitude-based IBS in p...
Complex sequential behaviors, such as speaking or playing music, entail flexible rule-based chaining of single acts. However, it remains unclear how the brain translates abstract structural rules into movements. We combined music production with multimodal neuroimaging to dissociate high-level structural and low-level motor planning. Pianists playe...
Background: To determine and compare lesion patterns and structural dysconnectivity underlying post-stroke aprosodia and amusia, using a data-driven multimodal neuroimaging approach.
Methods: Thirty-nine patients with right or left hemisphere stroke were enrolled in a cohort study and tested for linguistic and affective prosody perception and musi...
The frontopolar cortex (FPC) contributes to tracking the reward of alternative choices during decision making, as well as their reliability. Whether this FPC function extends to reward gradients associated with continuous movements during motor learning remains unknown. We used anodal transcranial direct current stimulation (tDCS) over the right FP...
The frontopolar cortex (FPC) contributes to tracking the reward of alternative choices during decision making, as well as their reliability. Whether this FPC function extends to reward gradients associated with continuous movements during motor learning remains unknown. We used anodal transcranial direct current stimulation (tDCS) over the right FP...
Complex sequential behaviours, such as speaking or playing music, often entail the flexible, rule-based chaining of single acts. However, it remains unclear how the brain translates abstract structural rules into concrete series of movements. Here we demonstrate a multi-level contribution of anatomically distinct cognitive and motor networks to the...
Language comprehension depends on tight functional interactions between distributed brain regions. While these interactions are established for semantic and syntactic processes, the functional network of speech intonation - the linguistic variation of pitch - has been scarcely defined. Particularly little is known about intonation in tonal language...
Neurocomparative music and language research has seen major advances over the past
two decades. The goal of this Special Issue “Advances in the Neurocognition of Music and Language” was to showcase the multiple neural analogies between musical and linguistic information processing, their entwined organization in human perception and cognition and t...
Decision-making is increasingly being recognised to play a role in learning motor skills. Understanding the neural processes regulating motor decision-making is therefore essential to identify mechanisms that contribute to motor skill learning. In decision-making tasks, the frontopolar cortex (FPC) is involved in tracking the reward of different al...
Objectives: A major issue in the rehabilitation of children with cochlear implants (CIs) is unexplained variance in their language skills, where many of them lag behind children with normal hearing (NH). Here, we assess links between generative language skills and the perception of prosodic stress, and with musical and parental activities in childr...
Brain asymmetries for words and melodies of songs depend on opposite acoustic cues
Intonation, the modulation of pitch in speech, is a crucial aspect of language that is processed in right‐hemispheric regions, beyond the classical left‐hemispheric language system. Whether or not this notion generalises across languages remains, however, unclear. Particularly, tonal languages are an interesting test case because of the dual lingui...
Objectives:
A major issue in the rehabilitation of children with cochlear implants (CIs) is unexplained variance in their language skills, where many of them lag behind children with normal hearing (NH). Here, we assess links between generative language skills and the perception of prosodic stress, and with musical and parental activities in child...
Relative clauses modify a preceding element, but as this element can be flexibly located, the point of attachment is sometimes ambiguous. Preference for this attachment can vary within languages such as German, yet explanations for differences in attachment preference related to cognitive strategies or constraints have been conflicting in the curre...
Generation of hierarchical structures, such as the embedding of subordinate elements into larger structures, is a core feature of human cognition. Processing of hierarchies is thought to rely on lateral prefrontal cortex (PFC). However, the neural underpinnings supporting active generation of new hierarchical levels remain poorly understood. Here,...
Music played in ensembles is a naturalistic model to study joint action and leader-follower relationships. Recently, the investigation of the brain underpinnings of joint musical actions has gained attention; however, the cerebral correlates underlying the roles of leader and follower in music performance remain elusive. The present study addressed...
Neural activity phase-locks to rhythm in both music and speech. However, the literature currently lacks a direct test of whether cortical tracking of comparable rhythmic structure is comparable across domains. Moreover, although musical training improves multiple aspects of music and speech perception, the relationship between musical training and...
Our ability to understand others' communicative intentions in speech is key to successful social interaction. Indeed, misunderstanding an "excuse me" as apology, while meant as criticism, may have important consequences. Recent behavioural studies have provided evidence that prosody, i.e., vocal tone, is an important indicator for speakers' intenti...
Share Link: https://authors.elsevier.com/c/1X1db,28iJJIvA
The relevance of left dorsal and ventral fiber pathways for syntactic and semantic comprehension is well established, while pathways for prosody are little explored. The present study examined linguistic prosodic structure building in a patient whose right arcuate/superior longitudinal fasci...
Evidence is accumulating that similar cognitive resources are engaged to process syntactic structure in music and language. Congenital amusia – a neurodevelopmental disorder that primarily affects music perception, including musical syntax – provides a special opportunity to understand the nature of this overlap. Using electroencephalography (EEG),...
The objective of our study was to assess alterations in speech as a possible localizing sign in frontal lobe epilepsy. Ictal speech was analyzed in 18 patients with frontal lobe epilepsy (FLE) during seizures and in the interictal period. Matched identical words were analyzed regarding alterations in fundamental frequency (ƒo) as an approximation o...
Generation of hierarchical structures, such as the embedding of subordinate elements into larger structures, is a core feature of human cognition. Discrimination of well-formed hierarchies is thought to rely on lateral prefrontal cortex (PFC). However, the brain bases underlying the active generation of new hierarchical levels remain poorly underst...
It is well established that musical training induces sensorimotor plasticity. However, there are remarkable differences in how musicians train for proficient stage performance. The present EEG study outlines for the first time clear-cut neurobiological differences between classical and jazz musicians at high and low levels of action planning, revea...
The capacity to represent and generate hierarchical structures is a core feature of human cognition. While discrimination of well-formed hierarchical structures is thought to rely on lateral prefrontal cortex (lPFC) in several domains, the brain bases of active generation of new hierarchical levels remains relatively unexplored. Here we introduce a...
Song and speech represent two auditory categories the brain usually classifies fairly easily. Functionally, this classification ability may depend to a great extent on characteristic features of pitch patterns present in song melody and speech prosody. Anatomically, the temporal lobe (TL) has been discussed as playing a prominent role in the proces...
The ability to predict upcoming structured events based on long-term knowledge and contextual priors is a fundamental principle of human cognition. Tonal music triggers predictive processes based on structural properties of harmony, i.e., regularities defining the arrangement of chords into well-formed musical sequences. While the neural architectu...
Action-theoretic views of language posit that the recognition of others’ intentions is key to successful interpersonal communication. Yet, speakers do not always code their intentions literally, raising the question of which mechanisms enable interlocutors to exchange com- municative intents. The present study investigated whether and how prosody—t...
Our vocal tone-the prosody-contributes a lot to the meaning of speech beyond the actual words. Indeed, the hesitant tone of a "yes" may be more telling than its affirmative lexical meaning [1]. The human brain contains dorsal and ventral processing streams in the left hemisphere that underlie core linguistic abilities such as phonology, syntax, and...
Background
Language and music present numerous structural similarities. This suggests that both domains may be built on shared cognitive and neuroanatomical resources.
Objectives
Here, discrepant results that either support or deny neurocognitive links between language and music are discussed. It is argued that the investigation of cognitive subpro...
Complex human behavior is hierarchically organized. Whether or not syntax plays a role in this organization is currently under debate. The present ERP study uses piano performance to isolate syntactic operations in action planning and to demonstrate their priority over nonsyntactic levels of movement selection. Expert pianists were asked to execute...
Sentences, musical phrases and goal-directed actions are composed of elements that are linked by specific rules to form meaningful outcomes. In goal-directed actions including a non-canonical element or scrambling the order of the elements alters the action's content and structure, respectively. In the present study we investigated event-related po...
Recent years have seen a major change in views on language and lan- guage use. During the last decades, language use has been more and more recognized as an intentional action (Grice 1957). In the form of speech acts (Austin 1962; Searle 1969), language expresses the speaker’s attitudes and communicative intents to shape the listener’s reaction. No...
Songs constitute a natural combination of lyrics and melodies, but it is unclear whether and how these two song components are integrated during the emergence of a memory trace. Network theories of memory suggest a prominent role of the hippocampus, together with unimodal sensory areas, in the build-up of conjunctive representations. The present st...
Background / Purpose:
There is much more to human communication than the (de)coding of the overt semantic meaning of a vocal speech signal. Often between the lines, speakers use additional prosodic cues to communicate their intentions, beliefs and attitudes. What are the brain bases for decoding the speaker’s intended meaning conveyed through sub...
Despite general agreement on shared syntactic resources in music and language, the neuroanatomical underpinnings of this overlap remain largely unexplored. While previous studies mainly considered frontal areas as supramodal grammar processors, the domain-general syntactic role of temporal areas has been so far neglected. Here we capitalized on the...
This functional magnetic resonance imaging study examines shared and distinct cortical areas involved in the auditory perception of song and speech at the level of their underlying constituents: words and pitch patterns. Univariate and multivariate analyses were performed to isolate the neural correlates of the word- and pitch-based discrimination...
An increasing number of neuroimaging studies in music cognition research suggest that "language areas" are involved in the processing of musical syntax, but none of these studies clarified whether these areas are a prerequisite for normal syntax processing in music. The present electrophysiological experiment tested whether patients with lesions in...
Contemporary neural models of auditory language comprehension proposed that the two hemispheres are differently specialized in the processing of segmental and suprasegmental features of language. While segmental processing of syntactic and lexical semantic information is predominantly assigned to the left hemisphere, the right hemisphere is thought...
The cognitive relationship between lyrics and tunes in song is currently under debate, with some researchers arguing that lyrics and tunes are represented as separate components, while others suggest that they are processed in integration. The present study addressed this issue by means of a functional magnetic resonance adaptation paradigm during...
The present study investigated the co-localization of musical and linguistic syntax processing in the human brain. EEGs were recorded from subdural electrodes placed on the left and right perisylvian cortex. The neural generators of the early potentials elicited by syntactic errors in music and language were localized by means of distributed source...
It has long been debated which aspects of music perception are universal and which are developed only after exposure to a specific musical culture. Here, we report a crosscultural study with participants from a native African population (Mafa) and Western participants, with both groups being naive to the music of the other respective culture. Exper...
This study investigates the functional architecture of working memory (WM) for verbal and tonal information during rehearsal and articulatory suppression. Participants were presented with strings of four sung syllables with the task to remember either the pitches (tonal information) or the syllables (verbal information). Rehearsal of verbal, as wel...
Music-syntactic irregularities often co-occur with the processing of physical irregularities. In this study we constructed chord-sequences such that perceived differences in the cognitive processing between regular and irregular chords could not be due to the sensory processing of acoustic factors like pitch repetition or pitch commonality (the maj...
We investigated electroencephalographic (EEG) correlates of moderate intermittent explosive disorder (mIED), which is characterized by uncontrollable, impulsive attacks that either manifest in aggressive outbursts of temper, or in implosive, auto-aggressive behaviour.
In two Experiments, EEG data were recorded during rest conditions, and while subj...
Human personality has brain correlates that exert manifold influences on biological processes. This study investigates relations between emotional personality and heart activity. Our data demonstrate that emotional personality is related to a specific cardiac amplitude signature in the resting electrocardiogram (ECG). Two experiments using function...
The present study investigated music-syntactic processing with chord sequences that ended on either regular or irregular chord functions. Sequences were composed such that perceived differences in the cognitive processing between syntactically regular and irregular chords could not be due to the sensory processing of acoustic factors like pitch rep...
Human emotion and its electrophysiological correlates are still poorly understood. The present study examined whether the valence of perceived emotions would differentially influence EEG power spectra and heart rate (HR). Pleasant and unpleasant emotions were induced by consonant and dissonant music. Unpleasant (compared to pleasant) music evoked a...
Using evoked potentials, this study investigated effects of deep propofol sedation, and effects of recovery from unconsciousness, on the processing of auditory information with stimuli suited to elicit a physical MMN, and a (music-syntactic) ERAN.
Levels of sedation were assessed using the Bispectral Index (BIS) and the Modified Observer's Assessme...
The present study investigated simultaneous processing of language and music using visually presented sentences and auditorily presented chord sequences. Music-syntactically regular and irregular chord functions were presented synchronously with syntactically correct or incorrect words, or with words that had either a high or a low semantic cloze p...
onsciousness. In contrast, the amplitude of the P1 was unchanged by sedation but markedly decreased during unconsciousness. Conclusion: The results indicate differential effects of propofol sedation on cognitive functions that involve mainly the auditory cortices and cognitive functions that involve the frontal cortices. RECENT findings indicate th...
Semantics is a key feature of language, but whether or not music can activate brain mechanisms related to the processing of semantic meaning is not known. We compared processing of semantic meaning in language and music, investigating the semantic priming effect as indexed by behavioral measures and by the N400 component of the event-related brain...
It is an open question whether cognitive processes of auditory perception that are mediated by functionally different cortices exhibit the same sensitivity to sedation. The auditory event-related potentials P1, mismatch negativity (MMN), and early right anterior negativity (ERAN) originate from different cortical areas and reflect different stages...
INTRODUCTION Accurate pitch perception is a prerequisite for the processing of melodic, harmonic, and prosodic aspects of both language and music. Recently, neural dynamics underlying pitch processing within a musical context have been extensively investigated by recording the mismatch negativity (MMN) [1,2], the right anterior-temporal negativity...
In the present study, the early right-anterior negativity (ERAN) elicited by harmonically inappropriate chords during listening to music was compared to the frequency mismatch negativity (MMN) and the abstract-feature MMN. Results revealed that the amplitude of the ERAN, in contrast to the MMN, is specifically dependent on the degree of harmonic ap...
In the present study, the early right-anterior negativity (ERAN) elicited by harmonically inappropriate chords during listening to music was compared to the frequency mismatch negativity (MMN) and the abstract-feature MMN. Results revealed that the amplitude of the ERAN, in contrast to the MMN, is specifically dependent on the degree of harmonic ap...