[Show abstract][Hide abstract] ABSTRACT: Early neuroimaging studies using Cyberball suggested that social rejection activated the pain matrix, as identified in studies of physical pain. However, these early studies were characterized by small sample sizes. Our statistical multi-level kernel density analysis (MKDA) of Cyberball neuroimaging studies with 244 participants fails to support the claim that social rejection operates on the same pain matrix as nociceptive stimuli, questioning whether social pain is more figurative or literal. We also performed an MKDA of the neuroimaging studies of reliving a romantic rejection to test whether the pain matrix was activated if the rejection were more meaningful. Results again failed to support the notion that rejection activates the neural matrix identified in studies of physical pain. Reliving an unwanted rejection by a romantic partner was significantly characterized by activation within and beyond the "Cyberball" brain network, suggesting that the neural correlates of social pain are more complex than previously thought.
[Show abstract][Hide abstract] ABSTRACT: Although significant advances have been made in our understanding of the neural basis of action observation and intention understanding in the last few decades by studies demonstrating the involvement of a specific brain network (action observation network; AON), these have been largely based on experimental studies in which people have been considered as strictly isolated entities. However, we, as social species, spend much more of our time performing actions interacting with others. Research shows that a person's position along the continuum of perceived social isolation/bonding to others is associated with a variety of physical and mental health effects. Thus, there is a crucial need to better understand the neural basis of intention understanding performed in interpersonal and emotional contexts. To address this issue, we performed a meta-analysis using of functional magnetic resonance imaging (fMRI) studies over the past decade that examined brain and cortical network processing associated with understanding the intention of others actions vs. those associated with passionate love for others. Both overlapping and distinct cortical and subcortical regions were identified for intention and love, respectively. These findings provide scientists and clinicians with a set of brain regions that can be targeted for future neuroscientific studies on intention understanding, and help develop neurocognitive models of pair-bonding.
Frontiers in Human Neuroscience 03/2013; 7:99. DOI:10.3389/fnhum.2013.00099 · 3.63 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Neuroimaging studies have found a correlation between activation in the anterior insula and love, and a correlation between activation in the posterior insula and lust. The present control-case study describes a neurological male patient, with a rare, circumscribed lesion in the anterior insula, whom we tested using a decision task that required he judge whether each of a series of attractive individuals could be the object of his love or lust. The patient, in contrast with neurologically typical participants matched on age, gender, and ethnicity, performed normally when making decisions about lust but showed a selective deficit when making decisions about love. These results provide the first clinical evidence indicating that the anterior insula may play an instrumental role in love but not lust more generally. These data support the notion of a posterior-to-anterior insular gradient, from sensorimotor to abstract representations, in the evaluation of anticipatory rewards in interpersonal relationships.
[Show abstract][Hide abstract] ABSTRACT: Numerous species possess cortical regions that are most sensitive to vocalizations produced by their own kind (conspecifics). In humans, the superior temporal sulci (STSs) putatively represent homologous voice-sensitive areas of cortex. However, superior temporal sulcus (STS) regions have recently been reported to represent auditory experience or "expertise" in general rather than showing exclusive sensitivity to human vocalizations per se. Using functional magnetic resonance imaging and a unique non-stereotypical category of complex human non-verbal vocalizations-human-mimicked versions of animal vocalizations-we found a cortical hierarchy in humans optimized for processing meaningful conspecific utterances. This left-lateralized hierarchy originated near primary auditory cortices and progressed into traditional speech-sensitive areas. Our results suggest that the cortical regions supporting vocalization perception are initially organized by sensitivity to the human vocal tract in stages before the STS. Additionally, these findings have implications for the developmental time course of conspecific vocalization processing in humans as well as its evolutionary origins.
The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 06/2012; 32(23):8084-93. DOI:10.1523/JNEUROSCI.1118-12.2012 · 6.34 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Whether viewed or heard, an object in action can be segmented from a background scene based on a number of different sensory cues. In the visual system, salient low-level attributes of an image are processed along parallel hierarchies, and involve intermediate stages, such as the lateral occipital cortices, wherein gross-level object form features are extracted prior to stages that show object specificity (e.g. for faces, buildings, or tools). In the auditory system, though relying on a rather different set of low-level signal attributes, a distinct acoustic event or “auditory object” can also be readily extracted from a background acoustic scene. However, it remains unclear whether cortical processing strategies used by the auditory system similarly extract gross-level aspects of “acoustic object form” that may be inherent to many real-world sounds. Examining mechanical and environmental action sounds, representing two distinct categories of non-biological and non-vocalization sounds, we had participants assess the degree to which each sound was perceived as a distinct object versus an acoustic scene. Using two functional magnetic resonance imaging (fMRI) task paradigms, we revealed bilateral foci along the superior temporal gyri (STG) showing sensitivity to the “object-ness” ratings of action sounds, independent of the category of sound and independent of task demands. Moreover, for both categories of sounds these regions also showed parametric sensitivity to spectral structure variations—a measure of change in entropy in the acoustic signals over time (acoustic form)—while only the environmental sounds showed parametric sensitivity to mean entropy measures. Thus, similar to the visual system, the auditory system appears to include intermediate feature extraction stages that are sensitive to the acoustic form of action sounds, and may serve as a stage that begins to dissociate different categories of real-world auditory objects.
Frontiers in Systems Neuroscience 05/2012; 6:27. DOI:10.3389/fnsys.2012.00027
[Show abstract][Hide abstract] ABSTRACT: One of the most difficult dilemmas in relationship science and couple therapy concerns the interaction between sexual desire and love. As two mental states of intense longing for union with others, sexual desire and love are, in fact, often difficult to disentangle from one another.
The present review aims to help understand the differences and similarities between these two mental states using a comprehensive statistical meta-analyses of all functional magnetic resonance imaging (fMRI) studies on sexual desire and love.
Systematic retrospective review of pertinent neuroimaging literature.
Review of published literature on fMRI studies illustrating brain regions associated with love and sexual desire to date.
Sexual desire and love not only show differences but also recruit a striking common set of brain areas that mediate somatosensory integration, reward expectation, and social cognition. More precisely, a significant posterior-to-anterior insular pattern appears to track sexual desire and love progressively.
This specific pattern of activation suggests that love builds upon a neural circuit for emotions and pleasure, adding regions associated with reward expectancy, habit formation, and feature detection. In particular, the shared activation within the insula, with a posterior-to-anterior pattern, from desire to love, suggests that love grows out of and is a more abstract representation of the pleasant sensorimotor experiences that characterize desire. From these results, one may consider desire and love on a spectrum that evolves from integrative representations of affective visceral sensations to an ultimate representation of feelings incorporating mechanisms of reward expectancy and habit learning.
Journal of Sexual Medicine 02/2012; 9(4):1048-54. DOI:10.1111/j.1743-6109.2012.02651.x · 3.15 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Both sighted and blind individuals can readily interpret meaning behind everyday real-world sounds. In sighted listeners, we previously reported that regions along the bilateral posterior superior temporal sulci (pSTS) and middle temporal gyri (pMTG) are preferentially activated when presented with recognizable action sounds. These regions have generally been hypothesized to represent primary loci for complex motion processing, including visual biological motion processing and audio-visual integration. However, it remained unclear whether, or to what degree, life-long visual experience might impact functions related to hearing perception or memory of sound-source actions. Using functional magnetic resonance imaging (fMRI), we compared brain regions activated in congenitally blind versus sighted listeners in response to hearing a wide range of recognizable human-produced action sounds (excluding vocalizations) versus unrecognized, backward-played versions of those sounds. Here, we show that recognized human action sounds commonly evoked activity in both groups along most of the left pSTS/pMTG complex, though with relatively greater activity in the right pSTS/pMTG by the blind group. These results indicate that portions of the postero-lateral temporal cortices contain domain-specific hubs for biological and/or complex motion processing independent of sensory-modality experience. Contrasting the two groups, the sighted listeners preferentially activated bilateral parietal plus medial and lateral frontal networks, whereas the blind listeners preferentially activated left anterior insula plus bilateral anterior calcarine and medial occipital regions, including what would otherwise have been visual-related cortex. These global-level network differences suggest that blind and sighted listeners may preferentially use different memory retrieval strategies when hearing and attempting to recognize action sounds.
Human Brain Mapping 12/2011; 32(12):2241-55. DOI:10.1002/hbm.21185 · 5.97 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: In contrast to visual object processing, relatively little is known about how the human brain processes everyday real-world sounds, transforming highly complex acoustic signals into representations of meaningful events or auditory objects. We recently reported a fourfold cortical dissociation for representing action (nonvocalization) sounds correctly categorized as having been produced by human, animal, mechanical, or environmental sources. However, it was unclear how consistent those network representations were across individuals, given potential differences between each participant's degree of familiarity with the studied sounds. Moreover, it was unclear what, if any, auditory perceptual attributes might further distinguish the four conceptual sound-source categories, potentially revealing what might drive the cortical network organization for representing acoustic knowledge. Here, we used functional magnetic resonance imaging to test participants before and after extensive listening experience with action sounds, and tested for cortices that might be sensitive to each of three different high-level perceptual attributes relating to how a listener associates or interacts with the sound source. These included the sound's perceived concreteness, effectuality (ability to be affected by the listener), and spatial scale. Despite some variation of networks for environmental sounds, our results verified the stability of a fourfold dissociation of category-specific networks for real-world action sounds both before and after familiarity training. Additionally, we identified cortical regions parametrically modulated by each of the three high-level perceptual sound attributes. We propose that these attributes contribute to the network-level encoding of category-specific acoustic knowledge representations.
[Show abstract][Hide abstract] ABSTRACT: Brain imaging is becoming a powerful tool in the study of human cerebral functions related to close personal relationships. Outside of subcortical structures traditionally thought to be involved in reward-related systems, a wide range of neuroimaging studies in relationship science indicate a prominent role for different cortical networks and cognitive factors. Thus, the field needs a better anatomical/network/whole-brain model to help translate scientific knowledge from lab bench to clinical models and ultimately to the patients suffering from disorders associated with love and couple relationships.
The aim of the present review is to provide a review across wide range of functional magnetic resonance imaging (fMRI) studies to critically identify the cortical networks associated with passionate love, and to compare and contrast it with other types of love (such as maternal love and unconditional love for persons with intellectual disabilities).
Retrospective review of pertinent neuroimaging literature.
Review of published literature on fMRI studies of love illustrating brain regions associated with different forms of love.
Although all fMRI studies of love point to the subcortical dopaminergic reward-related brain systems (involving dopamine and oxytocin receptors) for motivating individuals in pair-bonding, the present meta-analysis newly demonstrated that different types of love involve distinct cerebral networks, including those for higher cognitive functions such as social cognition and bodily self-representation.
These metaresults provide the first stages of a global neuroanatomical model of cortical networks involved in emotions related to different aspects of love. Developing this model in future studies should be helpful for advancing clinical approaches helpful in sexual medicine and couple therapy.
Journal of Sexual Medicine 11/2010; 7(11):3541-52. DOI:10.1111/j.1743-6109.2010.01999.x · 3.15 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: With regard to hearing perception, it remains unclear as to whether, or the extent to which, different conceptual categories of real-world sounds and related categorical knowledge are differentially represented in the brain. Semantic knowledge representations are reported to include the major divisions of living versus non-living things, plus more specific categories including animals, tools, biological motion, faces, and places-categories typically defined by their characteristic visual features. Here, we used functional magnetic resonance imaging (fMRI) to identify brain regions showing preferential activity to four categories of action sounds, which included non-vocal human and animal actions (living), plus mechanical and environmental sound-producing actions (non-living). The results showed a striking antero-posterior division in cortical representations for sounds produced by living versus non-living sources. Additionally, there were several significant differences by category, depending on whether the task was category-specific (e.g. human or not) versus non-specific (detect end-of-sound). In general, (1) human-produced sounds yielded robust activation in the bilateral posterior superior temporal sulci independent of task. Task demands modulated activation of left lateralized fronto-parietal regions, bilateral insular cortices, and sub-cortical regions previously implicated in observation-execution matching, consistent with "embodied" and mirror-neuron network representations subserving recognition. (2) Animal action sounds preferentially activated the bilateral posterior insulae. (3) Mechanical sounds activated the anterior superior temporal gyri and parahippocampal cortices. (4) Environmental sounds preferentially activated dorsal occipital and medial parietal cortices. Overall, this multi-level dissociation of networks for preferentially representing distinct sound-source categories provides novel support for grounded cognition models that may underlie organizational principles for hearing perception.
[Show abstract][Hide abstract] ABSTRACT: The ability to detect and rapidly process harmonic sounds, which in nature are typical of animal vocalizations and speech, can be critical for communication among conspecifics and for survival. Single-unit studies have reported neurons in auditory cortex sensitive to specific combinations of frequencies (e.g., harmonics), theorized to rapidly abstract or filter for specific structures of incoming sounds, where large ensembles of such neurons may constitute spectral templates. We studied the contribution of harmonic structure to activation of putative spectral templates in human auditory cortex by using a wide variety of animal vocalizations, as well as artificially constructed iterated rippled noises (IRNs). Both the IRNs and vocalization sounds were quantitatively characterized by calculating a global harmonics-to-noise ratio (HNR). Using functional MRI, we identified HNR-sensitive regions when presenting either artificial IRNs and/or recordings of natural animal vocalizations. This activation included regions situated between functionally defined primary auditory cortices and regions preferential for processing human nonverbal vocalizations or speech sounds. These results demonstrate that the HNR of sound reflects an important second-order acoustic signal attribute that parametrically activates distinct pathways of human auditory cortex. Thus, these results provide novel support for the presence of spectral templates, which may subserve a major role in the hierarchical processing of vocalizations as a distinct category of behaviorally relevant sound.
The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 03/2009; 29(7):2283-96. DOI:10.1523/JNEUROSCI.4145-08.2009 · 6.34 Impact Factor