Article

Cortical processing of human and emoji faces: an ERP analysis

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Social networks have become an effective tool of interaction, frequently replacing face-to-face communication by using non-verbal emotional cues, such as emojis. The aim of the present study was to investigate whether human and emoji faces are analogously processed cortically. We recorded P100, N170, and LPP event-related potentials (ERPs) in response to happy, neutral, and angry expressions of human and emoji faces in 30 participants. The results showed that P100 and LPP amplitudes were larger in response to human faces but emoji faces generated larger N170 amplitude. Angry faces elicited significantly larger P100 and LPP amplitudes. The neural time-course of processing human and emoji faces was similar, however, human faces generate greater attentional orientation response, emoji faces increase the structural encoding processes, and human faces elicited greater arousal and attentional engagement. These results suggest that the use of emoji faces in computer-mediated communication produces neural responses that are similar to those that are observed in face-to-face communication.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... As previously mentioned, regarding the differences between facial expressions and emojis, most studies support the notion that emojis perform similar functions to facial expressions in emotional communication (Kaye et al., 2017). Moreover, neuroscientific evidence suggested that the processing of emojis was largely equivalent to comparable facial expressions (Chatzichristos et al., 2020;Gantiva, Sotaquirá, Araujo, & Cuervo, 2019). For instance, Gantiva et al. (2019) found that both facial expressions and emojis elicited P100, N170, and LPP components. ...
... Moreover, neuroscientific evidence suggested that the processing of emojis was largely equivalent to comparable facial expressions (Chatzichristos et al., 2020;Gantiva, Sotaquirá, Araujo, & Cuervo, 2019). For instance, Gantiva et al. (2019) found that both facial expressions and emojis elicited P100, N170, and LPP components. The evidence from neuroimaging has also found that memory retrieval for emojis significantly activated the inferior frontal gyrus, amygdala, and right temporal pole (Chatzichristos et al., 2020), which has been known to be activated during facial expressions perception (e.g., Krautheim et al., 2020;Liu, Liu, Zheng, Zhao, & Fu, 2021). ...
... This evidence suggested similarities in the processing of facial expressions and emojis. However, the prior ERPs literature comparing emojis and faces mostly focused on emotion processing (without violations) and focused exclusively on the early ERP components (e.g., Gantiva et al., 2019;Weiß, Mussel, & Hewig, 2020). The present study aimed to determine the similarities and differences in the emotional violation processing of facial expressions, emojis, and emotion words. ...
Article
Full-text available
Human social interactions depend on the construction of emotional meaning. The present study used event-related potentials to investigate the neural features of emotional violation processing in facial expressions, emojis, and emotion words. Behavioral results showed emotion congruency effects among facial expressions, emojis, and emotion words. Emotional violations resulted in a longer response time than emotion congruent conditions in happy context conditions. Responses to angry faces were slower in angry sentences than in happy sentences. As expected, the classic N400 effect was obtained for the emotional violations among facial expressions, emojis, and emotion words. Emotional violations resulted in more negative-going N400 amplitudes. Moreover, the N400 effects elicited by facial expressions and emojis were significantly smaller than emotion words, and there were no significant differences in N400 effects between facial expressions and emojis. The findings suggest that the emotional violation processing of facial expressions, emojis, and emotion words could be reflected in an electrophysiological index of semantic processing, and that emotional violation elicited higher levels of semantic retrieval. In addition, there were differences between nonverbal and verbal information processing in emotional violation, while the emotional violation of words induced greater semantic retrieval demands than facial expressions and emojis.
... The human brain provokes similar neural activity when introduced to facial emojis as to real facial expressions (Gantiva et al., 2020). This means, that the human brain similarly processes facial emojis to facial expressions (Gantiva et al., 2020). ...
... The human brain provokes similar neural activity when introduced to facial emojis as to real facial expressions (Gantiva et al., 2020). This means, that the human brain similarly processes facial emojis to facial expressions (Gantiva et al., 2020). Thus, emojis can be significant in remote social communication, especially with the absence of facial expressions and verbal cues (Gesselman et al., 2019). ...
Thesis
Full-text available
II Abstract The lack of transmitted emotion and cues in virtual internal business communication could negatively affect trust level and relationships in organisations. Although, emojis are used to transmit emotion and non-verbal cues in business communication. Yet, misinterpretation of emojis i.e., emotions and cues, occurs due to the ambiguity of emojis, which could lead to misunderstanding and misalignment in business communication. This research aims to enable an accurate transmitting of emotions and cues in business communication through suggesting and developing an emoji syntax that sets up rules and guidelines for using emojis at the workplace. The emoji syntax is introduced in a clan culture company-Qudits: a Swiss start-up company due to its main characteristics, such as encouraging open communication, flexibility, and the exploitation of technologies for people development. The research adopts pragmatism and interpretivism as research philosophy and follows the DSR approach. Furthermore, the research conducts a qualitative analysis. The data for this research was collected through three main ways: documents of Qudits, exploratory interviews with the executive and co-founder, and semi-structured interviews with all Qudits team members. The main results of this research show that the clearer the emojis, the more effective the communication would be. Likewise, the more clan culture the organisation has, the more open communication occurs in the organisation. Therefore, when emojis are used correctly, they can increase the engagement level of communication, reduce misunderstanding, and increase trust in organisations.
... A related line of research has compared the processing of emojis to other kinds of stimuli, such as words or faces. The results have shown a similarity of the response to emojis and facial expressions (e.g., Gantiva et al., 2020;Weiβ et al., 2019). It should be noted, however, that other studies have failed to find an exact correspondence between face and emoji processing. ...
... Recent findings support this possibility. For instance, face emojis have shown to produce affective responses (measured with affective ratings, e.g., Fischer & Herbert, 2021;Gantiva et al., 2021), neural responses (measured with event-related potentials, e.g., Gantiva et al., 2020) and psychophysiological responses (measured with the startle reflex, Aluja et al., 2020, andwith electromyographic recordings, Gantiva et al., 2021) that are like those produced by human faces expressing emotions. Furthermore, emojis, like words, have been found to modulate the affective processing of words with which they co-occur, both at a behavioral and at a neural level (Comesaña et al., 2013;Pfeifer et al., 2022;Yang et al., 2021). ...
Article
Full-text available
This article presents subjective norms for 1031 emojis in six dimensions: visual complexity, familiarity, frequency of use, clarity, emotional valence, and emotional arousal. This is the largest normative study conducted so far that relies on subjective ratings. Unlike the few existing normative studies, which mainly comprise face emojis, here we present a wide range of emoji categories. We also examine the correlations between the dimensions assessed. Our results show that, in terms of their affective properties, emojis are analogous to other stimuli, such as words, showing the expected U-shaped relationship between valence and arousal. The relationship between affective properties and other dimensions (e.g., between valence and familiarity) is also similar to the relationship observed in words, in the sense that positively valenced emojis are more familiar than negative ones. These findings suggest that emojis are suitable stimuli for studying affective processing. Emoji-SP will be highly valuable for researchers of various fields interested in emojis, including computer science, communication, linguistics, and psychology. The full set of norms is available at: https://osf.io/dtfjv/.
... Upright emoticons, text-based representation of facial expression, elicit similar holistic configural processing as upright faces (Churches, Nicholls, Thiessen, Kohler, & Keage, 2014) rather than feature-based processing. While the processing of emotion expressions (such as happiness and anger) in faces and emoji share similarity in neurophysiology, faces elicited more attention and greater arousal than emoji (Gantiva, Sotaquirá, Araujo, & Cuervo, 2020). ...
... Furthermore, in line with prior findings that emoji expressions share caricatured facial features, which are related to enhanced attention engagement and greater structural encoding process (Gantiva et al., 2020), the P1 and N2 amplitudes in the present study show enhanced visual processing toward emoji, possibly indicating a feature-based processing strategy for emoji. ...
Article
The facial expression of pain is a universal social language that transcends cultures and societies. Gaining increasing popularity, emoji is now widely used to express emotional states during digital communication. Yet, it remains unclear whether emoji expression of pain can be processed as effectively as facial expression. This event-related brain potential (ERP) study investigates whether perception of pain expressions in emoji shares similarity with faces. Participants judged pain or neutral expressions in emoji and faces in a classic empathy paradigm. A dynamic convergent-divergent-convergent process was found during the observation of pain expressions. While the recognition for pain is better for faces than for emoji (both in accuracy and reaction time), we found similar P2 amplitudes during the early stage of processing and similar LPP amplitudes during later stage. Interestingly, the neural response diverges during P3 and N2 between faces and emoji: P3 respond more strongly to pain expression (relative to neutral) in faces, whereas N2 respond more strongly to pain expression (relative to neutral) in emoji. Our findings point to the commonality of perceiving pain in faces and in emoji, as well as the distinct salience of detecting pain in faces.
... Contudo, outros trabalhos recentes, em que os emojis foram apresentados de forma consciente em contextos semanticamente mais ricos como o recurso a frases, mostram ora que os emojis são processados de forma similar a qualquer outro tipo de estímulos, como palavras ou faces humanas (e.g., [14]; [15]; [16]), ora que o uso seu usso acarreta algum tipo de custo no processamento face a outros tipos de estímulos como palavras ou faces (e.g., [17]; [18]; [19]). Esta inconsistência de resultados, que pode decorrer de diferenças no tipo de estímulos usados (e.g., emojis retratando expressões faciais ou outro tipo de emojis vs. palavras vs. faces humanas), no tempo dado ao processamento dos estímulos apresentados (processamto consciente vs. não-consciente), e/ou das técnicas e paradigmas experimentais utilizados (e.g., paradigmas de reconhecimento visual de palavras isoladas, leitura auto-monitorada de frases, apresentação visual rápida em série de palavras em contexto frásico) demonstra, em qualquer dos casos, que o papel que os emojis exercem no processamento emocional da informação escrita não é claro na literatura científica atual, pelo que, neste trabalho, nos propusemos analisar em que medida o tipo de emoção veiculada pelas frases pode também modular o tipo efeito (facilitador vs. inibidor) observado nessa investigação. ...
... Em estudos envolvendo a deteção de rostos reais e faces esquemáticas (e.g., , , ) este viés de negatividade tem sido também encontrado (e.g., [40]; [41]; [42]), o que tem sido interpretado com uma espécie de efeito "pop-out", que reflete uma apetência inata para a deteção de expressões faciais negativas de forma automática e préatencional. Os resultados aqui obtidos com os emojis negativos de tristeza e nojo parecem consistentes com este racional, sendo adicionalmente suportados pelo facto de estudos recentes mostrarem que emojis de expressões faciais e imagens de rostos reais elicitarem o mesmo padrão de resposta neural (e.g., [15]). ...
... Premier composant répondant de manière préférentielle aux visages, la N170 est impliquée dans la catégorisation d'un visage en tant que tel. Ainsi, des stimuli ambigus comme les tableaux d'Arcimboldo (voir Figure 2), des objets ressemblant à des visages (« face-like ») ou même des émoticônes évoqueront une large N170 s'ils sont perçus comme des visages par leur observateur (Churches et al., 2009(Churches et al., , 2014Caharel et al., 2013;Gantiva et al., 2019) (Figure 8B et C). Au-delà de ce rôle de détection des visages, d'autres études suggèrent également une implication de la N170 dans les processus de discrimination individuelle. ...
... L'amplitude du composant N170 est plus large pour les emojis que pour les visages réels, suggérant un engagement plus important des processus perceptifs dans le premier cas. Adapté de Gantiva et al. (2019). ...
Thesis
Full-text available
La voie visuelle ventrale, s’étendant des régions occipitales aux régions temporales antérieures, est spécialisée dans la reconnaissance, par la modalité visuelle, des objets et personnes rencontrés au quotidien. De nombreuses études en imagerie par résonance magnétique fonctionnelle se sont intéressées aux bases cérébrales de la reconnaissance visuelle. Toutefois, la susceptibilité de cette technique aux artefacts magnétiques dans les régions du lobe temporal antérieur a conduit à sous-estimer le rôle de ces régions au sein de la voie ventrale. Le but de cette thèse est de mieux comprendre les mécanismes de reconnaissance visuelle au sein du cortex ventral occipito-temporal, et notamment de clarifier la contribution des structures temporales postérieures et antérieures dans la mise en œuvre des mécanismes de reconnaissance visuelle et de mise en lien avec la mémoire sémantique. Pour cela, nous nous appuyons sur une approche multimodale combinant neuropsychologie, stimulation visuelle périodique rapide (FPVS) et enregistrements en EEG de scalp et en EEG intracérébral (SEEG), chez des participants neurotypiques et des participants épileptiques. Nous rapportons cinq études empiriques dans lesquelles nous démontrons que (1) les patients avec une épilepsie temporale antérieure (i.e., le type d’épilepsie focale le plus fréquemment concerné par une procédure en SEEG) présentent des performances typiques en discrimination individuelle de visages, (2) la stimulation électrique du gyrus fusiforme antérieur droit peut entraîner un déficit transitoire spécifique à la reconnaissance des visages, même lorsqu’aucune dénomination n’est requise, (3) le processus de discrimination de visages familiers parmi des visages inconnus sollicite l’engagement d’un large réseau de structures ventrales bilatérales incluant les régions temporales antérieures et médiales, (4) certaines structures du lobe temporal antérieur ventral gauche sont impliquées dans l’intégration d’un visage familier et de son nom en une représentation unifiée, et (5) les régions temporales antérieures ventrales bilatérales sont engagées dans la mise en œuvre de représentations sémantiques associées à des mots écrits. Dans l’ensemble, nos travaux montrent que (1) le réseau de reconnaissance visuelle s’organise le long de la voie visuelle ventrale en suivant une hiérarchisation progressive selon l’axe postéro-antérieur, au sein duquel une transition graduelle s’effectue entre représentations majoritairement perceptives et représentations sémantiques de plus en plus abstraites, et (2) les régions impliquées dans la reconnaissance visuelle sont fortement latéralisées dans les régions postérieures ventrales, et deviennent bilatérales dans les régions temporales antérieures ventrales.
... The availability of a huge literary record on sentiment analysis in general (Abu-Salih, Bremie et al. 2019;Chan 2018, 2020;Dhir, Chen, and Nieminen 2015;Dhir, Chen, and Nieminen 2017;Dhir, Khalil et al. 2017;Dhir and Torsheim 2016;Dwivedi and Lewis 2020;Gantiva et al. 2020;Labban and Bizzi 2020;Malik, Dhir, and Nieminen 2016;, and its absence in the sports science domain creates an avenue for future work both as motivation and gap. The present study conducts initial sentiment analysis in the sports science domain. ...
Article
Full-text available
Social news sites have enabled news identification and its promotion through crowdsourcing and provided users with the capabilities to discuss topics. Compared to social media, social news sites are information-rich and possess their own social network. Reddit has been classified as a social news website and possesses similar characteristics to other websites in the domain. These include Slashdot, Digg, and StumbleUpon. This paper provides a broad analysis of spectators’ sentimental (or emotional) behaviour surrounding the 2020 Summer Olympics on Reddit. The study measures the impact of the COVID-19 pandemic on spectator sentiments and proposes a scheme for sensitising Redditors’ behaviour by crowdsourcing the influencers present in the Redditor pool.
... Finally, neutral emojis, like neutral facial expressions, tend to be perceived negatively (Lee et al., 2008). Recent evidence in neuroimaging suggests that emojis activate similar neural mechanisms as facial expressions (Cao & Zhao, 2018;Gantiva et al., 2019;Kim et al., 2016), which may explain why they elicited similar behavioural effects in ours and other studies. Future research should directly examine the combined neural and behavioural processing of emojis to confirm this relationship. ...
Article
Many emojis symbolize nonverbal cues that are used during face-to-face communication. Despite their popularity, few studies have examined how emojis influence digital interactions. The present study addresses this gap by measuring the impact of emojis on emotion interpretation, social attributions, and information processing. Participants read messages that are typical of social exchanges in instant text messaging (IM) accompanied by emojis that mimic negative, positive and neutral facial expressions. Sentence valence and emoji valence were paired in a fully crossed design such that verbal and nonverbal messages were either congruent or incongruent. Perceived emotional state of the sender, perceived warmth, and patterns of eye movements that reflect information processing were measured. A negativity effect was observed whereby the sender’s mood was perceived as negative when a negative emoji and/or a negative sentence were presented. Moreover, the presence of a negative emoji intensified the perceived negativity of negative sentences. Adding a positive emoji to a message increased the perceived warmth of the sender. Finally, processing speed and understanding of verbal messages was enhanced by the presence of congruent emojis. Our results therefore support the use of emojis, and in particular positive emojis, to improve communication, express feelings, and make a positive impression during socially-driven digital interactions.
... В эпоху глобального развития интернет-коммуникаций в виртуальном пространстве стали широко использоваться эмоджи (картинки с различными видами выражения лица) [20], а смайлики (смайл от англ. «улыбка» -доброжелательные, позитивные эмоджи) -в виртуальных маркетинговых кампаниях [21]. ...
... People perceive emoji faces similarly to human emotion faces. For example, Gantiva et al. (2019) found that emoji faces produced similar neural responses to real faces observed during face-to-face communication. In another study, Yuasa et al. (2011) found that emojis and human facial expressions elicited similar brain activity in the right and left inferior frontal gyri. ...
Article
Full-text available
Emoji faces, which are ubiquitous in our everyday communication, are thought to resemble human faces and aid emotional communication. Yet, few studies examine whether emojis are perceived as a particular emotion and whether that perception changes based on rendering differences across electronic platforms. The current paper draws upon emotion theory to evaluate whether emoji faces depict anatomical differences that are proposed to differentiate human depictions of emotion (hereafter, “facial expressions”). We modified the existing Facial Action Coding System (FACS) ( Ekman and Rosenberg, 1997 ) to apply to emoji faces. An equivalent “emoji FACS” rubric allowed us to evaluate two important questions: First, Anatomically, does the same emoji face “look” the same across platforms and versions? Second, Do emoji faces perceived as a particular emotion category resemble the proposed human facial expression for that emotion? To answer these questions, we compared the anatomically based codes for 31 emoji faces across three platforms and two version updates. We then compared those codes to the proposed human facial expression prototype for the emotion perceived within the emoji face. Overall, emoji faces across platforms and versions were not anatomically equivalent. Moreover, the majority of emoji faces did not conform to human facial expressions for an emotion, although the basic anatomical codes were shared among human and emoji faces. Some emotion categories were better predicted by the assortment of anatomical codes than others, with some individual differences among platforms. We discuss theories of emotion that help explain how emoji faces are perceived as an emotion, even when anatomical differences are not always consistent or specific to an emotion.
... By contrast, another study compared photographs of faces and graphic emoticons and found no significant difference (Oleszkiewicz, Frackowiak, et al., 2017). Other studies have compared emotional processing for emoticons and human faces using electrophysiological measures (Gantiva et al., 2020). They found that the N170 neural response, reflecting basic face processing involved in recognition of facial emotion, had larger amplitudes for emoticons than for human faces, and anger generated larger amplitudes than happiness. ...
Article
The development of information and communication technologies has provided a new non-verbal channel to convey emotions using emoticons. Although a great diversity of emoticons is widely used today in text-based communications, little is known about the way emotions are recognized when using emoticons compared to other modes of expression. In a pretest and three studies (N = 1,203), ‘new’ emoticons specifically designed to represent the six basic emotions were proposed to participants who had to recognized the emotions conveyed by each. The quality of recognition was compared to other modes of emotional expression, including facial expressions. In using a between-subject design, the first two studies revealed that the emotions conveyed by ‘new’ emoticons were recognized more effectively than other modes of expression, including facial expressions. Using a within-subject design, a third study confirmed the more successful recognition of ‘new’ emoticons than other modes of expression, and with a greater intensity. For all the studies, this effect was mainly due to the negative emotions of disgust (Study 1, 2, and 3) and sadness (Study 2 and 3). These findings suggest the need to use specific emoticons to convey easily recognized basic emotions for communication technologies, and implement them in social media.
... For example, adding a wink emoji to a message with food emojis that are not associated with sexual euphemisms can lead those same food emojis to be interpreted in a sexual way (Weissman, 2019). These findings make all the more sense in light of a recent study by Gantiva et al. (2020), who found that emoji faces elicited similar neural responses to human faces, suggesting again that emojis can provide paralinguistic information that is typically available in face-to-face, but not in text-based, communication. Emoji interpretation has also been studied without surrounding linguistic context. ...
... For example, the "face inversion" effect (Yin, 1969), which occurs specifically with faces (e.g., Farah & Tanaka, 1995), is observed even with emoticons (Churches et al., 2014). Besides, an event-related potential (ERP) study (Gantiva et al., 2019) suggests the similarity in neural processing between emoticons and real faces as indexed by the P100, N170, and late positive potential (LPP), although the effect of emoticons was smaller than that of real faces (see Weiß et al., 2019). ...
Article
Full-text available
This study investigated cultural differences in the perception of emoticons between Japanese and Dutch participants. We manipulated the eyes and mouth of emoticons independently and asked participants to evaluate the emotion of each emoticon. The results show that Japanese participants tended to focus on the emotion expressed with the eyes while Dutch participants put weight on the shape of the mouth when evaluating emoticons. This tendency is consistent with a previous cross-cultural study comparing people from Japan and the United States (Yuki et al., 2007).
... Las investigaciones relacionadas con el reconocimiento y procesamiento de rostros se han desarrollado en áreas como la esquizofrenia (Quintero et al., 2015), el autismo (Suárez-Pico, Bonelo-Cuellar & Utria, 2019). la demencia frontotemporal (Narambuena, Vaiman & Pereno, 2016) y el procesamiento cortical (Gantiva, Sotaquirá, Araujo & Cuervo, 2020), entre muchas otras. ...
Article
Full-text available
Resumen Los rasgos fenotípicos afectan la respuesta hacia los rostros emocionales. En Colom-bia, no se ha construido y validado un banco de imágenes de rostros emocionales. El objetivo del presente estudio fue construir y validar un banco de fotografías de rostros emocionales colom-bianos (alegría, ira, tristeza, miedo, asco, sorpresa y expresión neutral/descanso), a través de una metodología similar a la utilizada en el diseño de otros bancos de expresiones emocionales. El estudio se llevó a cabo con 80 participantes (43 mujeres), a los cuales se les presentó cada una de las 70 expresiones emocionales del Repositorio de Expresiones Faciales Emocionales Colombianas (REFEC). A partir de estudios previos en este campo se tomaron como índices de validez el porcentaje de acierto y el coeficiente de kappa de Fleiss (grado de concordancia entre evaluadores). Además, se midió el tiempo de reacción como índice de demanda cognitiva para el reconocimiento de la expresión emocional. Los resultados muestran un porcentaje de acierto superior al de otros bancos de estímulos de expresiones emocionales y un índice de concordan-cia bueno/excelente (similar al de otros instrumentos). En conjunto, los resultados sugieren que el REFEC es un banco de imágenes de expresiones faciales emocionales válido para su uso en población colombiana. Abstract Phenotypic traits affect the response to emotional faces. In Colombia, an image bank of emotional faces has not been built and validated. The objective of this study was to build and validate a bank of photographs of Colombian emotional faces (joy, anger, sadness, fear, disgust, surprise and neutral expression/rest), through a methodology similar to that used in the design of other banks of emotional expressions. The study was carried out with 80 participants (43 women), to whom each of the 70 emotional expressions of the Repository of
... I Still Love You which will released on February 12 that used drooling emoji ( ) shows that @NetflixID laughed at how the audience is more focused on Lara Jean's (main character of the movie) dress. The emoji, helps @NetflixID in conversation management by help users in tone adjustment (Kelly and Watts, 2015) because it can give extra emotional or meaning to the communication so it (emoji) possesses similar neural responses to face-to-face communication (Gantiva et al., 2019), which means that it can help the virtual interaction more alive since it promotes the daily conversation sensation. ...
Article
This descriptive qualitative research aims to analyze the rhetoric of @NetflixID that well-known as a brand's Twitter account using Selzer's "rhetorical analysis" approach. Data collected on January 2020 through documentation and uses text analysis technique as data analysis technique. The result of this analysis found that through authorships and personal pronouns, conversational tone, and humor, we can look into the some of the canons and elements of rhetoric of @NetflixID. Some canons found (canon style and canon memory) even help to find two of three elements of rhetoric (pathos and logos). @NetflixID founded to be embodied human character through human voice by using humor, emojis and emoticons, authorship and personal pronouns, non-formal language style, choice of words, anecdote, and non-standard vocabularies in giving information and showing emotions. Furthermore, @NetflixID shown the uses of rhetoric in stimulating demand, formulating the strategy that audience-focused, and interacting with the audience with spontaneous responds. This research is also found the potential uses of conversational tone and authorships and personal pronouns as marketing communication tool in initiating social interactions, generating affective responses from audience, creating emotional bonds with the consumer and attract the consumers' attention, similarly to humor.
... When adding a friendly emoji ( ) to a request, it significantly increased the usefulness of the responses and the intimacy level with the participant. Gantiva, Sotaquirá, Araujo, and Cuervo (2020) suggest that the use of emoji faces in computer-mediated communication produces neural responses that are similar to those that are observed in face-to-face communication. In this way, emojis could be an important tool to influence and understand emotional insights and basic emotions elicited in a survey (Kaye et al., 2017). ...
Article
Full-text available
Emojis are used in digital communication to express feelings and emotions and are commonly used on social media platforms. Research aiming to interpret the meaning of facial emojis is needed and should consider differences among several characteristics as the cultural aspects across different countries. This study aimed to assess the meaning of emojis by Brazilians and create two facial scales that can be used in later research with applications of emojis in consumer studies and sensory analysis of food and beverages. The associations between emotions and emojis were evaluated by 132 participants from Brazil using 39 descriptive terms and 33 facial emojis through a modified Check-All-That-Apply (CATA) questionnaire. The overall average of emoji associations for each CATA term varied from 1.1 to 4.3, demonstrating that the associations varied a lot, depending on the emotion. In this way, it was possible to obtain a group of emotions with a strong association with one emoji (n=15); a second group with a strong association with more than one emoji (n=8), and; a third group with weak associations (n=16). The emojis from the first and second groups were used to make a chart with 14 emojis to be used in consumer research with Brazilian consumers and also a 5-point facial scale of intensity to be used in sensory analysis with food and beverages.
... The psychological evaluation of users' preferences might be biased by users' surroundings, voluntary participation, and false feelings of their inner states (Ariely and Berns 2010). The electroencephalogram (EEG) has been tentatively used in revealing neural dynamics underlying cognitive processing of humanoid robot actions (Saygin et al. 2012;Urgen, Kutas, and Saygin 2018), emotional facial expressions (Chaminade et al. 2010) and motion behaviours , or emoji faces (Gantiva et al. 2019). It generates from neural activities and shows significant validity in reflecting humans' psychological states with a high temporal resolution (Luck 2014). ...
Article
Even though humanoid robots are being applied to diverse areas, the formation of users' preference for the appearance of humanoid robots remains unknown. The present study investigated users' neural dynamics underlying preference formation to evaluate users' preference for the appearance of humanoid robots. EEG signals were recorded in a preference categorisation task, and neural dynamics were analysed via event-related potentials and time-frequency analysis. The results showed that in the early stage, the preferred humanoid robot appearances elicited enhanced parieto-occipital N1, frontal P2, and early central and parieto-occipital theta rhythm power than the non-preferred appearances. In the later stage, the preferred humanoid robot appearances elicited enhanced scalp-distributed LPP and later central and parieto-occipital theta power than the non-preferred appearances. The results suggested that the formation of users' preference for the appearance of humanoid robots has a distinctive dual-stage of neural dynamics. The study provides designers with an objective method in evaluating users' preference for the appearance of humanoid robots. ARTICLE HISTORY
... People perceive emoji faces similarly to human emotion faces. For example, Gantiva et al. (2019) found that emoji faces produced similar neural responses to real faces observed during face-to-face communication. In another study, Yuasa et al. (2011) found that emojis and human facial expressions elicited similar brain activity in the right and left inferior frontal gyri. ...
... In a comparison of visual styles of emotional faces (Kendall et al., 2016), cartoony faces evoked an attenuated P1 when compared with photorealistic and rotoscoped faces. This was attributed to the low-level featural differences between cartoony and photorealistic faces, which were posited as feeding forward into the subsequent N170-an ERP component reflecting the neural processing of faces (Bentin et al., 1996) which is also sensitive to emotional expressions (Blau et al., 2007) and is modulated in comparisons of real faces and emoji faces (Gantiva et al., 2020;Weiß et al., 2020). ...
Article
Full-text available
Compositionality is a primary feature of language, but graphics can also create combinatorial meaning, like with items above faces (e.g., lightbulbs to mean inspiration). We posit that these “upfixes” (i.e., upwards affixes) involve a productive schema enabling both stored and novel face–upfix dyads. In two experiments, participants viewed either conventional (e.g., lightbulb) or unconventional (e.g., clover-leaves) upfixes with faces which either matched (e.g., lightbulb/smile) or mismatched (e.g., lightbulb/frown). In Experiment 1, matching dyads sponsored higher comprehensibility ratings and faster response times, modulated by conventionality. In Experiment 2, event-related brain potentials (ERPs) revealed conventional upfixes, regardless of matching, evoked larger N250s, indicating perceptual expertise, but mismatching and unconventional dyads elicited larger semantic processing costs (N400) than conventional-matching dyads. Yet mismatches evoked a late negativity, suggesting congruent novel dyads remained construable compared with violations. These results support that combinatorial graphics involve a constrained productive schema, similar to the lexicon of language.
... In this context, various studies have been conducted in the field of social neurology to examine the neural reactions of people at emoji and real facial expressions. The study conducted by Gantiva et al. (2019) revealed that using emoji faces in computer-mediated communication demonstrated similar neural responses to those in face-to-face communication. This gap opened in happy, neutral and angry facial expressions, and real expressions were found to create a higher effect than emoji. ...
Article
Full-text available
Emoji play a significant role in mutual communication. It was seen that emoji might carry different meanings depending on groups or cultures, although they are considered as a language valid for everyone. This study investigated how individuals responded to the questions directed to them using emoji, how they described the basic emotions, and what the individuals who read these messages understood. The study consisted of two stages. At the first stage, the participants were asked to respond to 10 questions, for which answers with concrete and abstract contents were sought, by using first emoji-only and then only text. Moreover, they were requested to express the eight basic emotions they were shown only with emoji. The participants also determined a satisfaction level for the emoji responses. At the second stage, a different group of participants scored the satisfaction levels by displaying the question. It was found out that a message consisting of emoji might be evaluated by the receiver in a different way, even if it has been created with a high satisfaction level.
... The psychological evaluation of users' preferences might be biased by users' surroundings, voluntary participation, and false feelings of their inner states (Ariely and Berns 2010). The electroencephalogram (EEG) has been tentatively used in revealing neural dynamics underlying cognitive processing of humanoid robot actions (Saygin et al. 2012;Urgen, Kutas, and Saygin 2018), emotional facial expressions (Chaminade et al. 2010) and motion behaviours , or emoji faces (Gantiva et al. 2019). It generates from neural activities and shows significant validity in reflecting humans' psychological states with a high temporal resolution (Luck 2014). ...
Article
The increasing growth in the use of mobile news apps has raised questions on how their interface design features affect users’ gaze behaviours and behavioural performance. To tackle these issues, two experiments (visual browse and search tasks) were designed to investigate the impact of interface design features (colour and layout) on users’ gaze behaviours (fixation count, fixation time ratio and first fixation duration) and behavioural performance (task completion time and search accuracy) with a portable eye tracker. Twenty-four participants were recruited to browse and search for news in different mobile news interfaces with Chinese language. The results showed that mobile news apps with white interfaces attracted more attention and participants needed more time to comprehend the information of white interfaces in the visual browse task. Furthermore, participants achieved higher search efficiency in the visual search task by using interfaces with red keyword and LT-RP (Left Text-Right Picture). In addition, it is noticeable that participants paid more attention to the text than the pictures of mobile news interfaces, and most participants first looked at text and later observed pictures. The findings provide valuable and interesting insights for better understanding users’ gaze behaviours and behavioural performance of mobile news apps.
... While emotions are expressed linguistically and para linguistically through non-verbal signs and gestures in faceto-face communication, emotional expressiveness online may include a wide range of visual features. It has, for example, emoji and stickers, by which emotions and feelings can be expressed, thus making the sentiment of the message more salient (Zhao, 2019;Gantiva et al., 2020). It might be inferred that those who can use this visual language to communicate their emotions online effectively might also acknowledge the benefits of these practices in online counseling, thus feeling more capable of providing online sessions that often involve emotional communication. ...
Article
Full-text available
The outbreak of the COVID-19 pandemic has accelerated the need for online counseling to preserve therapeutic processes that have begun face to face and to provide service to others in need during lockdowns. Previous studies underscored the importance of providing updated training as counselors frequently hesitate to use technological advances in therapeutic sessions. This study aims at reducing such barriers by revealing personal characteristics of future professionals that might inhibit or encourage their openness toward providing online counseling. To this end, this study is focused on several precursors of openness to provide online counseling: preference to communicate emotions online, identification of emotional expressiveness advantages in providing online counseling, innovative behavior, creativity, and future problem-solving thinking skills. The question at focus is which constructs would be found contributive to students’ openness to provide online counseling. The sample included 277 undergraduate students (future counselors) who filled out questionnaires. Data were analyzed using Partial Least Squares Structural Equation Modeling. Our findings pointed to the centrality of students’ preference to communicate their emotions online in explaining their openness to conducting online counseling. This study might help pinpointing the adjustments curriculum designers should address to better reflect the intensive changes within the counseling field that necessitate transferring face-to-face skills to online settings.
... Fane et al. (2018) showed that five graphic emoticons were associated with several emotions when participants freely expressed them. Moreover, the results of Gantiva et al. (2020) led to 24 of 36 graphic emoticons being retained with a recognition rate greater than 80%, eight each expressing happiness, anger, and a neutral emotion. In another study (Weiß et al., 2020), each of the 13 graphic emoticons, selected among 78, had to be evaluated with 18 emotion terms such as amused, awed, fearful, etc. Results showed that age had an effect on the evaluation of some emoticons: the disgust emoticon ( ) was evaluated as "frustration" by older adults and the fearful emoticon ( ) was evaluated as "surprise" by younger adults. ...
Article
The growing interest for emoticons and emojis has recently led to research examining their use and impact on various behaviors. As emoticons and emojis may lead to misinterpretations and misunderstandings between senders and recipients in online communication, it is necessary to examine whether emotions conveyed by these symbols are well recognized by individuals. In this perspective, a systematic review from 2001 to 2021 using the Preferred Reporting of Items for Systematic reviews and Meta-Analyses (PRISMA) method was conducted to determine which emoticons and emojis can help individuals to recognize emotions, and how the recognition of emotions based on emoticons and emojis is studied. A total of 501 articles were screened from three major databases in psychology, and 23 articles met the predefined inclusion criteria. The results suggest that the recognition of emotions should be examined before using emoticons in larger studies. They also revealed that the recognition varied according to the methods used to assess the valence of emoticons or to attribute a specific emotion to them (self-report, free expression, or categorization). Finally, a summary Table of the emotions conveyed by emoticons and emojis is proposed in this review.
... The LPP was averaged in three regions: posterior (PO4, POz, PO3, P3, P4, and Pz), central (Cz, CPz, C4, C3, Cp4, and Cp3), and anterior (Fz, AFz, FC3, F3, FC4, and F4). Noisy epochs were detected and rejected when one or more channels exceeded a voltage threshold of ± 75 µV 32,46 . A total of 11.27% of the epochs were removed. ...
Article
Full-text available
Bullying is a subtype of violence that leads to maladaptive behaviors and emotional responses, with implications for social competence, emotions, and empathy. The present study compared the time course of emotional processing in children who were involved in the dynamics of bullying (i.e., as victims, bullies, and observers) by evaluating event-related potentials [early posterior negativity and late positive potential (LPP)] in different brain regions during a passive visualization task that involved positive, neutral, and negative social pictures. High-density electroencephalograms were recorded in 45 children, 8–12 years old ( M = 9.5 years, SD = 1.3), while they observed emotional and neutral social pictures that we selected from the International Affective Picture System. Late positive potential had higher amplitudes in the victim group, especially in posterior and anterior regions. In the central region, LPP was greater toward neutral social pictures in bullying victims. The greater amplitude of LPP in victims was observed during and after the stimulus. The results showed a consistent response with a higher intensity in response to emotional stimuli in the victim group, suggesting a tendency toward hypervigilance that could interfere with emotional regulation.
... Emojis are an example of a cue that can help users convey and interpret information that may be otherwise difficult to express using purely text-based communication (Bai et al., 2019). In fact, recent EEG research has found preliminary evidence that emojis displaying facial expressions may be able to elicit similar neural responses as their corresponding "real-life" facial expressions (Gantiva et al., 2019). ...
Article
Full-text available
Emerging studies suggest that emojis can make important contributions to the emotional content and meaning of digital messages. Yet researchers currently lack adequate tools to incorporate emojis into their analyses. To address this gap, we used over 3 million Twitter posts from a period of 17 months and emotion ratings provided by 2,230 human raters to develop and validate the Multidimensional Lexicon of Emojis (MLE). This new lexicon consists of 359 common emojis rated on ten emotion and sentiment dimensions. The MLE is an open-access tool that holds promise for those interested in conducting a comprehensive analysis of the emotional content of digital communication that incorporates emojis and goes beyond the dimensions of negativity and positivity.
... In the era of Web 2.0, emojis are widely used in computermediated communication (CMC) as a substitute for real facial expressions in virtual environments ( Hof, 2016 ;Marengo et al., 2017 ;Walther and D'addario, 2001 ;Wang et al., 2014 ), and emojis are prevalent in marketing campaigns ( Gantiva et al., 2019 ;Leung and Chan, 2017 ). Thus, it is necessary to determine whether consumers process emojis as subliminal stimuli, as they do with human facial expressions; this study then discusses the following research question: ...
Article
Full-text available
This study aims to understand how hotel videos embedded with a smiling face emoji as a subliminal message affect consumers’ selection of hotels, with their brain activities measured and collected while they watched the videos. Data was collected from sixteen participants who completed two rounds of experiments. A chi-square test of homogeneity, paired sample t-test, and Bayes factor were performed to address the two proposed research questions. The results of this study reveal that participants’ selection of hotels would be significantly affected by the subliminal stimuli of a smiling face emoji. Meanwhile, neuroscientific data identifies significant differences between participants’ two (theta and beta) out of five bands of brainwaves while they were viewing hotel videos with and without the subliminal message. Suggestions for future studies and practical operations are also discussed.
... They are graphic symbols with predefined names and code (Unicode), which can represent not only facial expressions, abstract concepts and emotions/feelings, but also animals, plants, activities, gestures/body parts, and objects (Rodrigues et al., 2017). Possessing similar neural responses to face-to-face communication (Gantiva et al., 2019), using emoji can add extra emotional or contextual meaning to communication, enhance the attractiveness of the message to receivers (Cramer et al., 2016), help users in tone adjustment and conversation management and play a role in managing and maintaining interpersonal relationships (Kelly and Watts, 2015;Chairunnisa and Benedictus, 2017;Albawardi, 2018). On a social level, emoji, as a visual language, make it easier for non-English speaking nations to use English-dominated social media such as Twitter, Instagram and Facebook (Boothe and Wickstrom, 2017). ...
Article
Full-text available
A growing body of research explores emoji, which are visual symbols in computer mediated communication (CMC). In the 20 years since the first set of emoji was released, research on it has been on the increase, albeit in a variety of directions. We reviewed the extant body of research on emoji and noted the development, usage, function, and application of emoji. In this review article, we provide a systematic review of the extant body of work on emoji, reviewing how they have developed, how they are used differently, what functions they have and what research has been conducted on them in different domains. Furthermore, we summarize directions for future research on this topic.
Article
Emoji are vastly becoming an integral part of everyday communication, yet little is understood about the extent to which these are processed emotionally. Previous research shows that there is a processing advantage for emotionally-valenced words over neutral ones, therefore if emoji are indeed emotional, one could expect an equivalent processing advantage. In the Pilot Study, participants (N = 44) completed a lexical decision task to explore accuracy and response latency of word, face and emoji stimuli. This stimuli varied in emotional valence (positive vs. neutral). Main effects were found for stimuli type and valence on both accuracy and latency, although the interaction for accuracy was not significant. That is, there were processing advantages of positively-valenced stimuli over neutral ones, across all stimuli types. Also, faces and emoji were processed significantly more quickly than words, and latencies between face and emoji stimuli, irrespective of valence were largely equivalent. The Main Study recruited 33 participants to undertake a modified and extended version of the lexical decision task, which included three valence conditions (positive, negative and neutral) per stimuli type. Although no main effects were found for accuracy, there was a significant main effect found for stimuli but not for valence on latency. Namely, that word stimuli irrespective of valence were processed significantly more slowly than face or emoji stimuli. There was not a significant interaction between stimuli and valence, however. Therefore, overall although there was partial support for a processing advantage of emoji stimuli, this was not replicated across the studies reported here, suggesting additional work may be needed to corroborate further evidence.
It is well known that teachers’ feedback plays an important role in students’ learning, as it enhances learners’ cognitive development; yet there has been little research on how positive feedback given in the form of emojis works in computer-assisted language learning (CALL) courses. In this study, an experiment was designed to clarify how English as a Foreign Language (EFL) learners’ emotions are affected when such feedback is presented (no feedback vs. supraliminal feedback vs. subliminal feedback) and to examine how the feedback correlates with their learning effectiveness. A within-subject experiment with 33 participants was designed and administered to examine three proposed research hypotheses. Participants’ frontal asymmetry alpha (FAA) and their recollection of the learning contents were used as the index of their emotional valence and learning effectiveness. The results revealed that positive feedback given in the form of emojis generated a positive/approach emotion when it was shown subliminally. Furthermore, a significant relationship was seen between EFL learners’ positive emotion and their learning effectiveness; such a finding was supported by the significant relationship between the modes of emoji presentation and the participants’ learning effectiveness. This study’s originality and value lies in the innovative research method that was adopted and the interesting findings that it yielded. The limitations of this study such as research design and sampling are reported. The study also has practical and theoretical implications for practitioners and scholars of CALL for praxis and future research.
Article
The objective of the present study was to evaluate emotional responses to emoji faces through physiological and self-report measures, and evaluate possible differences between men and women. One hundred participants (50 women) observed pictures of happy, neutral, and angry emoji faces, while activity of the zygomatic and corrugator muscles, skin conductance, and heart rate were measured. Self-report measures of emotional experience were also recorded. The results showed an increase in zygomatic muscle activity toward happy emoji faces. An increasing trend in corrugator muscle activity toward angry emoji faces was observed; however, this trend was only marginally significant. Happy emoji faces generated an increase in the skin conductance response. The emotional experience of the participants was also consistent with the emotions that were expressed by the emoji faces. No differences were found between sexes. Overall, the results suggest that emoji faces can especially induce pleasant affective states.
Article
Emojis play an increasingly important role as nonverbal symbols in online communication, and this study investigates their effect on the processing of subsequent words. We suppose that emojis, as an aid to emotional expression and semantic comprehension, prime the cognitive processing of the textual stimuli that follow. An affective priming experiment using ERPs was conducted, in which Semantic Relationship between the emoji and the word (Congruent, Incongruent, Non-related) was manipulated to assess their impact in an affective word categorization task. Behavioral results revealed no significant difference, while electrophysiological data showed otherwise. N1 component elicited by semantically congruent and incongruent emoji-word pairs showed greater amplitudes than those by non-related pairs. N400 elicited by incongruent pairs showed greater amplitudes than the N400 by congruent pairs. These data indicate that while emojis do not influence the outcomes of linguistic processing, emoji primes enhance attentional level, and semantically congruent contents demand fewer cognitive resources and facilitate better processing in comparison to incongruent ones. The findings point to emojis’ role not only as affective primes, but also as paralanguage that is semantically processed. Implications for computer-mediated communication are discussed.
Article
Full-text available
Background: In this study we measured the affective appraisal of sounds and video clips using a newly developed graphical self-report tool: the EmojiGrid. The EmojiGrid is a square grid, labeled with emoji that express different degrees of valence and arousal. Users rate the valence and arousal of a given stimulus by simply clicking on the grid. Methods: In Experiment I, observers (N=150, 74 males, mean age=25.2±3.5) used the EmojiGrid to rate their affective appraisal of 77 validated sound clips from nine different semantic categories, covering a large area of the affective space. In Experiment II, observers (N=60, 32 males, mean age=24.5±3.3) used the EmojiGrid to rate their affective appraisal of 50 validated film fragments varying in positive and negative affect (20 positive, 20 negative, 10 neutral). Results: The results of this study show that for both sound and video, the agreement between the mean ratings obtained with the EmojiGrid and those obtained with an alternative and validated affective rating tool in previous studies in the literature, is excellent for valence and good for arousal. Our results also show the typical universal U-shaped relation between mean valence and arousal that is commonly observed for affective sensory stimuli, both for sound and video. Conclusions: We conclude that the EmojiGrid can be used as an affective self-report tool for the assessment of sound and video-evoked emotions.
Thesis
Les émoticônes sont souvent utilisées dans les environnements numériques pour transmettre des émotions. Bien qu’une grande diversité d’émoticônes existe, on sait peu de choses sur la façon dont elles transmettent des émotions par rapport à d’autres modes d’expression, et peu d’études se sont intéressées à leur utilisation dans un contexte scolaire. Dans cette thèse, quatre études (N = 291) ont été réalisées pour concevoir de « nouvelles » émoticônes représentant de manière non ambigüe les six émotions de base, trois études (N = 957) ont cherché à comparer la qualité de reconnaissance des émotions à partir de ces « nouvelles » émoticônes par rapport à d’autres modes d’expression, et notamment les expressions faciales. Une dernière étude a examiné la façon dont ces émoticônes sont utilisées sur une webradio par des collégiens (N = 204). Les résultats ont montré que les « nouvelles » émoticônes véhiculent les émotions plus efficacement et plus intensément que les expressions faciales et les émoticônes de Facebook et iOS. Cette meilleure reconnaissance est principalement due aux émotions négatives de dégoût et de tristesse. L’implémentation de ces « nouvelles » émoticônes sur la Wikiradio© Saooti a permis d’étudier leurs usages en contexte scolaire. Les résultats ont montré que, indépendamment du genre des élèves, l’utilisation de l’émoticône véhiculant la joie a été privilégiée pour exprimer les émotions à l’égard des émissions réalisées par des pairs. Ces résultats suggèrent la nécessité de concevoir des émoticônes spécifiques pour transmettre de manière non ambigüe des émotions dans les environnements numériques et étudier leurs effets sur les comportements.
Article
A cursor control system based on brain-computer interface (BCI) provides efficient computer access. These systems operate without any muscular activity from the user. Conventional BCI-based cursor control systems have several limitations. Therefore, hybrid SSVEP + P300 visual BCI (VBCI)-based cursor control is needed to overcome these limitations. This paper explores the feasibility of using noninvasive hybrid SSVEP + P300 VBCI for cursor control as a universal form of computer access. The proposed cursor control system has a graphical user interface (GUI) design that simultaneously evokes both SSVEP and P300 signals in the human cortex. The performance metrics of the proposed system are compared with conventional SSVEP VBCI and P300 VBCI-based cursor control systems. The proposed hybrid SSVEP + P300 BCI-based cursor control system achieves a maximum accuracy of 97.51% with a 27.15 bit/min information transfer rate (ITR). The results proved that the proposed system performed more efficiently than other systems. The proposed system was tested in a noisy environment and found to be suitable for real-world applications.
Article
There is evidence for an age-related decline in the ability to understand non-literal language such as sarcasm. There is also evidence to suggest that devices such as emoticons/emojis may influence sarcasm comprehension in younger adults. However, research examining whether such devices may improve written sarcasm comprehension in older adults is scarce. The present study used an online rating task to investigate the influence of the winking face emoji on both the interpretation and perception of message intent for sarcastic or literal criticism or praise. Results revealed that older adults, in comparison to their younger counterparts, demonstrated deficient ability in interpreting and perceiving sarcastic intent. However, older adults’ interpretation and perception of sarcastic intent were significantly improved when the messages were accompanied by the winking face emoji. This would suggest that the winking face emoji is a clear indicator of sarcastic intent, compensating for the absence of non-verbal cues in written communication, and may play a useful role in successful intergenerational communication.
Article
Facial emojis can express a variety of positive and negative emotions, and are commonly used in digital, written communication. However, little is known about how emojis impact text processing and how different emoji-text combinations give rise to a sender's mental state. In this study, we investigated how facial emojis with positive valence (= happy emojis) and facial emojis with negative valence (= upset emojis) embedded in emotionally ambiguous/neutral text affect the perceived mental state of the sender using ratings (Experiment 1) and the processing of the text messages using Event-Related Potentials (Experiment 2). We predicted that (1) the same text message with happy and upset emojis would convey different sender mental states, and (2) emoji valence would affect the processing of subsequent text in valence-specific ways. Our Experiment 1 results showed that while texts with upset emojis convey specific sender mental states, texts with happy emojis convey positive emotion more generally, with no further differentiation between emojis. In ERPs (Experiment 2), we found that emojis affect subsequent text processing at N400, and emoji valence affects processing downstream at the second word. We concluded that all facial-emojis impact text processing, but happy and upset emojis carry differential social-emotional salience and impact text processing differently when content becomes available.
Article
To identify the most effective type of emojis for inducing empathy, the nonverbal expression factors of emojis that generate empathy differences were categorized as body language types (the presence of movement and contextual information), emotion type (joy and sadness), and degree of bodily expression (upper body and whole body). After dividing the data into joyful and sad emotion groups, differences in empathy according to the body language types and degree of bodily expression of emojis were confirmed. As a result, in the sad emotions group, empathy was higher in the movement type and the type combining movement and contextual information than the static body language type and the contextual information type without movement. However, the difference in empathy according to the degree of body expression and the interaction effect between body language types and degree of body expression were not significant. On the other hand, in the joyful emotions group, neither the main effect nor the interaction effect was significant. These results indicate that the effective emoji types for inducing empathy are the upper body of the movement type and the upper body combined with movement and contextual information. These types are also considered to work more effectively when applied to emotions with low mirroring and emotion recognition rates, such as sad emotion.
Article
The prevalence of a negative body image among women is high. Because of its serious consequences for individuals’ mental health, there is an urgent need to improve current body image interventions. Recent studies using evaluative conditioning to strengthen the association between women’s body and positive (social) stimuli have shown promising results. In two experimental studies, we tested whether incorporating more age appropriate positive social stimuli as unconditioned stimuli (USs) can strengthen the conditioning procedure as a means to enhance women’s body satisfaction. In the experimental condition, participants’ body pictures were systematically followed by the Facebook like-button and youthful smiling faces (study 1, experimental condition: n = 68; control condition: n = 67) or positive Emojis (study 2, experimental condition: n = 64; control condition: n = 67). The results indicated that neither conditioning procedure enhanced participants’ body satisfaction more than a control procedure, and in both studies, there was no valence transfer from the positive USs to the body pictures. Thus, incorporation of age appropriate USs did not result in the anticipated conditioning effects. These findings challenge the utility of current evaluative conditioning procedures as an intervention technique to address a negative body image.
Article
Full-text available
La discriminación de las emociones expresadas a nivel facial es importante para las relaciones sociales, la empatía y la interacción social. El objetivo de este estudio fue observar si existían diferencias en el procesamiento cortical ante dos emociones básicas, la ira y el miedo y definir si la percepción de la ira intensa genera una mayor modulación del componente N170 en amplitud y latencia en comparación con las imágenes de rostros con expresión de miedo intenso. Para este estudio se utilizó la técnica de potenciales evocados con un montaje de 32 canales. Se encontraron diferencias significativas en latencia para las imágenes de rostros que expresan ira intensa, comparados con la condición de imágenes de rostros de miedo intenso. Se encontraron diferencias tanto de la amplitud como de latencia ante imágenes de rostros de ira y miedo intensos en comparación con imágenes de rostros neutros.
Article
Full-text available
The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral, or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (N = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away), and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (N = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer – conveyed by facial expression and face direction – amplifies emotional face processing within triadic group situations.
Article
Full-text available
Emoticons are graphic representations of facial expressions that many e-mail users embed in their messages. These symbols are widely known and commonly recognized among computer-mediated communication (CMC) users, and they are described by most observers as substituting for the nonverbal cues that are missing from CMC in comparison to face-to-face communication. Their empirical impacts, however, are undocumented. An experiment sought to determine the effects of three common emoticons on message interpretations. Hypotheses drawn from literature on nonverbal communication reflect several plausible relationships between emoticons and verbal messages. The results indicate that emoticons' contributions were outweighed by verbal content, but a negativity effect appeared such that any negative message aspect-verbal or graphic-shifts message interpretation in the direction of the negative element.
Article
Full-text available
The current study examined the time course of implicit processing of distinct facial features and the associate event-related potential (ERP) components. To this end, we used a masked priming paradigm to investigate implicit processing of the eyes and mouth in upright and inverted faces, using a prime duration of 33 ms. Two types of prime-target pairs were used: 1. congruent (e.g., open eyes only in both prime and target or open mouth only in both prime and target); 2. incongruent (e.g., open mouth only in prime and open eyes only in target or open eyes only in prime and open mouth only in target). The identity of the faces changed between prime and target. Participants pressed a button when the target face had the eyes open and another button when the target face had the mouth open. The behavioral results showed faster RTs for the eyes in upright faces than the eyes in inverted faces, the mouth in upright and inverted faces. Moreover they also revealed a congruent priming effect for the mouth in upright faces. The ERP findings showed a face orientation effect across all ERP components studied (P1, N1, N170, P2, N2, P3) starting at about 80 ms, and a congruency/priming effect on late components (P2, N2, P3), starting at about 150 ms. Crucially, the results showed that the orientation effect was driven by the eye region (N170, P2) and that the congruency effect started earlier (P2) for the eyes than for the mouth (N2). These findings mark the time course of the processing of internal facial features and provide further evidence that the eyes are automatically processed and that they are very salient facial features that strongly affect the amplitude, latency, and distribution of neural responses to faces.
Article
Full-text available
There is a new generation of emoticons, called emojis, increasingly used in mobile communications and social media. In the last two years, over ten billion of emojis were used on Twitter. Emojis are Unicode graphic symbols, used as a shorthand to express concepts and ideas. In contrast to a small number of well-known emoticons which carry clear emotional contents, there are hundreds of emojis. But what is their emotional contents? We provide the first emoji sentiment lexicon, called Emoji Sentiment Ranking, and draw a sentiment map of the 751 most frequently used emojis. The sentiment of emojis is computed from the sentiment of tweets in which they occur. We have engaged 83 human annotators to label over 1.6 million tweets in 13 European languages by the sentiment polarity (negative, neutral, or positive). About 4\% of the annotated tweets contain emojis. The sentiment analysis of emojis yields several interesting conclusions. It turns out that most of the emojis are positive, especially the most popular ones. The sentiment distribution of the tweets with and without emojis is significantly different. The inter-annotator agreement on the tweets with emojis is higher. Emojis tend to occur at the end of the tweets, and their sentiment polarity increases with the distance. We observe no significant differences in emoji rankings between the 13 languages, and propose our Emoji Sentiment Ranking as a European language-independent resource for automated sentiment analysis. Finally, the paper provides a formalization of sentiment and novel visualization in the form of a sentiment bar.
Article
Full-text available
It is now common practice, in digital communication, to use the character combination ":-)", known as an emoticon, to indicate a smiling face. Although emoticons are readily interpreted as smiling faces, it is unclear whether emoticons trigger face-specific mechanisms or whether separate systems are utilized. A hallmark of face perception is the utilization of regions in the occipitotemporal cortex, which are sensitive to configural processing. We recorded the N170 event-related potential to investigate the way in which emoticons are perceived. Inverting faces produces a larger and later N170 while inverting objects which are perceived featurally rather than configurally reduces the amplitude of the N170. We presented 20 participants with images of upright and inverted faces, emoticons and meaningless strings of characters. Emoticons showed a large amplitude N170 when upright and a decrease in amplitude when inverted, the opposite pattern to that shown by faces. This indicates that when upright, emoticons are processed in occipitotemporal sites similarly to faces due to their familiar configuration. However, the characters which indicate the physiognomic features of emoticons are not recognized by the more laterally placed facial feature detection systems used in processing inverted faces.
Article
Full-text available
The notion of a neuron that responds selectively to the image of a particular complex object has been controversial ever since Gross and his colleagues reported neurons in the temporal cortex of monkeys that were selective for the sight of a monkey's hand (Gross, Rocha-Miranda, & Bender, 1972). Since that time, evidence has mounted for neurons in the temporal lobe that respond selectively to faces. The present paper presents a critical analysis of the evidence for face neurons and discusses the implications of these neurons for models of object recognition. The paper also presents some possible reasons for the evolution of face neurons and suggests some analogies with the development of language in humans.
Article
Full-text available
a b s t r a c t Emoticons seem to enrich computer-mediated communication by improving enjoyment, perceived rich-ness and usefulness of information (Huang, Yen, & Zhang, 2008). Despite their extensive use over the last decades, the way emoticons affect subsequent emotional/cognitive processing is not fully understood. Here we conducted a masked priming experiment that explored the time course of the masked affective priming effect while recording event-related potentials. Type of prime (emoticon vs. word) and prime valence (positive vs. negative) were manipulated to assess their influence in the processing of positive/ negative target words. Results showed a masked affective priming effect in early (N2) and late temporal windows (LPC). This effect was observed in early components for negative target words and in later com-ponents for positive target words. Furthermore, it was restricted to those targets preceded by emoticon primes. Thus, the processing of emoticons seems to be privileged when compared with the words to which they refer.
Article
Full-text available
Emoticons are graphic representations of facial expressions that many e-mail users embed in their messages. These symbols are widely known and commonly recognized among computer-mediated communication (CMC) users, and they are described by most observers as substituting for the nonverbal cues that are missing from CMC in comparison to face-to-face communication. Their empirical impacts, however, are undocumented. An experiment sought to determine the effects of three common emoticons on message interpretations. Hypotheses drawn from literature on nonverbal communication reflect several plausible relationships between emoticons and verbal messages. The results indicate that emoticons’ contributions were outweighed by verbal content, but a negativity effect appeared such that any negative message aspect—verbal or graphic—shifts message interpretation in the direction of the negative element.
Article
Full-text available
Event-related potentials (ERPs) associated with face perception were recorded with scalp electrodes from normal volunteers. Subjects performed a visual target detection task in which they mentally counted the number of occurrences of pictorial stimuli from a designated category such us butterflies. In separate experiments, target stimuli were embedded within a series of other stimuli including unfamiliar human faces and isolated face components, inverted faces, distorted faces, animal faces, and other nonface stimuli. Unman faces evoked a negative potential at 172 msec (N170), which was absent from the ERPs elicited by other animate and inanimate nonface stimuli. N170 was largest over the posterior temporal scalp and was larger over the right than the left hemisphere. N170 was delayed when faces were presented upside-down, but its amplitude did not change. When presented in isolation, eyes elicited an N170 that was significantly larger than that elicited by whole faces, while noses and lips elicited small negative ERPs about 50 msec later than N170. Distorted human faces, in which the locations of inner face components were altered, elicited an N170 similar in amplitude to that elicited by normal faces. However, faces of animals, human hands, cars, and items of furniture did not evoke N170. N170 may reflect the operation of a neural mechanism tuned to detect (as opposed to identify) human faces, similar to the "structural encoder" suggested by Bruce and Young (1986). A similar function has been proposed for the face-selective N200 ERP recorded from the middle fusiform and posterior inferior temporal gyri using subdural electrodes in humans (Allison, McCarthy, Nobre, Puce, & Belger, 1994c). However, the differential sensitivity of N170 to eyes in isolation suggests that N170 may reflect the activation of an eye-sensitive region of cortex. The voltage distribution of N170 over the scalp is consistent with a neural generator located in the occipitotemporal sulcus lateral to the fusiform/inferior temporal region that generates N200.
Article
Full-text available
Previous studies that have found attentional capture effects for stimuli of motivational significance do not directly measure initial attentional deployment, leaving it unclear to what extent these items produce attentional capture. Visual prior entry, as measured by temporal order judgments (TOJs), rests on the premise that allocated attention accelerates perception, that is, stimuli that receive attention first are perceived first; thus, this method is a sensitive and direct measure of the time course of initial attentional deployment. The authors demonstrate, using a novel TOJ paradigm without cues, that displays of faces and facial threat show visual prior entry effects, that is, these stimuli are prioritized by the perceptual-attentional system over other stimuli competing for awareness. This study provides direct evidence of the extent to which motivationally significant stimuli capture attention over other concurrently displayed items in the visual array.
Article
Full-text available
A set of face stimuli called the NimStim Set of Facial Expressions is described. The goal in creating this set was to provide facial expressions that untrained individuals, characteristic of research participants, would recognize. This set is large in number, multiracial, and available to the scientific community online. The results of psychometric evaluations of these stimuli are presented. The results lend empirical support for the validity and reliability of this set of facial expressions as determined by accurate identification of expressions and high intra-participant agreement across two testing sessions, respectively.
Article
Full-text available
Emotional facial expressions provide critical information for social interactions. Above all, angry faces are assumed to reflect potential social threat. We investigated event-related potentials (ERPs) triggered by natural and artificial faces expressing fear, anger, happiness or no emotion in participants with low and high levels of social anxiety. Overall, artificial faces elicited stronger P100 and N170 responses than natural faces. Additionally, the N170 component was larger for emotional compared to neutral facial expressions. Social anxiety was associated with an enhanced emotional modulation of the early posterior negativity (EPN) in response to fearful and angry facial expressions. Additionally, while the late positive potential (LPP) was larger for emotional than for neutral faces in low socially anxious participants, LPPs of higher socially anxious participants did not differ. LPPs might therefore be enhanced in higher socially anxious participants for both emotional and neutral faces. Furthermore, the modulations of the EPN and LPP were comparable between natural and artificial faces. These results indicate that social anxiety influences early perceptual processing of faces and that artificial faces are suitable for psychophysiological emotion research.
Article
Full-text available
We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.
Article
Full-text available
Threatening, friendly, and neutral faces were presented to test the hypothesis of the facilitated perceptual processing of threatening faces. Dense sensor event-related brain potentials were measured while subjects viewed facial stimuli. Subjects had no explicit task for emotional categorization of the faces. Assessing early perceptual stimulus processing, threatening faces elicited an early posterior negativity compared with nonthreatening neutral or friendly expressions. Moreover, at later stages of stimulus processing, facial threat also elicited augmented late positive potentials relative to the other facial expressions, indicating the more elaborate perceptual analysis of these stimuli. Taken together, these data demonstrate the facilitated perceptual processing of threatening faces. Results are discussed within the context of an evolved module of fear (A. Ohman & S. Mineka, 2001).
Article
Full-text available
According to current ERP literature, face specific activity is reflected by a negative component over the inferior occipito-temporal cortex between 140 and 180 ms after stimulus onset (N170). A recently published study (Liu et al., 2002) using magnetoencephalography (MEG) clearly indicated that a face-selective component can be observed at 100 ms (M100) which is about 70 ms earlier than reported in most previous studies. Here we report these early differences at 107 ms between the ERPs of faces and buildings over the occipito-temporal cortex using electroencephalography. To exclude contrast differences as the main factor for this P100 differences we replicated this study using pictures of faces and scrambled faces. Both studies indicated that face processing starts already at approximately 100 ms with an initial stage which can be measured not only with MEG but also with ERPs.
Article
Full-text available
According to the traditional two-stage model of face processing, the face-specific N170 event-related potential (ERP) is linked to structural encoding of face stimuli, whereas later ERP components are thought to reflect processing of facial affect. This view has recently been challenged by reports of N170 modulations by emotional facial expression. This study examines the time-course and topography of the influence of emotional expression on the N170 response to faces. Dense-array ERPs were recorded in response to a set (n = 16) of fear and neutral faces. Stimuli were normalized on dimensions of shape, size and luminance contrast distribution. To minimize task effects related to facial or emotional processing, facial stimuli were irrelevant to a primary task of learning associative pairings between a subsequently presented visual character and a spoken word. N170 to faces showed a strong modulation by emotional facial expression. A split half analysis demonstrates that this effect was significant both early and late in the experiment and was therefore not associated with only the initial exposures of these stimuli, demonstrating a form of robustness against habituation. The effect of emotional modulation of the N170 to faces did not show significant interaction with the gender of the face stimulus, or hemisphere of recording sites. Subtracting the fear versus neutral topography provided a topography that itself was highly similar to the face N170. The face N170 response can be influenced by emotional expressions contained within facial stimuli. The topography of this effect is consistent with the notion that fear stimuli exaggerates the N170 response itself. This finding stands in contrast to previous models suggesting that N170 processes linked to structural analysis of faces precede analysis of emotional expression, and instead may reflect early top-down modulation from neural systems involved in rapid emotional processing.
Article
Full-text available
To investigate the time course of emotional expression processing, we recorded ERP responses to stimulus arrays containing neutral versus angry, disgusted, fearful, happy, sad, or surprised faces. In one half of the experiment, the task was to discriminate emotional and neutral facial expressions. Here, an enhanced early frontocentral positivity was elicited in response to emotional as opposed to neutral faces, followed by a broadly distributed positivity and an enhanced negativity at lateral posterior sites. These emotional expression effects were very similar for all six basic emotional expressions. In the other half of the experiment, attention was directed away from the faces toward a demanding perceptual discrimination task. Under these conditions, emotional expression effects were completely eliminated, demonstrating that brain processes involved in the detection and analysis of facial expression require focal attention. The face-specific N170 component was unaffected by any emotional expression, supporting the hypothesis that structural encoding and expression analysis are independent processes.
Article
The human face is an evolved adaptation for social communication. This implies that humans are genetically prepared to produce facial gestures that are automatically decoded by observers. Psychophysiological data demonstrate that humans respond automatically with their facial muscles, with autonomic responses, and with specific regional brain activation of the amygdala when exposed to emotionally expressive faces. Attention is preferentially and automatically oriented toward facial threat. Neuropsychological data, as well as a rapidly expanding brain-imaging literature, implicate the amygdala as a central structure for responding to negative emotional faces, and particularly to fearful ones. However, the amygdala may not be specialized for processing emotional faces, but may instead respond to faces because they provide important information for the defense appraisal that is its primary responsibility.
Article
Very little is known about how faces or other objects are analysed by human brain. In this paper I try to show that valuable information about some of the physiological processes involved can be derived from scalp-recorded, visual evoked potentials. The response properties of a distinct scalp potential, the ''vertex positive peak" (VPP), are described, which suggest that there is a low-level, stimulusrelated stage of processing, probably located in the inferotemporal cortex, which is designed to detect very rapidly the suddenly fixated images of single faces or objects. The underlying cortical mechanisms: (1) respond to any two-dimensional patterned image consistent with the basic structure of a face; (2) respond more strongly and more quickly to faces than to objects; and (3) respond fastest of all to the most commonly experienced views of complete faces. Preliminary evidence further suggests that different neuronal populations in the same cortical areas are involved in the generation of face- and object-evoked VPP responses. The functional implications of these evoked potential findings are discussed, also their relationship to the results of other electrophysiological and psychological studies in humans and of neurobiological studies in monkeys. Some ways of further enhancing the information derived from such experiments are also suggested.
Article
Emotional faces are motivationally salient stimuli that automatically capture attention and rapidly potentiate neural processing. Because of their superior temporal resolution, scalp-recorded event-related potentials (ERPs) are ideal for examining rapid changes in neural activity. Some reports have found larger ERPs for fearful and angry faces compared to both neutral and other emotional faces, and a key aim of the present study was to assess neural response to multiple emotional expressions using the NIMSTIM. Importantly, no study has yet systematically evaluated neural activity and selfreport ratings for multiple NIMSTIM expressions. Study 1 examined the time-course of electrocortical activity in response to fearful, angry, sad, happy, and neutral NIMSTIM faces. In Study 2, valence and arousal ratings were collected for the same faces in a separate sample. In line with previous findings, the early P1 was larger for fearful compared to neutral faces. The vertex positivity (VPP) was enhanced for fearful, angry, and happy expressions compared to neutral. There was no effect of expression on the N170. Marginally significant enhancements were observed for all expressions during the early posterior negativity (EPN). The late positive potential (LPP) was enhanced only for fearful and angry faces. All emotional expressions were rated as more arousing and more pleasant/unpleasant than neutral expressions. Overall, findings suggest that angry and fearful faces might be especially potent in terms of eliciting ERP responses and ideal for emotion research when more evocative images cannot be used.
Article
The nature and early time course of the initial processing differences between visually matched linguistic and nonlinguistic images were studied with event-related potentials (ERPs). The first effect began at 90 ms when ERPs to written words diverged from other objects, including faces. By 125 ms, ERPs to words and faces were more positive than those to other objects, effects identified with the P150. The amplitude and scalp distribution of P150s to words and faces were similar. The P150 seemed to be elicited selectively by images resembling any well-learned category of visual patterns. We propose that (a) visual perceptual categorization based on long-term experience begins by 125 ms, (b) P150 amplitude varies with the cumulative experience people have discriminating among instances of specific categories of visual objects (e.g., words, faces), and (c) the P150 is a scalp reflection of letterstring and face intracranial ERPs in posterior fusiform gyrus.
Article
Recent studies have shown that the late positive component of the event-related-potential (ERP) is enhanced for emotional pictures, presented in an oddball paradigm, evaluated as distant from an established affective context. In other research, with context-free, random presentation, affectively intense pictures (pleasant and unpleasant) prompted similar enhanced ERP late positivity (compared with the neutral picture response). In an effort to reconcile interpretations of the late positive potential (LPP), ERPs to randomly ordered pictures were assessed, but using the faster presentation rate, brief exposure (1.5 s), and distinct sequences of six pictures, as in studies using an oddball based on evaluative distance. Again, results showed larger LPPs to pleasant and unpleasant pictures, compared with neutral pictures. Furthermore, affective pictures of high arousal elicited larger LPPs than less affectively intense pictures. The data support the view that late positivity to affective pictures is modulated both by their intrinsic motivational significance and the evaluative context of picture presentation.
Article
The complex musculature of the human face has been shaped by natural selection to produce gestures that communicate information about intentions and emotional states between senders and receivers. According to the preparedness hypothesis, different facial gestures are differentially prepared by evolution to become associated with different outcomes. As attested by psychophysiological responses in Pavlovian conditioning experiments, expressions of anger and fear more easily become signals for aversive stimuli than do expression of happiness. Consistent with the evolutionary perspective, the superior conditioning to angry faces is stronger for male than for female faces, for adult than for child faces, and for faces directed toward the receiver rather than directed away. Furthermore, it appears to be primarily located in the right cerebral hemisphere. The enhanced autonomic activity to angry faces signaling electric shock is not mediated by conscious cognitive activity, but is evident also when recognition of the facial stimulus is blocked by backward masking procedures. Similarly, conditioned responses can be established to masked angry, but not to masked happy faces. Electromyographic measurement of facial muscle activity reveals a tendency for emotional facial expression to rapidly and automatically elicit its mirror image in the face of the receiver, typically accompanied by the appropriate emotional experience. The research reviewed in this paper supports the proposition that humans have been evolutionarily tuned to respond automatically to facial stimuli, and it is suggested that such early automatic reactions shape the subsequent conscious emotional processing of the stimulus.
Article
Populations of visual neurones have been discovered in one area of the temporal association cortex that respond to different aspects of facial information. The responses of these cells have many of the properties hypothesized for ‘gnostic units’ and provide insight into the final stages of visual processing leading to recognition of an object as a face and more specifically the identity of the face.
Article
Research on the neural mechanisms of face identity constitutes a fruitful method to explore the affective contributions to face processing. Here, we investigated central and peripheral electrophysiological indices associated with the perception of loved faces. Subjects viewed black-and-white photographs of faces that belonged to one of five categories: loved ones, famous people, unknown people, babies, and neutral faces from the Eckman and Friesen system. Subcategories of loved faces included romantic partner, parents, siblings, second-degree relatives, and friends. Pictures were presented in two separate blocks, differing in viewing time (0.5s vs. 4s), inter-stimulus interval (1.2s vs. 18s), and number of face presentations (200 vs. 50). Heart rate, skin conductance, electromyography of the zygomatic muscle, and event-related potentials (ERPs) were obtained while participants passively viewed the pictures. Subjective picture ratings of valence, arousal, and dominance were obtained at the end of the experiment. Both central and peripheral electrophysiological measures differentiated faces of loved ones from all other categories by eliciting higher heart rate, skin conductance, and zygomatic activity, as well as larger amplitudes of the late ERP components P3 and LPP. Loved faces also resulted in higher valence and arousal, but lower dominance ratings. Additional differences were found among subcategories of loved faces. Faces of romantic partners elicited higher physiological (skin conductance and zygomatic activity) and subjective (emotional arousal) responses than parents, siblings, or friends, suggesting that looking at the image of someone we love evokes strong positive affect and emotional/cognitive arousal that go beyond a feeling of familiarity or simple recognition.
Article
The foundations of orienting and attention are hypothesized to stem from activation of defensive and appetitive motivational systems that evolved to protect and sustain the life of the individual. Motivational activation initiates a cascade of perceptual and motor processes that facilitate the selection of appropriate behavior. Among these are detection of significance, indexed by a late centro-parietal positivity in the event-related potential, enhanced perceptual processing, indexed by a initial cardiac deceleration, and preparation for action, indexed by electrodermal changes. Data exploring the role of stimulus novelty and significance in orienting are presented that indicate different components of the orienting response habituate at different rates. Taken together, it is suggested that orienting is mediated by activation of fundamental motivational systems that have evolved to support survival.
A new off-line procedure for dealing with ocular artifacts in ERP recording is described. The procedure (EMCP) uses EOG and EEG records for individual trials in an experimental session to estimate a propagation factor which describes the relationship between the EOG and EEG traces. The propagation factor is computed after stimulus-linked variability in both traces has been removed. Different propagation factors are computed for blinks and eye movements. Tests are presented which demonstrate the validity and reliability of the procedure. ERPs derived from trials corrected by EMCP are more similar to a 'true' ERP than are ERPs derived from either uncorrected or randomly corrected trials. The procedure also reduces the difference between ERPs which are based on trials with different degrees of EOG variance. Furthermore, variability at each time point, across trials, is reduced following correction. The propagation factor decreases from frontal to parietal electrodes, and is larger for saccades than blinks. It is more consistent within experimental sessions than between sessions. The major advantage of the procedure is that it permits retention of all trials in an ERP experiment, irrespective of ocular artifact. Thus, studies of populations characterized by a high degree of artifact, and those requiring eye movements as part of the experimental task, are made possible. Furthermore, there is no need to require subjects to restrict eye movement activity. In comparison to procedures suggested by others, EMCP also has the advantage that separate correction factors are computed for blinks and movements and that these factors are based on data from the experimental session itself rather than from a separate calibration session.
Article
The Self-Assessment Manikin (SAM) is a non-verbal pictorial assessment technique that directly measures the pleasure, arousal, and dominance associated with a person's affective reaction to a wide variety of stimuli. In this experiment, we compare reports of affective experience obtained using SAM, which requires only three simple judgments, to the Semantic Differential scale devised by Mehrabian and Russell (An approach to environmental psychology, 1974) which requires 18 different ratings. Subjective reports were measured to a series of pictures that varied in both affective valence and intensity. Correlations across the two rating methods were high both for reports of experienced pleasure and felt arousal. Differences obtained in the dominance dimension of the two instruments suggest that SAM may better track the personal response to an affective stimulus. SAM is an inexpensive, easy method for quickly assessing reports of affective response in many contexts.
Article
Natural visual scenes contain vast quantities of information--far more than the visual system can process in a short period of time-and spatial attention is therefore used to focus the visual system's processing resources onto a subset of the incoming visual information. Most psychological theories of attention posit a single mechanism for this focusing of attention, but recent electrophysiological studies have provided evidence that the visual system employs several separable neural mechanisms of spatial attention. This paper describes the evidence for multiple attentional mechanisms and suggests links between these neurophysiologically defined mechanisms and specific functional processes that have been proposed in psychological theories of attention.
Article
A central issue in cognitive neuroscience concerns the functional architecture of the prefrontal cortex and the degree to which it is organized by sensory domain. To examine this issue, multiple areas of the macaque monkey prefrontal cortex were mapped for selective responses to visual stimuli that are prototypical of the brain's object vision pathway—pictorial representations of faces. Prefrontal neurons not only selectively process information related to the identity of faces but, importantly, such neurons are localized to a remarkably restricted area. These findings suggest that the prefrontal cortex is functionally compartmentalized with respect to the nature of its inputs.
Article
The nature and early time course of the initial processing differences between visually matched linguistic and nonlinguistic images were studied with event-related potentials (ERPs). The first effect began at 90 ms when ERPs to written words diverged from other objects, including faces. By 125 ms, ERPs to words and faces were more positive than those to other objects, effects identified with the P150. The amplitude and scalp distribution of P150s to words and faces were similar. The P150 seemed to be elicited selectively by images resembling any well-learned category of visual patterns. We propose that (a) visual perceptual categorization based on long-term experience begins by 125 ms, (b) P150 amplitude varies with the cumulative experience people have discriminating among instances of specific categories of visual objects (e.g., words, faces), and (c) the P150 is a scalp reflection of letterstring and face intracranial ERPs in posterior fusiform gyrus.
Article
Faces and eyes are critical social stimuli which adults process with ease, but how this expertise develops is not yet understood. Neural changes associated with face and eye processing were investigated developmentally using ERPs (N170), in 128 subjects (4-15 year olds and adults). Stimuli included upright faces to assess configural processing, eyes and inverted faces to assess feature-based processing. N170 was present in the youngest children with similar patterns of face sensitivity seen in adults. Development of N170 to upright faces continued until adulthood, suggesting slow maturation of configural processing. In contrast, N170 was shorter latency and much larger to eyes than faces in children and was mature by 11 years, suggesting the early presence of an eye detector, with a rapid maturational course.
Article
Electrophysiological and hemodynamical responses of the brain allow investigation of the neural origins of human attention. We review attention-related brain responses from auditory and visual tasks employing oddball and novelty paradigms. Dipole localization and intracranial recordings as well as functional magnetic resonance imaging reveal multiple areas involved in generating and modulating attentional brain responses. In addition, the influence of brain lesions of circumscribed areas of the human cortex onto attentional mechanisms are reviewed. While it is obvious that damaged brain tissue no longer functions properly, it has also been shown that functions of non-lesioned brain areas are impaired due to loss of modulatory influence of the lesioned area. Both early (P1 and N1) and late (P3) event-related potentials are modulated by excitatatory and inhibitory mechanisms. Oscillatory EEG-correlates of attention in the alpha and gamma frequency range also show attentional modulation.
Article
Here we used magnetoencephalography (MEG) to investigate stages of processing in face perception in humans. We found a face-selective MEG response occurring only 100 ms after stimulus onset (the 'M100'), 70 ms earlier than previously reported. Further, the amplitude of this M100 response was correlated with successful categorization of stimuli as faces, but not with successful recognition of individual faces, whereas the previously-described face-selective 'M170' response was correlated with both processes. These data suggest that face processing proceeds through two stages: an initial stage of face categorization, and a later stage at which the identity of the individual face is extracted.
Article
A key function of emotion is the preparation for action. However, organization of successful behavioral strategies depends on efficient stimulus encoding. The present study tested the hypothesis that perceptual encoding in the visual cortex is modulated by the emotional significance of visual stimuli. Event-related brain potentials were measured while subjects viewed pleasant, neutral, and unpleasant pictures. Early selective encoding of pleasant and unpleasant images was associated with a posterior negativity, indicating primary sources of activation in the visual cortex. The study also replicated previous findings in that affective cues also elicited enlarged late positive potentials, indexing increased stimulus relevance at higher-order stages of stimulus processing. These results support the hypothesis that sensory encoding of affective stimuli is facilitated implicitly by natural selective attention. Thus, the affect system not only modulates motor output (i.e., favoring approach or avoidance dispositions), but already operates at an early level of sensory encoding.
Article
We are an intensely social species--it has been argued that our social nature defines what makes us human, what makes us conscious or what gave us our large brains. As a new field, the social brain sciences are probing the neural underpinnings of social behaviour and have produced a banquet of data that are both tantalizing and deeply puzzling. We are finding new links between emotion and reason, between action and perception, and between representations of other people and ourselves. No less important are the links that are also being established across disciplines to understand social behaviour, as neuroscientists, social psychologists, anthropologists, ethologists and philosophers forge new collaborations.
Article
Facial emotions represent an important part of non-verbal communication used in everyday life. Recent studies on emotional processing have implicated differing brain regions for different emotions, but little has been determined on the timing of this processing. Here we presented a large number of unfamiliar faces expressing the six basic emotions, plus neutral faces, to 26 young adults while recording event-related potentials (ERPs). Subjects were naive with respect to the specific questions investigated; it was an implicit emotional task. ERPs showed global effects of emotion from 90 ms (P1), while latency and amplitude differences among emotional expressions were seen from 140 ms (N170 component). Positive emotions evoked N170 significantly earlier than negative emotions and the amplitude of N170 evoked by fearful faces was larger than neutral or surprised faces. At longer latencies (330-420 ms) at fronto-central sites, we also found a different pattern of effects among emotions. Localization analyses confirmed the superior and middle-temporal regions for early processing of facial expressions; the negative emotions elicited later, distinctive activations. The data support a model of automatic, rapid processing of emotional expressions.
Apple Says 'Face With Tears of Joy' is Most Popular Emoji in United States Among English Speakers
  • Macrumors
MacRumors. 2017. Apple Says 'Face With Tears of Joy' is Most Popular Emoji in United States Among English Speakers. https://www.macrumors.com/2017/11/03/face-with-tearsof-joy-most-popular-emoji/?utm_source=feedly&utm_ medium=webfeeds.