Figure - available from: Current Psychology
This content is subject to copyright. Terms and conditions apply.
The canthal tilt (black line) is the line connecting the medial canthus of the eye (1) and the lateral canthus (2). The horizontal diameter of the pupil (white line) to the canthal tilt ratio can be expressed as pupil/tilt index. The pupil/tilt index of the presented left eye is 0.21 (anger) independently from the zoom and the size of a picture
Source publication
We tested the hypothesis that a still-face person keeping direct gaze can express an emotion. The prospective study involved healthy volunteers as senders (n = 100). Four pictures of the face of a participant who tried to express certain emotion by gaze were obtained (Neutral – Angry – Sad – Happy). Twenty blinded judges judged the ability to expre...
Similar publications
Viewing faces that are perceived as emotionally expressive evokes enhanced neural responses in multiple brain regions, a phenomenon thought to depend critically on the amygdala. This emotion-related modulation is evident even in primary visual cortex (V1), providing a potential neural substrate by which emotionally salient stimuli can affect percep...
Key-frame selection plays an important role in facial expression recognition systems. It helps in selecting the most representative frames that capture the different poses of the face. The effect of the number of selected keyframes has been studied in this paper to find its impact on the final accuracy of the emotion recognition system. Dynamic and...
Emotional expressions are the behaviors that communicate our emotional state or attitude to others. They are expressed through verbal and non-verbal communication. Complex human behavior can be understood by studying physical features from multiple modalities; mainly facial, vocal and physical gestures. Recently, spontaneous multi-modal emotion rec...
Emotion expressions convey information. One important kind of information, from a communicative perspective, is information about what is demanded of the recipients of the expression. Compared to the vast body of research that focuses on whether, and to what degree, emotion expressions convey information about inner states, there is surprisingly li...
Emotional expressions are the behaviors that communicate our emotional state or attitude to others. They are expressed through verbal and non-verbal communication. Complex human behavior can be understood by studying physical features from multiple modalities; mainly facial, vocal and physical gestures. Recently, spontaneous multi-modal emotion rec...
Citations
... Such a gamer can no longer control his emotions and actions should he/she choose to continue the game [10]. The game itself in this case becomes an irritating factor, as the gamer cannot succeed or enter the flux in the process of gaming [11]. Therefore, it is better to finish the game, because the probability of further success is extremely low. ...
Farcana has developed a smart a gaming input device, that, apart from being a tool for the gamer to use in the process of gameplay, is also a suitable tool to collect biomedical information about the gamer, which after analysis by the artificial intelligence (AI) system allows informing the gamer about whether the individual is in a state of tilt. Tilt itself is a poor emotional state of the individual that appears due to the latter's inability to control one's emotions in the process of gameplay. The gamer can be either winning or losing, yet the fact that he/she can neither control nor even acknowledge the emotional state is tilt. The latter is an immense factor of impact on the overall success of the individual in the sphere of gaming and one's rating in cybersport. This paper has analyzed numerous studies and patents on the topic at hand. The available literature has provided the necessary insight on the topic of tilt and why it is important to help the gamer acknowledge one's state, especially given deteriorating results. Also, we have proposed a framework for the AI system for tilt recognition.
... For emotion recognition methods, there are mainly scale methods [13,14], machine learning [15][16][17], and deep learning algorithms [18][19][20]. e used recognition data are electroencephalogram (EEG) [21], electrocardiogram (ECG) [22], voice [23], video [24], expression [25], action [26], and so on. For the application field of this paper, athletes can collect audio, video, EEG, ECG, and other data when exercising. ...
Athletes participate in competitive competitions, and the ultimate goal is to better display their personal competitive level in the competition so as to achieve the goal of defeating their opponents and winning the competition. In all types of competitions, most matches are instantaneous, and opportunities are fleeting. The instantaneous nature and fierce competition of sports competitions require athletes who participate in sports competitions to have a high psychological quality. It can be seen that the quality of the mental state directly determines the performance of the athletes in usual training and competition. In the process of sports, if athletes can obtain real-time changes in their mental states when they encounter various situations, they can formulate more targeted and effective training or competition strategies according to the athletes’ states. For the opponent, by analyzing the opponent’s psychological state during exercise, the game strategy can be adjusted in real time in a targeted manner, and the probability of winning the game can be provided. Based on this background, this paper proposes to use support vector machine (SVM) to identify the mental state of athletes during exercise. This paper first collects the data of body movements and facial expressions of athletes during training or competition. Use multimodal data to train an SVM model. Output the emotional state of athletes at different stages based on test data. In order to verify the applicability of the method in this paper to the athlete subjects, several comparative models were used in the experiment to verify the performance of the used models. The experimental results show that the accuracy rate of emotion recognition obtained by this method is more than 80%. This shows that the research in this paper has certain application value.
... The ANS-controlled cues include facial reddening, tears, and the pupil dilatation. Following this approach, the emotional direct gaze is predominantly ANS-controlled phenomenon (Roitblat et al. 2019). ...
... It was demonstrated in our previous study (Report 1; Roitblat et al. 2019) that the facial expression of emotions can be adequately achieved with minimal muscular involvement when mainly the gaze expresses an emotion by combined slight activity of small intraorbital muscles and broader involvement of ANS-controlled reactions such as the pupil dilatation, aqueous humor formation ("moist eyes") and aqueous humor outflow, contractions of the ciliary muscle, and actions of the Muller's muscle (sympathetic fibers). In the current Report 2 study, we planned to test the hypothesis that emotions being expressed by a still-face person keeping direct gaze can be adequately recognized. ...
... The rationale for choosing such age limits was based on numerous studies of younger children and older adults that revealed some age-related peculiarities in emotion recognition and even "age-related decline in facial emotion identification" concept was suggested (Noh et al. 2011;Noh and Isaacowitz 2015;Naruse et al. 2013;Chaby et al. 2017). At the same time, numerous studies indicate adolescents, younger adults or, broader, "early adulthood" persons as ideal participants for emotion recognition studies (Noh et al. 2011;Noh and Isaacowitz 2015;Widen et al. 2015;Roitblat et al. 2019;Novello et al. 2018). ...
We tested the hypothesis that an emotion expressed by a still-face person keeping direct gaze can be adequately recognized. The prospective study involved 800 volunteers as judges (age 16–25). All the judges were presented with three 10-picture sets of still faces expressing angry, sad, and happy emotions by vivid gaze only. The first 10-picture set consisted of full faces, the second set consisted of the middle part of the face pictures, and the third set presented the eyes only. Half of the judges (Group 1, n = 400) were asked to label the pictures as Angry, Sad, or Happy. Another 400 judges (Group 2) were asked to label the same pictures as Angry, Sad, Surprised, Frightened, or Happy to validate the selected sets of pictures. Group 1 judges presented correct recognition of the emotion in 72% for full-face pictures, 73% agreement for middle parts, and 68% agreement for the eyes only, while in Group 2 the agreements were 63%, 68%, and 48.5% respectively. After Bonferroni correction for multiple comparisons, our results show that the gaze was the only emotional cue in all tests and the presented emotions were recognized with relatively equal degree of success. We confirm our initial statement (Report 1) that the expression of an emotion can be adequately achieved with minimal facial muscular involvement when predominantly the gaze expresses emotion. In most of the cases, an “emotional gaze” during a still-face or static face situation is adequately recognized by people but the emotion recognition skill has individual variations.
... Recognition of facial expressions has been one of the most discussed topics (Tcherkassof, Bollon, Dubois, Pansu, & Adam, 2007). Face is a fundamental element to express and/ or recognize emotions, not a single expression of facial muscles per se (Roitblat, Cohensedgh, Frig-Levinson, Suman, & Shterenshis, 2019). Consequently, facial expressions play an essential role in social interactions and communication with its' coherence with subjective emotional states as a more appropriate approach to the problem (Ekman & Rosenberg, 1997;Roitblat et al., 2020;Souto et al., 2020). ...
... As far as research on schizophrenia goes, studies on emotional deficits have focused mainly on three dimensions of emotion: the expression of emotions, the experience of emotions and the recognition of emotions (Trémeau, 2006). Specifically, the facial component of emotion has been widely studied, identifying two distinct focuses in the investigation: the ability to express an emotion (coding studies) and the ability to recognize the expressed emotion (decoding studies) (Roitblat et al., 2019). ...
Emotional recognition is a part of more general behaviour to promote adequate inter-relationship patterns and social functionality. Previous results on emotional recognition in individuals with a diagnosis of schizophrenia and other psychotic disorders suggest an impairment of this ability resulting in social inability. The purpose of the present pilot study was to investigate if the sensory modalities – visual and auditory- of stimuli interferes in the evaluation of emotional recognition and the relation between time of diagnosis and time of permanence in the rehabilitation centre and emotional recognition. The study involved a sample of thirty individuals, of both genders (70% men and 30% women), aged between 26 and 58 years old with a clinical diagnosis of schizophrenia or other psychotic disorders that attended two psychosocial rehabilitation centers of Oporto. They were presented with sets of visual (faces) and auditive (phrases) stimuli in order to evaluate their ability to recognize Ekman’s universal emotions. The results point to a better recognition of visual stimuli when compared auditive stimuli. In both sensory modality, joy and surprise are the emotions more correctly recognized with a significant statistical difference between modalities. For the emotion of sadness, the ability to better recognize sadness is through auditory stimuli. The time of diagnosis is positive and significantly associated with the recognition of the emotions of disgust and fear when using prosody. As to the impact of time of permanence in the institution and the ability to recognize emotions, the results suggest that a longer permanence is related to a more impaired ability for recognition. Evaluation and training of emotional recognition such be rethought.