首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 847 毫秒
1.
When we perceive the emotions of other people, we extract much information from the face. The present experiment used FACS (Facial Action Coding System), which is an instrument that measures the magnitude of facial action from a neutral face to a changed, emotional face. Japanese undergraduates judged the emotion in pictures of 66 static Japanese male faces (11 static pictures for each of six basic expressions: happiness, surprise, fear, anger, sadness, and disgust), ranging from neutral faces to maximally expressed emotions. The stimuli had previously been scored with FACS and were presented in random order. A high correlation between the subjects' judgments of facial expressions and the FACS scores was found.  相似文献   

2.
Categorical perception, indicated by superior discrimination between stimuli that cross categorical boundaries than between stimuli within a category, is an efficient manner of classification. The current study examined the development of categorical perception of emotional stimuli in infancy. We used morphed facial images to investigate whether infants find contrasts between emotional facial images that cross categorical boundaries to be more salient than those that do not, while matching the degree of differences in the two contrasts. Five‐month‐olds exhibited sensitivity to the categorical boundary between sadness and disgust, between happiness and surprise, as well as between sadness and anger but not between anger and disgust. Even 9‐month‐olds failed to exhibit evidence of a definitive category boundary between anger and disgust. These findings indicate the presence of discrete boundaries between some, but not all, of the basic emotions early in life. Implications of these findings for the major theories of emotion representation are discussed.  相似文献   

3.
The Intensity of Emotional Facial Expressions and Decoding Accuracy   总被引:2,自引:0,他引:2  
The influence of the physical intensity of emotional facial expressions on perceived intensity and emotion category decoding accuracy was assessed for expressions of anger, disgust, sadness, and happiness. The facial expressions of two men and two women posing each of the four emotions were used as stimuli. Six different levels of intensity of expression were created for each pose using a graphics morphing program. Twelve men and 12 women rated each of the 96 stimuli for perceived intensity of the underlying emotion and for the qualitative nature of the emotion expressed. The results revealed that perceived intensity varied linearly with the manipulated physical intensity of the expression. Emotion category decoding accuracy varied largely linearly with the manipulated physical intensity of the expression for expressions of anger, disgust, and sadness. For the happiness expressions only, the findings were consistent with a categorical judgment process. Sex of encoder produced significant effects for both dependent measures. These effects remained even after possible gender differences in encoding were controlled for, suggesting a perceptual bias on the part of the decoders.  相似文献   

4.
A method for studying emotional expression using posthypnotic suggestion to induce emotional states is presented. Subjects were videotaped while roleplaying happiness, sadness or anger or after hypnotic induction of one of these three emotions. Judges then rated these videotapes for emotional tone. Results indicated a main effect for emotion expressed, with happiness and sadness more easily identified by judges than anger. Accuracy was also greater for happiness and sadness in the hypnotically induced condition. However, role-played anger was more easily identified than hypnotically induced anger. An interaction of channel (body/face) and emotion indicated that identification of sadness and anger was easier for judges when the body alone was shown. Findings are discussed in terms of display rules for the expression of emotion.We gratefully acknowledge Irving Kirsch, Ross Buck, and Paul Ekman for their helpful comments on a draft of this article. Special thanks to Reuben Baron, without whose support neither this article nor our careers in psychology would have been possible.A preliminary report of this study was presented at the meeting of the American Psychological Association in Toronto, August 1984.  相似文献   

5.
While numerous studies have investigated children’s recognition of facial emotional expressions, little evidence has been gathered concerning their explicit knowledge of the components included in such expressions. Thus, we investigated children’s knowledge of the facial components involved in the expressions of happiness, sadness, anger, and surprise. Four- and 5-year-old Japanese children were presented with the blank face of a young character, and asked to select facial components in order to depict the emotions he felt. Children’s overall performance in the task increased as a function of age, and was above chance level for each emotion in both age groups. Children were likely to select the Cheek raiser and Lip corner puller to depict happiness, the Inner brow raiser, Brow lowerer, and Lid droop to depict sadness, the Brow lowerer and Upper lid raiser to depict anger, and the Upper lid raiser and Jaw drop to depict surprise. Furthermore, older children demonstrated a better knowledge of the involvement of the Upper lid raiser in surprise expressions.  相似文献   

6.
Facial expressions of fear and disgust have repeatedly been found to be less well recognized than those of other basic emotions by children. We undertook two studies in which we investigated the recognition and visual discrimination of these expressions in school-age children. In Study 1, children (5, 6, 9, and 10 years of age) were shown pairs of facial expressions, and asked to tell which one depicted a target emotion. The results indicated that accuracy in 9- and 10-year-olds was higher than in 5- and 6-year-olds for three contrasts: disgust–anger, fear–surprise, and fear–sadness. Younger children had more difficulty recognizing disgust when it was presented along with anger, and in recognizing fear when it was presented along with surprise. In Study 2, children (5, 6, 9, and 10 years of age) were shown a target expression along with two other expressions, and were asked to point to the expression that was the most similar to the target. Contrary to our expectations, even 5- and 6-year-olds were very accurate in discriminating fear and disgust from the other emotions, suggesting that visual perception was not the main limiting factor for the recognition of these emotions in school-age children.  相似文献   

7.
The facial expressions of emotion and the circumstances under which the expressions occurred in a sample of the most popular children's television programs were investigated in this study. Fifteen-second randomly selected intervals from episodes of five television programs were analyzed for displays of happiness, sadness, anger, fear, disgust, and surprise. In addition, the contexts in which the emotions occurred were examined. Results indicated that particular emotional expressions occurred at significantly different frequencies and that there was an association between emotional displays and emotion-contexts. The high rate of emotional displays found in television shows has implications for the development of knowledge regarding emotional display rules in viewers.We are grateful to Sharon Galligan for assistance in coding part of the data and to Carolyn Saarni and Amy Halberstadt for helpful comments on an earlier draft of this paper. This research was supported in part by a grant from the National Institute of Disabilities and Rehabilitation Research, #GOO85351. The opinions expressed herein do not necessarily reflect the position or policy of the U.S. Department of Education.  相似文献   

8.
In Suicide, Durkheim described two qualitatively different experiences of normative anomie, each with a distinct affective basis: an intentional, if not ruthless, disdain for society's normative order; and an unintentional disregard for, or confusion about, norms or rules of conduct. We generalize Durkheim's classification of the socioaffective aspects of anomic suicide, and present two theoretical models of normlessness‐anomie and the emotions. These models posit that intentional anomie involves the primary emotions anger, disgust, and joy‐happiness; these emotions can combine to form the secondary emotions contempt, pride, and derisiveness. Unintentional, passive anomie rather involves the emotions surprise, fear, and sadness; these can combine to form the secondary emotions disappointment, shame, and alarm. We additionally hypothesize that each kind of anomie has distinct potential behavioral consequences: intentional anomie can result in immorality, shamelessness, acquisitiveness, and premeditated homicidality; unintentional anomie, in depression, confusion, uncertainty, unpremeditated homicidality, and suicidality.  相似文献   

9.
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds’ fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).  相似文献   

10.
Inconsistencies in previous findings concerning the relationship between emotion and social context are likely to reflect the multi-dimensionality of the sociality construct. In the present study we focused on the role of the other person by manipulating two aspects of this role: co-experience of the event and expression of emotion. We predicted that another's co-experience and expression would affect emotional responses and that the direction of these effects would depend upon the manipulated emotion and how the role of the other person is appraised. Participants read vignettes eliciting four different emotions: happiness, sadness, anxiety, and anger. As well as an alone condition, there were four conditions in which a friend was present, either co-experiencing the event or merely observing it, and either expressing emotions consistent with the event or not showing any emotion. There were significant effects of co-experience in the case of anger situations, and of expression in the case of happiness and sadness situations. Social appraisal also appeared to influence emotional response. We discuss different processes that might be responsible for these results.  相似文献   

11.
Adult judges were presented with videotape segments showing an infant displaying facial configurations hypothesized to express discomfort/pain, anger, or sadness according to differential emotions theory (Izard, Dougherty, & Hembree, 1983). The segments also included the infant's nonfacial behavior and aspects of the situational context. Judges rated the segments using a set of emotion terms or a set of activity terms. Results showed that judges perceived the discomfort/pain and anger segments as involving one or more negative emotions not predicted by differential emotions theory. The sadness segments were perceived as involving relatively little emotion overall. Body activity accompanying the discomfort/pain and anger configurations was judged to be more jerky and active than body activity accompanying the sadness configurations. The sadness segments were accompanied by relatively little body movement overall. The results thus fail to conform to the predictions of differential emotions theory but provide information that may contribute to the development of a theory of infant expressive behavior.This article is based on the second author's master's thesis. The authors thank Dennis Ross for his expert assistance in the data analyses.  相似文献   

12.
The relation between knowledge of American Sign Language (ASL) and the ability to decode facial expressions of emotion was explored in this study. Subjects were 60 college students, half of whom were intermediate level students of ASL and half of whom had no exposure to a signed language. Subjects viewed and judged silent video segments of stimulus persons experiencing spontaneous emotional reactions representing either happiness, sadness, anger, disgust, or fear/surprise. Results indicated that hearing subjects knowledgeable in ASL were generally better than hearing non-signers at identifying facial expressions of emotion, although there were variations in decoding accuracy regarding the specific emotion being judged. In addition, females were more successful decoders than males. Results have implications for better understanding the nature of nonverbal communication in deaf and hearing individuals.We are grateful to Karl Scheibe for comments on an earlier version of this paper and to Erik Coats for statistical analysis. This study was conducted as part of a Senior Honors thesis at Wesleyan University.  相似文献   

13.
This study focuses on understanding how words and discrete facial emotions influence credibility perceptions of both prepared statements and spontaneous question and answer sessions. We build on and extend existing theoretical work concerning crises communication and discrete emotions. Using a press conference simulation, spokesperson video recordings were analyzed using automated face-emotion recognition software (FaceReader™) to characterize discrete emotions. A crisis-message-strategy trained dictionary for Linguistic Inquiry and Word Count (LIWC) was used to characterize message content. Our results indicate that spokespeople can control their verbal messages better in prepared statements than in more spontaneous settings, but their facial emotions are quite similar in both settings. Only three discrete emotions are related to credibility perceptions: anger, sadness, and surprise, but sadness and surprise are not universally viewed positively or negatively. Expressing too much emotion, or over-emoting, is problematic. Expressing more anger in the Q&A, which we refer to as reactive anger, is perceived negatively, and when spokespeople emote a low amount of sadness and use a high amount of words expressing sincerity they are viewed as having the most credible messages.  相似文献   

14.
Despite known differences in the acoustic properties of children’s and adults’ voices, no work to date has examined the vocal cues associated with emotional prosody in youth. The current study investigated whether child (n = 24, 17 female, aged 9–15) and adult (n = 30, 15 female, aged 18–63) actors differed in the vocal cues underlying their portrayals of basic emotions (anger, disgust, fear, happiness, sadness) and social expressions (meanness, friendliness). We also compared the acoustic characteristics of meanness and friendliness to comparable basic emotions. The pattern of distinctions between expressions varied as a function of age for voice quality and mean pitch. Specifically, adults’ portrayals of the various expressions were more distinct in mean pitch than children’s, whereas children’s representations differed more in voice quality than adults’. Given the importance of pitch variables for the interpretation of a speaker’s intended emotion, expressions generated by adults may thus be easier for listeners to decode than those of children. Moreover, the vocal cues associated with the social expressions of meanness and friendliness were distinct from those of basic emotions like anger and happiness respectively. Overall, our findings highlight marked differences in the ways in which adults and children convey socio-emotional expressions vocally, and expand our understanding of the communication of paralanguage in social contexts. Implications for the literature on emotion recognition are discussed.  相似文献   

15.
Gender roles in mainstream US culture suggest that girls express more happiness, sadness, anxiety, and shame/embarrassment than boys, while boys express more anger and externalizing emotions, such as contempt. However, gender roles and emotion expression may be different in low-income and ethnically diverse families, as children and parents are often faced with greater environmental stressors and may have different gender expectations. This study examined gender differences in emotion expression in low-income adolescents, an understudied population. One hundred and seventy nine adolescents (aged 14–17) participated in the Trier Social Stress Test (TSST). Trained coders rated adolescents’ expressions of happiness, sadness, anxiety, shame/embarrassment, anger, and contempt during the TSST using a micro-analytic coding system. Analyses showed that, consistent with gender roles, girls expressed higher levels of happiness and shame than boys; however, contrary to traditional gender roles, girls showed higher levels of contempt than boys. Also, in contrast to cultural stereotypes, there were no differences in anger between boys and girls. Findings suggest gender-role inconsistent displays of externalizing emotions in low-income adolescents under acute stress, and may reflect different emotion socialization experiences in this group.  相似文献   

16.
17.
The relationship between an individual's habitual, emotional dispositions or tendencies — an aspect of personality — and his ability to recognize facially expressed emotions was explored. Previous studies have used global scores of recognition accuracy across several emotions, but failed to examine the relationship between emotion traits and recognition of specific emotion expressions. In the present study, these more specific relationships were examined. Results indicated that females with an inhibited-non-assertive personality style tended to have poorer emotion recognition scores than more socially oriented females. In contrast, for males, the relationship between traits and recognition scores was much more emotion specific: Emotion traits were not significantly related to a global measure of recognition accuracy but were related to recognition rates of certain emotion expressions — sadness, anger, fear, surprise, and disgust. For most of the emotions, males appeared to be more likely to see what they feel. Possible explanations of the findings in terms of perceptual set and other mechanisms are discussed. Implications for clinical studies and research are noted. The study also highlights the importance of separate analysis of data for specific emotions, as well as for sex.  相似文献   

18.
To study the effects of gender on ability to recognize facial expressions of emotion, two separate samples of male and female undergraduates (727 in Study 1, 399 in Study 2) judged 120 color photographs of people posing one of four negative emotions: anger, disgust, fear, and sadness. Overall, females exceeded males in their ability to recognize emotions whether expressed by males or by females. As an exception, males were superior to females in recognizing male anger. The findings are discussed in terms of social sex-roles.Portions of this paper were presented at the Annual Convention of the American Psychological Association, New York: August, 1987.  相似文献   

19.
Physical attractiveness is suggested to be an indicator of biological quality and therefore should be stable. However, transient factors such as gaze direction and facial expression affect facial attractiveness, suggesting it is not. We compared the relative importance of variation between faces with variation within faces due to facial expressions. 128 participants viewed photographs of 14 men and 16 women displaying the six basic facial expressions (anger, disgust, fear, happiness, sadness, surprise) and a neutral expression. Each rater saw each model only once with a randomly chosen expression. The effect of expressions on attractiveness was similar in male and female faces, although several expressions were not significantly different from each other. Identity was 2.2 times as important as emotion in attractiveness for both male and female pictures, suggesting that attractiveness is stable. Since the hard tissues of the face are unchangeable, people may still be able to perceive facial structure whatever expression the face is displaying, and still make attractiveness judgements based on structural cues.  相似文献   

20.
Substantial research has documented the universality of several emotional expressions. However, recent findings have demonstrated cultural differences in level of recognition and ratings of intensity. When testing cultural differences, stimulus sets must meet certain requirements. Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of Emotion (JACFEE) is the only set that meets these requirements. The purpose of this study was to obtain judgment reliability data on the JACFEE, and to test for possible cross-national differences in judgments as well. Subjects from Hungary, Japan, Poland, Sumatra, United States, and Vietnam viewed the complete JACFEE photo set and judged which emotions were portrayed in the photos and rated the intensity of those expressions. Results revealed high agreement across countries in identifying the emotions portrayed in the photos, demonstrating the reliability of the JACFEE. Despite high agreement, cross-national differences were found in the exact level of agreement for photos of anger, contempt, disgust, fear, sadness, and surprise. Cross-national differences were also found in the level of intensity attributed to the photos. No systematic variation due to either preceding emotion or presentation order of the JACFEE was found. Also, we found that grouping the countries into a Western/Non-Western dichotomy was not justified according to the data. Instead, the cross-national differences are discussed in terms of possible sociopsychological variables that influence emotion judgments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号