首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a ‘pop-out’ effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions.  相似文献   

2.
Cross-cultural and laboratory research indicates that some facial expressions of emotion are recognized more accurately and faster than others. We assessed the hypothesis that such differences depend on the frequency with which each expression occurs in social encounters. Thirty observers recorded how often they saw different facial expressions during natural conditions in their daily life. For a total of 90 days (3 days per observer), 2,462 samples of seen expressions were collected. Among the basic expressions, happy faces were observed most frequently (31 %), followed by surprised (11.3 %), sad (9.3 %), angry (8.7 %), disgusted (7.2 %), and fearful faces, which were the least frequent (3.4 %). A significant amount (29 %) of non-basic emotional expressions (e.g., pride or shame) were also observed. We correlated our frequency data with recognition accuracy and response latency data from prior studies. In support of the hypothesis, significant correlations (generally, above .70) emerged, with recognition accuracy increasing and latency decreasing as a function of frequency. We conclude that the efficiency of facial emotion recognition is modulated by familiarity of the expressions.  相似文献   

3.
Physical attractiveness is suggested to be an indicator of biological quality and therefore should be stable. However, transient factors such as gaze direction and facial expression affect facial attractiveness, suggesting it is not. We compared the relative importance of variation between faces with variation within faces due to facial expressions. 128 participants viewed photographs of 14 men and 16 women displaying the six basic facial expressions (anger, disgust, fear, happiness, sadness, surprise) and a neutral expression. Each rater saw each model only once with a randomly chosen expression. The effect of expressions on attractiveness was similar in male and female faces, although several expressions were not significantly different from each other. Identity was 2.2 times as important as emotion in attractiveness for both male and female pictures, suggesting that attractiveness is stable. Since the hard tissues of the face are unchangeable, people may still be able to perceive facial structure whatever expression the face is displaying, and still make attractiveness judgements based on structural cues.  相似文献   

4.
Lipps (1907) presented a model of empathy which had an important influence on later formulations. According to Lipps, individuals tend to mimic an interaction partner's behavior, and this nonverbal mimicry induces—via a feedback process—the corresponding affective state in the observer. The resulting shared affect is believed to foster the understanding of the observed person's self. The present study tested this model in the context of judgments of emotional facial expressions. The results confirm that individuals mimic emotional facial expressions, and that the decoding of facial expressions is accompanied by shared affect. However, no evidence that emotion recognition accuracy or shared affect are mediated by mimicry was found. Yet, voluntary mimicry was found to have some limited influence on observer' s assessment of the observed person's personality. The implications of these results with regard to Lipps' original hypothesis are discussed.  相似文献   

5.
Twenty-five high-functioning, verbal children and adolescents with autism spectrum disorders (ASD; age range 8–15 years) who demonstrated a facial emotion recognition deficit were block randomized to an active intervention (n = 12) or waitlist control (n = 13) group. The intervention was a modification of a commercially-available, computerized, dynamic facial emotion training tool, the MiX by Humintell©. Modifications were introduced to address the special learning needs of individuals with ASD and to address limitations in current emotion recognition programs. Modifications included: coach-assistance, a combination of didactic instruction for seven basic emotions, scaffold instruction which included repeated practice with increased presentation speeds, guided attention to relevant facial cues, and imitation of expressions. Training occurred twice each week for 45–60 min across an average of six sessions. Outcome measures were administered prior to and immediately after treatment, as well as after a delay period of 4–6 weeks. Outcome measures included (a) direct assessment of facial emotion recognition, (b) emotion self-expression, and (c) generalization through emotion awareness in videos and stories, use of emotion words, and self-, parent-, and teacher-report on social functioning questionnaires. The facial emotion training program enabled children and adolescents with ASD to more accurately and quickly identify feelings in facial expressions with stimuli from both the training tool and generalization measures and demonstrate improved self-expression of facial emotion.  相似文献   

6.
This study examined age and gender differences in decoding nonverbal cues in a school population of 606 (pre)adolescents (9–15 years). The focus was on differences in the perceived intensity of several emotions in both basic and non-basic facial expressions. Age differences were found in decoding low intensity and ambiguous faces, but not in basic expressions. Older adolescents indicated more negative meaning in these more subtle and complex facial cues. Girls attributed more anger to both basic and non-basic facial expressions and showed a general negative bias in decoding non-basic facial expressions compared to boys. Findings are interpreted in the light of the development of emotion regulation and the importance for developing relationships.
Yolanda van BeekEmail:
  相似文献   

7.
This article introduces the Children’s Scales of Pleasure and Arousal as instruments to enable children to provide judgments of emotions they witness or experience along the major dimensions of affect. In two studies (Study 1: N = 160, 3–11 years and adults; Study 2: N = 280, 3–5 years and adults), participants used the scales to indicate the levels of pleasure or arousal they perceived in stylized drawings of facial expressions, in photographs of facial expressions, or in emotion labels. All age groups used the Pleasure Scale reliably and accurately with all three types of stimuli. All used the Arousal Scale with stylized faces and with facial expressions, but only 5-year-olds did so for emotion labels.  相似文献   

8.
Studies with socially anxious adults suggest that social anxiety is associated with problems in decoding other persons' facial expressions of emotions. Corresponding studies with socially anxious children are lacking. The aim of the present study was to test whether socially phobic children show deficits in classifying facial expressions of emotions or show a response bias for negative facial expressions. Fifty socially anxious and 25 socially non-anxious children (age 8 to 12) participated in the study. Pictures of faces with either neutral, positive (joyful) or negative (angry, disgusted, sad) facial expressions (24 per category) were presented for 60 ms on a monitor screen in random order. The children were asked to indicate by pressing a key whether the facial expression was neutral, positive, or negative, and to rate how confident they were about their classification. With regard to frequency of errors the socially anxious children reported significantly more often that they saw emotions when neutral faces were presented. Moreover, reaction times were longer. However, they did not feel less certain about their performance. There is neither an indication of an enhanced ability to decode negative facial expressions in socially anxious children, nor was there a specific tendency to interpret neutral or positive faces as negative.  相似文献   

9.
When we perceive the emotions of other people, we extract much information from the face. The present experiment used FACS (Facial Action Coding System), which is an instrument that measures the magnitude of facial action from a neutral face to a changed, emotional face. Japanese undergraduates judged the emotion in pictures of 66 static Japanese male faces (11 static pictures for each of six basic expressions: happiness, surprise, fear, anger, sadness, and disgust), ranging from neutral faces to maximally expressed emotions. The stimuli had previously been scored with FACS and were presented in random order. A high correlation between the subjects' judgments of facial expressions and the FACS scores was found.  相似文献   

10.
The present study examined effects of temporarily salient and chronic self-construal on decoding accuracy for positive and negative facial expressions of emotion. We primed independent and interdependent self-construal in a sample of participants who then rated the emotion expressions of a central character (target) in a cartoon showing a happy, sad, angry, or neutral facial expression in a group setting. Primed interdependence was associated with lower recognition accuracy for negative emotion expressions. Primed and chronic self-construal interacted such that for interdependence primed participants, higher chronic interdependence was associated with lower decoding accuracy for negative emotion expressions. Chronic independent self-construal was associated with higher decoding accuracy for negative emotion. These findings add to an increasing literature that highlights the significance of perceivers’ socio-cultural factors, self-construal in particular, for emotion perception.  相似文献   

11.
We report two studies which attempt to explain why some researchers found that neutral faces determine judgments of recognition as strongly as expressions of basic emotion, even through discrepant contextual information. In the first study we discarded the possibility that neutral faces could have an intense but undetected emotional content: 60 students' dimensional ratings showed that 10 neutral faces were perceived as less emotional than 10 emotional expressions. In Study 2 we tested whether neutral faces can convey strong emotional messages in some contexts: 128 students' dimensional ratings on 36 discrepant combinations of neutral faces or expressions with contextual information were more predictable from expressions when the contextual information consisted of common, everyday situations, but were more predictable from neutral faces when the context was an uncommon, extreme situation. In line with our hypothesis, we discuss these paradoxical findings as being caused by the salience of neutral faces in some particular contexts.This research was conducted as a part of the first author's doctoral dissertation, and was supported by a grant (PS89-022) of the Spanish DGICyT. We thank David Weston for his help in preparing the text. We also thank two anonymous reviewers for their valuable comments on a previous draft of this article.  相似文献   

12.
Recent studies demonstrated that in adults and children recognition of face identity and facial expression mutually interact ( Bate, Haslam, & Hodgson, 2009 ; Spangler, Schwarzer, Korell, & Maier‐Karius, 2010 ). Here, using a familiarization paradigm, we explored the relation between these processes in early infancy, investigating whether 3‐month‐old infants’ ability to recognize an individual face is affected by the positive (happiness) or neutral emotional expression displayed. Results indicated that infants’ face recognition appears enhanced when faces display a happy emotional expression, suggesting the presence of a mutual interaction between face identity and emotion recognition as early as 3 months of age.  相似文献   

13.
Cognitive models of social anxiety provide a basis for predicting that the ability to process nonverbal information accurately and quickly should be impaired during the experience of state anxiety. To test this hypothesis, we assigned participants to threatening and non-threatening conditions and asked them to label the emotions expressed in a series of faces. It was predicted that social anxiety would be positively associated with errors and response times in threatening conditions, but not in a non-threatening condition. It was also predicted that high social anxiety would be associated with more errors and longer response times when identifying similar expressions such as sadness, anger, and fear. The results indicate that social anxiety was not associated with errors in identifying facial expressions of emotion, regardless of the level of state anxiety experienced. However, social anxiety scores were found to be significantly related to response times to identify facial expressions, but the relationship varied depending on the level of state anxiety experienced. Methodological and theoretical implications of using response time data when assessing nonverbal ability are discussed.  相似文献   

14.
Adults' perceptions provide information about the emotional meaning of infant facial expressions. This study asks whether similar facial movements influence adult perceptions of emotional intensity in both infant positive (smile) and negative (cry face) facial expressions. Ninety‐five college students rated a series of naturally occurring and digitally edited images of infant facial expressions. Naturally occurring smiles and cry faces involving the co‐occurrence of greater lip movement, mouth opening, and eye constriction, were rated as expressing stronger positive and negative emotion, respectively, than expressions without these 3 features. Ratings of digitally edited expressions indicated that eye constriction contributed to higher ratings of positive emotion in smiles (i.e., in Duchenne smiles) and greater eye constriction contributed to higher ratings of negative emotion in cry faces. Stronger mouth opening contributed to higher ratings of arousal in both smiles and cry faces. These findings indicate a set of similar facial movements are linked to perceptions of greater emotional intensity, whether the movements occur in positive or negative infant emotional expressions. This proposal is discussed with reference to discrete, componential, and dynamic systems theories of emotion.  相似文献   

15.
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds’ fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).  相似文献   

16.
There is consistent evidence that older adults have difficulties in perceiving emotions. However, emotion perception measures to date have focused on one particular type of assessment: using standard photographs of facial expressions posing six basic emotions. We argue that it is important in future research to explore adult age differences in understanding more complex, social and blended emotions. Using stimuli which are dynamic records of the emotions expressed by people of all ages, and the use of genuine rather than posed emotions, would also improve the ecological validity of future research into age differences in emotion perception. Important questions remain about possible links between difficulties in perceiving emotional signals and the implications that this has for the everyday interpersonal functioning of older adults.  相似文献   

17.
Physical abuse history has been demonstrated to have an effect upon accuracy of interpretation of facial expressions, but he effects of sexual abuse have not been explored. Thus, the accuracy of interpretation and the role of different facial components in the interpretation of facial expressions were studied in sexually abused and non-abused girls. Twenty-nine sexually abused and 29 non-abused females, ranging in age from 5 to 9 years, chose schematic faces which best represented various emotional scenarios. Accuracy of interpretation of facial expression differed between sexually abused and non-abused girls only when emotion portrayed was considered. A history of sexual abuse alone had no effect upon overall accuracy, but did influence performance on specific emotions, particularly at certain ages. In this investigation, specific facial component had no effect on integretation of facial expressions. Rather than exhibiting patterns o fp overall arrested development, these sexually abused girls seemed to focus upon selected emotions when interpreting facial expressions. Findings regarding lhis selectivity of emotions or heightened awareness of particular emolions (e.g., anger) may be quite useful in understanding the effects of sexual abuse and in the advancement of treatment for sexual abuse victims.  相似文献   

18.
We investigated how power priming affects facial emotion recognition in the context of body postures conveying the same or different emotion. Facial emotions are usually recognized better when the face is presented with a congruent body posture, and recognized worse when the body posture is incongruent. In our study, we primed participants to either low, high, or neutral power prior to their performance in a facial-emotion categorization task in which faces were presented together with a congruent or incongruent body posture. Facial emotion recognition in high-power participants was not affected by body posture. In contrast, low-power and neutral-power participants were significantly affected by the congruence of facial and body emotions. Specifically, these participants displayed better facial emotion recognition when the body posture was congruent, and worse performance when the body posture was incongruent. In a following task, we trained the same participants to categorize two sets of novel checkerboard stimuli and then engaged them in a recognition test involving compounds of these stimuli. High, low, and neutral-power participants all showed a strong congruence effect for compound checkerboard stimuli. We discuss our results with reference to the literature on power and social perception.  相似文献   

19.
The relation between knowledge of American Sign Language (ASL) and the ability to decode facial expressions of emotion was explored in this study. Subjects were 60 college students, half of whom were intermediate level students of ASL and half of whom had no exposure to a signed language. Subjects viewed and judged silent video segments of stimulus persons experiencing spontaneous emotional reactions representing either happiness, sadness, anger, disgust, or fear/surprise. Results indicated that hearing subjects knowledgeable in ASL were generally better than hearing non-signers at identifying facial expressions of emotion, although there were variations in decoding accuracy regarding the specific emotion being judged. In addition, females were more successful decoders than males. Results have implications for better understanding the nature of nonverbal communication in deaf and hearing individuals.We are grateful to Karl Scheibe for comments on an earlier version of this paper and to Erik Coats for statistical analysis. This study was conducted as part of a Senior Honors thesis at Wesleyan University.  相似文献   

20.
We investigated whether the well-documented babyface stereotype is moderated by facial movement or expression. Impressions of more babyfaced women as warmer and less dominant were weaker when faces were moving than when they were static. These moderating effects of facial movement were consistent with its tendency to reduce the perceived anger of low babyfaced women. Impressions of more babyfaced women as less dominant were equally strong whether faces showed a neutral or surprised expression, but impressions of them as warmer were significant only for neutral expressions. The moderating effect of facial expression on impressions of warmth was consistent with the tendency for surprise expressions to attenuate differences in the perceived babyfaceness of high and low babyfaced people. Theoretical interpretations and practical implications are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号