首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
The Intensity of Emotional Facial Expressions and Decoding Accuracy   总被引:2,自引:0,他引:2  
The influence of the physical intensity of emotional facial expressions on perceived intensity and emotion category decoding accuracy was assessed for expressions of anger, disgust, sadness, and happiness. The facial expressions of two men and two women posing each of the four emotions were used as stimuli. Six different levels of intensity of expression were created for each pose using a graphics morphing program. Twelve men and 12 women rated each of the 96 stimuli for perceived intensity of the underlying emotion and for the qualitative nature of the emotion expressed. The results revealed that perceived intensity varied linearly with the manipulated physical intensity of the expression. Emotion category decoding accuracy varied largely linearly with the manipulated physical intensity of the expression for expressions of anger, disgust, and sadness. For the happiness expressions only, the findings were consistent with a categorical judgment process. Sex of encoder produced significant effects for both dependent measures. These effects remained even after possible gender differences in encoding were controlled for, suggesting a perceptual bias on the part of the decoders.  相似文献   

3.
Lipps (1907) presented a model of empathy which had an important influence on later formulations. According to Lipps, individuals tend to mimic an interaction partner's behavior, and this nonverbal mimicry induces—via a feedback process—the corresponding affective state in the observer. The resulting shared affect is believed to foster the understanding of the observed person's self. The present study tested this model in the context of judgments of emotional facial expressions. The results confirm that individuals mimic emotional facial expressions, and that the decoding of facial expressions is accompanied by shared affect. However, no evidence that emotion recognition accuracy or shared affect are mediated by mimicry was found. Yet, voluntary mimicry was found to have some limited influence on observer' s assessment of the observed person's personality. The implications of these results with regard to Lipps' original hypothesis are discussed.  相似文献   

4.
The purpose of this study was to determine whether it is more difficult to decode facial expressions of pain in older than in younger adults. The facial expressions of 10 younger and 10 older chronic pain patients undergoing a painful diagnostic test were viewed on videotape by untrained judges. Judges estimated the severity of pain being experienced by the patients. Ratings made of the older faces during painful moments described more pain, and appeared more accurate, than those made of younger faces. Judges also reported seeing more pain in posed, masked, and baseline facial expressions in the older adults. Age-related structural changes to the face were not responsible for this bias. This suggests that judges were predisposed to see pain in the faces of the older patients, and undermines the assumption that their ratings of pain in the painful moment segments were accurate.  相似文献   

5.
The impact of singular (e.g. sadness alone) and compound (e.g. sadness and anger together) facial expressions on individuals' recognition of faces was investigated. In three studies, a face recognition paradigm was used as a measure of the proficiency with which participants processed compound and singular facial expressions. For both positive and negative facial expressions, participants displayed greater proficiency in processing compound expressions relative to singular expressions. Specifically, the accuracy with which faces displaying compound expressions were recognized was significantly higher than the accuracy with which faces displaying singular expressions were recognized. Possible explanations involving the familiarity, distinctiveness, and salience of the facial expressions are discussed.  相似文献   

6.
Izard (2004/this issue) clarifies the position of differential emotions theory by proposing a distinction between hard and soft versions of event‐emotion expression relations. We concur that the best design to examine situational specificity in facial expressions is one that utilizes multiple stimulus situations assessed over multiple occasions and ages. However, the problem of how to identify, a priori, a family of stimulus situations remains. We offer an example from our own recent work demonstrating how facial expressions and physiological indexes may converge to indicate the presence of a meaningful family of stimulus situations. Specifically, we found evidence for a family of frustrating, goal‐blocking events that elicited expressions and cortisol responses indicative of anger at 4 months. Yet, individual differences exist in that these situations also elicited expressions and cortisol changes indicative of sadness. Identification of a more comprehensive set of such situations throughout infancy will allow researchers to more systematically examine the degree to which situational specificity of emotions is present.  相似文献   

7.
Categorical perception, demonstrated as reduced discrimination of within‐category relative to between‐category differences in stimuli, has been found in a variety of perceptual domains in adults. To examine the development of categorical perception in the domain of facial expression processing, we used behavioral and event‐related potential (ERP) methods to assess discrimination of within‐category (happy‐happy) and between‐category (happy‐sad) differences in facial expressions in 7‐month‐old infants. Data from a visual paired‐comparison test and recordings of attention‐sensitive ERPs showed no discrimination of facial expressions in the within‐category condition, whereas reliable discrimination was observed in the between‐category condition. The results also showed that face‐sensitive ERPs over occipital‐temporal scalp (P400) were attenuated in the within‐category condition relative to the between‐category condition, suggesting a potential neural basis for the reduced within‐category sensitivity. Together, these results suggest that the neural systems underlying categorical representation of facial expressions emerge during the early stages of postnatal development, before acquisition of language.  相似文献   

8.
Does mood influence people’s tendency to accept observed facial expressions as genuine? Based on recent theories of affect and cognition, two experiments predicted and found that negative mood increased and positive mood decreased people’s skepticism about the genuineness of facial expressions. After a mood induction, participants viewed images of faces displaying (a) positive, neutral, and negative expressions (Exp. 1), or (b) displays of six specific emotions (Exp. 2). Judgments of genuineness, valence, and confidence ratings were collected. As predicted, positive affect increased, and negative affect decreased the perceived genuineness of facial expressions, and there was some evidence for affect-congruence in judgments. The relevance of these findings for everyday nonverbal communication and strategic interpersonal behavior are considered, and their implications for recent affect-cognition theories are discussed.  相似文献   

9.
We assessed the impact of social context on the judgment of emotional facial expressions as a function of self-construal and decoding rules. German and Greek participants rated spontaneous emotional faces shown either alone or surrounded by other faces with congruent or incongruent facial expressions. Greek participants were higher in interdependence than German participants. In line with cultural decoding rules, Greek participants rated anger expressions less intensely and sad and disgust expressions more intensely. Social context affected the ratings by both groups in different ways. In the more interdependent culture (Greece) participants perceived anger least intensely when the group showed neutral expressions, whereas sadness expressions were rated as most intense in the absence of social context. In the independent culture (Germany) a group context (others expressing anger or happiness) additionally amplified the perception of angry and happy expressions. In line with the notion that these effects are mediated by more holistic processing linked to higher interdependence, this difference disappeared when we controlled for interdependence on the individual level. The findings confirm the usefulness of considering both country level and individual level factors when studying cultural differences.  相似文献   

10.
Previous studies have reported that the expression of smiles is facilitated by social interaction between partners. We examined the effects of social interaction and personal relationships on facial expressions in Japan. Pairs of friends and strangers seated next to each other (the no partition condition) or separated by a partition (the partition condition) were shown film clips aimed to elicit either positive or negative affect. Smiles were facilitated in the no partition condition in which participants interacted with each other. Further, the effect of social interaction on frowns differed depending on whether pairs were friends or strangers.  相似文献   

11.
To investigate the perception of emotional facial expressions, researchers rely on shared sets of photos or videos, most often generated by actor portrayals. The drawback of such standardized material is a lack of flexibility and controllability, as it does not allow the systematic parametric manipulation of specific features of facial expressions on the one hand, and of more general properties of the facial identity (age, ethnicity, gender) on the other. To remedy this problem, we developed FACSGen: a novel tool that allows the creation of realistic synthetic 3D facial stimuli, both static and dynamic, based on the Facial Action Coding System. FACSGen provides researchers with total control over facial action units, and corresponding informational cues in 3D synthetic faces. We present four studies validating both the software and the general methodology of systematically generating controlled facial expression patterns for stimulus presentation.  相似文献   

12.
We report data concerning cross-cultural judgments of emotion in spontaneously produced facial expressions. Americans, Japanese, British, and International Students in the US reliably attributed emotions to the expressions of Olympic judo athletes at the end of a match for a medal, and at two times during the subsequent medal ceremonies. There were some observer culture differences in absolute attribution agreement rates, but high cross-cultural agreement in differences in attribution rates across expressions (relative agreement rates). Moreover, we operationalized signal clarity and demonstrated that it was associated with agreement rates similarly in all cultures. Finally, we obtained judgments of won-lost match outcomes and medal finish, and demonstrated that the emotion judgments were associated with accuracy in judgments of outcomes. These findings demonstrated that members of different cultures reliably judge spontaneously expressed emotions, and that across observer cultures, lower absolute agreement rates are related to noise produced by non-emotional facial behaviors. Also, the findings suggested that observers of different cultures utilize the same facial cues when judging emotions, and that the signal value of facial expressions is similar across cultures.  相似文献   

13.
This study replicated and extended previously reported sex differences involving both viewer and target in the recognition of threatening facial expressions. Based on the assumption that the evolved cognitive mechanisms mediating anger recognition would have been designed by natural selection to operate quickly in the interests of survival, brief tachistoscopic presentation of stimulus photographs was used. Additionally, in contrast to prior published studies, the statistical methods of signal detection research were used to control for the confounding effects of non-random guessing. The main hypothesis, that anger posed by males would be more accurately perceived than anger posed by females, was supported. A secondary hypothesis, that female-posed anger would be more accurately perceived by women than by men, received partial support. Testosterone levels, measured inferentially in terms of diurnal cycles, failed to show the hypothesized positive relationship to accuracy of anger perception.  相似文献   

14.
The present study examined the impact of conflict over emotional expression on the nonverbal communication process between romantic partners. Fifty-four romantically involved female undergraduate students who scored within the upper or lower 30th percentile range on the Ambivalence over the Expression of Emotion Questionnaire (AEQ; King & Emmons, 1990) were recruited along with their romantic partners. The facial expressions of these women were examined during a conflict resolution task. Analyses indicated that highly ambivalent women expressed a greater number of negative facial expressions and shorter lasting positive facial expressions (measured with FACES; Kring & Sloan, 1992) than less ambivalent women. These expressions were not entirely explained by current mood, as ambivalence predicted a greater number of negative facial expressions, and a briefer display of positive facial expressions, above and beyond current levels of negative and positive affect. Furthermore, analyses indicated that the number of women's negative expressions predicted significant increases in men's dysphoria and marginal increases in men's anxiety, suggesting potential negative interactional patterns between ambivalent women and their partners.  相似文献   

15.
The current study examined the effects of institutionalization on the discrimination of facial expressions of emotion in three groups of 42‐month‐old children. One group consisted of children abandoned at birth who were randomly assigned to Care‐as‐Usual (institutional care) following a baseline assessment. Another group consisted of children abandoned at birth who were randomly assigned to high‐quality foster care following a baseline assessment. A third group consisted of never‐institutionalized children who were reared by their biological parents. All children were familiarized to happy, sad, fearful, and neutral facial expressions and tested on their ability to discriminate familiar versus novel facial expressions. Contrary to our prediction, all three groups of children were equally capable of discriminating among the different expressions. Furthermore, in contrast to findings at 13–30 months of age, these same children showed familiarity rather than novelty preferences toward different expressions. There were also asymmetries in children’s discrimination of facial expressions depending on which facial expression served as the familiar versus novel stimulus. Collectively, early institutionalization appears not to impact the development of the ability to discriminate facial expressions of emotion, at least when preferential looking serves as the dependent measure. These findings are discussed in the context of the myriad domains that are affected by early institutionalization.  相似文献   

16.
Facial expressions related to sadness are a universal signal of nonverbal communication. Although results of many psychology studies have shown that drooping of the lip corners, raising of the chin, and oblique eyebrow movements (a combination of inner brow raising and brow lowering) express sadness, no report has described a study elucidating facial expression characteristics under well-controlled circumstances with people actually experiencing the emotion of sadness itself. Therefore, spontaneous facial expressions associated with sadness remain unclear. We conducted this study to accumulate important findings related to spontaneous facial expressions of sadness. We recorded the spontaneous facial expressions of a group of participants as they experienced sadness during an emotion-elicitation task. This task required a participant to recall neutral and sad memories while listening to music. We subsequently conducted a detailed analysis of their sad and neutral expressions using the Facial Action Coding System. The prototypical facial expressions of sadness in earlier studies were not observed when people experienced sadness as an internal state under non-social circumstances. By contrast, they expressed tension around the mouth, which might function as a form of suppression. Furthermore, results show that parts of these facial actions are not only related to sad experiences but also to other emotional experiences such as disgust, fear, anger, and happiness. This study revealed the possibility that new facial expressions contribute to the experience of sadness as an internal state.  相似文献   

17.
Previous research has demonstrated that individuals who were accurate at recognizing facial expressions of emotions reported better relationships with family and friends. The purpose of the present study was to test whether the ability to recognize facial expressions of negative emotions predicted greater relationship satisfaction with their romantic relationships and whether this link was mediated by constructive responses to conflict. Participants currently involved in a romantic relationship completed a validated performance measure of recognition of facial expressions and afterwards reported on the responses they engaged in during conflict with their romantic partner and rated their romantic relationship satisfaction. Results showed that accurate recognition of facial expressions of negative emotions (anger, contempt, disgust, fear, and sadness) predicted less conflict engaging behaviors during conflict with their romantic partners (but not positive problem solving and withdrawal), which in turn predicted greater relationship satisfaction. The present study is the first to show that the ability to recognize facial expressions of negative emotions is related to romantic relationship satisfaction and that constructive responses to conflict such as less conflict engaging behaviors, mediate this process.  相似文献   

18.
Participants?? faces were covertly recorded while they rated the attractiveness of people, the decorative appeal of paintings, and the cuteness of animals. Ratings employed a continuous scale. The same participants then returned and tried to guess ratings from 3-s videotapes of themselves and other targets. Performance was above chance in all three stimulus categories, thereby replicating the results of an earlier study (North et al. in J Exp Soc Psychol 46(6):1109?C1113, 2010) but this time using a more sensitive rating procedure. Across conditions, accuracy in reading one??s own face was not reliably better than other-accuracy. We discuss our findings in the context of ??simulation?? theories of face-based emotion recognition (Goldman in The philosophy, psychology, and neuroscience of mindreading. Oxford University Press, Oxford, 2006) and the larger body of accuracy research.  相似文献   

19.
Journal of Nonverbal Behavior - Although emotion expressions are typically dynamic and include the whole person, much emotion recognition research uses static, posed facial expressions. In this...  相似文献   

20.
This work constitutes a systematic review of the empirical literature about emotional facial expressions displayed in the context of family interaction. Searches of electronic databases from January 1990 until December 2016 generated close to 4400 articles, of which only 26 met the inclusion criteria. Evidence indicate that affective expressions were mostly examined through laboratory and naturalistic observations, within a wide range of interactive contexts in which mother–child dyads significantly outnumbered father–child dyads. Moreover, dyadic partners were found to match each others’ displays and positive and neutral facial expressions proving more frequent than negative facial expressions. Finally, researchers observed some developmental and gender differences regarding the frequency, intensity, and category of emotional displays and identified certain links among facial expression behavior, family relations, personal adjustment, and peer-related social competence.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号