首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Because of the close connection between culture and language, a number of writers have suggested that bilinguals will differ in their behavior because of differences in the degree of assimilation of different cultures in the same individual. We tested this notion by obtaining data from bilingual (English and Hindi) college students in India using a well-studied cross-cultural research paradigm involving emotional perception. Subjects judged universal facial expressions of emotion in two separate sessions, one conducted entirely in English, the other in Hindi. In each session, they judged which emotion was being portrayed, and how intensely. Subjects recognized anger, fear, and sadness more accurately in English than in Hindi. They also attributed greater intensity to female photos of anger when rating in Hindi, but attributed greater intensity to female photos of sadness when rating in English. These findings were discussed in relation to the theoretical connection between culture and language.  相似文献   

2.
A procedure for improving children's skill in decoding facial expressions of emotion was studied in this experiment. In the first phase of the study, thirty-six fifth and sixth grade children watched video segments showing facial expressions of stimulus persons experiencing happiness, sadness, or fear and tried to identify each stimulus person's emotion. Subjects assigned to the feedback condition were given the correct answer for each segment, and subjects assigned to the no feedback condition received no information. Results for the second phase of the experiment, in which subjects' decoding skills were assessed, showed that the feedback method was effective in improving general decoding abilities. Furthermore, differences between subjects in the feedback and no feedback conditions were affected by subjects' sex and the specific emotion being decoded.Portions of this study were presented at the annual meeting of the Eastern Psychological Association, Boston, April, 1989. This study was funded by a grant from the Marks Meadow Research Foundation, as well as through ongoing support from the National Institute of Disability and Rehabilitation Research, U.S. Department of Education, to the second author.  相似文献   

3.
Ethnic bias in the recognition of facial expressions   总被引:1,自引:0,他引:1  
Ethnic bias in the recognition of facial expressions was assessed by having college students from the United States and Zambia assign emotion labels to facial expressions produced by imitation by United States and Zambian students. Bidirectional ethnic bias was revealed by the fact that Zambian raters labeled the Zambian facial expressions with less uncertainty than the U.S. facial expressions, and that U.S. raters labeled the U.S. facial expressions with less uncertainty than the Zambian facial expressions. In addition, the Facial Action Coding System was used to assess accuracy in the imitation of facial expressions. These results and the results of other analyses of recognition accuracy are reported.Portions of this paper were presented at the annual meeting of the Society for Cross-Cultural Research, Philadelphia, Pennsylvania, February 1980 (Note 1).  相似文献   

4.
The goal of this study was to examine whether individual differences in the intensity of facial expressions of emotion are associated with individual differences in the voluntary control of facial muscles. Fifty college students completed a facial mimicry task, and were judged on the accuracy and intensity of their facial movements. Self-reported emotional experience was measured after subjects viewed positive and negative affect-eliciting filmclips, and intensity of facial expressiveness was measured from videotapes recorded while the subjects viewed the filmclips. There were significant sex differences in both facial mimicry task performance and responses to the filmclips. Accuracy and intensity scores on the mimicry task, which were not significantly correlated with one another, were both positively correlated with the intensity of facial expressiveness in response to the filmclips, but were not associated with reported experiences.We wish to thank the Editor and two anonymous reviewers for their helpful comments on an earlier draft of this paper.  相似文献   

5.
American Sign Language (ASL) uses the face to express grammar and inflection, in addition to emotion. Research in this area has mostly used photographic stimuli. The purpose of this paper is to present data on how deaf signers and hearing non-signers recognize and categorize a variety of communicative facial expressions in ASL using dynamic stimuli rather than static pictures. Stimuli included six expression types chosen because they share overt similarities but express different content. Hearing participants were more accurate in their categorizations but expressed overall lower confidence regarding their performance.
Ruth B. GrossmanEmail:
  相似文献   

6.
Adults' perceptions provide information about the emotional meaning of infant facial expressions. This study asks whether similar facial movements influence adult perceptions of emotional intensity in both infant positive (smile) and negative (cry face) facial expressions. Ninety‐five college students rated a series of naturally occurring and digitally edited images of infant facial expressions. Naturally occurring smiles and cry faces involving the co‐occurrence of greater lip movement, mouth opening, and eye constriction, were rated as expressing stronger positive and negative emotion, respectively, than expressions without these 3 features. Ratings of digitally edited expressions indicated that eye constriction contributed to higher ratings of positive emotion in smiles (i.e., in Duchenne smiles) and greater eye constriction contributed to higher ratings of negative emotion in cry faces. Stronger mouth opening contributed to higher ratings of arousal in both smiles and cry faces. These findings indicate a set of similar facial movements are linked to perceptions of greater emotional intensity, whether the movements occur in positive or negative infant emotional expressions. This proposal is discussed with reference to discrete, componential, and dynamic systems theories of emotion.  相似文献   

7.
Facial expressions of emotion influence interpersonal trait inferences   总被引:4,自引:0,他引:4  
Theorists have argued that facial expressions of emotion serve the interpersonal function of allowing one animal to predict another's behavior. Humans may extend these predictions into the indefinite future, as in the case of trait inference. The hypothesis that facial expressions of emotion (e.g., anger, disgust, fear, happiness, and sadness) affect subjects' interpersonal trait inferences (e.g., dominance and affiliation) was tested in two experiments. Subjects rated the dispositional affiliation and dominance of target faces with either static or apparently moving expressions. They inferred high dominance and affiliation from happy expressions, high dominance and low affiliation from angry and disgusted expressions, and low dominance from fearful and sad expressions. The findings suggest that facial expressions of emotion convey not only a target's internal state, but also differentially convey interpersonal information, which could potentially seed trait inference.This article constitutes a portion of my dissertation research at Stanford University, which was supported by a National Science Foundation Fellowship and an American Psychological Association Dissertation Award. Thanks to Nancy Alvarado, Chris Dryer, Paul Ekman, Bertram Malle, Susan Nolen-Hoeksema, Steven Sutton, Robert Zajonc, and more anonymous reviewers than I can count on one hand for their comments.  相似文献   

8.
To examine individual differences in decoding facial expressions, college students judged type and emotional intensity of emotional faces at five intensity levels and completed questionnaires on family expressiveness, emotionality, and self-expressiveness. For decoding accuracy, family expressiveness was negatively related, with strongest effects for more prototypical faces, and self-expressiveness was positively related. For perceptions of emotional intensity, family expressiveness was positively related, emotionality tended to be positively related, and self-expressiveness tended to be negatively related; these findings were all qualified by level of ambiguity/clarity of the facial expressions. Decoding accuracy and perceived emotional intensity also related positively with each other. Results suggest that even simple facial judgments are made through an interpretive lens partially created by family expressiveness, individuals’ own emotionality, and self-expressiveness.  相似文献   

9.
The current study examined the effects of institutionalization on the discrimination of facial expressions of emotion in three groups of 42‐month‐old children. One group consisted of children abandoned at birth who were randomly assigned to Care‐as‐Usual (institutional care) following a baseline assessment. Another group consisted of children abandoned at birth who were randomly assigned to high‐quality foster care following a baseline assessment. A third group consisted of never‐institutionalized children who were reared by their biological parents. All children were familiarized to happy, sad, fearful, and neutral facial expressions and tested on their ability to discriminate familiar versus novel facial expressions. Contrary to our prediction, all three groups of children were equally capable of discriminating among the different expressions. Furthermore, in contrast to findings at 13–30 months of age, these same children showed familiarity rather than novelty preferences toward different expressions. There were also asymmetries in children’s discrimination of facial expressions depending on which facial expression served as the familiar versus novel stimulus. Collectively, early institutionalization appears not to impact the development of the ability to discriminate facial expressions of emotion, at least when preferential looking serves as the dependent measure. These findings are discussed in the context of the myriad domains that are affected by early institutionalization.  相似文献   

10.
Darwin (1872) hypothesized that some facial muscle actions associated with emotion cannot be consciously inhibited, particularly when the to-be concealed emotion is strong. The present study investigated emotional “leakage” in deceptive facial expressions as a function of emotional intensity. Participants viewed low or high intensity disgusting, sad, frightening, and happy images, responding to each with a 5 s videotaped genuine or deceptive expression. Each 1/30 s frame of the 1,711 expressions (256,650 frames in total) was analyzed for the presence and duration of universal expressions. Results strongly supported the inhibition hypothesis. In general, emotional leakage lasted longer in both the upper and lower face during high-intensity masked, relative to low-intensity, masked expressions. High intensity emotion was more difficult to conceal than low intensity emotion during emotional neutralization, leading to a greater likelihood of emotional leakage in the upper face. The greatest and least amount of emotional leakage occurred during fearful and happiness expressions, respectively. Untrained observers were unable to discriminate real and false expressions above the level of chance.  相似文献   

11.
To study the effects of gender on ability to recognize facial expressions of emotion, two separate samples of male and female undergraduates (727 in Study 1, 399 in Study 2) judged 120 color photographs of people posing one of four negative emotions: anger, disgust, fear, and sadness. Overall, females exceeded males in their ability to recognize emotions whether expressed by males or by females. As an exception, males were superior to females in recognizing male anger. The findings are discussed in terms of social sex-roles.Portions of this paper were presented at the Annual Convention of the American Psychological Association, New York: August, 1987.  相似文献   

12.
One of the most prevalent problems in face transplant patients is an inability to generate facial expression of emotions. The purpose of this study was to measure the subjective recognition of patients’ emotional expressions by other people. We examined facial expression of six emotions in two facial transplant patients (patient A = partial, patient B = full) and one healthy control using video clips to evoke emotions. We recorded target subjects’ facial expressions with a video camera while they were watching the clips. These were then shown to a panel of 130 viewers and rated in terms of degree of emotional expressiveness on a 7-point Likert scale. The scores for emotional expressiveness were higher for the healthy control than they were for patients A and B, and these varied as a function of emotion. The most recognizable emotion was happiness. The least recognizable emotions in Patient A were fear, surprise, and anger. The expressions of Patient B scored lower than those of Patient A and the healthy control. The findings show that partial and full-face transplant patients may have difficulties in generating facial expression of emotions even if they can feel those emotions, and different parts of the face seem to play critical roles in different emotional expressions.  相似文献   

13.
Sex, age and education differences in facial affect recognition were assessed within a large sample (n = 7,320). Results indicate superior performance by females and younger individuals in the correct identification of facial emotion, with the largest advantage for low intensity expressions. Though there were no demographic differences for identification accuracy on neutral faces, controlling for response biases by males and older individuals to label faces as neutral revealed sex and age differences for these items as well. This finding suggests that inferior facial affect recognition performance by males and older individuals may be driven primarily by instances in which they fail to detect the presence of emotion in facial expressions. Older individuals also demonstrated a greater tendency to label faces with negative emotion choices, while females exhibited a response bias for sad and fear. These response biases have implications for understanding demographic differences in facial affect recognition.  相似文献   

14.
Subjects were exposed to a sequence of facial expression photographs to determine whether viewing earlier expressions in a sequence would alter intensity judgments of a final expression in the sequence. Results showed that whether the preceding expressions were shown by the same person who displayed the final expression, or different people, intensity ratings of both sad and happy final expressions were enhanced when preceded by a sequence of contrasting as opposed to similar or identical facial expressions. Results are discussed from the perspective of adaptation-level theory.  相似文献   

15.
When we perceive the emotions of other people, we extract much information from the face. The present experiment used FACS (Facial Action Coding System), which is an instrument that measures the magnitude of facial action from a neutral face to a changed, emotional face. Japanese undergraduates judged the emotion in pictures of 66 static Japanese male faces (11 static pictures for each of six basic expressions: happiness, surprise, fear, anger, sadness, and disgust), ranging from neutral faces to maximally expressed emotions. The stimuli had previously been scored with FACS and were presented in random order. A high correlation between the subjects' judgments of facial expressions and the FACS scores was found.  相似文献   

16.
The present study examined effects of temporarily salient and chronic self-construal on decoding accuracy for positive and negative facial expressions of emotion. We primed independent and interdependent self-construal in a sample of participants who then rated the emotion expressions of a central character (target) in a cartoon showing a happy, sad, angry, or neutral facial expression in a group setting. Primed interdependence was associated with lower recognition accuracy for negative emotion expressions. Primed and chronic self-construal interacted such that for interdependence primed participants, higher chronic interdependence was associated with lower decoding accuracy for negative emotion expressions. Chronic independent self-construal was associated with higher decoding accuracy for negative emotion. These findings add to an increasing literature that highlights the significance of perceivers’ socio-cultural factors, self-construal in particular, for emotion perception.  相似文献   

17.
Substantial research has documented the universality of several emotional expressions. However, recent findings have demonstrated cultural differences in level of recognition and ratings of intensity. When testing cultural differences, stimulus sets must meet certain requirements. Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of Emotion (JACFEE) is the only set that meets these requirements. The purpose of this study was to obtain judgment reliability data on the JACFEE, and to test for possible cross-national differences in judgments as well. Subjects from Hungary, Japan, Poland, Sumatra, United States, and Vietnam viewed the complete JACFEE photo set and judged which emotions were portrayed in the photos and rated the intensity of those expressions. Results revealed high agreement across countries in identifying the emotions portrayed in the photos, demonstrating the reliability of the JACFEE. Despite high agreement, cross-national differences were found in the exact level of agreement for photos of anger, contempt, disgust, fear, sadness, and surprise. Cross-national differences were also found in the level of intensity attributed to the photos. No systematic variation due to either preceding emotion or presentation order of the JACFEE was found. Also, we found that grouping the countries into a Western/Non-Western dichotomy was not justified according to the data. Instead, the cross-national differences are discussed in terms of possible sociopsychological variables that influence emotion judgments.  相似文献   

18.
Do infants show distinct negative facial expressions for different negative emotions? To address this question, European American, Chinese, and Japanese 11‐month‐olds were videotaped during procedures designed to elicit mild anger or frustration and fear. Facial behavior was coded using Baby FACS, an anatomically based scoring system. Infants' nonfacial behavior differed across procedures, suggesting that the target emotions were successfully elicited. However evidence for distinct emotion‐specific facial configurations corresponding to fear versus anger was not obtained. Although facial responses were largely similar across cultures, some differences also were observed. Results are discussed in terms of functionalist and dynamical systems approaches to emotion and emotional expression.  相似文献   

19.
It was tested whether professional actors would be able to communicate emotional meaning via facial expression when this is presented to judges without context information. Forty takes were selected from movies, depicting a female or a male actor in close-up showing facial expression. These close-ups were selected by two expert judges, who knew the complete movie and had to agree on the emotion expressed or expected to be expressed by the female or male actor in the respective take. Five takes each were selected to represent the basic emotions of joy, sadness, fear, and anger. Twenty takes each were selected showing female and male actors. These 40 takes (edited in random order onto video tape) were presented to 90 judges (about half female and half male; about half younger pupils and about half older ones), whose task it was to judge the emotion(s) expressed on nine 5-point-emotion scales. Results indicated that female actors are somewhat better (though not statistically significant) in communicating emotion via facial expression without context than male actors are. Furthermore, significant interactions between portrayed emotion and gender of actors were found: While recognition rate for joy is very high and identical for both genders, female actors seem to be superior in communicating fear and sadness via facial expression, while male actors are more successful in communicating anger. A display rule approach seems to be appropriate for the interpretation of these results, indicating that gender-specific attempts to control/inhibit certain emotions do not only exist in real life, but also in movies.The research reported here was supported by a grant of the Deutsche Forschungsgemeinschaft (WA 519/2-2). I thank Uwe Balser and Tina Mertesacker for their help in selecting the stimuli, preparing the judgment tapes, recruiting subjects and conducting the study.  相似文献   

20.
To examine children's ability to control their affective expression facially, 68 second- and fourth-grade boys and girls were unobtrusively videotaped while discussing six self-chosen activities about which they felt positively, neutrally, or negatively. Children then performed three facial management tasks: (a)inhibition (showing no emotion instead of a felt emotion); (b)simulation (showing an emotion when not feeling anything); and (c)masking (showing an emotion that is different than what is felt). Twelve raters judged the edited tapes of the children's performance for affect positivity and deceptiveness. Repeated measures ANOVAs indicated, in contrast to previous research, that children were highly competent in managing facial displays. To understand children's techniques for managing affective displays, two raters categorized the primary cognitive strategies children used. Results showed that fourth graders used more complex strategies more often than second graders. These results highlight children's skill and strategies in affect management.Funding for this project was provided by a NICHHD grant (#HD22367) to the first author. We gratefully acknowledge the assistance of Nancy A. Ballard and Michael G. Rakouskas in data collection and preparation. We also thank the children whose participation and cooperation made this research possible.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号