首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Few attempts have been made since the pioneer work of Ekman et al. (1980) to examine the development of the deliberate control of facial action units in children. We are reporting here two studies concerned with this issue. In Study 1, we investigated children’s ability to activate facial action units involved in sadness and happiness expressions as well as combinations of these action units. In Study 2, we examined children’s ability to pose happiness and sadness with their face, without telling them which action unit to activate. The children who took part in this study were simply asked to portray happiness and sadness as convincingly as possible. The results of Study 1 indicate a strong developmental progression in children’s ability to produce elementary facial components of both emotions as well as in their ability to produce a combination of the elements in the case of happiness. In agreement with prior research in motor development, several non-target action units were also activated when children performed the task. Their occurrence persisted throughout childhood, indicating limitations in the finer motor control achieved by children across age. The results obtained in Study 2 paralleled those obtained in Study 1 in many respects, providing evidence that the children used the technique of deliberate action to pose the two target emotions.  相似文献   

2.
Darwin (1872) hypothesized that some facial muscle actions associated with emotion cannot be consciously inhibited, particularly when the to-be concealed emotion is strong. The present study investigated emotional “leakage” in deceptive facial expressions as a function of emotional intensity. Participants viewed low or high intensity disgusting, sad, frightening, and happy images, responding to each with a 5 s videotaped genuine or deceptive expression. Each 1/30 s frame of the 1,711 expressions (256,650 frames in total) was analyzed for the presence and duration of universal expressions. Results strongly supported the inhibition hypothesis. In general, emotional leakage lasted longer in both the upper and lower face during high-intensity masked, relative to low-intensity, masked expressions. High intensity emotion was more difficult to conceal than low intensity emotion during emotional neutralization, leading to a greater likelihood of emotional leakage in the upper face. The greatest and least amount of emotional leakage occurred during fearful and happiness expressions, respectively. Untrained observers were unable to discriminate real and false expressions above the level of chance.  相似文献   

3.
4.
The aim of the current study was to investigate the influence of happy and sad mood on facial muscular reactions to emotional facial expressions. Following film clips intended to induce happy and sad mood states, participants observed faces with happy, sad, angry, and neutral expressions while their facial muscular reactions were recorded electromyografically. Results revealed that after watching the happy clip participants showed congruent facial reactions to all emotional expressions, whereas watching the sad clip led to a general reduction of facial muscular reactions. Results are discussed with respect to the information processing style underlying the lack of mimicry in a sad mood state and also with respect to the consequences for social interactions and for embodiment theories.  相似文献   

5.
One of the most prevalent problems in face transplant patients is an inability to generate facial expression of emotions. The purpose of this study was to measure the subjective recognition of patients’ emotional expressions by other people. We examined facial expression of six emotions in two facial transplant patients (patient A = partial, patient B = full) and one healthy control using video clips to evoke emotions. We recorded target subjects’ facial expressions with a video camera while they were watching the clips. These were then shown to a panel of 130 viewers and rated in terms of degree of emotional expressiveness on a 7-point Likert scale. The scores for emotional expressiveness were higher for the healthy control than they were for patients A and B, and these varied as a function of emotion. The most recognizable emotion was happiness. The least recognizable emotions in Patient A were fear, surprise, and anger. The expressions of Patient B scored lower than those of Patient A and the healthy control. The findings show that partial and full-face transplant patients may have difficulties in generating facial expression of emotions even if they can feel those emotions, and different parts of the face seem to play critical roles in different emotional expressions.  相似文献   

6.
A comparative perspective has remained central to the study of human facial expressions since Darwin’s [(1872/1998). The expression of the emotions in man and animals (3rd ed.). New York: Oxford University Press] insightful observations on the presence and significance of cross-species continuities and species-unique phenomena. However, cross-species comparisons are often difficult to draw due to methodological limitations. We report the application of a common methodology, the Facial Action Coding System (FACS) to examine facial movement across two species of hominoids, namely humans and chimpanzees. FACS [Ekman & Friesen (1978). Facial action coding system. CA: Consulting Psychology Press] has been employed to identify the repertoire of human facial movements. We demonstrate that FACS can be applied to other species, but highlight that any modifications must be based on both underlying anatomy and detailed observational analysis of movements. Here we describe the ChimpFACS and use it to compare the repertoire of facial movement in chimpanzees and humans. While the underlying mimetic musculature shows minimal differences, important differences in facial morphology impact upon the identification and detection of related surface appearance changes across these two species.
Sarah-Jane VickEmail:
  相似文献   

7.
Micro-expression has gained a lot of attention because of its potential applications (e.g., transportation security) and theoretical implications (e.g., expression of emotions). However, the duration of micro-expression, which is considered as the most important characteristic, has not been firmly established. The present study provides evidence to define the duration of micro-expression by collecting and analyzing the fast facial expressions which are the leakage of genuine emotions. Participants were asked to neutralize their faces while watching emotional video episodes. Among the more than 1,000 elicited facial expressions, 109 leaked fast expressions (less than 500 ms) were selected and analyzed. The distribution curves of total duration and onset duration for the micro-expressions were presented. Based on the distribution and estimation, it seems suitable to define micro-expression by its total duration less than 500 ms or its onset duration less than 260 ms. These findings may facilitate further studies of micro-expressions in the future.  相似文献   

8.
We examined the effects of the temporal quality of smile displays on impressions and decisions made in a simulated job interview. We also investigated whether similar judgments were made in response to synthetic (Study 1) and human facial stimuli (Study 2). Participants viewed short video excerpts of female interviewees exhibiting dynamic authentic smiles, dynamic fake smiles, or neutral expressions, and rated them with respect to a number of attributes. In both studies, perceivers’ judgments and employment decisions were significantly shaped by the temporal quality of smiles, with dynamic authentic smiles generally leading to more favorable job, person, and expression ratings than dynamic fake smiles or neutral expressions. Furthermore, authentically smiling interviewees were judged to be more suitable and were more likely to be short-listed and selected for the job. The findings show a high degree of correspondence in the effects created by synthetic and human facial stimuli, suggesting that temporal features of smiles similarly influence perceivers’ judgments and decisions across the two types of stimulus.
Eva KrumhuberEmail:
  相似文献   

9.
Facial expressions related to sadness are a universal signal of nonverbal communication. Although results of many psychology studies have shown that drooping of the lip corners, raising of the chin, and oblique eyebrow movements (a combination of inner brow raising and brow lowering) express sadness, no report has described a study elucidating facial expression characteristics under well-controlled circumstances with people actually experiencing the emotion of sadness itself. Therefore, spontaneous facial expressions associated with sadness remain unclear. We conducted this study to accumulate important findings related to spontaneous facial expressions of sadness. We recorded the spontaneous facial expressions of a group of participants as they experienced sadness during an emotion-elicitation task. This task required a participant to recall neutral and sad memories while listening to music. We subsequently conducted a detailed analysis of their sad and neutral expressions using the Facial Action Coding System. The prototypical facial expressions of sadness in earlier studies were not observed when people experienced sadness as an internal state under non-social circumstances. By contrast, they expressed tension around the mouth, which might function as a form of suppression. Furthermore, results show that parts of these facial actions are not only related to sad experiences but also to other emotional experiences such as disgust, fear, anger, and happiness. This study revealed the possibility that new facial expressions contribute to the experience of sadness as an internal state.  相似文献   

10.
Facial symmetry is an index of developmental stability and shows a positive correlation with attractiveness assessment. However, the appearance of one’s facial symmetry is not always static and may change when there is facial movement while a person is speaking. This study examined whether viewing a dynamic image of a person speaking (where facial symmetry may alter) would elicit a different perception of attractiveness than viewing a static image of that person as a still photo. We examined changes in both measured and perceived facial symmetry in relation to attractiveness perception. We found that when facial movements created an appearance of overall greater facial symmetry while a person was speaking in a video, the person was rated as being more attractive than as a still photo. Likewise, those with facial movements measured and perceived as less symmetrical while speaking were rated as less attractive in a video clip than still photo. By examining the perception of faces in motion as we typically encounter others in real life rather than considering only static photos, we have extended the ecological validity of the study of the perception of bilateral symmetry in humans as it relates to attractiveness.  相似文献   

11.
Funding relationships in nonprofit management are increasingly defined by a philosophy of rational management, characterized by measurement of outputs and benchmarking, which represents an audit culture system (Burnley, Matthews, & McKenzie, 2005). There is concern that these approaches are constantly undermining the mission of community service nonprofit organizations (Darcy, 2002). In this research, we analyzed the management of funding relationships by examining dynamics within a nonprofit funding relationship in New Zealand. Through focus groups we explored the relationship between 17 representatives from nonprofit organizations and four Board members of a funding Trust. The management of this funding relationship was characterized by an appreciation of the diverse nature of nonprofit organizations, a balance between trust and control, and communication. We suggest that elements of these dynamics could be incorporated into nonprofit funding relationships in order to challenge an over-reliance on audit culture systems, and to re-establish relationships characterized by interaction between nonprofit organizations and their funders. Finally, we call for future research in this area.  相似文献   

12.
Adults' perceptions provide information about the emotional meaning of infant facial expressions. This study asks whether similar facial movements influence adult perceptions of emotional intensity in both infant positive (smile) and negative (cry face) facial expressions. Ninety‐five college students rated a series of naturally occurring and digitally edited images of infant facial expressions. Naturally occurring smiles and cry faces involving the co‐occurrence of greater lip movement, mouth opening, and eye constriction, were rated as expressing stronger positive and negative emotion, respectively, than expressions without these 3 features. Ratings of digitally edited expressions indicated that eye constriction contributed to higher ratings of positive emotion in smiles (i.e., in Duchenne smiles) and greater eye constriction contributed to higher ratings of negative emotion in cry faces. Stronger mouth opening contributed to higher ratings of arousal in both smiles and cry faces. These findings indicate a set of similar facial movements are linked to perceptions of greater emotional intensity, whether the movements occur in positive or negative infant emotional expressions. This proposal is discussed with reference to discrete, componential, and dynamic systems theories of emotion.  相似文献   

13.
Physical abuse history has been demonstrated to have an effect upon accuracy of interpretation of facial expressions, but he effects of sexual abuse have not been explored. Thus, the accuracy of interpretation and the role of different facial components in the interpretation of facial expressions were studied in sexually abused and non-abused girls. Twenty-nine sexually abused and 29 non-abused females, ranging in age from 5 to 9 years, chose schematic faces which best represented various emotional scenarios. Accuracy of interpretation of facial expression differed between sexually abused and non-abused girls only when emotion portrayed was considered. A history of sexual abuse alone had no effect upon overall accuracy, but did influence performance on specific emotions, particularly at certain ages. In this investigation, specific facial component had no effect on integretation of facial expressions. Rather than exhibiting patterns o fp overall arrested development, these sexually abused girls seemed to focus upon selected emotions when interpreting facial expressions. Findings regarding lhis selectivity of emotions or heightened awareness of particular emolions (e.g., anger) may be quite useful in understanding the effects of sexual abuse and in the advancement of treatment for sexual abuse victims.  相似文献   

14.
The goal of this study was to examine whether individual differences in the intensity of facial expressions of emotion are associated with individual differences in the voluntary control of facial muscles. Fifty college students completed a facial mimicry task, and were judged on the accuracy and intensity of their facial movements. Self-reported emotional experience was measured after subjects viewed positive and negative affect-eliciting filmclips, and intensity of facial expressiveness was measured from videotapes recorded while the subjects viewed the filmclips. There were significant sex differences in both facial mimicry task performance and responses to the filmclips. Accuracy and intensity scores on the mimicry task, which were not significantly correlated with one another, were both positively correlated with the intensity of facial expressiveness in response to the filmclips, but were not associated with reported experiences.We wish to thank the Editor and two anonymous reviewers for their helpful comments on an earlier draft of this paper.  相似文献   

15.
People can discriminate cheaters from cooperators on the basis of negative facial expressions. However, such cheater detection is far from perfect in real-world situations. Therefore, it is possible that cheaters have the ability to disguise negative emotional expressions that signal their uncooperative attitude. To test this possibility, emotional intensity and trustworthiness were evaluated for facial photographs of cheaters and cooperators defined by scores in an economic game. The facial photographs had either posed happy or angry expressions. The angry expressions of cheaters were rated angrier and less trustworthy than those of cooperators. On the other hand, happy expressions of cheaters were higher in emotional intensity but comparable to those of cooperators in trustworthiness. These results suggest that cheater detection based on the processing of negative facial expressions can be thwarted by a posed or fake smile, which cheaters put on with higher intensity than cooperators.  相似文献   

16.
Two studies were conducted using video records of real faces and three-dimensional schematic faces to investigate the perceptual distortions introduced by viewing faces at a vertical angle and their influence on the attribution of emotional expressions and attitudes. The results indicate that faces seen from below were perceived as morepositive and lessnegative, while faces seen from above appeared morenegative and lesspositive. This effect seems to be moderated by interindividual differences in facial morphology, and perhaps by differences in dynamic aspects of expressions. The second study investigated the respective contribution of the upper half and the lower half of the face to the perceptual distortion found. In general, judges based their attributions of emotional state more on cues from the upper half of the face.  相似文献   

17.
In this study, we investigated the emotional effect of dynamic presentation of facial expressions. Dynamic and static facial expressions of negative and positive emotions were presented using computer-morphing (Experiment 1) and videos of natural changes (Experiment 2), as well as other dynamic and static mosaic images. Participants rated the valence and arousal of their emotional response to the stimuli. The participants consistently reported higher arousal responses to dynamic than to static presentation of facial expressions and mosaic images for both valences. Dynamic presentation had no effect on the valence ratings. These results suggest that dynamic presentation of emotional facial expressions enhances the overall emotional experience without a corresponding qualitative change in the experience, although this effect is not specific to facial images.
Wataru SatoEmail:
  相似文献   

18.
The facial behavior during a marble rolling game was analyzed for two samples of children between the ages of 2 and 5 years using the Facial Action Coding System (FACS). In addition, for a subsample of children temperament ratings were available. Analysis of coding reliability showed that frequency as well as temporal location coding can be reliably performed for preschoolers. The facial movements show a frequency distribution which is highly similar in both samples. Movements of the mouth, especially the components of smiling, and some movements of the eye region, were observed frequently. Most other facial movements were infrequent events. The more often shown facial movements were stable over a period up to 18 months. In addition, sum-scores of emotion-relevant Action Units were meaningfully related to infant temperament characteristics.  相似文献   

19.
The current study examined the effects of institutionalization on the discrimination of facial expressions of emotion in three groups of 42‐month‐old children. One group consisted of children abandoned at birth who were randomly assigned to Care‐as‐Usual (institutional care) following a baseline assessment. Another group consisted of children abandoned at birth who were randomly assigned to high‐quality foster care following a baseline assessment. A third group consisted of never‐institutionalized children who were reared by their biological parents. All children were familiarized to happy, sad, fearful, and neutral facial expressions and tested on their ability to discriminate familiar versus novel facial expressions. Contrary to our prediction, all three groups of children were equally capable of discriminating among the different expressions. Furthermore, in contrast to findings at 13–30 months of age, these same children showed familiarity rather than novelty preferences toward different expressions. There were also asymmetries in children’s discrimination of facial expressions depending on which facial expression served as the familiar versus novel stimulus. Collectively, early institutionalization appears not to impact the development of the ability to discriminate facial expressions of emotion, at least when preferential looking serves as the dependent measure. These findings are discussed in the context of the myriad domains that are affected by early institutionalization.  相似文献   

20.
Cross-cultural and laboratory research indicates that some facial expressions of emotion are recognized more accurately and faster than others. We assessed the hypothesis that such differences depend on the frequency with which each expression occurs in social encounters. Thirty observers recorded how often they saw different facial expressions during natural conditions in their daily life. For a total of 90 days (3 days per observer), 2,462 samples of seen expressions were collected. Among the basic expressions, happy faces were observed most frequently (31 %), followed by surprised (11.3 %), sad (9.3 %), angry (8.7 %), disgusted (7.2 %), and fearful faces, which were the least frequent (3.4 %). A significant amount (29 %) of non-basic emotional expressions (e.g., pride or shame) were also observed. We correlated our frequency data with recognition accuracy and response latency data from prior studies. In support of the hypothesis, significant correlations (generally, above .70) emerged, with recognition accuracy increasing and latency decreasing as a function of frequency. We conclude that the efficiency of facial emotion recognition is modulated by familiarity of the expressions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号