首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Human social interaction is enriched with synchronous movement which is said to be essential to establish interactional flow. One commonly investigated phenomenon in this regard is facial mimicry, the tendency of humans to mirror facial expressions. Because studies investigating facial mimicry in face-to-face interactions are lacking, the temporal dynamics of facial mimicry remain unclear. We therefore developed and tested the suitability of a novel approach to quantifying facial expression synchrony in face-to-face interactions: windowed cross-lagged correlation analysis (WCLC) for electromyography signals. We recorded muscle activations related to smiling (Zygomaticus Major) and frowning (Corrugator Supercilii) of two interaction partners simultaneously in 30 dyadic affiliative interactions. We expected WCLC to reliably detect facial expression synchrony above chance level and, based on previous research, expected the occurrence of rapid synchronization of smiles within 200 ms. WCLC significantly detected synchrony of smiling but not frowning compared to a control condition of chance level synchrony in six different interactional phases (smiling: d z s = .85–1.11; frowning: d z s = .01–.30). Synchronizations of smiles between interaction partners predominantly occurred within 1000 ms, with a significant amount occurring within 200 ms. This rapid synchronization of smiles supports the notion of the existence of an anticipated mimicry response for smiles. We conclude that WCLC is suited to quantify the temporal dynamics of facial expression synchrony in dyadic interactions and discuss implications for different psychological research areas.  相似文献   

2.
We examined 6‐month‐old infants' abilities to discriminate smiling and frowning from neutral stimuli. In addition, we assessed the relationship between infants' preferences for varying intensities of smiling and frowning facial expressions and their mothers' history of depressive symptoms. Forty‐six infants were presented pairs of facial expressions, and their preferential looking time was recorded. They also participated in a 3‐min interaction with their mothers for which duration of both mother and infant gazing and smiling were coded. Analyses revealed that the infants reliably discriminated between varying intensities of smiling and frowning facial expressions and a paired neutral expression. In addition, infants' preferences for smiling and frowning expressions were related to self‐reports of maternal depressive symptoms experienced since the birth of the infant. Potential implications for social cognitive development are discussed.  相似文献   

3.
According to the facial feedback hypothesis, facial muscles do not only express emotions, they also have the ability to modulate subjective experiences of emotions and to initiate emotions. This study examined the voluntary facial action technique, where participants were instructed to react with the Zygomatic major muscle (smile) or the Corrugator supercilii muscle (frown) when exposed to different stimuli. The results demonstrate that the technique effectively induces facial feedback effects. Through use of this technique we further addressed three important areas of facial feedback and found, first, that facial feedback did not modulate the experience of positive and negative emotion evoking stimuli differently. Second, the modulating ability provided significant feedback effects, while the initiating ability did not. Third, an effect of feedback remained and could be detected even some time after the critical manipulation. It is concluded that the present technique can be used in the future study of facial feedback.  相似文献   

4.
Socially anxiety may be related to a different pattern of facial mimicry and contagion of others’ emotions. We report two studies in which participants with different levels of social anxiety reacted to others’ emotional displays, either shown on a computer screen (Study 1) or in an actual social interaction (Study 2). Study 1 examined facial mimicry and emotional contagion in response to displays of happiness, anger, fear, and contempt. Participants mimicked negative and positive emotions to some extent, but we found no relation between mimicry and the social anxiety level of the participants. Furthermore, socially anxious individuals were more prone to experience negative emotions and felt more irritated in response to negative emotion displays. In Study 2, we found that social anxiety was related to enhanced mimicry of smiling, but this was only the case for polite smiles and not for enjoyment smiles. These results suggest that socially anxious individuals tend to catch negative emotions from others, but suppress their expression by mimicking positive displays. This may be explained by the tendency of socially anxious individuals to avoid conflict or rejection.  相似文献   

5.
This article introduces the Children’s Scales of Pleasure and Arousal as instruments to enable children to provide judgments of emotions they witness or experience along the major dimensions of affect. In two studies (Study 1: N = 160, 3–11 years and adults; Study 2: N = 280, 3–5 years and adults), participants used the scales to indicate the levels of pleasure or arousal they perceived in stylized drawings of facial expressions, in photographs of facial expressions, or in emotion labels. All age groups used the Pleasure Scale reliably and accurately with all three types of stimuli. All used the Arousal Scale with stylized faces and with facial expressions, but only 5-year-olds did so for emotion labels.  相似文献   

6.
In this study, we investigated the emotional effect of dynamic presentation of facial expressions. Dynamic and static facial expressions of negative and positive emotions were presented using computer-morphing (Experiment 1) and videos of natural changes (Experiment 2), as well as other dynamic and static mosaic images. Participants rated the valence and arousal of their emotional response to the stimuli. The participants consistently reported higher arousal responses to dynamic than to static presentation of facial expressions and mosaic images for both valences. Dynamic presentation had no effect on the valence ratings. These results suggest that dynamic presentation of emotional facial expressions enhances the overall emotional experience without a corresponding qualitative change in the experience, although this effect is not specific to facial images.
Wataru SatoEmail:
  相似文献   

7.
To better understand early positive emotional expression, automated software measurements of facial action were supplemented with anatomically based manual coding. These convergent measurements were used to describe the dynamics of infant smiling and predict perceived positive emotional intensity. Over the course of infant smiles, degree of smile strength varied with degree of eye constriction (cheek raising, the Duchenne marker), which varied with degree of mouth opening. In a series of three rating studies, automated measurements of smile strength and mouth opening predicted naïve (undergraduate) observers’ continuous ratings of video clips of smile sequences, as well as naïve and experienced (parent) ratings of positive emotion in still images from the sequences. An a priori measure of smile intensity combining anatomically based manual coding of both smile strength and mouth opening predicted positive emotion ratings of the still images. The findings indicate the potential of automated and fine-grained manual measurements of facial actions to describe the course of emotional expressions over time and to predict perceptions of emotional intensity.  相似文献   

8.
The study investigates effects of emotional pictures and words on Center of Pressure (CoP) whole-body reactions, based on theories of emotional valence and arousal, approach–avoidance theory, freezing in humans, and stimulus type (pictures vs. words). For freezing, the study differentiated between rambling and trembling components of the CoP reaction. We hypothesized that negative versus positive emotional valence caused stronger CoP avoidance, for both emotional pictures and words. In addition, freezing was hypothesized to be evident in the CoP trembling component caused by high emotional arousal. Forty-five students enrolled in a teacher program completed a bipedal assessment on a force plate while watching positive versus negative and high- versus low-arousal pictures and words that had been selected from stimulus lists in a pretest. Participants rated the valence and arousal of all stimuli in a questionnaire, the results of which indicated a relationship between negative valence and high arousal. The force plate data confirm the hypotheses. First, negative stimuli elicited significant avoidance CoP shifts, independent of their arousal, as indicated by t tests. This effect was found for both emotional pictures and words. CoP for positive stimuli did not differ from zero. Second, indicating freezing, the CoP trembling component was increased by high arousal, independent of valence. Freezing was only found for emotional pictures. The study discusses both the CoP avoidance effect with respect to valence and stimulus type, and the value of the trembling analysis for freezing. It closes with an analysis of the methodological limitations and with recommendations for future studies.  相似文献   

9.
Twenty-five high-functioning, verbal children and adolescents with autism spectrum disorders (ASD; age range 8–15 years) who demonstrated a facial emotion recognition deficit were block randomized to an active intervention (n = 12) or waitlist control (n = 13) group. The intervention was a modification of a commercially-available, computerized, dynamic facial emotion training tool, the MiX by Humintell©. Modifications were introduced to address the special learning needs of individuals with ASD and to address limitations in current emotion recognition programs. Modifications included: coach-assistance, a combination of didactic instruction for seven basic emotions, scaffold instruction which included repeated practice with increased presentation speeds, guided attention to relevant facial cues, and imitation of expressions. Training occurred twice each week for 45–60 min across an average of six sessions. Outcome measures were administered prior to and immediately after treatment, as well as after a delay period of 4–6 weeks. Outcome measures included (a) direct assessment of facial emotion recognition, (b) emotion self-expression, and (c) generalization through emotion awareness in videos and stories, use of emotion words, and self-, parent-, and teacher-report on social functioning questionnaires. The facial emotion training program enabled children and adolescents with ASD to more accurately and quickly identify feelings in facial expressions with stimuli from both the training tool and generalization measures and demonstrate improved self-expression of facial emotion.  相似文献   

10.
Three studies asked whether reported emotional response interfere with magnitude sensitivity, defined as a subjective evaluation difference between a high magnitude outcome and a low one. Previous research has reported that emotion reduces magnitude sensitivity under separate evaluation in a gain domain (Hsee & Rottenstreich, 2004), a negative effect. We test the generality of this emotion effect in gain and loss domains, and under separate or joint evaluation mode, using a variety of stimuli. We found an opposite, positive, effect in Experiment 1 (in willingness to pay to save species or prevent health impairments) and Experiment 3 (in willingness to pay to prevent bad outcomes in news stories) but replicated the original negative effect in Experiment 2 (compensation for losses). Further research is needed to disentangle possible causes of these effects and to explore how these findings may be applied to measurement of values for non-market goods.  相似文献   

11.
Women were videotaped while they spoke about a positive and a negative experience either in the presence of an experimenter or alone. They gave self-reports of their emotional experience, and the videotapes were rated for facial and verbal expression of emotion. Participants spoke less about their emotions when the experimenter (E) was present. When E was present, during positive disclosures they smiled more, but in negative disclosures they showed less negative and more positive expression. Facial behavior was only related to experienced emotion during positive disclosure when alone. Verbal behavior was related to experienced emotion for positive and negative disclosures when alone. These results show that verbal and nonverbal behaviors, and their relationship with emotional experience, depend on the type of emotion, the nature of the emotional event, and the social context.  相似文献   

12.
Adults' perceptions provide information about the emotional meaning of infant facial expressions. This study asks whether similar facial movements influence adult perceptions of emotional intensity in both infant positive (smile) and negative (cry face) facial expressions. Ninety‐five college students rated a series of naturally occurring and digitally edited images of infant facial expressions. Naturally occurring smiles and cry faces involving the co‐occurrence of greater lip movement, mouth opening, and eye constriction, were rated as expressing stronger positive and negative emotion, respectively, than expressions without these 3 features. Ratings of digitally edited expressions indicated that eye constriction contributed to higher ratings of positive emotion in smiles (i.e., in Duchenne smiles) and greater eye constriction contributed to higher ratings of negative emotion in cry faces. Stronger mouth opening contributed to higher ratings of arousal in both smiles and cry faces. These findings indicate a set of similar facial movements are linked to perceptions of greater emotional intensity, whether the movements occur in positive or negative infant emotional expressions. This proposal is discussed with reference to discrete, componential, and dynamic systems theories of emotion.  相似文献   

13.
This preliminary study presents data on training to improve the accuracy of judging facial expressions of emotion, a core component of emotional intelligence. Feedback following judgments of angry, fearful, sad, and surprised states indicated the correct answers as well as difficulty level of stimuli. Improvement was greater for emotional expressions originating from a cultural group more distant from participants’ own family background, for which feedback likely provides greater novel information. These results suggest that training via feedback can improve emotion perception skill. Thus, the current study also provides suggestive evidence for cultural learning in emotion, for which previous research has been cross-sectional and subject to selection biases. Hillary Anger Elfenbein is affiliated with Organizational Behavior and Industrial Relations, University of California, Berkeley, CA. This research was supported by National Institute of Mental Health Behavioral Science Track Award for Rapid Transition 1R03MH071294-1. I thank Howard Friedman, Ursula Hess, Abigail Marsh, and three anonymous reviewers for their helpful comments, and Ken Coelho and Cindy Lau for research assistance.  相似文献   

14.
We investigated how power priming affects facial emotion recognition in the context of body postures conveying the same or different emotion. Facial emotions are usually recognized better when the face is presented with a congruent body posture, and recognized worse when the body posture is incongruent. In our study, we primed participants to either low, high, or neutral power prior to their performance in a facial-emotion categorization task in which faces were presented together with a congruent or incongruent body posture. Facial emotion recognition in high-power participants was not affected by body posture. In contrast, low-power and neutral-power participants were significantly affected by the congruence of facial and body emotions. Specifically, these participants displayed better facial emotion recognition when the body posture was congruent, and worse performance when the body posture was incongruent. In a following task, we trained the same participants to categorize two sets of novel checkerboard stimuli and then engaged them in a recognition test involving compounds of these stimuli. High, low, and neutral-power participants all showed a strong congruence effect for compound checkerboard stimuli. We discuss our results with reference to the literature on power and social perception.  相似文献   

15.
Unclaimed prize information (i.e., the number of prizes still available to be won) is information commonly provided to scratch card gamblers. However, unless the number of tickets remaining to be purchased is also provided, this information is uninformative. Despite its lack of utility in assisting gamblers in choosing the most favourable type of scratch card to play, we hypothesized that unclaimed prize information would bias participants’ judgments within a scratch card gambling context. In Experiment 1 (N?=?201), we showed that participants are influenced by this information such that they felt more likely to win, were more excited to play, and preferred to hypothetically purchase more of the scratch card with the greatest number of unclaimed prizes. In Experiment 2 (N?=?201), we attempted to ameliorate this bias by providing participants with the number of tickets remaining to be purchased and equating the payback percentages of all three games. The bias, although attenuated, still persisted in these conditions. Finally, in Experiment 3 (N?=?200), we manipulated the hypothetical scratch cards such that games with the highest number of unclaimed prizes were the least favourable, and vice versa. As in Experiment 2, participants still favoured cards with greater numbers of unclaimed prizes. Possible mechanisms underlying this bias are discussed. In conclusion, across three experiments, we demonstrate that salient unclaimed prize information is capable of exerting a strong effect over judgments related to scratch card games.  相似文献   

16.
Nonverbally-expressed emotions are not always linked to people’s true emotions. We investigated whether observers’ ability to distinguish trues from lies differs for positive and negative emotional expressions. Participants judged targets either simulating or truly experiencing positive or negative emotions. Deception detection was measured by participants’ inference of the targets’ emotions and their direct judgments of deception. Results of the direct measure showed that participants could not accurately distinguish between truth tellers and liars, regardless which emotion was expressed. As anticipated, the effects emerged on the indirect emotion measure: participants distinguished liars from truth tellers when inferring experienced emotions from negative emotional expressions, but not positive emotional expressions.  相似文献   

17.
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two 6‐month‐old infant‐mother dyads who each engaged in a face‐to‐face interaction. Automated measurements showed high associations with anatomically based manual coding (concurrent validity); measurements of smiling showed high associations with mean ratings of positive emotion made by naive observers (construct validity). For both infants and mothers, smile strength and eye constriction (the Duchenne marker) were correlated over time, creating a continuous index of smile intensity. Infant and mother smile activity exhibited changing (nonstationary) local patterns of association, suggesting the dyadic repair and dissolution of states of affective synchrony. The study provides insights into the potential and limitations of automated measurement of facial action.  相似文献   

18.
Sex Differences in Self-awareness of Smiling During a Mock Job Interview   总被引:1,自引:0,他引:1  
The present study examined sex differences in awareness of smiling behavior during a job interview, along with intended outcomes of false smiling. Male and female participants were assigned to the interviewee role of a mock job interview and were videotaped. Results indicate that women were more self-aware of false, but not genuine, smiling. In addition, women reported using false smiles to mask negative emotion and to appear enthusiastic more than did men. Naïve judges rated women who smiled in an attempt to mask negative emotion more harshly than men who smiled for this reason. Implications of these findings for the understanding of sex differences in smiling are discussed.  相似文献   

19.
Facial expressions related to sadness are a universal signal of nonverbal communication. Although results of many psychology studies have shown that drooping of the lip corners, raising of the chin, and oblique eyebrow movements (a combination of inner brow raising and brow lowering) express sadness, no report has described a study elucidating facial expression characteristics under well-controlled circumstances with people actually experiencing the emotion of sadness itself. Therefore, spontaneous facial expressions associated with sadness remain unclear. We conducted this study to accumulate important findings related to spontaneous facial expressions of sadness. We recorded the spontaneous facial expressions of a group of participants as they experienced sadness during an emotion-elicitation task. This task required a participant to recall neutral and sad memories while listening to music. We subsequently conducted a detailed analysis of their sad and neutral expressions using the Facial Action Coding System. The prototypical facial expressions of sadness in earlier studies were not observed when people experienced sadness as an internal state under non-social circumstances. By contrast, they expressed tension around the mouth, which might function as a form of suppression. Furthermore, results show that parts of these facial actions are not only related to sad experiences but also to other emotional experiences such as disgust, fear, anger, and happiness. This study revealed the possibility that new facial expressions contribute to the experience of sadness as an internal state.  相似文献   

20.
Background: Companies are increasingly applying both goal- and performance-oriented leadership practices. For employees, such indirect control practices make higher self-regulatory demands: They become responsible for their work outcomes and have to bear the consequences of failure just like the self-employed. The current study focuses on the concept of “self-endangering work behaviors” as representing a possible negative effect of indirect control and a possible mediator between work demands and negative outcomes. Method: An online survey was conducted with 607 employees, who reported to work in an indirect control setting. It assessed extension of working hours, intensification of working hours, sickness presenteeism, and faking as possible self-endangering work behaviors together with exhaustion as a subjective well-being measure. The lavaan package was used to test the mediation hypothesis with a structural equation model. Results: Results supported the assumption that self-endangering work behaviors might partly explain the association between work demands and exhaustion. A mediation effect was found for extension of working hours, intensification of working hours, and for faking. However, sickness presenteeism delivered no statistically significant mediation effect in the association between work demands and exhaustion. Discussion: As a mechanism for coping with high work demands, the new concept of self-endangering work behaviors offers one possible explanation for the negative association between high work demands and both subjective well-being and health. The concept needs to be addressed in occupational health prevention initiatives. Such interventions should balance the negative and positive effects of indirect control and take self-endangering work behavior into account.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号