首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The aim of this study was to examine the effect of two basic emotions, happiness and sadness, on dance movement. A total of 32 adult participants were induced to feel emotional states of either happiness or sadness and then danced intuitively to an emotionally ‘neutral’ piece of music, composed specifically for the experiment. Based on an Effort-Shape analysis of body movement, full body movement was captured and seven different movement cues were examined, in order to explore whether differences in corporeal articulations between the happy and sad condition existed. Results revealed that in the happy condition, participants moved faster, with more acceleration, and made more expanded and more impulsive movements than in the sad condition. Results are discussed with respect to possible consequences for future research on human movement.  相似文献   

2.
Two studies set out to provide information regarding social environment contexts of trait emotional intelligence. In an initial exploratory study, participants with higher emotional intelligence were associated with close others of high emotional intelligence. A second longitudinal study examined the effect of social environments comprising higher or lower emotional intelligence on changes in emotional intelligence of residents in these environments. This study assessed the emotional intelligence and subjective well-being of residents of colleges at the start of a semester and again three months later. A higher composite college residence emotional intelligence index predicted an increase in participants' emotional intelligence and positive affect. These effects were especially strong for first-semester residents. The results offer new information regarding social environment contexts of trait emotional intelligence and extend findings regarding contagion of emotion and transmission of individual differences in complex emotion processing.  相似文献   

3.
We investigated how power priming affects facial emotion recognition in the context of body postures conveying the same or different emotion. Facial emotions are usually recognized better when the face is presented with a congruent body posture, and recognized worse when the body posture is incongruent. In our study, we primed participants to either low, high, or neutral power prior to their performance in a facial-emotion categorization task in which faces were presented together with a congruent or incongruent body posture. Facial emotion recognition in high-power participants was not affected by body posture. In contrast, low-power and neutral-power participants were significantly affected by the congruence of facial and body emotions. Specifically, these participants displayed better facial emotion recognition when the body posture was congruent, and worse performance when the body posture was incongruent. In a following task, we trained the same participants to categorize two sets of novel checkerboard stimuli and then engaged them in a recognition test involving compounds of these stimuli. High, low, and neutral-power participants all showed a strong congruence effect for compound checkerboard stimuli. We discuss our results with reference to the literature on power and social perception.  相似文献   

4.
The aim of the present study was to investigate developmental differences in reliance on situational versus vocal cues for recognition of emotions. Turkish preschool, second, and fifth grade children participated in the study. Children listened to audiotape recordings of situations between a mother and a child where the emotional cues implied by the context of a vignette and the vocal expression were either consistent or inconsistent. After listening to each vignette, participants were questioned about the content of the incident and were asked to make a judgment about the emotion of the mother referred to in the recording. Angry, happy, and neutral emotions were utilized. Results revealed that 1) recognition of emotions improved with age, and 2) children relied more on the channel depicting either anger or happiness than on the channel depicting neutrality.  相似文献   

5.
The facial feedback hypothesis states that facial actions modulate subjective experiences of emotion. Using the voluntary facial action technique, in which the participants react with instruction induced smiles and frowns when exposed to positive and negative emotional pictures and then rate the pleasantness of these stimuli, four questions were addressed in the present study. The results in Experiment 1 demonstrated a feedback effect because participants experienced the stimuli as more pleasant during smiling as compared to when frowning. However, this effect was present only during the critical actions of smiling and frowning, with no remaining effects after 5 min or after 1 day. In Experiment 2, feedback effects were found only when the facial action (smile/frown) was incongruent with the presented emotion (positive/negative), demonstrating attenuating but not enhancing modulation. Finally, no difference in the intensity of produced feedback effect was found between smiling and frowning, and no difference in feedback effect was found between positive and negative emotions. In conclusion, facial feedback appears to occur mainly during actual facial actions, and primarily attenuate ongoing emotional states.  相似文献   

6.
Abstract

Power is associated with living in reward-rich environments and causes behavioural disinhibition (Keltner, Grumfeld & Anderson, 2003). Powerful people also have greater freedom of emotional expression (Hecht & La France, 1998). Two studies were conducted with the aim of: a) analyzing the effect of dispositional power on emotion suppression, and b) exploring the simple and interaction effects of dispositional and situational power on emotion suppression. In a first correlational study, the power of individuals was found to be negatively correlated with emotion suppression. In a second experimental study, participants were assigned to a powerful or powerless position and negative emotions were induced with pictures. Participants were asked to suppress their emotions during the presentation of the pictures. Participants' emotion suppression was measured using the suppression subscale of the Emotion Regulation Questionnaire (Gross & John, 2003). Results showed that dispositionally powerless participants suppressed their emotions more than dispositionally powerful participants only when they were assigned to a low power position. These results are discussed.  相似文献   

7.
Cross-sectional studies support negative associations between children’s skills in recognizing emotional expressions and their problem behaviors. Few studies have examined these associations over time, however, precluding our understanding of the direction of effects. Emotion recognition difficulties may contribute to the development of problem behaviors; additionally, problem behaviors may constrain the development of emotion recognition skill. The present study tested the bidirectional linkages between children’s emotion recognition and teacher-reported problem behaviors in 1st and 3rd grade. Specifically, emotion recognition, hyperactivity, internalizing behaviors, and externalizing behaviors were assessed in 117 children in 1st grade and in 3rd grade. Results from fully cross-lagged path models revealed divergent developmental patterns: Controlling for concurrent levels of problem behaviors and first-grade receptive language skills, lower emotion recognition in 1st grade significantly predicted greater internalizing behaviors, but not hyperactivity or externalizing behaviors, in 3rd grade. Moreover, greater hyperactivity in 1st grade marginally predicted lower emotion recognition in 3rd grade, but internalizing and externalizing behaviors were not predictive of emotion recognition over time. Together, these findings extend previous research to identify specific developmental pathways, whereby emotion recognition difficulties contribute to the development of internalizing behaviors, and early hyperactivity may contribute to the development of emotion recognition difficulties, thus highlighting the importance of examining these processes and their mutual development over time.  相似文献   

8.
Lai CJ  Chiang CC 《Work (Reading, Mass.)》2012,41(Z1):5419-5421
Consumer on-line behaviors are more important than ever due to highly growth of on-line shopping. The purposes of this study were to design placement methods of background music for shopping website and examine the effect on browsers' emotional and cognitive response. Three placement points of background music during the browsing, i.e. 2 min., 4 min., and 6 min. from the start of browsing were considered for entry points. Both browsing without music (no music) and browsing with constant music volume (full music) were treated as control groups. Participants' emotional state, approach-avoidance behavior intention, and action to adjust music volume were collected. Results showed that participants had a higher level of pleasure, arousal and approach behavior intention for the three placement points than for no music and full music. Most of the participants for full music (5/6) adjusted the background music. Only 16.7% (3/18) participants for other levels turn off the background music. The results indicate that playing background music after the start of browsing is benefit for on-line shopping atmosphere. It is inappropriate to place background music at the start of browsing shopping website. The marketer must manipulated placement methods of background music for a web store carefully.  相似文献   

9.
Recent research on human nonverbal vocalizations has led to considerable progress in our understanding of vocal communication of emotion. However, in contrast to studies of animal vocalizations, this research has focused mainly on the emotional interpretation of such signals. The repertoire of human nonverbal vocalizations as acoustic types, and the mapping between acoustic and emotional categories, thus remain underexplored. In a cross-linguistic naming task (Experiment 1), verbal categorization of 132 authentic (non-acted) human vocalizations by English-, Swedish- and Russian-speaking participants revealed the same major acoustic types: laugh, cry, scream, moan, and possibly roar and sigh. The association between call type and perceived emotion was systematic but non-redundant: listeners associated every call type with a limited, but in some cases relatively wide, range of emotions. The speed and consistency of naming the call type predicted the speed and consistency of inferring the caller’s emotion, suggesting that acoustic and emotional categorizations are closely related. However, participants preferred to name the call type before naming the emotion. Furthermore, nonverbal categorization of the same stimuli in a triad classification task (Experiment 2) was more compatible with classification by call type than by emotion, indicating the former’s greater perceptual salience. These results suggest that acoustic categorization may precede attribution of emotion, highlighting the need to distinguish between the overt form of nonverbal signals and their interpretation by the perceiver. Both within- and between-call acoustic variation can then be modeled explicitly, bringing research on human nonverbal vocalizations more in line with the work on animal communication.  相似文献   

10.
11.
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds’ fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).  相似文献   

12.
The purpose of this study was to investigate contributions of maternal emotional resources to individual differences in adolescents’ functional connectivity during emotion regulation. Participants included 35 adolescent girls who completed an implicit emotion regulation task during fMRI. Mothers reported on the quality of their adult attachment and emotional awareness when youth were in elementary school. Higher anxious attachment and lower emotional awareness were significantly correlated with more positive amygdala–right ventrolateral prefrontal cortex connectivity, a pattern linked in prior research with ineffective emotion regulation and emotional difficulties. Further, there was an indirect effect of anxious attachment on adolescent connectivity through emotional awareness. These results suggest that compromised maternal emotional resources in childhood may be linked to atypical neural processing of emotions.  相似文献   

13.
Recent research has demonstrated that preschool children can decode emotional meaning in expressive body movement; however, to date, no research has considered preschool children's ability to encode emotional meaning in this media. The current study investigated 4- (N = 23) and 5- (N = 24) year-old children's ability to encode the emotional meaning of an accompanying music segment by moving a teddy bear using previously modeled expressive movements to indicate one of four target emotions (happiness, sadness, anger, or fear). Adult judges visually categorized the silent videotaped expressive movement performances by children of both ages with greater than chance level accuracy. In addition, accuracy in categorizing the emotion being expressed varied as a function of age of child and emotion. A subsequent cue analysis revealed that children as young as 4 years old were systematically varying their expressive movements with respect to force, rotation, shifts in movement pattern, tempo, and upward movement in the process of emotional communication. The theoretical significance of such encoding ability is discussed with respect to children's nonverbal skills and the communication of emotion.  相似文献   

14.
The present research explores the link between the personality trait exploitativeness, a component of narcissism, and emotion recognition abilities. Prior research on this topic has produced inconsistent findings. We attempt to resolve these inconsistencies by testing the hypothesis that narcissistic exploitativeness, in particular, should be associated with emotion-reading abilities because it specifically taps into the motivation to manipulate others. Across two studies we find that narcissistic exploitativeness is indeed associated with increased emotion recognition, but in some cases the confounding effects of mood need to be considered (Study 1). Importantly, effect sizes of narcissistic exploitativeness were similar in magnitude to two different measures of dispositional empathy, which is an established correlate of emotion recognition. These studies suggest that emotional recognition abilities are associated with desirable and undesirable traits.  相似文献   

15.
Recent research on unemployment has not sufficiently acknowledged how unemployment reverberates within families, particularly emotionally. This article uses data from more than 50 in‐depth interviews to illuminate the emotional demands that men's unemployment makes beyond the unemployed individual. It shows that wives of unemployed men do two types of emotion work—self‐focused and other‐focused—and both are aimed toward facilitating husbands' success in the emotionally arduous white‐collar job‐search process. This article extends research on emotion work by suggesting that participants perceive wives' emotion work as a resource with potential economic benefits in the form of unemployed men's reemployment. The findings furthermore suggest that as a resource, wives' emotion work is shaped by the demands of the labor market that their husbands encounter.  相似文献   

16.
Human–human communication studies have suggested that within communicative interactions, individuals acknowledge each other as intentional agents and adjust their emotion nonverbal behavior according to the other. This process has been defined as emotional attunement. In this study, we examine the emotional attunement process in the context of affective human–computer interactions. To this purpose, participants were exposed to one of two conditions. In one case, they played with a computer that simulated understanding of their emotional reactions while guiding them across four different game-like activities; in the other, the computer guided participants across the activities without mentioning any ability to understand emotional responses. Face movements, gaze direction, posture, vocal behavior, electrocardiogram and electrodermal activity were simultaneously recorded during the experimental sessions. Results showed that if participants were aware of interacting with an agent able to recognize their emotions, they reported that the computer was able to “understand” them and showed a higher number of nonverbal behaviors during the most interactive activity. The implications are discussed.  相似文献   

17.
The relationship between an individual's habitual, emotional dispositions or tendencies — an aspect of personality — and his ability to recognize facially expressed emotions was explored. Previous studies have used global scores of recognition accuracy across several emotions, but failed to examine the relationship between emotion traits and recognition of specific emotion expressions. In the present study, these more specific relationships were examined. Results indicated that females with an inhibited-non-assertive personality style tended to have poorer emotion recognition scores than more socially oriented females. In contrast, for males, the relationship between traits and recognition scores was much more emotion specific: Emotion traits were not significantly related to a global measure of recognition accuracy but were related to recognition rates of certain emotion expressions — sadness, anger, fear, surprise, and disgust. For most of the emotions, males appeared to be more likely to see what they feel. Possible explanations of the findings in terms of perceptual set and other mechanisms are discussed. Implications for clinical studies and research are noted. The study also highlights the importance of separate analysis of data for specific emotions, as well as for sex.  相似文献   

18.
Cross-cultural and laboratory research indicates that some facial expressions of emotion are recognized more accurately and faster than others. We assessed the hypothesis that such differences depend on the frequency with which each expression occurs in social encounters. Thirty observers recorded how often they saw different facial expressions during natural conditions in their daily life. For a total of 90 days (3 days per observer), 2,462 samples of seen expressions were collected. Among the basic expressions, happy faces were observed most frequently (31 %), followed by surprised (11.3 %), sad (9.3 %), angry (8.7 %), disgusted (7.2 %), and fearful faces, which were the least frequent (3.4 %). A significant amount (29 %) of non-basic emotional expressions (e.g., pride or shame) were also observed. We correlated our frequency data with recognition accuracy and response latency data from prior studies. In support of the hypothesis, significant correlations (generally, above .70) emerged, with recognition accuracy increasing and latency decreasing as a function of frequency. We conclude that the efficiency of facial emotion recognition is modulated by familiarity of the expressions.  相似文献   

19.
Surveying recent developments in management and work culture, computing and social media, and science and psychology, this article speculates on the concept of emotional extraction. Emotional extraction is defined in two ways. One iteration involves the transfer of emotional resources from one individual or group to another, such as that which occurs in the work of caring for others, but which also increasingly occurs in the work of producing new technology, such as emotionally aware computers. A second instance of emotional extraction entails the use of emotion knowledge – or theories about emotions, such as emotional intelligence – to generate conclusions or predictions about human behaviour. Emotional extraction in service work, management, marketing, social media, artificial intelligence, and neuroscience are discussed. ‘Mining the mind’ focuses in particular on emotional extraction that enhances both productivity and predictability, in turn tracing how emotionally extractive sites are implicated within the production and hierarchical valuation of difference – especially racial and gendered, but also neural difference – in everyday life. The article aims to offer scholars in cultural studies, as well as critical race theory, feminist theory, and critical disability studies, ways to think about this newly intensifying resource extraction and the intersections of culture, capital, and human experience that such extraction indexes and makes possible.  相似文献   

20.
This study explores the cognitive effects of emotional visuals and company–cause fit in CSR communication. We conducted a 2 (valence of visuals: positive vs. negative)?×?2 (company–cause fit: high vs. low) within-subjects experiment to examine the effects of valence and company–cause fit on participants’ memory of CSR information, measured by recognition sensitivity and cued recall of company information in CSR messages addressing three different CSR issues. Results showed that negative emotional visuals were more effective than positive emotional visuals. Company–cause fit also played a significant role, but its effect depended on the level of cognitive effects aimed for. We discuss the theoretical and practical implications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号