首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Recent research has demonstrated that preschool children can decode emotional meaning in expressive body movement; however, to date, no research has considered preschool children's ability to encode emotional meaning in this media. The current study investigated 4- (N = 23) and 5- (N = 24) year-old children's ability to encode the emotional meaning of an accompanying music segment by moving a teddy bear using previously modeled expressive movements to indicate one of four target emotions (happiness, sadness, anger, or fear). Adult judges visually categorized the silent videotaped expressive movement performances by children of both ages with greater than chance level accuracy. In addition, accuracy in categorizing the emotion being expressed varied as a function of age of child and emotion. A subsequent cue analysis revealed that children as young as 4 years old were systematically varying their expressive movements with respect to force, rotation, shifts in movement pattern, tempo, and upward movement in the process of emotional communication. The theoretical significance of such encoding ability is discussed with respect to children's nonverbal skills and the communication of emotion.  相似文献   

2.
The relation between knowledge of American Sign Language (ASL) and the ability to decode facial expressions of emotion was explored in this study. Subjects were 60 college students, half of whom were intermediate level students of ASL and half of whom had no exposure to a signed language. Subjects viewed and judged silent video segments of stimulus persons experiencing spontaneous emotional reactions representing either happiness, sadness, anger, disgust, or fear/surprise. Results indicated that hearing subjects knowledgeable in ASL were generally better than hearing non-signers at identifying facial expressions of emotion, although there were variations in decoding accuracy regarding the specific emotion being judged. In addition, females were more successful decoders than males. Results have implications for better understanding the nature of nonverbal communication in deaf and hearing individuals.We are grateful to Karl Scheibe for comments on an earlier version of this paper and to Erik Coats for statistical analysis. This study was conducted as part of a Senior Honors thesis at Wesleyan University.  相似文献   

3.
Despite known differences in the acoustic properties of children’s and adults’ voices, no work to date has examined the vocal cues associated with emotional prosody in youth. The current study investigated whether child (n = 24, 17 female, aged 9–15) and adult (n = 30, 15 female, aged 18–63) actors differed in the vocal cues underlying their portrayals of basic emotions (anger, disgust, fear, happiness, sadness) and social expressions (meanness, friendliness). We also compared the acoustic characteristics of meanness and friendliness to comparable basic emotions. The pattern of distinctions between expressions varied as a function of age for voice quality and mean pitch. Specifically, adults’ portrayals of the various expressions were more distinct in mean pitch than children’s, whereas children’s representations differed more in voice quality than adults’. Given the importance of pitch variables for the interpretation of a speaker’s intended emotion, expressions generated by adults may thus be easier for listeners to decode than those of children. Moreover, the vocal cues associated with the social expressions of meanness and friendliness were distinct from those of basic emotions like anger and happiness respectively. Overall, our findings highlight marked differences in the ways in which adults and children convey socio-emotional expressions vocally, and expand our understanding of the communication of paralanguage in social contexts. Implications for the literature on emotion recognition are discussed.  相似文献   

4.
A method for studying emotional expression using posthypnotic suggestion to induce emotional states is presented. Subjects were videotaped while roleplaying happiness, sadness or anger or after hypnotic induction of one of these three emotions. Judges then rated these videotapes for emotional tone. Results indicated a main effect for emotion expressed, with happiness and sadness more easily identified by judges than anger. Accuracy was also greater for happiness and sadness in the hypnotically induced condition. However, role-played anger was more easily identified than hypnotically induced anger. An interaction of channel (body/face) and emotion indicated that identification of sadness and anger was easier for judges when the body alone was shown. Findings are discussed in terms of display rules for the expression of emotion.We gratefully acknowledge Irving Kirsch, Ross Buck, and Paul Ekman for their helpful comments on a draft of this article. Special thanks to Reuben Baron, without whose support neither this article nor our careers in psychology would have been possible.A preliminary report of this study was presented at the meeting of the American Psychological Association in Toronto, August 1984.  相似文献   

5.
The aim of the present study was to investigate developmental differences in reliance on situational versus vocal cues for recognition of emotions. Turkish preschool, second, and fifth grade children participated in the study. Children listened to audiotape recordings of situations between a mother and a child where the emotional cues implied by the context of a vignette and the vocal expression were either consistent or inconsistent. After listening to each vignette, participants were questioned about the content of the incident and were asked to make a judgment about the emotion of the mother referred to in the recording. Angry, happy, and neutral emotions were utilized. Results revealed that 1) recognition of emotions improved with age, and 2) children relied more on the channel depicting either anger or happiness than on the channel depicting neutrality.  相似文献   

6.
Gender roles in mainstream US culture suggest that girls express more happiness, sadness, anxiety, and shame/embarrassment than boys, while boys express more anger and externalizing emotions, such as contempt. However, gender roles and emotion expression may be different in low-income and ethnically diverse families, as children and parents are often faced with greater environmental stressors and may have different gender expectations. This study examined gender differences in emotion expression in low-income adolescents, an understudied population. One hundred and seventy nine adolescents (aged 14–17) participated in the Trier Social Stress Test (TSST). Trained coders rated adolescents’ expressions of happiness, sadness, anxiety, shame/embarrassment, anger, and contempt during the TSST using a micro-analytic coding system. Analyses showed that, consistent with gender roles, girls expressed higher levels of happiness and shame than boys; however, contrary to traditional gender roles, girls showed higher levels of contempt than boys. Also, in contrast to cultural stereotypes, there were no differences in anger between boys and girls. Findings suggest gender-role inconsistent displays of externalizing emotions in low-income adolescents under acute stress, and may reflect different emotion socialization experiences in this group.  相似文献   

7.
Inconsistencies in previous findings concerning the relationship between emotion and social context are likely to reflect the multi-dimensionality of the sociality construct. In the present study we focused on the role of the other person by manipulating two aspects of this role: co-experience of the event and expression of emotion. We predicted that another's co-experience and expression would affect emotional responses and that the direction of these effects would depend upon the manipulated emotion and how the role of the other person is appraised. Participants read vignettes eliciting four different emotions: happiness, sadness, anxiety, and anger. As well as an alone condition, there were four conditions in which a friend was present, either co-experiencing the event or merely observing it, and either expressing emotions consistent with the event or not showing any emotion. There were significant effects of co-experience in the case of anger situations, and of expression in the case of happiness and sadness situations. Social appraisal also appeared to influence emotional response. We discuss different processes that might be responsible for these results.  相似文献   

8.
Young (M = 23 years) and older (M = 77 years) adults' interpretation and memory for the emotional content of spoken discourse was examined in an experiment using short, videotaped scenes of two young actresses talking to each other about emotionally-laden events. Emotional nonverbal information (prosody or facial expressions) was conveyed at the end of each scene at low, medium, and high intensities. Nonverbal information indicating anger, happiness, or fear, conflicted with the verbal information. Older adults' ability to differentiate levels of emotional intensity was not as strong (for happiness and anger) compared to younger adults. An incidental memory task revealed that older adults, more often than younger adults, reconstruct what people state verbally to coincide with the meaning of the nonverbal content, if the nonverbal content is conveyed through facial expressions. A second experiment with older participants showed that the high level of memory reconstructions favoring the nonverbal interpretation was maintained when the ages of the participants and actresses were matched, and when the nonverbal content was conveyed both through prosody and facial expressions.  相似文献   

9.
10.
The present study examined the potential for information provided in a person's style of walking to reveal certain emotions. Ten subjects observed five walkers expressing four different emotions and made emotion identifications as well as judgments about specific gait characteristics. Results revealed that subjects were able to identify sadness, anger, happiness, and pride from gait information at better than chance levels; however, identifications of pride were significantly less accurate than were identifications of sadness and anger. In addition, subjects' acuracy varied across the five walkers. Results also revealed that gait characteristics such as the amount of arm swing, stride length, heavyfootedness, and walking speed differentiated the emotions expressed by walkers.Portions of this paper were presented at the 26th meeting of the New England Psychological Association, Boston, MA, November, 1986.Joann M. Montepare received a Ph.D. in Social-Developmental Psychology from Brandeis University. She is presently a postdoctoral research fellow at the Center for Research On Women, Wellesley College, Wellesley MA, 02181. Her research interests include the development of subjective perceptions of age and the impact of nonverbal information on social stereotypes of age. Sabra Goldstein and Annmarie Clausen hold B.A. degrees in Psychology from Wellesley College. Please address reprint requests to the first author.  相似文献   

11.
Many children with autism communicate through the use of alternative communication systems, such as sign language. Limited research has been conducted on the situations under which sign language will be acquired across verbal operants without direct teaching. The purpose of the current study was to evaluate exposure to sign language on the acquisition of signed mands, tacts, and intraverbals in a male child with autism. Results indicated fast acquisition of mands, tacts, and intraverbals without direct teaching. Results are discussed in the context of future research investigating exposure without direct teaching in individuals who communicate with alternative communication systems.  相似文献   

12.
突发的新冠病毒肺炎疫情凸显了进一步推动手语翻译职业化、做好手语应急服务人才储备的紧迫性。在这一时期,信息无障碍工作得到了高度重视,科技手段助力手语翻译行业发展,民间手语翻译员自发加入抗疫工作,取得了不少成果,但也暴露出一些问题。本文对中国手语翻译职业化的历程进行了回顾与评述,对手语翻译人才的培养现状进行了描述与总结,并结合此次疫情中手语翻译的情势与问题,提出了加强法律法规建设、翻译教育自我革新及开创手语翻译中国模式等政策建议。  相似文献   

13.
14.
Abstract

Previous research has recognized the role of emotions in protests and social movements in the offline world. Despite the current scenario of ubiquitous social media and ‘Twitter revolutions,’ our knowledge about the connections between emotions and online protests still remains limited. In this study, we examine whether online protest actions follow the same emotional groundwork for supporting and nurturing a social movement as in the offline world, and how these emotions vary across various stages of the social movement. Through a computer-assisted emotion analysis of 65,613 Twitter posts (tweets), posted during the Nirbhaya social movement (movement against the Delhi gang-rape incident) in India, we identified a strong resemblance between online emotional patterns and offline protest emotions as discussed in literature. Formal statistical testing of a range of emotions (negativity, positivity, anger, sadness, anxiety, certainty, individualism, collectivism, and achievement) demonstrates that they significantly differed across stages of the social movement; as such, they influenced the course of the online protest, resonating parallels with offline events. The findings highlight the importance of anger and anxiety in stirring the collective conscience, and identify that positive emotion was pervasive during the protest event. Implications of these findings are discussed.  相似文献   

15.
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds’ fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).  相似文献   

16.
Categorical perception, indicated by superior discrimination between stimuli that cross categorical boundaries than between stimuli within a category, is an efficient manner of classification. The current study examined the development of categorical perception of emotional stimuli in infancy. We used morphed facial images to investigate whether infants find contrasts between emotional facial images that cross categorical boundaries to be more salient than those that do not, while matching the degree of differences in the two contrasts. Five‐month‐olds exhibited sensitivity to the categorical boundary between sadness and disgust, between happiness and surprise, as well as between sadness and anger but not between anger and disgust. Even 9‐month‐olds failed to exhibit evidence of a definitive category boundary between anger and disgust. These findings indicate the presence of discrete boundaries between some, but not all, of the basic emotions early in life. Implications of these findings for the major theories of emotion representation are discussed.  相似文献   

17.
This article examines gender differences in the emotion management of men and women in the workplace. The belief in American culture that women are more emotional than men has limited women's opportunities in many types of work. Because emotional expression is often tightly controlled in the workplace, examining emotion management performed at work presents an opportunity to evaluate gender differences in response to similar working conditions. Previous research suggests that men and women do not differ in their experiences of emotion and the expression of emotion is linked to status positions. An analysis of survey data collected from workers in a diverse group of occupations illustrates that women express anger less and happiness more than men in the workplace. Job and status characteristics explain the association between gender and anger management at work but were unrelated to the management of happiness expressions in the workplace.  相似文献   

18.
Facial expressions of emotion influence interpersonal trait inferences   总被引:4,自引:0,他引:4  
Theorists have argued that facial expressions of emotion serve the interpersonal function of allowing one animal to predict another's behavior. Humans may extend these predictions into the indefinite future, as in the case of trait inference. The hypothesis that facial expressions of emotion (e.g., anger, disgust, fear, happiness, and sadness) affect subjects' interpersonal trait inferences (e.g., dominance and affiliation) was tested in two experiments. Subjects rated the dispositional affiliation and dominance of target faces with either static or apparently moving expressions. They inferred high dominance and affiliation from happy expressions, high dominance and low affiliation from angry and disgusted expressions, and low dominance from fearful and sad expressions. The findings suggest that facial expressions of emotion convey not only a target's internal state, but also differentially convey interpersonal information, which could potentially seed trait inference.This article constitutes a portion of my dissertation research at Stanford University, which was supported by a National Science Foundation Fellowship and an American Psychological Association Dissertation Award. Thanks to Nancy Alvarado, Chris Dryer, Paul Ekman, Bertram Malle, Susan Nolen-Hoeksema, Steven Sutton, Robert Zajonc, and more anonymous reviewers than I can count on one hand for their comments.  相似文献   

19.
The purpose of this study is to specify the characteristics that contribute to the perception of emotions expressed through dance movements and to develop an emotional model to show the relationships between impressions and the characteristics of expressive body movements. Six dancers expressed three different emotions through dance: joy, sadness, and anger. Observers (N = 192) rated both their impressions (33 dimensions) and the dance movements (26 characteristics) of 18 dance performances. The results showed that the observers could accurately perceive the emotional meanings expressed in the dances. The impressions of Dynamics, Expansion, and Stability—and the evaluated movements of Frequency and Velocity of Upward Extension, Frequency and Velocity of Downward Movements, Turning or Jumping, and Body Closing—were extracted via factor analysis as determinants of observers’ impressions of emotional expressions in dance. Additionally, covariance structure analysis and discriminant function analysis indicated that the emotional expressions of the dances expressing joy, sadness, and anger are each associated with particular factors. Through these analyses, we developed the Movements Impressions Emotions Model for dance.  相似文献   

20.
Humans perceive emotions in terms of categories, such as “happiness,” “sadness,” and “anger.” To learn these complex conceptual emotion categories, humans must first be able to perceive regularities in expressive behaviors (e.g., facial configurations) across individuals. Recent research suggests that infants spontaneously form “basic-level” categories of facial configurations (e.g., happy vs. fear), but not “superordinate” categories of facial configurations (e.g., positive vs. negative). The current studies further explore how infant age and language impact superordinate categorization of facial configurations associated with different negative emotions. Across all experiments, infants were habituated to one person displaying facial configurations associated with anger and disgust. While 10-month-olds formed a category of person identity (Experiment 1), 14-month-olds formed a category that included negative facial configurations displayed by the same person (Experiment 2). However, neither age formed the hypothesized superordinate category of negative valence. When a verbal label (“toma”) was added to each of the habituation events (Experiment 3), 10-month-olds formed a category similar to 14-month-olds in Experiment 2. These findings intersect a larger conversation about the nature and development of children's emotion categories and highlight the importance of considering developmental processes, such as language learning and attentional/memory development, in the design and interpretation of infant categorization studies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号