首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The Intensity of Emotional Facial Expressions and Decoding Accuracy   总被引:2,自引:0,他引:2  
The influence of the physical intensity of emotional facial expressions on perceived intensity and emotion category decoding accuracy was assessed for expressions of anger, disgust, sadness, and happiness. The facial expressions of two men and two women posing each of the four emotions were used as stimuli. Six different levels of intensity of expression were created for each pose using a graphics morphing program. Twelve men and 12 women rated each of the 96 stimuli for perceived intensity of the underlying emotion and for the qualitative nature of the emotion expressed. The results revealed that perceived intensity varied linearly with the manipulated physical intensity of the expression. Emotion category decoding accuracy varied largely linearly with the manipulated physical intensity of the expression for expressions of anger, disgust, and sadness. For the happiness expressions only, the findings were consistent with a categorical judgment process. Sex of encoder produced significant effects for both dependent measures. These effects remained even after possible gender differences in encoding were controlled for, suggesting a perceptual bias on the part of the decoders.  相似文献   

2.
Categorical perception, demonstrated as reduced discrimination of within‐category relative to between‐category differences in stimuli, has been found in a variety of perceptual domains in adults. To examine the development of categorical perception in the domain of facial expression processing, we used behavioral and event‐related potential (ERP) methods to assess discrimination of within‐category (happy‐happy) and between‐category (happy‐sad) differences in facial expressions in 7‐month‐old infants. Data from a visual paired‐comparison test and recordings of attention‐sensitive ERPs showed no discrimination of facial expressions in the within‐category condition, whereas reliable discrimination was observed in the between‐category condition. The results also showed that face‐sensitive ERPs over occipital‐temporal scalp (P400) were attenuated in the within‐category condition relative to the between‐category condition, suggesting a potential neural basis for the reduced within‐category sensitivity. Together, these results suggest that the neural systems underlying categorical representation of facial expressions emerge during the early stages of postnatal development, before acquisition of language.  相似文献   

3.
The specificity predicted by differential emotions theory (DET) for early facial expressions in response to 5 different eliciting situations was studied in a sample of 4‐month‐old infants (n = 150). Infants were videotaped during tickle, sour taste, jack‐in‐the‐box, arm restraint, and masked‐stranger situations and their expressions were coded second by second. Infants showed a variety of facial expressions in each situation; however, more infants exhibited positive (joy and surprise) than negative expressions (anger, disgust, fear, and sadness) across all situations except sour taste. Consistent with DET‐predicted specificity, joy expressions were the most common in response to tickling, and were less common in response to other situations. Surprise expressions were the most common in response to the jack‐in‐the‐box, as predicted, but also were the most common in response to the arm restraint and masked‐stranger situations, indicating a lack of specificity. No evidence of predicted specificity was found for anger, disgust, fear, and sadness expressions. Evidence of individual differences in expressivity within situations, as well as stability in the pattern across situations, underscores the need to examine both child and contextual factors in studying emotional development. The results provide little support for the DET postulate of situational specificity and suggest that a synthesis of differential emotions and dynamic systems theories of emotional expression should be considered.  相似文献   

4.
We assessed the impact of social context on the judgment of emotional facial expressions as a function of self-construal and decoding rules. German and Greek participants rated spontaneous emotional faces shown either alone or surrounded by other faces with congruent or incongruent facial expressions. Greek participants were higher in interdependence than German participants. In line with cultural decoding rules, Greek participants rated anger expressions less intensely and sad and disgust expressions more intensely. Social context affected the ratings by both groups in different ways. In the more interdependent culture (Greece) participants perceived anger least intensely when the group showed neutral expressions, whereas sadness expressions were rated as most intense in the absence of social context. In the independent culture (Germany) a group context (others expressing anger or happiness) additionally amplified the perception of angry and happy expressions. In line with the notion that these effects are mediated by more holistic processing linked to higher interdependence, this difference disappeared when we controlled for interdependence on the individual level. The findings confirm the usefulness of considering both country level and individual level factors when studying cultural differences.  相似文献   

5.
This study examined the emergence of affect specificity in infancy. In this study, infants received verbal and facial signals of 2 different, negatively valenced emotions (fear and sadness) as well as neutral affect via a television monitor to determine if they could make qualitative distinctions among emotions of the same valence. Twenty 12‐ to 14‐month‐olds and 20 16‐ to 18‐month‐olds were examined. Results suggested that younger infants showed no evidence of referential specificity, as they responded similarly to both the target and distracter toys, and showed no evidence of affect specificity, showing no difference in play between affect conditions. Older infants, in contrast, showed evidence both of referential and affect specificity. With respect to affect specificity, 16‐ to 18‐month‐olds touched the target toy less in the fear condition than in the sad condition and showed a larger proportion of negative facial expressions in the sad condition versus the fear condition. These findings suggest a developmental emergence after 15 months of age for affect specificity in relating emotional messages to objects.  相似文献   

6.
When we perceive the emotions of other people, we extract much information from the face. The present experiment used FACS (Facial Action Coding System), which is an instrument that measures the magnitude of facial action from a neutral face to a changed, emotional face. Japanese undergraduates judged the emotion in pictures of 66 static Japanese male faces (11 static pictures for each of six basic expressions: happiness, surprise, fear, anger, sadness, and disgust), ranging from neutral faces to maximally expressed emotions. The stimuli had previously been scored with FACS and were presented in random order. A high correlation between the subjects' judgments of facial expressions and the FACS scores was found.  相似文献   

7.
We examined whether 18‐month‐olds understand how the emotional valence of people's experiences predicts their subsequent emotional reactions, as well as how their behaviors are influenced by the reliability of the emoter. Infants watched a person express sadness after receiving an object that was either inappropriate (conventional emoter) or appropriate (unconventional emoter) to perform an action. Then, infants’ imitation, social referencing, and prosocial behaviors (helping) were examined when interacting with the person. Results showed that during the exposure phase, the unconventional group showed visual search patterns suggesting hypothesis testing and expressed less concern toward the person than the conventional group. In the social referencing task, the conventional group preferred to search for the target of a positive expression as opposed to the disgust object. In contrast, the unconventional group was more likely to trust the person's negative expression. As expected, no differences were found between the groups on the instrumental helping tasks. However, during the empathic helping tasks, the conventional group needed fewer prompts to help than the unconventional group. These findings provide the first evidence that the congruence between a person's emotional responses and her experiences impacts 18‐month‐olds’ subsequent behaviors toward that person.  相似文献   

8.
Facial expressions related to sadness are a universal signal of nonverbal communication. Although results of many psychology studies have shown that drooping of the lip corners, raising of the chin, and oblique eyebrow movements (a combination of inner brow raising and brow lowering) express sadness, no report has described a study elucidating facial expression characteristics under well-controlled circumstances with people actually experiencing the emotion of sadness itself. Therefore, spontaneous facial expressions associated with sadness remain unclear. We conducted this study to accumulate important findings related to spontaneous facial expressions of sadness. We recorded the spontaneous facial expressions of a group of participants as they experienced sadness during an emotion-elicitation task. This task required a participant to recall neutral and sad memories while listening to music. We subsequently conducted a detailed analysis of their sad and neutral expressions using the Facial Action Coding System. The prototypical facial expressions of sadness in earlier studies were not observed when people experienced sadness as an internal state under non-social circumstances. By contrast, they expressed tension around the mouth, which might function as a form of suppression. Furthermore, results show that parts of these facial actions are not only related to sad experiences but also to other emotional experiences such as disgust, fear, anger, and happiness. This study revealed the possibility that new facial expressions contribute to the experience of sadness as an internal state.  相似文献   

9.
Do infants show distinct negative facial expressions for different negative emotions? To address this question, European American, Chinese, and Japanese 11‐month‐olds were videotaped during procedures designed to elicit mild anger or frustration and fear. Facial behavior was coded using Baby FACS, an anatomically based scoring system. Infants' nonfacial behavior differed across procedures, suggesting that the target emotions were successfully elicited. However evidence for distinct emotion‐specific facial configurations corresponding to fear versus anger was not obtained. Although facial responses were largely similar across cultures, some differences also were observed. Results are discussed in terms of functionalist and dynamical systems approaches to emotion and emotional expression.  相似文献   

10.
Research has demonstrated that infants recognize emotional expressions of adults in the first half year of life. We extended this research to a new domain, infant perception of the expressions of other infants. In an intermodal matching procedure, 3.5‐ and 5‐month‐old infants heard a series of infant vocal expressions (positive and negative affect) along with side‐by‐side dynamic videos in which one infant conveyed positive facial affect and another infant conveyed negative facial affect. Results demonstrated that 5‐month‐olds matched the vocal expressions with the affectively congruent facial expressions, whereas 3.5‐month‐olds showed no evidence of matching. These findings indicate that by 5 months of age, infants detect, discriminate, and match the facial and vocal affective displays of other infants. Further, because the facial and vocal expressions were portrayed by different infants and shared no face–voice synchrony, temporal, or intensity patterning, matching was likely based on detection of a more general affective valence common to the face and voice.  相似文献   

11.
Facial expressions of fear and disgust have repeatedly been found to be less well recognized than those of other basic emotions by children. We undertook two studies in which we investigated the recognition and visual discrimination of these expressions in school-age children. In Study 1, children (5, 6, 9, and 10 years of age) were shown pairs of facial expressions, and asked to tell which one depicted a target emotion. The results indicated that accuracy in 9- and 10-year-olds was higher than in 5- and 6-year-olds for three contrasts: disgust–anger, fear–surprise, and fear–sadness. Younger children had more difficulty recognizing disgust when it was presented along with anger, and in recognizing fear when it was presented along with surprise. In Study 2, children (5, 6, 9, and 10 years of age) were shown a target expression along with two other expressions, and were asked to point to the expression that was the most similar to the target. Contrary to our expectations, even 5- and 6-year-olds were very accurate in discriminating fear and disgust from the other emotions, suggesting that visual perception was not the main limiting factor for the recognition of these emotions in school-age children.  相似文献   

12.
Several studies have shown that at 7 months of age, infants display an attentional bias toward fearful facial expressions. In this study, we analyzed visual attention and heart rate data from a cross‐sectional study with 5‐, 7‐, 9‐, and 11‐month‐old infants (Experiment 1) and visual attention from a longitudinal study with 5‐ and 7‐month‐old infants (Experiment 2) to examine the emergence and stability of the attentional bias to fearful facial expressions. In both experiments, the attentional bias to fearful faces appeared to emerge between 5 and 7 months of age: 5‐month‐olds did not show a difference in disengaging attention from fearful and nonfearful faces, whereas 7‐ and 9‐month‐old infants had a lower probability of disengaging attention from fearful than nonfearful faces. Across the age groups, heart rate (HR) data (Experiment 1) showed a more pronounced and longer‐lasting HR deceleration to fearful than nonfearful expressions. The results are discussed in relation to the development of the perception and experience of fear and the interaction between emotional and attentional processes.  相似文献   

13.
Humans perceive emotions in terms of categories, such as “happiness,” “sadness,” and “anger.” To learn these complex conceptual emotion categories, humans must first be able to perceive regularities in expressive behaviors (e.g., facial configurations) across individuals. Recent research suggests that infants spontaneously form “basic-level” categories of facial configurations (e.g., happy vs. fear), but not “superordinate” categories of facial configurations (e.g., positive vs. negative). The current studies further explore how infant age and language impact superordinate categorization of facial configurations associated with different negative emotions. Across all experiments, infants were habituated to one person displaying facial configurations associated with anger and disgust. While 10-month-olds formed a category of person identity (Experiment 1), 14-month-olds formed a category that included negative facial configurations displayed by the same person (Experiment 2). However, neither age formed the hypothesized superordinate category of negative valence. When a verbal label (“toma”) was added to each of the habituation events (Experiment 3), 10-month-olds formed a category similar to 14-month-olds in Experiment 2. These findings intersect a larger conversation about the nature and development of children's emotion categories and highlight the importance of considering developmental processes, such as language learning and attentional/memory development, in the design and interpretation of infant categorization studies.  相似文献   

14.
Because of the close connection between culture and language, a number of writers have suggested that bilinguals will differ in their behavior because of differences in the degree of assimilation of different cultures in the same individual. We tested this notion by obtaining data from bilingual (English and Hindi) college students in India using a well-studied cross-cultural research paradigm involving emotional perception. Subjects judged universal facial expressions of emotion in two separate sessions, one conducted entirely in English, the other in Hindi. In each session, they judged which emotion was being portrayed, and how intensely. Subjects recognized anger, fear, and sadness more accurately in English than in Hindi. They also attributed greater intensity to female photos of anger when rating in Hindi, but attributed greater intensity to female photos of sadness when rating in English. These findings were discussed in relation to the theoretical connection between culture and language.  相似文献   

15.
The perception of emotional facial expressions may activate corresponding facial muscles in the receiver, also referred to as facial mimicry. Facial mimicry is highly dependent on the context and type of facial expressions. While previous research almost exclusively investigated mimicry in response to pictures or videos of emotional expressions, studies with a real, face-to-face partner are still rare. Here we compared facial mimicry of angry, happy, and sad expressions and emotion recognition in a dyadic face-to-face setting. In sender-receiver dyads, we recorded facial electromyograms in parallel. Senders communicated to the receivers—with facial expressions only—the emotions felt during specific personal situations in the past, eliciting anger, happiness, or sadness. Receivers mostly mimicked happiness, to a lesser degree, sadness, and anger as the least mimicked emotion. In actor-partner interdependence models we showed that the receivers’ own facial activity influenced their ratings, which increased the agreement between the senders’ and receivers’ ratings for happiness, but not for angry and sad expressions. These results are in line with the Emotion Mimicry in Context View, holding that humans mimic happy expressions according to affiliative intentions. The mimicry of sad expressions is less intense, presumably because it signals empathy and might imply personal costs. Direct anger expressions are mimicked the least, possibly because anger communicates threat and aggression. Taken together, we show that incidental facial mimicry in a face-to-face setting is positively related to the recognition accuracy for non-stereotype happy expressions, supporting the functionality of facial mimicry.  相似文献   

16.
Despite being inherently dynamic phenomena, much of our understanding of how infants attend and scan facial expressions is based on static face stimuli. Here we investigate how six-, nine-, and twelve-month infants allocate their visual attention toward dynamic-interactive videos of the six basic emotional expressions, and compare their responses with static images of the same stimuli. We find infants show clear differences in how they attend and scan dynamic and static expressions, looking longer toward the dynamic-face and lower-face regions. Infants across all age groups show differential interest in expressions, and show precise scanning of regions “diagnostic” for emotion recognition. These data also indicate that infants' attention toward dynamic expressions develops over the first year of life, including relative increases in interest and scanning precision toward some negative facial expressions (e.g., anger, fear, and disgust).  相似文献   

17.
Like faces, bodies are significant sources of social information. However, research suggests that infants do not develop body representation (i.e., knowledge about typical human bodies) until the second year of life, although they are sensitive to facial information much earlier. Yet, previous research only examined whether infants are sensitive to the typical arrangement of body parts. We examined whether younger infants have body knowledge of a different kind, namely the relative size of body parts. Five‐ and 9‐month‐old infants were tested for their preference between a normal versus a proportionally distorted body. Nine‐month‐olds exhibited a preference for the normal body when images were presented upright but not when they were inverted. Five‐month‐olds failed to exhibit a preference in either condition. These results indicate that infants have knowledge about human bodies by the second half of the first year of life. Moreover, given that better performance on upright than on inverted stimuli has been tied to expertise, the fact that older infants exhibited an inversion effect with body images indicates that at least some level of expertise in body processing develops by 9 months of age.  相似文献   

18.
The facial expressions of emotion and the circumstances under which the expressions occurred in a sample of the most popular children's television programs were investigated in this study. Fifteen-second randomly selected intervals from episodes of five television programs were analyzed for displays of happiness, sadness, anger, fear, disgust, and surprise. In addition, the contexts in which the emotions occurred were examined. Results indicated that particular emotional expressions occurred at significantly different frequencies and that there was an association between emotional displays and emotion-contexts. The high rate of emotional displays found in television shows has implications for the development of knowledge regarding emotional display rules in viewers.We are grateful to Sharon Galligan for assistance in coding part of the data and to Carolyn Saarni and Amy Halberstadt for helpful comments on an earlier draft of this paper. This research was supported in part by a grant from the National Institute of Disabilities and Rehabilitation Research, #GOO85351. The opinions expressed herein do not necessarily reflect the position or policy of the U.S. Department of Education.  相似文献   

19.
Accurate assessment of emotion requires the coordination of information from different sources such as faces, bodies, and voices. Adults readily integrate facial and bodily emotions. However, not much is known about the developmental origin of this capacity. Using a familiarization paired‐comparison procedure, 6.5‐month‐olds in the current experiments were familiarized to happy, angry, or sad emotions in faces or bodies and tested with the opposite image type portraying the familiar emotion paired with a novel emotion. Infants looked longer at the familiar emotion across faces and bodies (except when familiarized to angry images and tested on the happy/angry contrast). This matching occurred not only for emotions from different affective categories (happy, angry) but also within the negative affective category (angry, sad). Thus, 6.5‐month‐olds, like adults, integrate emotions from bodies and faces in a fairly sophisticated manner, suggesting rapid development of emotion processing early in life.  相似文献   

20.
A series of 3 experiments are reviewed in which infants between 4 and 10 months of age were familiarized with members of 2 basic‐level object categories. The degree of distinctiveness between categories was varied. Preference tests were intended to determine whether infants formed a single category representation (at a more global level) or 2 basic‐level representations. Across 3 experiments, 10‐month‐old infants appeared to have formed multiple basic‐level categories, whereas younger infants tended to form broader, more inclusive representations. The tendency to form multiple categories was influenced to some extent by category distinctiveness. Whereas 10‐month‐olds formed separate categories for all contrasts, 7‐month‐olds did so only when the 2 familiarized categories were from separate global domains. A perceptual account of the global‐to‐basic shift in early categorization is offered. Task dependencies in early categorization are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号