首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Darwin (1872) hypothesized that some facial muscle actions associated with emotion cannot be consciously inhibited, particularly when the to-be concealed emotion is strong. The present study investigated emotional “leakage” in deceptive facial expressions as a function of emotional intensity. Participants viewed low or high intensity disgusting, sad, frightening, and happy images, responding to each with a 5 s videotaped genuine or deceptive expression. Each 1/30 s frame of the 1,711 expressions (256,650 frames in total) was analyzed for the presence and duration of universal expressions. Results strongly supported the inhibition hypothesis. In general, emotional leakage lasted longer in both the upper and lower face during high-intensity masked, relative to low-intensity, masked expressions. High intensity emotion was more difficult to conceal than low intensity emotion during emotional neutralization, leading to a greater likelihood of emotional leakage in the upper face. The greatest and least amount of emotional leakage occurred during fearful and happiness expressions, respectively. Untrained observers were unable to discriminate real and false expressions above the level of chance.  相似文献   

2.
Cross-cultural and laboratory research indicates that some facial expressions of emotion are recognized more accurately and faster than others. We assessed the hypothesis that such differences depend on the frequency with which each expression occurs in social encounters. Thirty observers recorded how often they saw different facial expressions during natural conditions in their daily life. For a total of 90 days (3 days per observer), 2,462 samples of seen expressions were collected. Among the basic expressions, happy faces were observed most frequently (31 %), followed by surprised (11.3 %), sad (9.3 %), angry (8.7 %), disgusted (7.2 %), and fearful faces, which were the least frequent (3.4 %). A significant amount (29 %) of non-basic emotional expressions (e.g., pride or shame) were also observed. We correlated our frequency data with recognition accuracy and response latency data from prior studies. In support of the hypothesis, significant correlations (generally, above .70) emerged, with recognition accuracy increasing and latency decreasing as a function of frequency. We conclude that the efficiency of facial emotion recognition is modulated by familiarity of the expressions.  相似文献   

3.
Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds’ fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).  相似文献   

4.
Adults' perceptions provide information about the emotional meaning of infant facial expressions. This study asks whether similar facial movements influence adult perceptions of emotional intensity in both infant positive (smile) and negative (cry face) facial expressions. Ninety‐five college students rated a series of naturally occurring and digitally edited images of infant facial expressions. Naturally occurring smiles and cry faces involving the co‐occurrence of greater lip movement, mouth opening, and eye constriction, were rated as expressing stronger positive and negative emotion, respectively, than expressions without these 3 features. Ratings of digitally edited expressions indicated that eye constriction contributed to higher ratings of positive emotion in smiles (i.e., in Duchenne smiles) and greater eye constriction contributed to higher ratings of negative emotion in cry faces. Stronger mouth opening contributed to higher ratings of arousal in both smiles and cry faces. These findings indicate a set of similar facial movements are linked to perceptions of greater emotional intensity, whether the movements occur in positive or negative infant emotional expressions. This proposal is discussed with reference to discrete, componential, and dynamic systems theories of emotion.  相似文献   

5.
When we perceive the emotions of other people, we extract much information from the face. The present experiment used FACS (Facial Action Coding System), which is an instrument that measures the magnitude of facial action from a neutral face to a changed, emotional face. Japanese undergraduates judged the emotion in pictures of 66 static Japanese male faces (11 static pictures for each of six basic expressions: happiness, surprise, fear, anger, sadness, and disgust), ranging from neutral faces to maximally expressed emotions. The stimuli had previously been scored with FACS and were presented in random order. A high correlation between the subjects' judgments of facial expressions and the FACS scores was found.  相似文献   

6.
Research has demonstrated that humans detect threatening stimuli more rapidly than nonthreatening stimuli. Although the literature presumes that biases for threat should be normative, present early in development, evident across multiple forms of threat, and stable across individuals, developmental work in this area is limited. Here, we examine the developmental differences in infants' (4‐ to 24‐month‐olds) attention to social (angry faces) and nonsocial (snakes) threats using a new age‐appropriate dot‐probe task. In Experiment 1, infants' first fixations were more often to snakes than to frogs, and they were faster to fixate probes that appeared in place of snakes vs. frogs. There were no significant age differences, suggesting that a perceptual bias for snakes is present early in life and stable across infancy. In Experiment 2, infants fixated probes more quickly after viewing any trials that contained an angry face compared to trials that contained a happy face. Further, there were age‐related changes in infants' responses to face stimuli, with a general increase in looking time to faces before the probe and an increase in latency to fixate the probe after seeing angry faces. Together, this work suggests that different developmental mechanisms may be responsible for attentional biases for social vs. nonsocial threats.  相似文献   

7.
We report two studies which attempt to explain why some researchers found that neutral faces determine judgments of recognition as strongly as expressions of basic emotion, even through discrepant contextual information. In the first study we discarded the possibility that neutral faces could have an intense but undetected emotional content: 60 students' dimensional ratings showed that 10 neutral faces were perceived as less emotional than 10 emotional expressions. In Study 2 we tested whether neutral faces can convey strong emotional messages in some contexts: 128 students' dimensional ratings on 36 discrepant combinations of neutral faces or expressions with contextual information were more predictable from expressions when the contextual information consisted of common, everyday situations, but were more predictable from neutral faces when the context was an uncommon, extreme situation. In line with our hypothesis, we discuss these paradoxical findings as being caused by the salience of neutral faces in some particular contexts.This research was conducted as a part of the first author's doctoral dissertation, and was supported by a grant (PS89-022) of the Spanish DGICyT. We thank David Weston for his help in preparing the text. We also thank two anonymous reviewers for their valuable comments on a previous draft of this article.  相似文献   

8.
To examine individual differences in decoding facial expressions, college students judged type and emotional intensity of emotional faces at five intensity levels and completed questionnaires on family expressiveness, emotionality, and self-expressiveness. For decoding accuracy, family expressiveness was negatively related, with strongest effects for more prototypical faces, and self-expressiveness was positively related. For perceptions of emotional intensity, family expressiveness was positively related, emotionality tended to be positively related, and self-expressiveness tended to be negatively related; these findings were all qualified by level of ambiguity/clarity of the facial expressions. Decoding accuracy and perceived emotional intensity also related positively with each other. Results suggest that even simple facial judgments are made through an interpretive lens partially created by family expressiveness, individuals’ own emotionality, and self-expressiveness.  相似文献   

9.
Memory for in-group faces tends to be better than memory for out-group faces. Ackerman et al. (Psychological Science 17:836–840, 2006) found that this effect reverses when male faces display anger, supposedly due to their functional value in signaling intergroup threat. We explored the generalizability of this reverse effect. White participants viewed Black and White male or female faces displaying angry, fearful, or neutral expressions. Recognition accuracy for White male faces was better than for Black male faces when faces were neutral, but this reversed when the faces displayed anger or fear. For female targets, Black faces were generally better recognized than White faces, and female faces were better remembered when they displayed anger rather than fear, whereas male faces were better remembered when they displayed fear rather than anger. These findings are difficult to reconcile with a functional account and suggest (a) that the processing of male out-group faces is influenced by negative emotional expressions in general; and (b) that gender role expectations lead to differential remembering of male and female faces as a function of emotional expression.  相似文献   

10.
The bad genes and anomalous face overgeneralization accounts of facial preferences were tested by examining cue validity, cue utilization, and accuracy in judging health and intelligence from faces in the upper and lower halves of the distributions of attractiveness and its components: averageness, symmetry, and masculinity. Consistent with the bad genes hypothesis, facial attractiveness, averageness, symmetry, and male face masculinity each provided valid cues to intelligence and/or health for faces in the lower but not the upper halves of the distributions of these facial qualities. Consistent with the anomalous face overgeneralization hypothesis, attractiveness and its components were utilized as cues not only for faces in the lower halves of the distributions, but also for those in the upper halves. Intelligence and health were judged accurately for faces in the lower but not the upper half of the attractiveness distribution, and attractiveness mediated this accuracy at all ages except adolescence. Since adolescence is the prime mating age, the latter finding raises questions about the utility of attractiveness as an evolved mechanism to ensure the selection of high quality mates.  相似文献   

11.
It is unclear whether infants differentially process emotional faces in the brain at 5 months of age. Contradictory findings of previous research indicate that additional factors play a role in this process. The current study investigated whether five-month-old infants show differential brain activity between emotional faces. Furthermore, we explored the relation between emotional face processing and (I) stimulus characteristics, specifically the spatial frequency content, and (II) parent, child, and dyadic qualities of interaction characteristics. Face-sensitive components (i.e., N290, P400, Nc) in response to neutral and fearful faces that contained only lower or higher spatial frequencies were assessed. Quality of parent–child interaction was assessed with the Manchester Assessment of Caregiver Infant Interaction (MACI). The results show that, as a full group, none of the components differed between emotional expressions. However, when splitting the group based on median MACI scores, infants who showed high quality of interaction (i.e., more attentiveness to caregiver, positive and negative affect, and liveliness) processed emotions differently, whereas infants who showed low quality did not. These results indicate that a sub-group of infants show differential emotional face processing at 5 months of age, which seem to relate to quality of their behavior during the parent–child interaction.  相似文献   

12.
This preregistered study examined how face masks influenced face memory in a North American sample of 6- to 9-month-old infants (N = 58) born during the COVID-19 pandemic. Infants' memory was tested using a standard visual paired comparison (VPC) task. We crossed whether or not the faces were masked during familiarization and test, yielding four trial types (masked-familiarization/masked-test, unmasked-familiarization/masked-test, masked-familiarization/unmasked-test, and unmasked-familiarization/unmasked-test). Infants showed memory for the faces if the faces were unmasked at test, regardless of whether or not the face was masked during familiarization. However, infants did not show robust evidence of memory when test faces were masked, regardless of the familiarization condition. In addition, infants' bias for looking at the upper (eye) region was greater for masked than unmasked faces, although this difference was unrelated to memory performance. In summary, although the presence of face masks does appear to influence infants' processing of and memory for faces, they can form memories of masked faces and recognize those familiar faces even when unmasked.  相似文献   

13.
14.
To investigate whether infants show neural signatures of recognizing unfamiliar human faces, we tested 9‐month‐olds (= 31) in a rapid repetition ERP paradigm. Pictures of unfamiliar male and female faces (targets) were preceded either by a central attractor (Unprimed) or by a face (Primed). In the latter case, the prime faces were either identical to the target (Repeated) or not (Unrepeated). We compared processing of primed versus unprimed faces as well as processing of repeated versus unrepeated faces. Primed stimuli elicited decreased P1 amplitude, P1 latency and N290 amplitude, indicating categorical repetition effects very early during the stream of processing. For repeated relative to unrepeated faces, N290 latency was reduced. In addition, we observed an enhanced late positivity at occipital channels for unrepeated compared to repeated male faces, but no difference for female faces. Taken together, these results suggest that 9‐month‐olds categorize faces before discriminating them individually. Furthermore, infants' ability to recognize face identity seems to depend on familiarity with the given face category, as indicated by differences in brain responses to male and female faces.  相似文献   

15.
Anzures G  Pascalis O  Quinn PC  Slater AM  Lee K 《Infancy》2011,16(6):640-654
An abundance of experience with own-race faces and limited to no experience with other-race faces has been associated with better recognition memory for own-race faces in infants, children, and adults. This study investigated the developmental origins of this other-race effect (ORE) by examining the role of a salient perceptual property of faces-that of skin color. Six- and 9-month-olds' recognition memory for own- and other-race faces was examined using infant-controlled habituation and visual-paired comparison at test. Infants were shown own- or other-race faces in color or with skin color cues minimized in grayscale images. Results for the color stimuli replicated previous findings that infants show an ORE in face recognition memory. Results for the grayscale stimuli showed that even when a salient perceptual cue to race, such as skin color information, is minimized, 6- to 9-month-olds, nonetheless, show an ORE in their face recognition memory. Infants' use of shape-based and configural cues for face recognition is discussed.  相似文献   

16.
Perceptual narrowing—a phenomenon in which perception is broad from birth, but narrows as a function of experience—has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants retain human face discrimination, presumably because of their experience with human faces. The current study demonstrates that 4‐ to 6‐month‐old infants are able to discriminate nonprimate faces as well. In a visual paired comparison test, 4‐ to 6‐month‐old infants (n = 26) looked significantly longer at novel sheep (Ovis aries) faces, compared to a familiar sheep face (p = .017), while 9‐ to 11‐month‐olds (n = 26) showed no visual preference, and adults (n = 27) had a familiarity preference (p < .001). Infants’ face recognition systems are broadly tuned at birth—not just for primate faces, but for nonprimate faces as well—allowing infants to become specialists in recognizing the types of faces encountered in their first year of life.  相似文献   

17.
Sex, age and education differences in facial affect recognition were assessed within a large sample (n = 7,320). Results indicate superior performance by females and younger individuals in the correct identification of facial emotion, with the largest advantage for low intensity expressions. Though there were no demographic differences for identification accuracy on neutral faces, controlling for response biases by males and older individuals to label faces as neutral revealed sex and age differences for these items as well. This finding suggests that inferior facial affect recognition performance by males and older individuals may be driven primarily by instances in which they fail to detect the presence of emotion in facial expressions. Older individuals also demonstrated a greater tendency to label faces with negative emotion choices, while females exhibited a response bias for sad and fear. These response biases have implications for understanding demographic differences in facial affect recognition.  相似文献   

18.
In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight‐month‐old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a novel face. In the switch faces, a new combination of identity and emotional expression was presented. The results show that infants differentiated between switch and habituation faces only in the upright condition but not in the inverted condition. Experiment 2 provides evidence that infants’ nonresponse in the inverted condition can be attributed to their independent processing of facial identity and emotional expression. This suggests that infants in the upright condition processed facial identity and emotional expression in conjunction with one another.  相似文献   

19.
Emotion regulation is an important developmental task of the early years of life. However, situational effects are rarely examined. In this study, we evaluated situational effects on 7‐month‐olds' and their mothers' emotional expression and interactive regulation behavior, individual differences across situations, and intercorrelations within situations. Mother‐infant dyads (N = 225) were observed interacting during episodes from play, teaching, and still‐face situations that varied along 2 developmentally salient dimensions: emotional challenge (low vs. high), and attentional focus (face‐to‐face vs. object). Attentional focus affected mothers' behavior, whereas both challenge and attentional focus affected infants. Associations between mother and infant behaviors varied in each situation. High‐challenge situations provided more consistent individual differences in infants and more negative behavior from mothers. Findings have implications for appropriate assessment of emotion regulation in infancy.  相似文献   

20.
The effect of prior otitis media with effusion (OME) or current middle ear effusion (MEE) on phonetic perception was examined by testing infants' discrimination of boo and goo syllables in 2 test sessions. Middle ear function was assessed following each perception test using tympanometry. Perceptual performance was compared across 3 infant groups: (a) history‐negative, infants with normal middle ear function who had never received medical treatment for OME; (b) history‐positive, infants with normal middle ear function who received medical treatment for prior episodes of OME; and (c) MEE, infants presenting tympanograms indicating middle ear effusion on the day of testing. History‐negative infants performed significantly better than MEE infants in both test sessions. History‐negative infants also performed significantly better than history‐positive infants in the 2nd test session. Findings suggest that OME has a negative impact on infant phonetic discrimination that may persist even after middle ear function has returned to normal.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号