首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
The current study examined the effects of institutionalization on the discrimination of facial expressions of emotion in three groups of 42‐month‐old children. One group consisted of children abandoned at birth who were randomly assigned to Care‐as‐Usual (institutional care) following a baseline assessment. Another group consisted of children abandoned at birth who were randomly assigned to high‐quality foster care following a baseline assessment. A third group consisted of never‐institutionalized children who were reared by their biological parents. All children were familiarized to happy, sad, fearful, and neutral facial expressions and tested on their ability to discriminate familiar versus novel facial expressions. Contrary to our prediction, all three groups of children were equally capable of discriminating among the different expressions. Furthermore, in contrast to findings at 13–30 months of age, these same children showed familiarity rather than novelty preferences toward different expressions. There were also asymmetries in children’s discrimination of facial expressions depending on which facial expression served as the familiar versus novel stimulus. Collectively, early institutionalization appears not to impact the development of the ability to discriminate facial expressions of emotion, at least when preferential looking serves as the dependent measure. These findings are discussed in the context of the myriad domains that are affected by early institutionalization.  相似文献   

2.
Quinn and Liben (2008) reported a sex difference on a mental rotation task in which 3‐ to 4‐month‐olds were familiarized with a shape in different rotations and then tested with a novel rotation of the familiar shape and its mirror image. As a group, males but not females showed a significant preference for the mirror image, a pattern paralleled at the individual level (with most males but less than half the females showing the preference). Experiment 1 examined a possible explanation for this performance difference, namely, that females were more sensitive to the angular differences in the familiarized shape. Three‐ to 4‐month‐olds were given a discrimination task involving familiarization with a shape at a given rotation and preference testing with the shape in the familiarized versus a novel rotation. Females and males preferred the novel rotation, with no sex difference observed. This finding did not provide support for the suggestion that the sex difference in mental rotation is explained by differential sensitivity to angular rotation. Experiment 2 revealed that the sex difference in mental rotation is observed in 6‐ to 7‐month‐olds and 9‐ to 10‐month‐olds, suggesting that a sex difference in mental rotation is present at multiple ages during infancy.  相似文献   

3.
Research examining infants’ discrimination of affect often uses unfamiliar faces and voices of adults. Recently, research has examined infant discrimination of affect in familiar faces and voices. In much of this research, infants were habituated to the affective expressions using a “standard” 50% habituation criterion. We extend this line of research by examining infants’ discrimination of unfamiliar peers’, that is, 4‐month‐olds, dynamic, facial, and vocal affective expressions and assessing how discrimination is affected by changing the habituation criterion. In two experiments, using an infant‐controlled habituation design, we explored 3‐ and 5‐month‐olds’ discrimination of their peers’ dynamic audiovisual displays of positive and negative expressions of affect. Results of Experiment 1, using a 50% habituation criterion, revealed that 5‐month‐olds, but not 3‐month‐olds discriminated the affective expressions of their peers. In Experiment 2, we examined whether 3‐month‐olds’ lack of discrimination in Experiment 1 was a result of insufficient habituation (i.e., familiarization). Specifically, 3‐month‐olds were habituated using a 70% habituation criterion, providing them with longer familiarization time. Results revealed that using the more stringent habituation criterion, 3‐month‐olds showed longer habituation times, that is increased familiarization, and discriminated their peers’ affective expressions. Results are discussed in terms of infants’ discrimination of affect, the role of familiarization time, and limitations of the 50% habituation criterion.  相似文献   

4.
Given the importance of infants' perception of bimodal speech for emerging language and emotion development, this study used eye‐tracking technology to examine infants' attention to face+voice displays differing by emotion (fear, sad, happy) and visual stimulus (dynamic versus static). Peripheral distracters were presented to measure attention disengagement. It was predicted that infants would look longer at and disengage more slowly from dynamic bimodal emotion displays, especially when viewing dynamic fear. However, the results from twenty‐two 10‐month‐olds found significantly greater attention on dynamic versus static trials, independent of emotion. Interestingly, infants looked equally to mouth and eye regions of speakers' faces except when viewing/hearing dynamic fear; in this case, they fixated more on the speakers' mouth region. Average latencies to distracters were longer on dynamic compared to static bimodal stimuli, but not differentiated by emotion. Thus, infants' attention was enhanced (in terms of both elicitation and maintenance) by dynamic, bimodal emotion displays. Results are compared to conflicting findings using static emotion displays, with suggestions for future research using more ecologically relevant dynamic, multimodal displays to gain a richer understanding of infants' processing of emotion.  相似文献   

5.
This study examined the emergence of affect specificity in infancy. In this study, infants received verbal and facial signals of 2 different, negatively valenced emotions (fear and sadness) as well as neutral affect via a television monitor to determine if they could make qualitative distinctions among emotions of the same valence. Twenty 12‐ to 14‐month‐olds and 20 16‐ to 18‐month‐olds were examined. Results suggested that younger infants showed no evidence of referential specificity, as they responded similarly to both the target and distracter toys, and showed no evidence of affect specificity, showing no difference in play between affect conditions. Older infants, in contrast, showed evidence both of referential and affect specificity. With respect to affect specificity, 16‐ to 18‐month‐olds touched the target toy less in the fear condition than in the sad condition and showed a larger proportion of negative facial expressions in the sad condition versus the fear condition. These findings suggest a developmental emergence after 15 months of age for affect specificity in relating emotional messages to objects.  相似文献   

6.
Behavioral and electrophysiological evidence suggests a gradual, experience‐dependent specialization of cortical face processing systems that takes place largely in the 1st year of life. To further investigate these findings, event‐related potentials (ERPs) were collected from typically developing 9‐month‐old infants presented with pictures of familiar and unfamiliar monkey or human faces in 2 different orientations. Analyses revealed differential processing across changes in monkey and human faces. The N290 was greater for familiar compared to unfamiliar faces, regardless of species or orientation. In contrast, the P400 to unfamiliar faces was greater than to familiar faces, but only for the monkey condition. The P400 to human faces differentiated the orientation of both familiar and unfamiliar faces. These results suggest more specific processing of human compared to monkey faces in 9‐month‐olds.  相似文献   

7.
Research has demonstrated that humans detect threatening stimuli more rapidly than nonthreatening stimuli. Although the literature presumes that biases for threat should be normative, present early in development, evident across multiple forms of threat, and stable across individuals, developmental work in this area is limited. Here, we examine the developmental differences in infants' (4‐ to 24‐month‐olds) attention to social (angry faces) and nonsocial (snakes) threats using a new age‐appropriate dot‐probe task. In Experiment 1, infants' first fixations were more often to snakes than to frogs, and they were faster to fixate probes that appeared in place of snakes vs. frogs. There were no significant age differences, suggesting that a perceptual bias for snakes is present early in life and stable across infancy. In Experiment 2, infants fixated probes more quickly after viewing any trials that contained an angry face compared to trials that contained a happy face. Further, there were age‐related changes in infants' responses to face stimuli, with a general increase in looking time to faces before the probe and an increase in latency to fixate the probe after seeing angry faces. Together, this work suggests that different developmental mechanisms may be responsible for attentional biases for social vs. nonsocial threats.  相似文献   

8.
The perception of emotional facial expressions may activate corresponding facial muscles in the receiver, also referred to as facial mimicry. Facial mimicry is highly dependent on the context and type of facial expressions. While previous research almost exclusively investigated mimicry in response to pictures or videos of emotional expressions, studies with a real, face-to-face partner are still rare. Here we compared facial mimicry of angry, happy, and sad expressions and emotion recognition in a dyadic face-to-face setting. In sender-receiver dyads, we recorded facial electromyograms in parallel. Senders communicated to the receivers—with facial expressions only—the emotions felt during specific personal situations in the past, eliciting anger, happiness, or sadness. Receivers mostly mimicked happiness, to a lesser degree, sadness, and anger as the least mimicked emotion. In actor-partner interdependence models we showed that the receivers’ own facial activity influenced their ratings, which increased the agreement between the senders’ and receivers’ ratings for happiness, but not for angry and sad expressions. These results are in line with the Emotion Mimicry in Context View, holding that humans mimic happy expressions according to affiliative intentions. The mimicry of sad expressions is less intense, presumably because it signals empathy and might imply personal costs. Direct anger expressions are mimicked the least, possibly because anger communicates threat and aggression. Taken together, we show that incidental facial mimicry in a face-to-face setting is positively related to the recognition accuracy for non-stereotype happy expressions, supporting the functionality of facial mimicry.  相似文献   

9.
This study investigates the role of dynamic information in identity face recognition at birth. More specifically we tested whether semi‐rigid motion, conveyed by a change in facial expression, facilitates identity recognition. In Experiment 1, two‐day‐old newborns, habituated to a static happy or fearful face (static condition) have been tested using a pair of novel and familiar identity static faces with a neutral expression. Results indicated that newborns manifest a preference for the familiar stimulus, independently of the facial expression. In contrast, in Experiment 2 newborns habituated to a semi‐rigid moving face (dynamic condition) manifest a preference for the novel face, independently of the facial expression. In Experiment 3, a multistatic image of a happy face with different degrees of intensity was utilized to test the role of different amount of static pictorial information in identity recognition (multistatic image condition). Results indicated that newborns were not able to recognize the face to which they have been familiarized. Overall, results clearly demonstrate that newborns' face recognition is strictly dependent on the learning conditions and that the semi‐rigid motion conveyed by facial expressions facilitates identity recognition since birth.  相似文献   

10.
The aim of the current study was to investigate the influence of happy and sad mood on facial muscular reactions to emotional facial expressions. Following film clips intended to induce happy and sad mood states, participants observed faces with happy, sad, angry, and neutral expressions while their facial muscular reactions were recorded electromyografically. Results revealed that after watching the happy clip participants showed congruent facial reactions to all emotional expressions, whereas watching the sad clip led to a general reduction of facial muscular reactions. Results are discussed with respect to the information processing style underlying the lack of mimicry in a sad mood state and also with respect to the consequences for social interactions and for embodiment theories.  相似文献   

11.
Using the eye gaze of others to direct one's own attention develops during the first year of life and is thought to be an important skill for learning and social communication. However, it is currently unclear whether infants differentially attend to and encode objects cued by the eye gaze of individuals within familiar groups (e.g., own race, more familiar sex) relative to unfamiliar groups (e.g., other race, less familiar sex). During gaze cueing, but prior to the presentation of objects, 10‐month‐olds looked longer to the eyes of own‐race faces relative to 5‐month‐olds and relative to the eyes of other‐race faces. After gaze cueing, two objects were presented alongside the face and at both ages, infants looked longer to the uncued objects for faces from the more familiar‐sex and longer to cued objects for the less familiar‐sex faces. Finally, during the test phase, both 5‐ and 10‐month‐old infants looked longer to uncued objects relative to cued objects but only when the objects were cued by an own‐race and familiar‐sex individual. Results demonstrate that infants use face eye gaze differently when the cue comes from someone within a highly experienced group.  相似文献   

12.
Ross Flom  Anne D. Pick 《Infancy》2005,7(2):207-218
The study of gaze following in infants younger than 12 months of age has emphasized the effects of gesture, type of target, and its position or placement. This experiment extends this literature by examining the effects of adults' affective expression on 7‐month‐olds' gaze following. The effects of 3 affective expressions—happy, sad, and neutral—on 7‐month‐olds' frequency of gaze following were examined. The results indicated that infants more frequently followed the gaze of an adult posing a neutral expression than that of an adult posing either a happy or a sad expression. The infants also looked proportionately longer toward the indicated target when the adult's expression was neutral. The results are interpreted in terms of infants' flexibility of attention.  相似文献   

13.
Across three experiments, we examined 9‐ and 11‐month‐olds' mappings of novel sound properties to novel animal categories. Infants were familiarized with novel animal–novel sound pairings (e.g., Animal A [red]–Sound 1) and then tested on: (1) their acquisition of the original pairing and (2) their generalization of the sound property to a new member of a familiarized category (e.g., Animal A [blue]–Sound 1). When familiarized with a single exemplar of a category, 11‐month‐olds showed no evidence of acquiring or generalizing the animal–sound pairings. In contrast, 11‐month‐olds learnt the original animal–sound mappings and generalized the sound property to a novel member of that category when familiarized with multiple exemplars of a category. Finally, when familiarized with multiple exemplars, 9‐month‐old infants learnt the original animal–sound pairing, but did not extend the novel sound property. The results of these experiments provide evidence for developmental differences in the facilitative role of multiple exemplars in promoting the learning and generalization of information.  相似文献   

14.
Perceptual narrowing—a phenomenon in which perception is broad from birth, but narrows as a function of experience—has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants retain human face discrimination, presumably because of their experience with human faces. The current study demonstrates that 4‐ to 6‐month‐old infants are able to discriminate nonprimate faces as well. In a visual paired comparison test, 4‐ to 6‐month‐old infants (n = 26) looked significantly longer at novel sheep (Ovis aries) faces, compared to a familiar sheep face (p = .017), while 9‐ to 11‐month‐olds (n = 26) showed no visual preference, and adults (n = 27) had a familiarity preference (p < .001). Infants’ face recognition systems are broadly tuned at birth—not just for primate faces, but for nonprimate faces as well—allowing infants to become specialists in recognizing the types of faces encountered in their first year of life.  相似文献   

15.
Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a ‘pop-out’ effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions.  相似文献   

16.
This research examined developmental and individual differences in infants' speed of processing faces and the relation of processing speed to the type of information encoded. To gauge processing speed, 7‐ and 12‐month‐olds were repeatedly presented with the same face (frontal view), each time paired with a new one, until they showed a consistent preference for the new one. Subsequent probe trials assessed recognition of targets that either preserved configural integrity (Study 1: 3/4 profile and full profile poses) or disrupted it while preserving featural information (Study 2: rotations of 160° or 200° and fracturings). There were developmental differences in both speed and in infants' appreciation of information about faces. Older infants took about 60% fewer trials to reach criterion and had more mature patterns of attention (i.e., looks of shorter duration and more shifts of gaze). Whereas infants of both ages recognized the familiar face in a 3/4 pose, the 12‐month‐olds also recognized it in profile and when rotated. Twelve‐month‐olds who were fast processors additionally recognized the fractured faces; otherwise, processing speed was unrelated to the type of information extracted. At 7 months then, infants made use of some configural information in processing faces; at 12 months, they made use of even more of the configural information, along with part‐based or featural information.  相似文献   

17.
Facial expressions of emotion influence interpersonal trait inferences   总被引:4,自引:0,他引:4  
Theorists have argued that facial expressions of emotion serve the interpersonal function of allowing one animal to predict another's behavior. Humans may extend these predictions into the indefinite future, as in the case of trait inference. The hypothesis that facial expressions of emotion (e.g., anger, disgust, fear, happiness, and sadness) affect subjects' interpersonal trait inferences (e.g., dominance and affiliation) was tested in two experiments. Subjects rated the dispositional affiliation and dominance of target faces with either static or apparently moving expressions. They inferred high dominance and affiliation from happy expressions, high dominance and low affiliation from angry and disgusted expressions, and low dominance from fearful and sad expressions. The findings suggest that facial expressions of emotion convey not only a target's internal state, but also differentially convey interpersonal information, which could potentially seed trait inference.This article constitutes a portion of my dissertation research at Stanford University, which was supported by a National Science Foundation Fellowship and an American Psychological Association Dissertation Award. Thanks to Nancy Alvarado, Chris Dryer, Paul Ekman, Bertram Malle, Susan Nolen-Hoeksema, Steven Sutton, Robert Zajonc, and more anonymous reviewers than I can count on one hand for their comments.  相似文献   

18.
This study tested the hypothesis derived from ecological theory that adaptive social perceptions of emotion expressions fuel trait impressions. Moreover, it was predicted that these impressions would be overgeneralized and perceived in faces that were not intentionally posing expressions but nevertheless varied in emotional demeanor. To test these predictions, perceivers viewed 32 untrained targets posing happy, surprised, angry, sad, and fearful expressions and formed impressions of their dominance and affiliation. When targets posed happiness and surprise they were perceived as high in dominance and affiliation whereas when they posed anger they were perceived as high in dominance and low in affiliation. When targets posed sadness and fear they were perceived as low in dominance. As predicted, many of these impressions were overgeneralized and attributed to targets who were not posing expressions. The observed effects were generally independent of the impact of other facial cues (i.e., attractiveness and babyishness).  相似文献   

19.
In this study, we examined developmental changes in infants' processing of own‐ versus other‐race faces. Caucasian American 8‐month‐olds (Experiment 1) and 4‐month‐olds (Experiment 2) were tested in a habituation‐switch procedure designed to assess holistic (attending to the relationship between internal and external features of the face) versus featural (attending to individual features of the face) processing of faces. Eight‐month‐olds demonstrated holistic processing of upright own‐race (Caucasian) faces, but featural processing of upright other‐race (African) faces. Inverted faces were processed featurally, regardless of ethnicity. Four‐month‐olds, however, demonstrated holistic processing of both Caucasian and African upright faces. These results demonstrate that infants' processing of own‐ versus other‐race faces becomes specialized between 4 and 8 months.  相似文献   

20.
Retaining detailed representations of unstressed syllables is a logical prerequisite for infants' use of probabilistic phonotactics to segment iambic words from fluent speech. The head‐turn preference study was used to investigate the nature of English‐learners' representations of iambic word onsets. Fifty‐four 10.5‐month‐olds were familiarized to passages containing the nonsense iambic word forms ginome and tupong. Following familiarization, infants were either tested on familiar (ginome and tupong) or near‐familiar (pinome and bupong) versus unfamiliar (kidar and mafoos) words. Infants in the familiar test group (familiar vs. unfamiliar) oriented significantly longer to familiar than unfamiliar test items, whereas infants in the near‐familiar test group (near‐familiar vs. unfamiliar) oriented equally long to near‐familiar and unfamiliar test items. Our results provide evidence that infants retain fairly detailed representations of unstressed syllables and therefore support the hypothesis that infants use phonotactic cues to find words in fluent speech.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号