Abstract: | Given the importance of infants' perception of bimodal speech for emerging language and emotion development, this study used eye‐tracking technology to examine infants' attention to face+voice displays differing by emotion (fear, sad, happy) and visual stimulus (dynamic versus static). Peripheral distracters were presented to measure attention disengagement. It was predicted that infants would look longer at and disengage more slowly from dynamic bimodal emotion displays, especially when viewing dynamic fear. However, the results from twenty‐two 10‐month‐olds found significantly greater attention on dynamic versus static trials, independent of emotion. Interestingly, infants looked equally to mouth and eye regions of speakers' faces except when viewing/hearing dynamic fear; in this case, they fixated more on the speakers' mouth region. Average latencies to distracters were longer on dynamic compared to static bimodal stimuli, but not differentiated by emotion. Thus, infants' attention was enhanced (in terms of both elicitation and maintenance) by dynamic, bimodal emotion displays. Results are compared to conflicting findings using static emotion displays, with suggestions for future research using more ecologically relevant dynamic, multimodal displays to gain a richer understanding of infants' processing of emotion. |