Functional magnetic resonance imaging (fMRI) was used to compare brain activation to static facial displays versus dynamic changes in facial identity or emotional expression. Static images depicted prototypical fearful, angry and neutral expressions. Identity morphs depicted identity changes from one person to another, always with neutral expressions. Emotion morphs depicted expression changes from neutral to fear or anger, creating the illusion that the actor was 'getting scared' or 'getting angry' in real-time. Brain regions implicated in processing facial affect, including the amygdala and fusiform gyrus, showed greater responses to dynamic versus static emotional expressions, especially for fear. Identity morphs activated a dorsal fronto-cingulo-parietal circuit and additional ventral areas, including the amygdala, that also responded to the emotion morphs. Activity in the superior temporal sulcus discriminated emotion morphs from identity morphs, extending its known role in processing biologically relevant motion. The results highlight the importance of temporal cues in the neural coding of facial displays.
SciCrunch is a data sharing and display platform. Anyone can create a custom portal where they can select searchable subsets of hundreds of data sources, brand their web pages and create their community. SciCrunch will push data updates automatically to all portals on a weekly basis. User communities can also add their own data to scicrunch, however this is not currently a free service.