These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Decoding of emotional information in voice-sensitive cortices. Author: Ethofer T, Van De Ville D, Scherer K, Vuilleumier P. Journal: Curr Biol; 2009 Jun 23; 19(12):1028-33. PubMed ID: 19446457. Abstract: The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.[Abstract] [Full Text] [Related] [New Search]