These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Dynamic intersubject neural synchronization reflects affective responses to sad music. Author: Sachs ME, Habibi A, Damasio A, Kaplan JT. Journal: Neuroimage; 2020 Sep; 218():116512. PubMed ID: 31901418. Abstract: Psychological theories of emotion often highlight the dynamic quality of the affective experience, yet neuroimaging studies of affect have traditionally relied on static stimuli that lack ecological validity. Consequently, the brain regions that represent emotions and feelings as they unfold remain unclear. Recently, dynamic, model-free analytical techniques have been employed with naturalistic stimuli to better capture time-varying patterns of activity in the brain; yet, few studies have focused on relating these patterns to changes in subjective feelings. Here, we address this gap, using intersubject correlation and phase synchronization to assess how stimulus-driven changes in brain activity and connectivity are related to two aspects of emotional experience: emotional intensity and enjoyment. During fMRI scanning, healthy volunteers listened to a full-length piece of music selected to induce sadness. After scanning, participants listened to the piece twice while simultaneously rating the intensity of felt sadness or felt enjoyment. Activity in the auditory cortex, insula, and inferior frontal gyrus was significantly synchronized across participants. Synchronization in auditory, visual, and prefrontal regions was significantly greater in participants with higher measures of a subscale of trait empathy related to feeling emotions in response to music. When assessed dynamically, continuous enjoyment ratings positively predicted a moment-to-moment measure of intersubject synchronization in auditory, default mode, and striatal networks, as well as the orbitofrontal cortex, whereas sadness predicted intersubject synchronization in limbic and striatal networks. The results suggest that stimulus-driven patterns of neural communication in emotional processing and high-level cortical regions carry meaningful information with regards to our feeling in response to a naturalistic stimulus.[Abstract] [Full Text] [Related] [New Search]