These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: The role of the left and right inferior frontal gyrus in processing metaphoric and unrelated co-speech gestures.
    Author: Steines M, Nagels A, Kircher T, Straube B.
    Journal: Neuroimage; 2021 Aug 15; 237():118182. PubMed ID: 34020020.
    Abstract:
    Gestures are an integral part of in-person conversations and complement the meaning of the speech they accompany. The neural processing of co-speech gestures is supported by a mostly left-lateralized network of fronto-temporal regions. However, in contrast to iconic gestures, metaphoric as well as unrelated gestures have been found to more strongly engage the left and right inferior frontal gyrus (IFG), respectively. With this study, we conducted the first systematic comparison of all three types of gestures and resulting potential laterality effects. During collection of functional imaging data, 74 subjects were presented with 5 s videos of abstract speech with related metaphoric gestures, concrete speech with related iconic gestures and concrete speech with unrelated gestures. They were asked to judge whether the content of the speech and gesture matched or not. Differential contrasts revealed that both abstract related and concrete unrelated compared to concrete related stimuli elicited stronger activation of the bilateral IFG. Analyses of lateralization indices for IFG activation further showed a left hemispheric dominance for metaphoric gestures and a right hemispheric dominance for unrelated gestures. Our results give support to the hypothesis that the bilateral IFG is activated specifically when processing load for speech-gesture combinations is high. In addition, laterality effects indicate a stronger involvement of the right IFG in mismatch detection and conflict processing, whereas the left IFG performs the actual integration of information from speech and gesture.
    [Abstract] [Full Text] [Related] [New Search]