These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Language and gesture neural correlates: A meta-analysis of functional magnetic resonance imaging studies. Author: Cacciante L, Pregnolato G, Salvalaggio S, Federico S, Kiper P, Smania N, Turolla A. Journal: Int J Lang Commun Disord; 2024; 59(3):902-912. PubMed ID: 37971416. Abstract: BACKGROUND: Humans often use co-speech gestures to promote effective communication. Attention has been paid to the cortical areas engaged in the processing of co-speech gestures. AIMS: To investigate the neural network underpinned in the processing of co-speech gestures and to observe whether there is a relationship between areas involved in language and gesture processing. METHODS & PROCEDURES: We planned to include studies with neurotypical and/or stroke participants who underwent a bimodal task (i.e., processing of co-speech gestures with relative speech) and a unimodal task (i.e., speech or gesture alone) during a functional magnetic resonance imaging (fMRI) session. After a database search, abstract and full-text screening were conducted. Qualitative and quantitative data were extracted, and a meta-analysis was performed with the software GingerALE 3.0.2, performing contrast analyses of uni- and bimodal tasks. MAIN CONTRIBUTION: The database search produced 1024 records. After the screening process, 27 studies were included in the review. Data from 15 studies were quantitatively analysed through meta-analysis. Meta-analysis found three clusters with a significant activation of the left middle frontal gyrus and inferior frontal gyrus, and bilateral middle occipital gyrus and inferior temporal gyrus. CONCLUSIONS: There is a close link at the neural level for the semantic processing of auditory and visual information during communication. These findings encourage the integration of the use of co-speech gestures during aphasia treatment as a strategy to foster the possibility to communicate effectively for people with aphasia. WHAT THIS PAPER ADDS: What is already known on this subject Gestures are an integral part of human communication, and they may have a relationship at neural level with speech processing. What this paper adds to the existing knowledge During processing of bi- and unimodal communication, areas related to semantic processing and multimodal processing are activated, suggesting that there is a close link between co-speech gestures and spoken language at a neural level. What are the potential or actual clinical implications of this work? Knowledge of the functions related to gesture and speech processing neural networks will allow for the adoption of model-based neurorehabilitation programs to foster recovery from aphasia by strengthening the specific functions of these brain networks.[Abstract] [Full Text] [Related] [New Search]