These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Spoken language and arm gestures are controlled by the same motor control system. Author: Gentilucci M, Dalla Volta R. Journal: Q J Exp Psychol (Hove); 2008 Jun; 61(6):944-57. PubMed ID: 18470824. Abstract: Arm movements can influence language comprehension much as semantics can influence arm movement planning. Arm movement itself can be used as a linguistic signal. We reviewed neurophysiological and behavioural evidence that manual gestures and vocal language share the same control system. Studies of primate premotor cortex and, in particular, of the so-called "mirror system", including humans, suggest the existence of a dual hand/mouth motor command system involved in ingestion activities. This may be the platform on which a combined manual and vocal communication system was constructed. In humans, speech is typically accompanied by manual gesture, speech production itself is influenced by executing or observing transitive hand actions, and manual actions play an important role in the development of speech, from the babbling stage onwards. Behavioural data also show reciprocal influence between word and symbolic gestures. Neuroimaging and repetitive transcranial magnetic stimulation (rTMS) data suggest that the system governing both speech and gesture is located in Broca's area. In general, the presented data support the hypothesis that the hand motor-control system is involved in higher order cognition.[Abstract] [Full Text] [Related] [New Search]