These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Speech-and-gesture integration in high functioning autism.
    Author: Silverman LB, Bennetto L, Campana E, Tanenhaus MK.
    Journal: Cognition; 2010 Jun; 115(3):380-93. PubMed ID: 20356575.
    Abstract:
    This study examined iconic gesture comprehension in autism, with the goal of assessing whether cross-modal processing difficulties impede speech-and-gesture integration. Participants were 19 adolescents with high functioning autism (HFA) and 20 typical controls matched on age, gender, verbal IQ, and socio-economic status (SES). Gesture comprehension was assessed via quantitative analyses of visual fixations during a video-based task, using the visual world paradigm. Participants' eye movements were recorded while they watched videos of a person describing one of four shapes shown on a computer screen, using speech-and-gesture or speech-only descriptions. Participants clicked on the shape that the speaker described. Since gesture naturally precedes speech, earlier visual fixations to the target shape during speech-and-gesture compared to speech-only trials, would suggest immediate integration of auditory and visual information. Analyses of eye movements supported this pattern in control participants but not in individuals with autism: iconic gestures facilitated comprehension in typical individuals, while it hindered comprehension in those with autism. Cross-modal processing difficulties in autism were not accounted for by impaired unimodal speech or gesture processing. The results have important implications for the treatment of children and adults with this disorder.
    [Abstract] [Full Text] [Related] [New Search]