These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

195 related articles for article (PubMed ID: 32150573)

  • 21. Audiovisual sentence recognition not predicted by susceptibility to the McGurk effect.
    Van Engen KJ; Xie Z; Chandrasekaran B
    Atten Percept Psychophys; 2017 Feb; 79(2):396-403. PubMed ID: 27921268
    [TBL] [Abstract][Full Text] [Related]  

  • 22. I see what you meant to say: Anticipating speech errors during online sentence processing.
    Lowder MW; Ferreira F
    J Exp Psychol Gen; 2019 Oct; 148(10):1849-1858. PubMed ID: 30556724
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Audiovisual alignment of co-speech gestures to speech supports word learning in 2-year-olds.
    Jesse A; Johnson EK
    J Exp Child Psychol; 2016 May; 145():1-10. PubMed ID: 26765249
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Direct Speaker Gaze Promotes Trust in Truth-Ambiguous Statements.
    Kreysa H; Kessler L; Schweinberger SR
    PLoS One; 2016; 11(9):e0162291. PubMed ID: 27643789
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Gesturing during disfluent speech: A pragmatic account.
    Kısa YD; Goldin-Meadow S; Casasanto D
    Cognition; 2024 Sep; 250():105855. PubMed ID: 38865912
    [TBL] [Abstract][Full Text] [Related]  

  • 26. I can't keep your face and voice out of my head: neural correlates of an attentional bias toward nonverbal emotional cues.
    Jacob H; Brück C; Domin M; Lotze M; Wildgruber D
    Cereb Cortex; 2014 Jun; 24(6):1460-73. PubMed ID: 23382516
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Verbal and nonverbal predictors of language-mediated anticipatory eye movements.
    Rommers J; Meyer AS; Huettig F
    Atten Percept Psychophys; 2015 Apr; 77(3):720-30. PubMed ID: 25795276
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Modulation of scene consistency and task demand on language-driven eye movements for audio-visual integration.
    Yu WY; Tsai JL
    Acta Psychol (Amst); 2016 Nov; 171():1-16. PubMed ID: 27640139
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Speaker's hand gestures modulate speech perception through phase resetting of ongoing neural oscillations.
    Biau E; Torralba M; Fuentemilla L; de Diego Balaguer R; Soto-Faraco S
    Cortex; 2015 Jul; 68():76-85. PubMed ID: 25595613
    [TBL] [Abstract][Full Text] [Related]  

  • 30. The influence of speaker gaze on listener comprehension: contrasting visual versus intentional accounts.
    Staudte M; Crocker MW; Heloir A; Kipp M
    Cognition; 2014 Oct; 133(1):317-28. PubMed ID: 25079951
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Activation of auditory cortex during silent lipreading.
    Calvert GA; Bullmore ET; Brammer MJ; Campbell R; Williams SC; McGuire PK; Woodruff PW; Iversen SD; David AS
    Science; 1997 Apr; 276(5312):593-6. PubMed ID: 9110978
    [TBL] [Abstract][Full Text] [Related]  

  • 32. When it is apt to adapt: Flexible reasoning guides children's use of talker identity and disfluency cues.
    Thacker JM; Chambers CG; Graham SA
    J Exp Child Psychol; 2018 Mar; 167():314-327. PubMed ID: 29223857
    [TBL] [Abstract][Full Text] [Related]  

  • 33. "Look who's talking!" Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism.
    Grossman RB; Steinhart E; Mitchell T; McIlvane W
    Autism Res; 2015 Jun; 8(3):307-16. PubMed ID: 25620208
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Fixating the eyes of a speaker provides sufficient visual information to modulate early auditory processing.
    Kaplan E; Jesse A
    Biol Psychol; 2019 Sep; 146():107724. PubMed ID: 31323242
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Some limits on encoding visible speech and gestures using a dichotic shadowing task.
    Thompson LA; Guzman FA
    J Gerontol B Psychol Sci Soc Sci; 1999 Nov; 54(6):P347-9. PubMed ID: 10625962
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Looking to understand: the coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension.
    Richardson DC; Dale R
    Cogn Sci; 2005 Nov; 29(6):1045-60. PubMed ID: 21702802
    [TBL] [Abstract][Full Text] [Related]  

  • 37. The listener automatically uses spatial story representations from the speaker's cohesive gestures when processing subsequent sentences without gestures.
    Sekine K; Kita S
    Acta Psychol (Amst); 2017 Sep; 179():89-95. PubMed ID: 28750209
    [TBL] [Abstract][Full Text] [Related]  

  • 38. The impact of when, what and how predictions on auditory speech perception.
    Pinto S; Tremblay P; Basirat A; Sato M
    Exp Brain Res; 2019 Dec; 237(12):3143-3153. PubMed ID: 31576421
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Hand gestures as visual prosody: BOLD responses to audio-visual alignment are modulated by the communicative nature of the stimuli.
    Biau E; Morís Fernández L; Holle H; Avila C; Soto-Faraco S
    Neuroimage; 2016 May; 132():129-137. PubMed ID: 26892858
    [TBL] [Abstract][Full Text] [Related]  

  • 40. On the temporal dynamics of language-mediated vision and vision-mediated language.
    Anderson SE; Chiu E; Huette S; Spivey MJ
    Acta Psychol (Amst); 2011 Jun; 137(2):181-9. PubMed ID: 20961519
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 10.