These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

217 related articles for article (PubMed ID: 31323242)

  • 41. Perception of co-speech gestures in aphasic patients: a visual exploration study during the observation of dyadic conversations.
    Preisig BC; Eggenberger N; Zito G; Vanbellingen T; Schumacher R; Hopfner S; Nyffeler T; Gutbrod K; Annoni JM; Bohlhalter S; Müri RM
    Cortex; 2015 Mar; 64():157-68. PubMed ID: 25461716
    [TBL] [Abstract][Full Text] [Related]  

  • 42. Demystifying infant vocal imitation: The roles of mouth looking and speaker's gaze.
    Imafuku M; Kanakogi Y; Butler D; Myowa M
    Dev Sci; 2019 Nov; 22(6):e12825. PubMed ID: 30980494
    [TBL] [Abstract][Full Text] [Related]  

  • 43. Neural responses towards a speaker's feeling of (un)knowing.
    Jiang X; Pell MD
    Neuropsychologia; 2016 Jan; 81():79-93. PubMed ID: 26700458
    [TBL] [Abstract][Full Text] [Related]  

  • 44. Utilization of visual information and listener strategies in intelligibility impairment related to bilateral facial paresis.
    Keintz C
    Int J Speech Lang Pathol; 2011 Dec; 13(6):510-7. PubMed ID: 21682545
    [TBL] [Abstract][Full Text] [Related]  

  • 45. The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception.
    Treille A; Vilain C; Sato M
    Front Psychol; 2014; 5():420. PubMed ID: 24860533
    [TBL] [Abstract][Full Text] [Related]  

  • 46. Effects of eye position on event-related potentials during auditory selective attention.
    Okita T; Wei JH
    Psychophysiology; 1993 Jul; 30(4):359-65. PubMed ID: 8327621
    [TBL] [Abstract][Full Text] [Related]  

  • 47. Seeing a talking face matters: Infants' segmentation of continuous auditory-visual speech.
    Tan SHJ; Kalashnikova M; Burnham D
    Infancy; 2023 Mar; 28(2):277-300. PubMed ID: 36217702
    [TBL] [Abstract][Full Text] [Related]  

  • 48. Visual speech speeds up the neural processing of auditory speech.
    van Wassenhove V; Grant KW; Poeppel D
    Proc Natl Acad Sci U S A; 2005 Jan; 102(4):1181-6. PubMed ID: 15647358
    [TBL] [Abstract][Full Text] [Related]  

  • 49. Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception.
    Skipper JI; van Wassenhove V; Nusbaum HC; Small SL
    Cereb Cortex; 2007 Oct; 17(10):2387-99. PubMed ID: 17218482
    [TBL] [Abstract][Full Text] [Related]  

  • 50. Auditory and visual cortical activity during selective attention in fragile X syndrome: a cascade of processing deficiencies.
    Van der Molen MJ; Van der Molen MW; Ridderinkhof KR; Hamel BC; Curfs LM; Ramakers GJ
    Clin Neurophysiol; 2012 Apr; 123(4):720-9. PubMed ID: 21958658
    [TBL] [Abstract][Full Text] [Related]  

  • 51. Objective eye-gaze behaviour during face-to-face communication with proficient alaryngeal speakers: a preliminary study.
    Evitts P; Gallop R
    Int J Lang Commun Disord; 2011; 46(5):535-49. PubMed ID: 21899671
    [TBL] [Abstract][Full Text] [Related]  

  • 52. A link between individual differences in multisensory speech perception and eye movements.
    Gurler D; Doyle N; Walker E; Magnotti J; Beauchamp M
    Atten Percept Psychophys; 2015 May; 77(4):1333-41. PubMed ID: 25810157
    [TBL] [Abstract][Full Text] [Related]  

  • 53. Brain responds to another person's eye blinks in a natural setting-the more empathetic the viewer the stronger the responses.
    Mandel A; Helokunnas S; Pihko E; Hari R
    Eur J Neurosci; 2015 Oct; 42(8):2508-14. PubMed ID: 26132210
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Visual form predictions facilitate auditory processing at the N1.
    Paris T; Kim J; Davis C
    Neuroscience; 2017 Feb; 343():157-164. PubMed ID: 27646290
    [TBL] [Abstract][Full Text] [Related]  

  • 55. Autistic adults anticipate and integrate meaning based on the speaker's voice: Evidence from eye-tracking and event-related potentials.
    Barzy M; Black J; Williams D; Ferguson HJ
    J Exp Psychol Gen; 2020 Jun; 149(6):1097-1115. PubMed ID: 31714095
    [TBL] [Abstract][Full Text] [Related]  

  • 56. Using EEG and stimulus context to probe the modelling of auditory-visual speech.
    Paris T; Kim J; Davis C
    Cortex; 2016 Feb; 75():220-230. PubMed ID: 26045213
    [TBL] [Abstract][Full Text] [Related]  

  • 57. Seeing a Talking Face Matters: Gaze Behavior and the Auditory-Visual Speech Benefit in Adults' Cortical Tracking of Infant-directed Speech.
    Tan SHJ; Kalashnikova M; Di Liberto GM; Crosse MJ; Burnham D
    J Cogn Neurosci; 2023 Nov; 35(11):1741-1759. PubMed ID: 37677057
    [TBL] [Abstract][Full Text] [Related]  

  • 58. Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli.
    Vroomen J; Stekelenburg JJ
    J Cogn Neurosci; 2010 Jul; 22(7):1583-96. PubMed ID: 19583474
    [TBL] [Abstract][Full Text] [Related]  

  • 59. Auditory-visual processing represented in the human superior temporal gyrus.
    Reale RA; Calvert GA; Thesen T; Jenison RL; Kawasaki H; Oya H; Howard MA; Brugge JF
    Neuroscience; 2007 Mar; 145(1):162-84. PubMed ID: 17241747
    [TBL] [Abstract][Full Text] [Related]  

  • 60. Situating language in a minimal social context: how seeing a picture of the speaker's face affects language comprehension.
    Hernández-Gutiérrez D; Muñoz F; Sánchez-García J; Sommer W; Abdel Rahman R; Casado P; Jiménez-Ortega L; Espuny J; Fondevila S; Martín-Loeches M
    Soc Cogn Affect Neurosci; 2021 May; 16(5):502-511. PubMed ID: 33470410
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 11.