These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

275 related articles for article (PubMed ID: 29248497)

  • 1. Electrophysiological evidence for Audio-visuo-lingual speech integration.
    Treille A; Vilain C; Schwartz JL; Hueber T; Sato M
    Neuropsychologia; 2018 Jan; 109():126-133. PubMed ID: 29248497
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions.
    Treille A; Vilain C; Hueber T; Lamalle L; Sato M
    J Cogn Neurosci; 2017 Mar; 29(3):448-466. PubMed ID: 28139959
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.
    Treille A; Vilain C; Kandel S; Sato M
    Exp Brain Res; 2017 Sep; 235(9):2867-2876. PubMed ID: 28676921
    [TBL] [Abstract][Full Text] [Related]  

  • 4. The impact of when, what and how predictions on auditory speech perception.
    Pinto S; Tremblay P; Basirat A; Sato M
    Exp Brain Res; 2019 Dec; 237(12):3143-3153. PubMed ID: 31576421
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.
    Bourguignon M; Baart M; Kapnoula EC; Molinaro N
    J Neurosci; 2020 Jan; 40(5):1053-1065. PubMed ID: 31889007
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception.
    Aller M; Økland HS; MacGregor LJ; Blank H; Davis MH
    J Neurosci; 2022 Aug; 42(31):6108-6120. PubMed ID: 35760528
    [TBL] [Abstract][Full Text] [Related]  

  • 7. The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception.
    Treille A; Vilain C; Sato M
    Front Psychol; 2014; 5():420. PubMed ID: 24860533
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions.
    Treille A; Cordeboeuf C; Vilain C; Sato M
    Neuropsychologia; 2014 May; 57():71-7. PubMed ID: 24530236
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Electrophysiological evidence for speech-specific audiovisual integration.
    Baart M; Stekelenburg JJ; Vroomen J
    Neuropsychologia; 2014 Jan; 53():115-21. PubMed ID: 24291340
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Audio-visual matching of speech and non-speech oral gestures in patients with aphasia and apraxia of speech.
    Schmid G; Ziegler W
    Neuropsychologia; 2006; 44(4):546-55. PubMed ID: 16129459
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Neurophysiological Indices of Audiovisual Speech Processing Reveal a Hierarchy of Multisensory Integration Effects.
    O'Sullivan AE; Crosse MJ; Liberto GMD; de Cheveigné A; Lalor EC
    J Neurosci; 2021 Jun; 41(23):4991-5003. PubMed ID: 33824190
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A representation of abstract linguistic categories in the visual system underlies successful lipreading.
    Nidiffer AR; Cao CZ; O'Sullivan A; Lalor EC
    Neuroimage; 2023 Nov; 282():120391. PubMed ID: 37757989
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Phonetic versus spatial processes during motor-oriented imitations of visuo-labial and visuo-lingual speech: A functional near-infrared spectroscopy study.
    Zhao T; Hu A; Su R; Lyu C; Wang L; Yan N
    Eur J Neurosci; 2022 Jan; 55(1):154-174. PubMed ID: 34854143
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Effects of audio-visual integration on the detection of masked speech and non-speech sounds.
    Eramudugolla R; Henderson R; Mattingley JB
    Brain Cogn; 2011 Feb; 75(1):60-6. PubMed ID: 21067852
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Listening to talking faces: motor cortical activation during speech perception.
    Skipper JI; Nusbaum HC; Small SL
    Neuroimage; 2005 Mar; 25(1):76-89. PubMed ID: 15734345
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Correlation between audio-visual enhancement of speech in different noise environments and SNR: a combined behavioral and electrophysiological study.
    Liu B; Lin Y; Gao X; Dang J
    Neuroscience; 2013 Sep; 247():145-51. PubMed ID: 23673276
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Neural indicators of articulator-specific sensorimotor influences on infant speech perception.
    Choi D; Dehaene-Lambertz G; Peña M; Werker JF
    Proc Natl Acad Sci U S A; 2021 May; 118(20):. PubMed ID: 33980713
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Degradation of labial information modifies audiovisual speech perception in cochlear-implanted children.
    Huyse A; Berthommier F; Leybaert J
    Ear Hear; 2013; 34(1):110-21. PubMed ID: 23059850
    [TBL] [Abstract][Full Text] [Related]  

  • 19. [Intermodal timing cues for audio-visual speech recognition].
    Hashimoto M; Kumashiro M
    J UOEH; 2004 Jun; 26(2):215-25. PubMed ID: 15244074
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Read My Lips: Brain Dynamics Associated with Audiovisual Integration and Deviance Detection.
    Tse CY; Gratton G; Garnsey SM; Novak MA; Fabiani M
    J Cogn Neurosci; 2015 Sep; 27(9):1723-37. PubMed ID: 25848682
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 14.