These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
191 related articles for article (PubMed ID: 35728955)
1. MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading. Bröhl F; Keitel A; Kayser C eNeuro; 2022; 9(3):. PubMed ID: 35728955 [TBL] [Abstract][Full Text] [Related]
2. Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech. Bourguignon M; Baart M; Kapnoula EC; Molinaro N J Neurosci; 2020 Jan; 40(5):1053-1065. PubMed ID: 31889007 [TBL] [Abstract][Full Text] [Related]
3. A Visual Cortical Network for Deriving Phonological Information from Intelligible Lip Movements. Hauswald A; Lithari C; Collignon O; Leonardelli E; Weisz N Curr Biol; 2018 May; 28(9):1453-1459.e3. PubMed ID: 29681475 [TBL] [Abstract][Full Text] [Related]
4. Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception. Aller M; Økland HS; MacGregor LJ; Blank H; Davis MH J Neurosci; 2022 Aug; 42(31):6108-6120. PubMed ID: 35760528 [TBL] [Abstract][Full Text] [Related]
5. Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age. Suess N; Hauswald A; Reisinger P; Rösch S; Keitel A; Weisz N Cereb Cortex; 2022 Oct; 32(21):4818-4833. PubMed ID: 35062025 [TBL] [Abstract][Full Text] [Related]
6. Seeing to hear better: evidence for early audio-visual interactions in speech identification. Schwartz JL; Berthommier F; Savariaux C Cognition; 2004 Sep; 93(2):B69-78. PubMed ID: 15147940 [TBL] [Abstract][Full Text] [Related]
8. Auditory detection is modulated by theta phase of silent lip movements. Biau E; Wang D; Park H; Jensen O; Hanslmayr S Curr Res Neurobiol; 2021; 2():100014. PubMed ID: 36246505 [TBL] [Abstract][Full Text] [Related]
9. Left Motor δ Oscillations Reflect Asynchrony Detection in Multisensory Speech Perception. Biau E; Schultz BG; Gunter TC; Kotz SA J Neurosci; 2022 Mar; 42(11):2313-2326. PubMed ID: 35086905 [TBL] [Abstract][Full Text] [Related]
10. Neurophysiological Indices of Audiovisual Speech Processing Reveal a Hierarchy of Multisensory Integration Effects. O'Sullivan AE; Crosse MJ; Liberto GMD; de Cheveigné A; Lalor EC J Neurosci; 2021 Jun; 41(23):4991-5003. PubMed ID: 33824190 [TBL] [Abstract][Full Text] [Related]
11. Shared and modality-specific brain regions that mediate auditory and visual word comprehension. Keitel A; Gross J; Kayser C Elife; 2020 Aug; 9():. PubMed ID: 32831168 [TBL] [Abstract][Full Text] [Related]
12. Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions. Treille A; Vilain C; Hueber T; Lamalle L; Sato M J Cogn Neurosci; 2017 Mar; 29(3):448-466. PubMed ID: 28139959 [TBL] [Abstract][Full Text] [Related]
13. Lipreading and covert speech production similarly modulate human auditory-cortex responses to pure tones. Kauramäki J; Jääskeläinen IP; Hari R; Möttönen R; Rauschecker JP; Sams M J Neurosci; 2010 Jan; 30(4):1314-21. PubMed ID: 20107058 [TBL] [Abstract][Full Text] [Related]
14. The contribution of dynamic visual cues to audiovisual speech perception. Jaekl P; Pesquita A; Alsius A; Munhall K; Soto-Faraco S Neuropsychologia; 2015 Aug; 75():402-10. PubMed ID: 26100561 [TBL] [Abstract][Full Text] [Related]