These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

191 related articles for article (PubMed ID: 35728955)

  • 1. MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading.
    Bröhl F; Keitel A; Kayser C
    eNeuro; 2022; 9(3):. PubMed ID: 35728955
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.
    Bourguignon M; Baart M; Kapnoula EC; Molinaro N
    J Neurosci; 2020 Jan; 40(5):1053-1065. PubMed ID: 31889007
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A Visual Cortical Network for Deriving Phonological Information from Intelligible Lip Movements.
    Hauswald A; Lithari C; Collignon O; Leonardelli E; Weisz N
    Curr Biol; 2018 May; 28(9):1453-1459.e3. PubMed ID: 29681475
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception.
    Aller M; Økland HS; MacGregor LJ; Blank H; Davis MH
    J Neurosci; 2022 Aug; 42(31):6108-6120. PubMed ID: 35760528
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age.
    Suess N; Hauswald A; Reisinger P; Rösch S; Keitel A; Weisz N
    Cereb Cortex; 2022 Oct; 32(21):4818-4833. PubMed ID: 35062025
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Seeing to hear better: evidence for early audio-visual interactions in speech identification.
    Schwartz JL; Berthommier F; Savariaux C
    Cognition; 2004 Sep; 93(2):B69-78. PubMed ID: 15147940
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Lip movements entrain the observers' low-frequency brain oscillations to facilitate speech intelligibility.
    Park H; Kayser C; Thut G; Gross J
    Elife; 2016 May; 5():. PubMed ID: 27146891
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Auditory detection is modulated by theta phase of silent lip movements.
    Biau E; Wang D; Park H; Jensen O; Hanslmayr S
    Curr Res Neurobiol; 2021; 2():100014. PubMed ID: 36246505
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Left Motor δ Oscillations Reflect Asynchrony Detection in Multisensory Speech Perception.
    Biau E; Schultz BG; Gunter TC; Kotz SA
    J Neurosci; 2022 Mar; 42(11):2313-2326. PubMed ID: 35086905
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Neurophysiological Indices of Audiovisual Speech Processing Reveal a Hierarchy of Multisensory Integration Effects.
    O'Sullivan AE; Crosse MJ; Liberto GMD; de Cheveigné A; Lalor EC
    J Neurosci; 2021 Jun; 41(23):4991-5003. PubMed ID: 33824190
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Shared and modality-specific brain regions that mediate auditory and visual word comprehension.
    Keitel A; Gross J; Kayser C
    Elife; 2020 Aug; 9():. PubMed ID: 32831168
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions.
    Treille A; Vilain C; Hueber T; Lamalle L; Sato M
    J Cogn Neurosci; 2017 Mar; 29(3):448-466. PubMed ID: 28139959
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Lipreading and covert speech production similarly modulate human auditory-cortex responses to pure tones.
    Kauramäki J; Jääskeläinen IP; Hari R; Möttönen R; Rauschecker JP; Sams M
    J Neurosci; 2010 Jan; 30(4):1314-21. PubMed ID: 20107058
    [TBL] [Abstract][Full Text] [Related]  

  • 14. The contribution of dynamic visual cues to audiovisual speech perception.
    Jaekl P; Pesquita A; Alsius A; Munhall K; Soto-Faraco S
    Neuropsychologia; 2015 Aug; 75():402-10. PubMed ID: 26100561
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Congruent Visual Speech Enhances Cortical Entrainment to Continuous Auditory Speech in Noise-Free Conditions.
    Crosse MJ; Butler JS; Lalor EC
    J Neurosci; 2015 Oct; 35(42):14195-204. PubMed ID: 26490860
    [TBL] [Abstract][Full Text] [Related]  

  • 16. When eyes beat lips: speaker gaze affects audiovisual integration in the McGurk illusion.
    Wahn B; Schmitz L; Kingstone A; Böckler-Raettig A
    Psychol Res; 2022 Sep; 86(6):1930-1943. PubMed ID: 34854983
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Changes in visually and auditory attended audiovisual speech processing in cochlear implant users: A longitudinal ERP study.
    Weglage A; Layer N; Meister H; Müller V; Lang-Roth R; Walger M; Sandmann P
    Hear Res; 2024 Jun; 447():109023. PubMed ID: 38733710
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Contributions of local speech encoding and functional connectivity to audio-visual speech perception.
    Giordano BL; Ince RAA; Gross J; Schyns PG; Panzeri S; Kayser C
    Elife; 2017 Jun; 6():. PubMed ID: 28590903
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Lip movements enhance speech representations and effective connectivity in auditory dorsal stream.
    Zhang L; Du Y
    Neuroimage; 2022 Aug; 257():119311. PubMed ID: 35589000
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Neural Mechanisms Underlying Cross-Modal Phonetic Encoding.
    Shahin AJ; Backer KC; Rosenblum LD; Kerlin JR
    J Neurosci; 2018 Feb; 38(7):1835-1849. PubMed ID: 29263241
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 10.