These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

175 related articles for article (PubMed ID: 35062025)

  • 1. Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age.
    Suess N; Hauswald A; Reisinger P; Rösch S; Keitel A; Weisz N
    Cereb Cortex; 2022 Oct; 32(21):4818-4833. PubMed ID: 35062025
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A Visual Cortical Network for Deriving Phonological Information from Intelligible Lip Movements.
    Hauswald A; Lithari C; Collignon O; Leonardelli E; Weisz N
    Curr Biol; 2018 May; 28(9):1453-1459.e3. PubMed ID: 29681475
    [TBL] [Abstract][Full Text] [Related]  

  • 3. MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading.
    Bröhl F; Keitel A; Kayser C
    eNeuro; 2022; 9(3):. PubMed ID: 35728955
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.
    Bourguignon M; Baart M; Kapnoula EC; Molinaro N
    J Neurosci; 2020 Jan; 40(5):1053-1065. PubMed ID: 31889007
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception.
    Aller M; Økland HS; MacGregor LJ; Blank H; Davis MH
    J Neurosci; 2022 Aug; 42(31):6108-6120. PubMed ID: 35760528
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Human Frequency Following Responses to Vocoded Speech.
    Ananthakrishnan S; Luo X; Krishnan A
    Ear Hear; 2017; 38(5):e256-e267. PubMed ID: 28362674
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Do congruent lip movements facilitate speech processing in a dynamic audiovisual multi-talker scenario? An ERP study with older and younger adults.
    Begau A; Klatt LI; Wascher E; Schneider D; Getzmann S
    Behav Brain Res; 2021 Aug; 412():113436. PubMed ID: 34175355
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Masking of the mouth area impairs reconstruction of acoustic speech features and higher-level segmentational features in the presence of a distractor speaker.
    Haider CL; Suess N; Hauswald A; Park H; Weisz N
    Neuroimage; 2022 May; 252():119044. PubMed ID: 35240298
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Lip movements entrain the observers' low-frequency brain oscillations to facilitate speech intelligibility.
    Park H; Kayser C; Thut G; Gross J
    Elife; 2016 May; 5():. PubMed ID: 27146891
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Sensorimotor control of vocal pitch and formant frequencies in Parkinson's disease.
    Mollaei F; Shiller DM; Baum SR; Gracco VL
    Brain Res; 2016 Sep; 1646():269-277. PubMed ID: 27288701
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.
    Crosse MJ; Di Liberto GM; Lalor EC
    J Neurosci; 2016 Sep; 36(38):9888-95. PubMed ID: 27656026
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Rapid change in articulatory lip movement induced by preceding auditory feedback during production of bilabial plosives.
    Mochida T; Gomi H; Kashino M
    PLoS One; 2010 Nov; 5(11):e13866. PubMed ID: 21079783
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions.
    Treille A; Vilain C; Hueber T; Lamalle L; Sato M
    J Cogn Neurosci; 2017 Mar; 29(3):448-466. PubMed ID: 28139959
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Neural Speech Tracking Highlights the Importance of Visual Speech in Multi-speaker Situations.
    Haider CL; Park H; Hauswald A; Weisz N
    J Cogn Neurosci; 2024 Jan; 36(1):128-142. PubMed ID: 37977156
    [TBL] [Abstract][Full Text] [Related]  

  • 15. General Auditory and Speech-Specific Contributions to Cortical Envelope Tracking Revealed Using Auditory Chimeras.
    Prinsloo KD; Lalor EC
    J Neurosci; 2022 Oct; 42(41):7782-7798. PubMed ID: 36041853
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Auditory detection is modulated by theta phase of silent lip movements.
    Biau E; Wang D; Park H; Jensen O; Hanslmayr S
    Curr Res Neurobiol; 2021; 2():100014. PubMed ID: 36246505
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Attention fine-tunes auditory-motor processing of speech sounds.
    Möttönen R; van de Ven GM; Watkins KE
    J Neurosci; 2014 Mar; 34(11):4064-9. PubMed ID: 24623783
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Generalizable EEG Encoding Models with Naturalistic Audiovisual Stimuli.
    Desai M; Holder J; Villarreal C; Clark N; Hoang B; Hamilton LS
    J Neurosci; 2021 Oct; 41(43):8946-8962. PubMed ID: 34503996
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Neurophysiological Indices of Audiovisual Speech Processing Reveal a Hierarchy of Multisensory Integration Effects.
    O'Sullivan AE; Crosse MJ; Liberto GMD; de Cheveigné A; Lalor EC
    J Neurosci; 2021 Jun; 41(23):4991-5003. PubMed ID: 33824190
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Children With Normal Hearing Are Efficient Users of Fundamental Frequency and Vocal Tract Length Cues for Voice Discrimination.
    Zaltz Y; Goldsworthy RL; Eisenberg LS; Kishon-Rabin L
    Ear Hear; 2020; 41(1):182-193. PubMed ID: 31107364
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.