These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

280 related articles for article (PubMed ID: 29921937)

  • 1. Neural mechanisms for selectively tuning in to the target speaker in a naturalistic noisy situation.
    Dai B; Chen C; Long Y; Zheng L; Zhao H; Bai X; Liu W; Zhang Y; Liu L; Guo T; Ding G; Lu C
    Nat Commun; 2018 Jun; 9(1):2405. PubMed ID: 29921937
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Speaker-Listener Neural Coupling Reveals an Adaptive Mechanism for Speech Comprehension in a Noisy Environment.
    Li Z; Li J; Hong B; Nolte G; Engel AK; Zhang D
    Cereb Cortex; 2021 Aug; 31(10):4719-4729. PubMed ID: 33969389
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Auditory-Articulatory Neural Alignment between Listener and Speaker during Verbal Communication.
    Liu L; Zhang Y; Zhou Q; Garrett DD; Lu C; Chen A; Qiu J; Ding G
    Cereb Cortex; 2020 Mar; 30(3):942-951. PubMed ID: 31318013
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Neural decoding of attentional selection in multi-speaker environments without access to clean sources.
    O'Sullivan J; Chen Z; Herrero J; McKhann GM; Sheth SA; Mehta AD; Mesgarani N
    J Neural Eng; 2017 Oct; 14(5):056001. PubMed ID: 28776506
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Noise-robust cortical tracking of attended speech in real-world acoustic scenes.
    Fuglsang SA; Dau T; Hjortkjær J
    Neuroimage; 2017 Aug; 156():435-444. PubMed ID: 28412441
    [TBL] [Abstract][Full Text] [Related]  

  • 6. How the Listener's Attention Dynamically Switches Between Different Speakers During a Natural Conversation.
    Dai B; Zhai Y; Long Y; Lu C
    Psychol Sci; 2024 Jun; 35(6):635-652. PubMed ID: 38657276
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Left Superior Temporal Gyrus Is Coupled to Attended Speech in a Cocktail-Party Auditory Scene.
    Vander Ghinst M; Bourguignon M; Op de Beeck M; Wens V; Marty B; Hassid S; Choufani G; Jousmäki V; Hari R; Van Bogaert P; Goldman S; De Tiège X
    J Neurosci; 2016 Feb; 36(5):1596-606. PubMed ID: 26843641
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Hierarchical Encoding of Attended Auditory Objects in Multi-talker Speech Perception.
    O'Sullivan J; Herrero J; Smith E; Schevon C; McKhann GM; Sheth SA; Mehta AD; Mesgarani N
    Neuron; 2019 Dec; 104(6):1195-1209.e3. PubMed ID: 31648900
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Speaker-listener neural coupling correlates with semantic and acoustic features of naturalistic speech.
    Li Z; Hong B; Nolte G; Engel AK; Zhang D
    Soc Cogn Affect Neurosci; 2024 Jul; ():. PubMed ID: 39012092
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Attentional Modulation of Hierarchical Speech Representations in a Multitalker Environment.
    Kiremitçi I; Yilmaz Ö; Çelik E; Shahdloo M; Huth AG; Çukur T
    Cereb Cortex; 2021 Oct; 31(11):4986-5005. PubMed ID: 34115102
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Selective cortical representation of attended speaker in multi-talker speech perception.
    Mesgarani N; Chang EF
    Nature; 2012 May; 485(7397):233-6. PubMed ID: 22522927
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Spatiotemporal dynamics of auditory attention synchronize with speech.
    Wöstmann M; Herrmann B; Maess B; Obleser J
    Proc Natl Acad Sci U S A; 2016 Apr; 113(14):3873-8. PubMed ID: 27001861
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Emotions amplify speaker-listener neural alignment.
    Smirnov D; Saarimäki H; Glerean E; Hari R; Sams M; Nummenmaa L
    Hum Brain Mapp; 2019 Nov; 40(16):4777-4788. PubMed ID: 31400052
    [TBL] [Abstract][Full Text] [Related]  

  • 14. EEG-based auditory attention detection: boundary conditions for background noise and speaker positions.
    Das N; Bertrand A; Francart T
    J Neural Eng; 2018 Dec; 15(6):066017. PubMed ID: 30207293
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Evaluating Speaker-Listener Cognitive Effort in Speech Communication Through Brain-to-Brain Synchrony: A Pilot Functional Near-Infrared Spectroscopy Investigation.
    Green GD; Jacewicz E; Santosa H; Arzbecker LJ; Fox RA
    J Speech Lang Hear Res; 2024 May; 67(5):1339-1359. PubMed ID: 38535722
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Cortical Representations of Speech in a Multitalker Auditory Scene.
    Puvvada KC; Simon JZ
    J Neurosci; 2017 Sep; 37(38):9189-9196. PubMed ID: 28821680
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Breaking down the cocktail party: Attentional modulation of cerebral audiovisual speech processing.
    Wikman P; Sahari E; Salmela V; Leminen A; Leminen M; Laine M; Alho K
    Neuroimage; 2021 Jan; 224():117365. PubMed ID: 32941985
    [TBL] [Abstract][Full Text] [Related]  

  • 18. EEG-based speaker-listener neural coupling reflects speech-selective attentional mechanisms beyond the speech stimulus.
    Li J; Hong B; Nolte G; Engel AK; Zhang D
    Cereb Cortex; 2023 Nov; 33(22):11080-11091. PubMed ID: 37814353
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Cortical tracking of multiple streams outside the focus of attention in naturalistic auditory scenes.
    Hausfeld L; Riecke L; Valente G; Formisano E
    Neuroimage; 2018 Nov; 181():617-626. PubMed ID: 30048749
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Visual input enhances selective speech envelope tracking in auditory cortex at a "cocktail party".
    Zion Golumbic E; Cogan GB; Schroeder CE; Poeppel D
    J Neurosci; 2013 Jan; 33(4):1417-26. PubMed ID: 23345218
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 14.