These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

139 related articles for article (PubMed ID: 37604959)

  • 1. Mouth and facial informativeness norms for 2276 English words.
    Krason A; Zhang Y; Man H; Vigliocco G
    Behav Res Methods; 2024 Aug; 56(5):4786-4801. PubMed ID: 37604959
    [TBL] [Abstract][Full Text] [Related]  

  • 2. The role of iconic gestures and mouth movements in face-to-face communication.
    Krason A; Fenton R; Varley R; Vigliocco G
    Psychon Bull Rev; 2022 Apr; 29(2):600-612. PubMed ID: 34671936
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Increasing audiovisual speech integration in autism through enhanced attention to mouth.
    Feng S; Wang Q; Hu Y; Lu H; Li T; Song C; Fang J; Chen L; Yi L
    Dev Sci; 2023 Jul; 26(4):e13348. PubMed ID: 36394129
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Face-viewing patterns predict audiovisual speech integration in autistic children.
    Feng S; Lu H; Wang Q; Li T; Fang J; Chen L; Yi L
    Autism Res; 2021 Dec; 14(12):2592-2602. PubMed ID: 34415113
    [TBL] [Abstract][Full Text] [Related]  

  • 5. [Slowing down the flow of facial information enhances facial scanning in children with autism spectrum disorders: A pilot eye tracking study].
    Charrier A; Tardif C; Gepner B
    Encephale; 2017 Feb; 43(1):32-40. PubMed ID: 26995150
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech.
    Banks B; Gowen E; Munro KJ; Adank P
    J Speech Lang Hear Res; 2021 Sep; 64(9):3432-3445. PubMed ID: 34463528
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence.
    Hernández-Gutiérrez D; Abdel Rahman R; Martín-Loeches M; Muñoz F; Schacht A; Sommer W
    Cortex; 2018 Jul; 104():12-25. PubMed ID: 29715582
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Automatic audiovisual integration in speech perception.
    Gentilucci M; Cattaneo L
    Exp Brain Res; 2005 Nov; 167(1):66-75. PubMed ID: 16034571
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Syllable or phoneme? A mouse-tracking investigation of phonological units in Mandarin Chinese and English spoken word recognition.
    Lin YC; Lin PY; Yeh LH
    J Exp Psychol Learn Mem Cogn; 2023 Jan; 49(1):130-176. PubMed ID: 35679219
    [TBL] [Abstract][Full Text] [Related]  

  • 10. [Development and evaluation of a deep learning algorithm for German word recognition from lip movements].
    Pham DN; Rahne T
    HNO; 2022 Jun; 70(6):456-465. PubMed ID: 35024877
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Comparing the Informativeness of Single-Word Samples and Connected Speech Samples in Assessing Speech Sound Disorders.
    Yeh LL; Liu CC
    J Speech Lang Hear Res; 2021 Nov; 64(11):4071-4084. PubMed ID: 34618552
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.
    Bourguignon M; Baart M; Kapnoula EC; Molinaro N
    J Neurosci; 2020 Jan; 40(5):1053-1065. PubMed ID: 31889007
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Inferring word meanings by assuming that speakers are informative.
    Frank MC; Goodman ND
    Cogn Psychol; 2014 Dec; 75():80-96. PubMed ID: 25238461
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Tolerance for audiovisual asynchrony is enhanced by the spectrotemporal fidelity of the speaker's mouth movements and speech.
    Shahin AJ; Shen S; Kerlin JR
    Lang Cogn Neurosci; 2017; 32(9):1102-1118. PubMed ID: 28966930
    [TBL] [Abstract][Full Text] [Related]  

  • 15. When eyes beat lips: speaker gaze affects audiovisual integration in the McGurk illusion.
    Wahn B; Schmitz L; Kingstone A; Böckler-Raettig A
    Psychol Res; 2022 Sep; 86(6):1930-1943. PubMed ID: 34854983
    [TBL] [Abstract][Full Text] [Related]  

  • 16. The Relevance of the Availability of Visual Speech Cues During Adaptation to Noise-Vocoded Speech.
    Trotter AS; Banks B; Adank P
    J Speech Lang Hear Res; 2021 Jul; 64(7):2513-2528. PubMed ID: 34161748
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Computerised speech and language therapy or attention control added to usual care for people with long-term post-stroke aphasia: the Big CACTUS three-arm RCT.
    Palmer R; Dimairo M; Latimer N; Cross E; Brady M; Enderby P; Bowen A; Julious S; Harrison M; Alshreef A; Bradley E; Bhadhuri A; Chater T; Hughes H; Witts H; Herbert E; Cooper C
    Health Technol Assess; 2020 Apr; 24(19):1-176. PubMed ID: 32369007
    [TBL] [Abstract][Full Text] [Related]  

  • 18. The Lancaster Sensorimotor Norms: multidimensional measures of perceptual and action strength for 40,000 English words.
    Lynott D; Connell L; Brysbaert M; Brand J; Carney J
    Behav Res Methods; 2020 Jun; 52(3):1271-1291. PubMed ID: 31832879
    [TBL] [Abstract][Full Text] [Related]  

  • 19. The contribution of dynamic visual cues to audiovisual speech perception.
    Jaekl P; Pesquita A; Alsius A; Munhall K; Soto-Faraco S
    Neuropsychologia; 2015 Aug; 75():402-10. PubMed ID: 26100561
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Mouth and Voice: A Relationship between Visual and Auditory Preference in the Human Superior Temporal Sulcus.
    Zhu LL; Beauchamp MS
    J Neurosci; 2017 Mar; 37(10):2697-2708. PubMed ID: 28179553
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.