These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

198 related articles for article (PubMed ID: 37841080)

  • 21. Benefits of Music Training for Perception of Emotional Speech Prosody in Deaf Children With Cochlear Implants.
    Good A; Gordon KA; Papsin BC; Nespoli G; Hopyan T; Peretz I; Russo FA
    Ear Hear; 2017; 38(4):455-464. PubMed ID: 28085739
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Robust Multimodal Emotion Recognition from Conversation with Transformer-Based Crossmodality Fusion.
    Xie B; Sidulova M; Park CH
    Sensors (Basel); 2021 Jul; 21(14):. PubMed ID: 34300651
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Res-FLNet: human-robot interaction and collaboration for multi-modal sensing robot autonomous driving tasks based on learning control algorithm.
    Wang S
    Front Neurorobot; 2023; 17():1269105. PubMed ID: 37850153
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence.
    Petrini K; McAleer P; Pollick F
    Brain Res; 2010 Apr; 1323():139-48. PubMed ID: 20153297
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Recognition of musical beat and style and applications in interactive humanoid robot.
    Chu Y
    Front Neurorobot; 2022; 16():875058. PubMed ID: 35990882
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Multi-Modal Adaptive Fusion Transformer Network for the Estimation of Depression Level.
    Sun H; Liu J; Chai S; Qiu Z; Lin L; Huang X; Chen Y
    Sensors (Basel); 2021 Jul; 21(14):. PubMed ID: 34300504
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Arousal Rules: An Empirical Investigation into the Aesthetic Experience of Cross-Modal Perception with Emotional Visual Music.
    Lee IE; Latchoumane CV; Jeong J
    Front Psychol; 2017; 8():440. PubMed ID: 28421007
    [TBL] [Abstract][Full Text] [Related]  

  • 28. The unity assumption facilitates cross-modal binding of musical, non-speech stimuli: The role of spectral and amplitude envelope cues.
    Chuen L; Schutz M
    Atten Percept Psychophys; 2016 Jul; 78(5):1512-28. PubMed ID: 27084701
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Reciprocal modulation of cognitive and emotional aspects in pianistic performances.
    Higuchi MK; Fornari J; Del Ben CM; Graeff FG; Leite JP
    PLoS One; 2011; 6(9):e24437. PubMed ID: 21931716
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Deep Q-network for social robotics using emotional social signals.
    Belo JPR; Azevedo H; Ramos JJG; Romero RAF
    Front Robot AI; 2022; 9():880547. PubMed ID: 36226257
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Dynamic emotional and neural responses to music depend on performance expression and listener experience.
    Chapin H; Jantzen K; Kelso JA; Steinberg F; Large E
    PLoS One; 2010 Dec; 5(12):e13812. PubMed ID: 21179549
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Transformative skeletal motion analysis: optimization of exercise training and injury prevention through graph neural networks.
    Zhu J; Ye Z; Ren M; Ma G
    Front Neurosci; 2024; 18():1353257. PubMed ID: 38606310
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Dance Movements Enhance Song Learning in Deaf Children with Cochlear Implants.
    Vongpaisal T; Caruso D; Yuan Z
    Front Psychol; 2016; 7():835. PubMed ID: 27378964
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Cross-modal interactions in the experience of musical performances: physiological correlates.
    Chapados C; Levitin DJ
    Cognition; 2008 Sep; 108(3):639-51. PubMed ID: 18603233
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Music video emotion classification using slow-fast audio-video network and unsupervised feature representation.
    Pandeya YR; Bhattarai B; Lee J
    Sci Rep; 2021 Oct; 11(1):19834. PubMed ID: 34615904
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Online teaching emotion analysis based on GRU and nonlinear transformer algorithm.
    Ding L
    PeerJ Comput Sci; 2023; 9():e1696. PubMed ID: 38077587
    [TBL] [Abstract][Full Text] [Related]  

  • 37. People's dispositional cooperative tendencies towards robots are unaffected by robots' negative emotional displays in prisoner's dilemma games.
    Hsieh TY; Cross ES
    Cogn Emot; 2022 Aug; 36(5):995-1019. PubMed ID: 35389323
    [TBL] [Abstract][Full Text] [Related]  

  • 38. A Multi-Modal Convolutional Neural Network Model for Intelligent Analysis of the Influence of Music Genres on Children's Emotions.
    Qian Q; Chen X
    Comput Intell Neurosci; 2022; 2022():4957085. PubMed ID: 35909819
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Learning Adversarial Transformer for Symbolic Music Generation.
    Zhang N
    IEEE Trans Neural Netw Learn Syst; 2023 Apr; 34(4):1754-1763. PubMed ID: 32614773
    [TBL] [Abstract][Full Text] [Related]  

  • 40. The effect of context and audio-visual modality on emotions elicited by a musical performance.
    Coutinho E; Scherer KR
    Psychol Music; 2017 Jul; 45(4):550-569. PubMed ID: 28781419
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 10.