These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

112 related articles for article (PubMed ID: 39024092)

  • 1. End-to-End Multimodal Emotion Recognition Based on Facial Expressions and Remote Photoplethysmography Signals.
    Li J; Peng J
    IEEE J Biomed Health Inform; 2024 Oct; 28(10):6054-6063. PubMed ID: 39024092
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Using Facial Micro-Expressions in Combination With EEG and Physiological Signals for Emotion Recognition.
    Saffaryazdi N; Wasim ST; Dileep K; Nia AF; Nanayakkara S; Broadbent E; Billinghurst M
    Front Psychol; 2022; 13():864047. PubMed ID: 35837650
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Multimodal emotion recognition by combining physiological signals and facial expressions: a preliminary study.
    Kortelainen J; Tiinanen S; Huang X; Li X; Laukka S; Pietikäinen M; Seppänen T
    Annu Int Conf IEEE Eng Med Biol Soc; 2012; 2012():5238-41. PubMed ID: 23367110
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Emotion Classification Based on Pulsatile Images Extracted from Short Facial Videos via Deep Learning.
    Talala S; Shvimmer S; Simhon R; Gilead M; Yitzhaky Y
    Sensors (Basel); 2024 Apr; 24(8):. PubMed ID: 38676235
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Feature selection for multimodal emotion recognition in the arousal-valence space.
    Torres CA; Orozco ÁA; Álvarez MA
    Annu Int Conf IEEE Eng Med Biol Soc; 2013; 2013():4330-3. PubMed ID: 24110691
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Fusion of Facial Expressions and EEG for Multimodal Emotion Recognition.
    Huang Y; Yang J; Liao P; Pan J
    Comput Intell Neurosci; 2017; 2017():2107451. PubMed ID: 29056963
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Joint low-rank tensor fusion and cross-modal attention for multimodal physiological signals based emotion recognition.
    Wan X; Wang Y; Wang Z; Tang Y; Liu B
    Physiol Meas; 2024 Jul; 45(7):. PubMed ID: 38917842
    [No Abstract]   [Full Text] [Related]  

  • 8. [Emotion Recognition Based on Multiple Physiological Signals].
    Chen S; Zhang L; Jiang F; Chen W; Miao J; Chen H
    Zhongguo Yi Liao Qi Xie Za Zhi; 2020 Apr; 44(4):283-287. PubMed ID: 32762198
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Dimensional emotion recognition from camera-based PRV features.
    Zhou K; Schinle M; Stork W
    Methods; 2023 Oct; 218():224-232. PubMed ID: 37678514
    [TBL] [Abstract][Full Text] [Related]  

  • 10. PulseGAN: Learning to Generate Realistic Pulse Waveforms in Remote Photoplethysmography.
    Song R; Chen H; Cheng J; Li C; Liu Y; Chen X
    IEEE J Biomed Health Inform; 2021 May; 25(5):1373-1384. PubMed ID: 33434140
    [TBL] [Abstract][Full Text] [Related]  

  • 11. An effective cross-scenario remote heart rate estimation network based on global-local information and video transformer.
    Xiang G; Yao S; Peng Y; Deng H; Wu X; Wang K; Li Y; Wu F
    Phys Eng Sci Med; 2024 Jun; 47(2):729-739. PubMed ID: 38504066
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Deep learning-based remote-photoplethysmography measurement from short-time facial video.
    Li B; Jiang W; Peng J; Li X
    Physiol Meas; 2022 Nov; 43(11):. PubMed ID: 36215976
    [No Abstract]   [Full Text] [Related]  

  • 13. CATM: A Multi-Feature-Based Cross-Scale Attentional Convolutional EEG Emotion Recognition Model.
    Yu H; Xiong X; Zhou J; Qian R; Sha K
    Sensors (Basel); 2024 Jul; 24(15):. PubMed ID: 39123882
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Multimodal Emotion Detection via Attention-Based Fusion of Extracted Facial and Speech Features.
    Mamieva D; Abdusalomov AB; Kutlimuratov A; Muminov B; Whangbo TK
    Sensors (Basel); 2023 Jun; 23(12):. PubMed ID: 37420642
    [TBL] [Abstract][Full Text] [Related]  

  • 15. AND-rPPG: A novel denoising-rPPG network for improving remote heart rate estimation.
    Lokendra B; Puneet G
    Comput Biol Med; 2022 Feb; 141():105146. PubMed ID: 34942393
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults.
    Tan JW; Andrade AO; Li H; Walter S; Hrabal D; Rukavina S; Limbrecht-Ecklundt K; Hoffman H; Traue HC
    PLoS One; 2016; 11(1):e0146691. PubMed ID: 26761427
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Multilevel analysis of facial expressions of emotion and script: self-report (arousal and valence) and psychophysiological correlates.
    Balconi M; Vanutelli ME; Finocchiaro R
    Behav Brain Funct; 2014 Sep; 10(1):32. PubMed ID: 25261242
    [TBL] [Abstract][Full Text] [Related]  

  • 18. FusionSense: Emotion Classification Using Feature Fusion of Multimodal Data and Deep Learning in a Brain-Inspired Spiking Neural Network.
    Tan C; Ceballos G; Kasabov N; Puthanmadam Subramaniyam N
    Sensors (Basel); 2020 Sep; 20(18):. PubMed ID: 32957655
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Emotion recognition from single-channel EEG signals using a two-stage correlation and instantaneous frequency-based filtering method.
    Taran S; Bajaj V
    Comput Methods Programs Biomed; 2019 May; 173():157-165. PubMed ID: 31046991
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Emotion Recognition of Subjects With Hearing Impairment Based on Fusion of Facial Expression and EEG Topographic Map.
    Li D; Liu J; Yang Y; Hou F; Song H; Song Y; Gao Q; Mao Z
    IEEE Trans Neural Syst Rehabil Eng; 2023; 31():437-445. PubMed ID: 36455076
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.