These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

112 related articles for article (PubMed ID: 32369980)

  • 21. Orthogonal-blendshape-based editing system for facial motion capture data.
    Li Q; Deng Z
    IEEE Comput Graph Appl; 2008; 28(6):76-82. PubMed ID: 19004687
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Accurate landmarking of three-dimensional facial data in the presence of facial expressions and occlusions using a three-dimensional statistical facial feature model.
    Zhao X; Dellandréa E; Chen L; Kakadiaris IA
    IEEE Trans Syst Man Cybern B Cybern; 2011 Oct; 41(5):1417-28. PubMed ID: 21622076
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Look More Into Occlusion: Realistic Face Frontalization and Recognition With BoostGAN.
    Duan Q; Zhang L
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):214-228. PubMed ID: 32203031
    [TBL] [Abstract][Full Text] [Related]  

  • 24. EAC-Net: Deep Nets with Enhancing and Cropping for Facial Action Unit Detection.
    Li W; Abtahi F; Zhu Z; Yin L
    IEEE Trans Pattern Anal Mach Intell; 2018 Nov; 40(11):2583-2596. PubMed ID: 29994168
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Less differentiated facial responses to naturalistic films of another person's emotional expressions in adolescents and adults with High-Functioning Autism Spectrum Disorder.
    Weiss EM; Rominger C; Hofer E; Fink A; Papousek I
    Prog Neuropsychopharmacol Biol Psychiatry; 2019 Mar; 89():341-346. PubMed ID: 30336172
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Quantitative anatomical analysis of facial expression using a 3D motion capture system: Application to cosmetic surgery and facial recognition technology.
    Lee JG; Jung SJ; Lee HJ; Seo JH; Choi YJ; Bae HS; Park JT; Kim HJ
    Clin Anat; 2015 Sep; 28(6):735-44. PubMed ID: 25872024
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Using FACS to trace the neural specializations underlying the recognition of facial expressions: A commentary on Waller et al. (2020).
    Taubert J; Japee S
    Neurosci Biobehav Rev; 2021 Jan; 120():75-77. PubMed ID: 33227326
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Reconstructing 3D Face Model with Associated Expression Deformation from a Single Face Image via Constructing a Low-Dimensional Expression Deformation Manifold.
    Wang SF; Lai SH
    IEEE Trans Pattern Anal Mach Intell; 2011 Oct; 33(10):2115-21. PubMed ID: 21576739
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Optimal Facial Feature Based Emotional Recognition Using Deep Learning Algorithm.
    Kumar Arora T; Kumar Chaubey P; Shree Raman M; Kumar B; Nagesh Y; Anjani PK; Ahmed HMS; Hashmi A; Balamuralitharan S; Debtera B
    Comput Intell Neurosci; 2022; 2022():8379202. PubMed ID: 36177319
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Using Facial Symmetry to Handle Pose Variations in Real-World 3D Face Recognition.
    Passalis G; Perakis P; Theoharis T; Kakadiaris IA
    IEEE Trans Pattern Anal Mach Intell; 2011 Oct; 33(10):1938-51. PubMed ID: 21383396
    [TBL] [Abstract][Full Text] [Related]  

  • 31. EquiFACS: The Equine Facial Action Coding System.
    Wathan J; Burrows AM; Waller BM; McComb K
    PLoS One; 2015; 10(8):e0131738. PubMed ID: 26244573
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Pose-Invariant Face Recognition via RGB-D Images.
    Sang G; Li J; Zhao Q
    Comput Intell Neurosci; 2016; 2016():3563758. PubMed ID: 26819581
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Extending the MaqFACS to measure facial movement in Japanese macaques (Macaca fuscata) reveals a wide repertoire potential.
    Correia-Caeiro C; Holmes K; Miyabe-Nishiwaki T
    PLoS One; 2021; 16(1):e0245117. PubMed ID: 33411716
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Use of 3D faces facilitates facial expression recognition in children.
    Wang L; Chen W; Li H
    Sci Rep; 2017 Apr; 7():45464. PubMed ID: 28368008
    [TBL] [Abstract][Full Text] [Related]  

  • 35. The digital Emily project: achieving a photorealistic digital actor.
    Alexander O; Rogers M; Lambeth W; Chiang JY; Ma WC; Wang CC; Debevec P
    IEEE Comput Graph Appl; 2010; 30(4):20-31. PubMed ID: 20650725
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Similar facial electromyographic responses to faces, voices, and body expressions.
    Magnée MJ; Stekelenburg JJ; Kemner C; de Gelder B
    Neuroreport; 2007 Mar; 18(4):369-72. PubMed ID: 17435605
    [TBL] [Abstract][Full Text] [Related]  

  • 37. CalliFACS: The common marmoset Facial Action Coding System.
    Correia-Caeiro C; Burrows A; Wilson DA; Abdelrahman A; Miyabe-Nishiwaki T
    PLoS One; 2022; 17(5):e0266442. PubMed ID: 35580128
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Differences in Facial Expressions between Spontaneous and Posed Smiles: Automated Method by Action Units and Three-Dimensional Facial Landmarks.
    Park S; Lee K; Lim JA; Ko H; Kim T; Lee JI; Kim H; Han SJ; Kim JS; Park S; Lee JY; Lee EC
    Sensors (Basel); 2020 Feb; 20(4):. PubMed ID: 32098261
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Can an android's posture and movement discriminate against the ambiguous emotion perceived from its facial expressions?
    Yagi S; Nakata Y; Nakamura Y; Ishiguro H
    PLoS One; 2021; 16(8):e0254905. PubMed ID: 34375327
    [TBL] [Abstract][Full Text] [Related]  

  • 40.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 6.