These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

87 related articles for article (PubMed ID: 25797829)

  • 1. The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences.
    Chaspari T; Soldatos C; Maragos P
    World J Biol Psychiatry; 2015; 16(5):312-22. PubMed ID: 25797829
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Recognizing emotional speech in Persian: a validated database of Persian emotional speech (Persian ESD).
    Keshtiari N; Kuhlmann M; Eslami M; Klann-Delius G
    Behav Res Methods; 2015 Mar; 47(1):275-94. PubMed ID: 24853832
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Recognizing emotions in spoken language: a validated set of Portuguese sentences and pseudosentences for research on emotional prosody.
    Castro SL; Lima CF
    Behav Res Methods; 2010 Feb; 42(1):74-81. PubMed ID: 20160287
    [TBL] [Abstract][Full Text] [Related]  

  • 4. The Mandarin Chinese auditory emotions stimulus database: A validated set of Chinese pseudo-sentences.
    Gong B; Li N; Li Q; Yan X; Chen J; Li L; Wu X; Wu C
    Behav Res Methods; 2023 Apr; 55(3):1441-1459. PubMed ID: 35641682
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A resource of validated affective and neutral sentences to assess identification of emotion in spoken language after a brain injury.
    Ben-David BM; van Lieshout PH; Leszcz T
    Brain Inj; 2011; 25(2):206-20. PubMed ID: 21117915
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Speaking to the trained ear: musical expertise enhances the recognition of emotions in speech prosody.
    Lima CF; Castro SL
    Emotion; 2011 Oct; 11(5):1021-31. PubMed ID: 21942696
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Minho Affective Sentences (MAS): Probing the roles of sex, mood, and empathy in affective ratings of verbal stimuli.
    Pinheiro AP; Dias M; Pedrosa J; Soares AP
    Behav Res Methods; 2017 Apr; 49(2):698-716. PubMed ID: 27004484
    [TBL] [Abstract][Full Text] [Related]  

  • 8. ASR for emotional speech: clarifying the issues and enhancing performance.
    Athanaselis T; Bakamidis S; Dologlou I; Cowie R; Douglas-Cowie E; Cox C
    Neural Netw; 2005 May; 18(4):437-44. PubMed ID: 15946824
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Intelligibility of emotional speech in younger and older adults.
    Dupuis K; Pichora-Fuller MK
    Ear Hear; 2014; 35(6):695-707. PubMed ID: 25127327
    [TBL] [Abstract][Full Text] [Related]  

  • 10. The Chinese Facial Emotion Recognition Database (CFERD): a computer-generated 3-D paradigm to measure the recognition of facial emotional expressions at different intensities.
    Huang CL; Hsiao S; Hwu HG; Howng SL
    Psychiatry Res; 2012 Dec; 200(2-3):928-32. PubMed ID: 22503384
    [TBL] [Abstract][Full Text] [Related]  

  • 11. The EU-Emotion Voice Database.
    Lassalle A; Pigat D; O'Reilly H; Berggen S; Fridenson-Hayo S; Tal S; Elfström S; Råde A; Golan O; Bölte S; Baron-Cohen S; Lundqvist D
    Behav Res Methods; 2019 Apr; 51(2):493-506. PubMed ID: 29713953
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Vocal emotion processing in Parkinson's disease: reduced sensitivity to negative emotions.
    Dara C; Monetta L; Pell MD
    Brain Res; 2008 Jan; 1188():100-11. PubMed ID: 18022608
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Nonlinguistic vocalizations from online amateur videos for emotion research: A validated corpus.
    Anikin A; Persson T
    Behav Res Methods; 2017 Apr; 49(2):758-771. PubMed ID: 27130172
    [TBL] [Abstract][Full Text] [Related]  

  • 14. What do your eyes reveal about your foreign language? Reading emotional sentences in a native and foreign language.
    Iacozza S; Costa A; Duñabeitia JA
    PLoS One; 2017; 12(10):e0186027. PubMed ID: 28973016
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Emotional intelligence, not music training, predicts recognition of emotional speech prosody.
    Trimmer CG; Cuddy LL
    Emotion; 2008 Dec; 8(6):838-49. PubMed ID: 19102595
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Disentangling the brain networks supporting affective speech comprehension.
    Hervé PY; Razafimandimby A; Vigneau M; Mazoyer B; Tzourio-Mazoyer N
    Neuroimage; 2012 Jul; 61(4):1255-67. PubMed ID: 22507230
    [TBL] [Abstract][Full Text] [Related]  

  • 17. The Dysarthric Expressed Emotional Database (DEED): An audio-visual database in British English.
    Alhinti L; Cunningham S; Christensen H
    PLoS One; 2023; 18(8):e0287971. PubMed ID: 37549162
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Detection of affective states from text and speech for real-time human--computer interaction.
    Calix RA; Javadpour L; Knapp GM
    Hum Factors; 2012 Aug; 54(4):530-45. PubMed ID: 22908677
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Emotions in [a]: a perceptual and acoustic study.
    Toivanen J; Waaramaa T; Alku P; Laukkanen AM; Seppänen T; Väyrynen E; Airas M
    Logoped Phoniatr Vocol; 2006; 31(1):43-8. PubMed ID: 16517522
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A standardised database of Chinese emotional film clips.
    Ge Y; Zhao G; Zhang Y; Houston RJ; Song J
    Cogn Emot; 2019 Aug; 33(5):976-990. PubMed ID: 30293475
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 5.