These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

125 related articles for article (PubMed ID: 33714544)

  • 1. Finding event structure in time: What recurrent neural networks can tell us about event structure in mind.
    Davis F; Altmann GTM
    Cognition; 2021 Aug; 213():104651. PubMed ID: 33714544
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Events as intersecting object histories: A new theory of event representation.
    Altmann GTM; Ekves Z
    Psychol Rev; 2019 Nov; 126(6):817-840. PubMed ID: 31144837
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Reading visually embodied meaning from the brain: Visually grounded computational models decode visual-object mental imagery induced by written text.
    Anderson AJ; Bruni E; Lopopolo A; Poesio M; Baroni M
    Neuroimage; 2015 Oct; 120():309-22. PubMed ID: 26188260
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Sense classification of shallow discourse relations with focused RNNs.
    Weiss G; Bajec M
    PLoS One; 2018; 13(10):e0206057. PubMed ID: 30376557
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Applications of Recurrent Neural Networks in Environmental Factor Forecasting: A Review.
    Chen Y; Cheng Q; Cheng Y; Yang H; Yu H
    Neural Comput; 2018 Nov; 30(11):2855-2881. PubMed ID: 30216144
    [TBL] [Abstract][Full Text] [Related]  

  • 6. The state of the onion: Grammatical aspect modulates object representation during event comprehension.
    Misersky J; Slivac K; Hagoort P; Flecken M
    Cognition; 2021 Sep; 214():104744. PubMed ID: 33962314
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences.
    He W; Wu Y; Deng L; Li G; Wang H; Tian Y; Ding W; Wang W; Xie Y
    Neural Netw; 2020 Dec; 132():108-120. PubMed ID: 32866745
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Individual differences in encoded neural representations within cortical speech production network.
    Alfred KL; Hayes JC; Pizzie RG; Cetron JS; Kraemer DJM
    Brain Res; 2020 Jan; 1726():146483. PubMed ID: 31585067
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A hybrid model based on neural networks for biomedical relation extraction.
    Zhang Y; Lin H; Yang Z; Wang J; Zhang S; Sun Y; Yang L
    J Biomed Inform; 2018 May; 81():83-92. PubMed ID: 29601989
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Semantic Representations for NLP Using VerbNet and the Generative Lexicon.
    Brown SW; Bonn J; Kazeminejad G; Zaenen A; Pustejovsky J; Palmer M
    Front Artif Intell; 2022; 5():821697. PubMed ID: 35493615
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Neural mechanisms of language comprehension: challenges to syntax.
    Kuperberg GR
    Brain Res; 2007 May; 1146():23-49. PubMed ID: 17400197
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Effect of recurrent infomax on the information processing capability of input-driven recurrent neural networks.
    Tanaka T; Nakajima K; Aoyagi T
    Neurosci Res; 2020 Jul; 156():225-233. PubMed ID: 32068068
    [TBL] [Abstract][Full Text] [Related]  

  • 14. The activation of object-state representations during online language comprehension.
    Kang X; Joergensen GH; Altmann GTM
    Acta Psychol (Amst); 2020 Oct; 210():103162. PubMed ID: 32818688
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Molecular language models: RNNs or transformer?
    Chen Y; Wang Z; Zeng X; Li Y; Li P; Ye X; Sakurai T
    Brief Funct Genomics; 2023 Jul; 22(4):392-400. PubMed ID: 37078726
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Universality and individuality in neural dynamics across large populations of recurrent networks.
    Maheswaranathan N; Williams AH; Golub MD; Ganguli S; Sussillo D
    Adv Neural Inf Process Syst; 2019 Dec; 2019():15629-15641. PubMed ID: 32782422
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Visual and Affective Multimodal Models of Word Meaning in Language and Mind.
    De Deyne S; Navarro DJ; Collell G; Perfors A
    Cogn Sci; 2021 Jan; 45(1):e12922. PubMed ID: 33432630
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Representations of continuous attractors of recurrent neural networks.
    Yu J; Yi Z; Zhang L
    IEEE Trans Neural Netw; 2009 Feb; 20(2):368-72. PubMed ID: 19150791
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Distinct fronto-temporal substrates of distributional and taxonomic similarity among words: evidence from RSA of BOLD signals.
    Carota F; Nili H; Pulvermüller F; Kriegeskorte N
    Neuroimage; 2021 Jan; 224():117408. PubMed ID: 33049407
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Bio-instantiated recurrent neural networks: Integrating neurobiology-based network topology in artificial networks.
    Goulas A; Damicelli F; Hilgetag CC
    Neural Netw; 2021 Oct; 142():608-618. PubMed ID: 34391175
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.