These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

144 related articles for article (PubMed ID: 35773285)

  • 1. Optimised weight programming for analogue memory-based deep neural networks.
    Mackin C; Rasch MJ; Chen A; Timcheck J; Bruce RL; Li N; Narayanan P; Ambrogio S; Le Gallo M; Nandakumar SR; Fasoli A; Luquin J; Friz A; Sebastian A; Tsai H; Burr GW
    Nat Commun; 2022 Jun; 13(1):3765. PubMed ID: 35773285
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Quantization-aware training for low precision photonic neural networks.
    Kirtas M; Oikonomou A; Passalis N; Mourgias-Alexandris G; Moralis-Pegios M; Pleros N; Tefas A
    Neural Netw; 2022 Nov; 155():561-573. PubMed ID: 36191452
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices.
    Spoon K; Tsai H; Chen A; Rasch MJ; Ambrogio S; Mackin C; Fasoli A; Friz AM; Narayanan P; Stanisavljevic M; Burr GW
    Front Comput Neurosci; 2021; 15():675741. PubMed ID: 34290595
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Lean Neural Networks for Autonomous Radar Waveform Design.
    Baietto A; Boubin J; Farr P; Bihl TJ; Jones AM; Stewart C
    Sensors (Basel); 2022 Feb; 22(4):. PubMed ID: 35214218
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Equivalent-accuracy accelerated neural-network training using analogue memory.
    Ambrogio S; Narayanan P; Tsai H; Shelby RM; Boybat I; di Nolfo C; Sidler S; Giordano M; Bodini M; Farinha NCP; Killeen B; Cheng C; Jaoudi Y; Burr GW
    Nature; 2018 Jun; 558(7708):60-67. PubMed ID: 29875487
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Toward Full-Stack Acceleration of Deep Convolutional Neural Networks on FPGAs.
    Liu S; Fan H; Ferianc M; Niu X; Shi H; Luk W
    IEEE Trans Neural Netw Learn Syst; 2022 Aug; 33(8):3974-3987. PubMed ID: 33577458
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Accelerating Inference of Convolutional Neural Networks Using In-memory Computing.
    Dazzi M; Sebastian A; Benini L; Eleftheriou E
    Front Comput Neurosci; 2021; 15():674154. PubMed ID: 34413731
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Energy-efficient Mott activation neuron for full-hardware implementation of neural networks.
    Oh S; Shi Y; Del Valle J; Salev P; Lu Y; Huang Z; Kalcheim Y; Schuller IK; Kuzum D
    Nat Nanotechnol; 2021 Jun; 16(6):680-687. PubMed ID: 33737724
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Mixed-Precision Deep Learning Based on Computational Memory.
    Nandakumar SR; Le Gallo M; Piveteau C; Joshi V; Mariani G; Boybat I; Karunaratne G; Khaddam-Aljameh R; Egger U; Petropoulos A; Antonakopoulos T; Rajendran B; Sebastian A; Eleftheriou E
    Front Neurosci; 2020; 14():406. PubMed ID: 32477047
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Training Deep Convolutional Neural Networks with Resistive Cross-Point Devices.
    Gokmen T; Onen M; Haensch W
    Front Neurosci; 2017; 11():538. PubMed ID: 29066942
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators.
    Rasch MJ; Mackin C; Le Gallo M; Chen A; Fasoli A; Odermatt F; Li N; Nandakumar SR; Narayanan P; Tsai H; Burr GW; Sebastian A; Narayanan V
    Nat Commun; 2023 Aug; 14(1):5282. PubMed ID: 37648721
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Understanding and mitigating noise in trained deep neural networks.
    Semenova N; Larger L; Brunner D
    Neural Netw; 2022 Feb; 146():151-160. PubMed ID: 34864223
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Evolution of Deep Convolutional Neural Networks Using Cartesian Genetic Programming.
    Suganuma M; Kobayashi M; Shirakawa S; Nagao T
    Evol Comput; 2020; 28(1):141-163. PubMed ID: 30900927
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Accurate deep neural network inference using computational phase-change memory.
    Joshi V; Le Gallo M; Haefeli S; Boybat I; Nandakumar SR; Piveteau C; Dazzi M; Rajendran B; Sebastian A; Eleftheriou E
    Nat Commun; 2020 May; 11(1):2473. PubMed ID: 32424184
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Memristors for Neuromorphic Circuits and Artificial Intelligence Applications.
    Miranda E; Suñé J
    Materials (Basel); 2020 Feb; 13(4):. PubMed ID: 32093164
    [TBL] [Abstract][Full Text] [Related]  

  • 16. CHARLES: A C++ fixed-point library for Photonic-Aware Neural Networks.
    Paolini E; De Marinis L; Maggiani L; Cococcioni M; Andriolli N
    Neural Netw; 2023 May; 162():531-540. PubMed ID: 36990002
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Efficient Computation Reduction in Bayesian Neural Networks Through Feature Decomposition and Memorization.
    Jia X; Yang J; Liu R; Wang X; Cotofana SD; Zhao W
    IEEE Trans Neural Netw Learn Syst; 2021 Apr; 32(4):1703-1712. PubMed ID: 32386165
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Event-driven implementation of deep spiking convolutional neural networks for supervised classification using the SpiNNaker neuromorphic platform.
    Patiño-Saucedo A; Rostro-Gonzalez H; Serrano-Gotarredona T; Linares-Barranco B
    Neural Netw; 2020 Jan; 121():319-328. PubMed ID: 31590013
    [TBL] [Abstract][Full Text] [Related]  

  • 19. MorphIC: A 65-nm 738k-Synapse/mm
    Frenkel C; Legat JD; Bol D
    IEEE Trans Biomed Circuits Syst; 2019 Oct; 13(5):999-1010. PubMed ID: 31329562
    [TBL] [Abstract][Full Text] [Related]  

  • 20. SRAM-Based CIM Architecture Design for Event Detection.
    Sulaiman MBG; Lin JY; Li JB; Shih CM; Juang KC; Lu CC
    Sensors (Basel); 2022 Oct; 22(20):. PubMed ID: 36298205
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 8.