These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

112 related articles for article (PubMed ID: 32286978)

  • 1. Densely Distilled Flow-Based Knowledge Transfer in Teacher-Student Framework for Image Classification.
    Bae JH; Yeo D; Yim J; Kim NS; Pyo CS; Kim J
    IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32286978
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Representational Distance Learning for Deep Neural Networks.
    McClure P; Kriegeskorte N
    Front Comput Neurosci; 2016; 10():131. PubMed ID: 28082889
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.
    Ying M; Wang Y; Yang K; Wang H; Liu X
    Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305
    [No Abstract]   [Full Text] [Related]  

  • 7. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Autoencoder and restricted Boltzmann machine for transfer learning in functional magnetic resonance imaging task classification.
    Hwang J; Lustig N; Jung M; Lee JH
    Heliyon; 2023 Jul; 9(7):e18086. PubMed ID: 37519689
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Compression Helps Deep Learning in Image Classification.
    Yang EH; Amer H; Jiang Y
    Entropy (Basel); 2021 Jul; 23(7):. PubMed ID: 34356422
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Estimation of Pedestrian Pose Orientation Using Soft Target Training Based on Teacher⁻Student Framework.
    Heo D; Nam JY; Ko BC
    Sensors (Basel); 2019 Mar; 19(5):. PubMed ID: 30845772
    [TBL] [Abstract][Full Text] [Related]  

  • 12. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
    Su T; Zhang J; Yu Z; Wang G; Liu X
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Learning Student Networks via Feature Embedding.
    Chen H; Wang Y; Xu C; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Efficient Combination of CNN and Transformer for Dual-Teacher Uncertainty-guided Semi-supervised Medical Image Segmentation.
    Xiao Z; Su Y; Deng Z; Zhang W
    Comput Methods Programs Biomed; 2022 Nov; 226():107099. PubMed ID: 36116398
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
    Sepahvand M; Abdali-Mohammadi F
    Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
    [TBL] [Abstract][Full Text] [Related]  

  • 18. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
    Tseng W; Liu H; Yang Y; Liu C; Lu B
    Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
    [No Abstract]   [Full Text] [Related]  

  • 19. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Enabling data-limited chemical bioactivity predictions through deep neural network transfer learning.
    Liu R; Laxminarayan S; Reifman J; Wallqvist A
    J Comput Aided Mol Des; 2022 Dec; 36(12):867-878. PubMed ID: 36272041
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.