These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

350 related articles for article (PubMed ID: 37186614)

  • 1. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 2. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A novel adaptive cubic quasi-Newton optimizer for deep learning based medical image analysis tasks, validated on detection of COVID-19 and segmentation for COVID-19 lung infection, liver tumor, and optic disc/cup.
    Liu Y; Zhang M; Zhong Z; Zeng X
    Med Phys; 2023 Mar; 50(3):1528-1538. PubMed ID: 36057788
    [TBL] [Abstract][Full Text] [Related]  

  • 7. ResKD: Residual-Guided Knowledge Distillation.
    Li X; Li S; Omar B; Wu F; Li X
    IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Deep Learning Model Compression With Rank Reduction in Tensor Decomposition.
    Dai W; Fan J; Miao Y; Hwang K
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; PP():. PubMed ID: 37976188
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Continual Learning With Knowledge Distillation: A Survey.
    Li S; Su T; Zhang XY; Wang Z
    IEEE Trans Neural Netw Learn Syst; 2024 Oct; PP():. PubMed ID: 39423075
    [TBL] [Abstract][Full Text] [Related]  

  • 10. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
    Tseng W; Liu H; Yang Y; Liu C; Lu B
    Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
    [No Abstract]   [Full Text] [Related]  

  • 11. Method and evaluations of the effective gain of artificial intelligence models for reducing CO2 emissions.
    Delanoƫ P; Tchuente D; Colin G
    J Environ Manage; 2023 Apr; 331():117261. PubMed ID: 36642044
    [TBL] [Abstract][Full Text] [Related]  

  • 12. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Learning lightweight tea detector with reconstructed feature and dual distillation.
    Zheng Z; Zuo G; Zhang W; Zhang C; Zhang J; Rao Y; Jiang Z
    Sci Rep; 2024 Oct; 14(1):23669. PubMed ID: 39390063
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
    Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Multistage feature fusion knowledge distillation.
    Li G; Wang K; Lv P; He P; Zhou Z; Xu C
    Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Restructuring the Teacher and Student in Self-Distillation.
    Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J
    IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482
    [TBL] [Abstract][Full Text] [Related]  

  • 17. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation.
    Li C; Teng X; Ding Y; Lan L
    Sensors (Basel); 2024 Jun; 24(11):. PubMed ID: 38894408
    [TBL] [Abstract][Full Text] [Related]  

  • 18. A Method of Deep Learning Model Optimization for Image Classification on Edge Device.
    Lee H; Lee N; Lee S
    Sensors (Basel); 2022 Sep; 22(19):. PubMed ID: 36236445
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Research on a lightweight electronic component detection method based on knowledge distillation.
    Xia Z; Gu J; Wang W; Huang Z
    Math Biosci Eng; 2023 Nov; 20(12):20971-20994. PubMed ID: 38124584
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Knowledge distillation in deep learning and its applications.
    Alkhulaifi A; Alsahli F; Ahmad I
    PeerJ Comput Sci; 2021; 7():e474. PubMed ID: 33954248
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 18.