These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

190 related articles for article (PubMed ID: 35180258)

  • 1. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.
    Cho I; Kang U
    PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 4. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
    Su T; Zhang J; Yu Z; Wang G; Liu X
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Frameless Graph Knowledge Distillation.
    Shi D; Shao Z; Gao J; Wang Z; Guo Y
    IEEE Trans Neural Netw Learn Syst; 2024 Sep; PP():. PubMed ID: 39231057
    [TBL] [Abstract][Full Text] [Related]  

  • 7. LAD: Layer-Wise Adaptive Distillation for BERT Model Compression.
    Lin YJ; Chen KY; Kao HY
    Sensors (Basel); 2023 Jan; 23(3):. PubMed ID: 36772523
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
    Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Fine-Grained Learning Behavior-Oriented Knowledge Distillation for Graph Neural Networks.
    Liu K; Huang Z; Wang CD; Gao B; Chen Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; PP():. PubMed ID: 39012738
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Research on a lightweight electronic component detection method based on knowledge distillation.
    Xia Z; Gu J; Wang W; Huang Z
    Math Biosci Eng; 2023 Nov; 20(12):20971-20994. PubMed ID: 38124584
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Cosine similarity-guided knowledge distillation for robust object detectors.
    Park S; Kang D; Paik J
    Sci Rep; 2024 Aug; 14(1):18888. PubMed ID: 39143179
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Study of Deep Learning-Based Legal Judgment Prediction in Internet of Things Era.
    Zheng M; Liu B; Sun L
    Comput Intell Neurosci; 2022; 2022():8490760. PubMed ID: 35978889
    [TBL] [Abstract][Full Text] [Related]  

  • 19. On Representation Knowledge Distillation for Graph Neural Networks.
    Joshi CK; Liu F; Xun X; Lin J; Foo CS
    IEEE Trans Neural Netw Learn Syst; 2024 Apr; 35(4):4656-4667. PubMed ID: 36459610
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 10.