These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

113 related articles for article (PubMed ID: 38507388)

  • 1. Unpacking the Gap Box Against Data-Free Knowledge Distillation.
    Wang Y; Qian B; Liu H; Rui Y; Wang M
    IEEE Trans Pattern Anal Mach Intell; 2024 Sep; 46(9):6280-6291. PubMed ID: 38507388
    [TBL] [Abstract][Full Text] [Related]  

  • 2. AdaDFKD: Exploring adaptive inter-sample relationship in data-free knowledge distillation.
    Li J; Zhou S; Li L; Wang H; Bu J; Yu Z
    Neural Netw; 2024 Sep; 177():106386. PubMed ID: 38776761
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.
    Ying M; Wang Y; Yang K; Wang H; Liu X
    Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305
    [No Abstract]   [Full Text] [Related]  

  • 4. Adaptive Perspective Distillation for Semantic Segmentation.
    Tian Z; Chen P; Lai X; Jiang L; Liu S; Zhao H; Yu B; Yang MC; Jia J
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1372-1387. PubMed ID: 35294341
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Tolerant Self-Distillation for image classification.
    Liu M; Yu Y; Ji Z; Han J; Zhang Z
    Neural Netw; 2024 Jun; 174():106215. PubMed ID: 38471261
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Generalized Knowledge Distillation via Relationship Matching.
    Ye HJ; Lu S; Zhan DC
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Few-Shot Face Stylization via GAN Prior Distillation.
    Zhao R; Zhu M; Wang N; Gao X
    IEEE Trans Neural Netw Learn Syst; 2024 Mar; PP():. PubMed ID: 38536698
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Bridging the gap between patient-specific and patient-independent seizure prediction via knowledge distillation.
    Wu D; Yang J; Sawan M
    J Neural Eng; 2022 Jun; 19(3):. PubMed ID: 35617933
    [No Abstract]   [Full Text] [Related]  

  • 11. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Selective knowledge sharing for privacy-preserving federated distillation without a good teacher.
    Shao J; Wu F; Zhang J
    Nat Commun; 2024 Jan; 15(1):349. PubMed ID: 38191466
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Improving Differentiable Architecture Search via self-distillation.
    Zhu X; Li J; Liu Y; Wang W
    Neural Netw; 2023 Oct; 167():656-667. PubMed ID: 37717323
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Adversarial Distillation for Learning with Privileged Provisions.
    Wang X; Zhang R; Sun Y; Qi J
    IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Folic acid supplementation and malaria susceptibility and severity among people taking antifolate antimalarial drugs in endemic areas.
    Crider K; Williams J; Qi YP; Gutman J; Yeung L; Mai C; Finkelstain J; Mehta S; Pons-Duran C; Menéndez C; Moraleda C; Rogers L; Daniels K; Green P
    Cochrane Database Syst Rev; 2022 Feb; 2(2022):. PubMed ID: 36321557
    [TBL] [Abstract][Full Text] [Related]  

  • 17. SPD: Semi-Supervised Learning and Progressive Distillation for 3-D Detection.
    Xie B; Yang Z; Yang L; Luo R; Lu J; Wei A; Weng X; Li B
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; PP():. PubMed ID: 35905067
    [TBL] [Abstract][Full Text] [Related]  

  • 18. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Lee H; Buman MP; Turaga P
    Eng Appl Artif Intell; 2024 Apr; 130():. PubMed ID: 38282698
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Planning Implications Related to Sterilization-Sensitive Science Investigations Associated with Mars Sample Return (MSR).
    Velbel MA; Cockell CS; Glavin DP; Marty B; Regberg AB; Smith AL; Tosca NJ; Wadhwa M; Kminek G; Meyer MA; Beaty DW; Carrier BL; Haltigin T; Hays LE; Agee CB; Busemann H; Cavalazzi B; Debaille V; Grady MM; Hauber E; Hutzler A; McCubbin FM; Pratt LM; Smith CL; Summons RE; Swindle TD; Tait KT; Udry A; Usui T; Westall F; Zorzano MP
    Astrobiology; 2022 Jun; 22(S1):S112-S164. PubMed ID: 34904892
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.