These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

26 related articles for article (PubMed ID: 37030799)

  • 1. GANsDTA: Predicting Drug-Target Binding Affinity Using GANs.
    Zhao L; Wang J; Pang L; Liu Y; Zhang J
    Front Genet; 2019; 10():1243. PubMed ID: 31993067
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Tolerant Self-Distillation for image classification.
    Liu M; Yu Y; Ji Z; Han J; Zhang Z
    Neural Netw; 2024 Jun; 174():106215. PubMed ID: 38471261
    [TBL] [Abstract][Full Text] [Related]  

  • 3. MT4MTL-KD: A Multi-Teacher Knowledge Distillation Framework for Triplet Recognition.
    Gui S; Wang Z; Chen J; Zhou X; Zhang C; Cao Y
    IEEE Trans Med Imaging; 2024 Apr; 43(4):1628-1639. PubMed ID: 38127608
    [TBL] [Abstract][Full Text] [Related]  

  • 4. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation.
    Li C; Teng X; Ding Y; Lan L
    Sensors (Basel); 2024 Jun; 24(11):. PubMed ID: 38894408
    [TBL] [Abstract][Full Text] [Related]  

  • 5. When Object Detection Meets Knowledge Distillation: A Survey.
    Li Z; Xu P; Chang X; Yang L; Zhang Y; Yao L; Chen X
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10555-10579. PubMed ID: 37028387
    [TBL] [Abstract][Full Text] [Related]  

  • 6. AdaDFKD: Exploring adaptive inter-sample relationship in data-free knowledge distillation.
    Li J; Zhou S; Li L; Wang H; Bu J; Yu Z
    Neural Netw; 2024 Sep; 177():106386. PubMed ID: 38776761
    [TBL] [Abstract][Full Text] [Related]  

  • 7. DSP-KD: Dual-Stage Progressive Knowledge Distillation for Skin Disease Classification.
    Zeng X; Ji Z; Zhang H; Chen R; Liao Q; Wang J; Lyu T; Zhao L
    Bioengineering (Basel); 2024 Jan; 11(1):. PubMed ID: 38247947
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Enhanced Knowledge Distillation for Advanced Recognition of Chinese Herbal Medicine.
    Zheng L; Long W; Yi J; Liu L; Xu K
    Sensors (Basel); 2024 Feb; 24(5):. PubMed ID: 38475094
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Self-distillation framework for document-level relation extraction in low-resource environments.
    Wu H; Zhou G; Xia Y; Liu H; Zhang T
    PeerJ Comput Sci; 2024; 10():e1930. PubMed ID: 38660168
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Unpacking the Gap Box Against Data-Free Knowledge Distillation.
    Wang Y; Qian B; Liu H; Rui Y; Wang M
    IEEE Trans Pattern Anal Mach Intell; 2024 Mar; PP():. PubMed ID: 38507388
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Enhancing Offensive Language Detection with Data Augmentation and Knowledge Distillation.
    Deng J; Chen Z; Sun H; Zhang Z; Wu J; Nakagawa S; Ren F; Huang M
    Research (Wash D C); 2023; 6():0189. PubMed ID: 37727321
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Adversarial Distillation for Learning with Privileged Provisions.
    Wang X; Zhang R; Sun Y; Qi J
    IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Leveraging Symbolic Knowledge Bases for Commonsense Natural Language Inference Using Pattern Theory.
    Aakur SN; Sarkar S
    IEEE Trans Pattern Anal Mach Intell; 2023 Nov; 45(11):13185-13202. PubMed ID: 37339033
    [TBL] [Abstract][Full Text] [Related]  

  • 14. MTANS: Multi-Scale Mean Teacher Combined Adversarial Network with Shape-Aware Embedding for Semi-Supervised Brain Lesion Segmentation.
    Chen G; Ru J; Zhou Y; Rekik I; Pan Z; Liu X; Lin Y; Lu B; Shi J
    Neuroimage; 2021 Dec; 244():118568. PubMed ID: 34508895
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Mitigating Accuracy-Robustness Trade-Off Via Balanced Multi-Teacher Adversarial Distillation.
    Zhao S; Wang X; Wei X
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; PP():. PubMed ID: 38889035
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Adversarial co-training for semantic segmentation over medical images.
    Xie H; Fu C; Zheng X; Zheng Y; Sham CW; Wang X
    Comput Biol Med; 2023 May; 157():106736. PubMed ID: 36958238
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Adversarial Multi-Teacher Distillation for Semi-Supervised Relation Extraction.
    Li W; Qian T; Li X; Zou L
    IEEE Trans Neural Netw Learn Syst; 2023 Mar; PP():. PubMed ID: 37030799
    [TBL] [Abstract][Full Text] [Related]  

  • 18.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

  • 19.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

  • 20.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

    [Next]    [New Search]
    of 2.