These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

118 related articles for article (PubMed ID: 38683707)

  • 1. ADPS: Asymmetric Distillation Postsegmentation for Image Anomaly Detection.
    Xing P; Tang H; Tang J; Li Z
    IEEE Trans Neural Netw Learn Syst; 2024 Apr; PP():. PubMed ID: 38683707
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Cosine similarity knowledge distillation for surface anomaly detection.
    Sheng S; Jing J; Wang Z; Zhang H
    Sci Rep; 2024 Apr; 14(1):8150. PubMed ID: 38589492
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Unsupervised Anomaly Detection with Distillated Teacher-Student Network Ensemble.
    Xiao Q; Wang J; Lin Y; Gongsa W; Hu G; Li M; Wang F
    Entropy (Basel); 2021 Feb; 23(2):. PubMed ID: 33561954
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
    Zhang Y; Chen Z; Yang X
    Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A Method for Image Anomaly Detection Based on Distillation and Reconstruction.
    Luo J; Zhang J
    Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005667
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Using teacher-student neural networks based on knowledge distillation to detect anomalous samples in the otolith images.
    Chen Y; Zhu G
    Zoology (Jena); 2023 Dec; 161():126133. PubMed ID: 37979211
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.
    Ying M; Wang Y; Yang K; Wang H; Liu X
    Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305
    [No Abstract]   [Full Text] [Related]  

  • 10. TSSK-Net: Weakly supervised biomarker localization and segmentation with image-level annotation in retinal OCT images.
    Liu X; Liu Q; Zhang Y; Wang M; Tang J
    Comput Biol Med; 2023 Feb; 153():106467. PubMed ID: 36584602
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 13. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 14. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Self-Parameter Distillation Dehazing.
    Kim G; Kwon J
    IEEE Trans Image Process; 2022 Dec; PP():. PubMed ID: 37015501
    [TBL] [Abstract][Full Text] [Related]  

  • 18. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
    Yang Y; Guo X; Ye C; Xiang Y; Ma T
    Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Learning Student Networks via Feature Embedding.
    Chen H; Wang Y; Xu C; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018
    [TBL] [Abstract][Full Text] [Related]  

  • 20. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.