These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

134 related articles for article (PubMed ID: 38282698)

  • 1. Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Lee H; Buman MP; Turaga P
    Eng Appl Artif Intell; 2024 Apr; 130():. PubMed ID: 38282698
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Topological Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
    Conf Rec Asilomar Conf Signals Syst Comput; 2022; 2022():837-842. PubMed ID: 37583442
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Constrained Adaptive Distillation Based on Topological Persistence for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
    IEEE Trans Instrum Meas; 2023; 72():. PubMed ID: 38818128
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
    Liu Y; Wang K; Li G; Lin L
    IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
    [TBL] [Abstract][Full Text] [Related]  

  • 5. TDA-Net: Fusion of Persistent Homology and Deep Learning Features for COVID-19 Detection From Chest X-Ray Images.
    Hajij M; Zamzmi G; Batayneh F
    Annu Int Conf IEEE Eng Med Biol Soc; 2021 Nov; 2021():4115-4119. PubMed ID: 34892132
    [TBL] [Abstract][Full Text] [Related]  

  • 6. PI-Net: A Deep Learning Approach to Extract Topological Persistence Images.
    Som A; Choi H; Ramamurthy KN; Buman MP; Turaga P
    Conf Comput Vis Pattern Recognit Workshops; 2020 Jun; 2020():3639-3648. PubMed ID: 32995068
    [TBL] [Abstract][Full Text] [Related]  

  • 7. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
    Tseng W; Liu H; Yang Y; Liu C; Lu B
    Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
    [No Abstract]   [Full Text] [Related]  

  • 8. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
    Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Hybrid Topological Data Analysis and Deep Learning for Basal Cell Carcinoma Diagnosis.
    Maurya A; Stanley RJ; Lama N; Nambisan AK; Patel G; Saeed D; Swinfard S; Smith C; Jagannathan S; Hagerty JR; Stoecker WV
    J Imaging Inform Med; 2024 Feb; 37(1):92-106. PubMed ID: 38343238
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 13. RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging.
    Jaiswal A; Ashutosh K; Rousseau JF; Peng Y; Wang Z; Ding Y
    Proc IEEE Int Conf Data Min; 2022; 2022():981-986. PubMed ID: 37038389
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Learning Privacy-Preserving Student Networks via Discriminative-Generative Distillation.
    Ge S; Liu B; Wang P; Li Y; Zeng D
    IEEE Trans Image Process; 2022 Dec; PP():. PubMed ID: 37015525
    [TBL] [Abstract][Full Text] [Related]  

  • 15. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 17. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Efficient Combination of CNN and Transformer for Dual-Teacher Uncertainty-guided Semi-supervised Medical Image Segmentation.
    Xiao Z; Su Y; Deng Z; Zhang W
    Comput Methods Programs Biomed; 2022 Nov; 226():107099. PubMed ID: 36116398
    [TBL] [Abstract][Full Text] [Related]  

  • 19. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
    Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.