BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

127 related articles for article (PubMed ID: 38818128)

  • 1. Constrained Adaptive Distillation Based on Topological Persistence for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
    IEEE Trans Instrum Meas; 2023; 72():. PubMed ID: 38818128
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Lee H; Buman MP; Turaga P
    Eng Appl Artif Intell; 2024 Apr; 130():. PubMed ID: 38282698
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Topological Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
    Conf Rec Asilomar Conf Signals Syst Comput; 2022; 2022():837-842. PubMed ID: 37583442
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
    Liu Y; Wang K; Li G; Lin L
    IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Layerwised multimodal knowledge distillation for vision-language pretrained model.
    Wang J; Liao D; Zhang Y; Xu D; Zhang X
    Neural Netw; 2024 Jul; 175():106272. PubMed ID: 38569460
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy.
    Park S; Heo YS
    Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
    Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.
    Ying M; Wang Y; Yang K; Wang H; Liu X
    Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305
    [No Abstract]   [Full Text] [Related]  

  • 10. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
    Zhang Y; Chen Z; Yang X
    Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Multilayer Semantic Features Adaptive Distillation for Object Detectors.
    Zhang Z; Liu J; Chen Y; Mei W; Huang F; Chen L
    Sensors (Basel); 2023 Sep; 23(17):. PubMed ID: 37688070
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 15. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 16. TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.
    Wei X; Wang Z
    Sci Rep; 2024 Mar; 14(1):7414. PubMed ID: 38548859
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Role of Data Augmentation Strategies in Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Som A; Shukla A; Hasanaj K; Buman MP; Turaga P
    IEEE Internet Things J; 2022 Jul; 9(14):12848-12860. PubMed ID: 35813017
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Multistage feature fusion knowledge distillation.
    Li G; Wang K; Lv P; He P; Zhou Z; Xu C
    Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.