These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

157 related articles for article (PubMed ID: 38005675)

  • 41. Efficient Crowd Counting via Dual Knowledge Distillation.
    Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
    IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
    [TBL] [Abstract][Full Text] [Related]  

  • 42. Regional Time-Series Coding Network and Multi-View Image Generation Network for Short-Time Gait Recognition.
    Sun W; Lu G; Zhao Z; Guo T; Qin Z; Han Y
    Entropy (Basel); 2023 May; 25(6):. PubMed ID: 37372181
    [TBL] [Abstract][Full Text] [Related]  

  • 43. Enhanced Knowledge Distillation for Advanced Recognition of Chinese Herbal Medicine.
    Zheng L; Long W; Yi J; Liu L; Xu K
    Sensors (Basel); 2024 Feb; 24(5):. PubMed ID: 38475094
    [TBL] [Abstract][Full Text] [Related]  

  • 44. MT4MTL-KD: A Multi-Teacher Knowledge Distillation Framework for Triplet Recognition.
    Gui S; Wang Z; Chen J; Zhou X; Zhang C; Cao Y
    IEEE Trans Med Imaging; 2024 Apr; 43(4):1628-1639. PubMed ID: 38127608
    [TBL] [Abstract][Full Text] [Related]  

  • 45. Generalized Knowledge Distillation via Relationship Matching.
    Ye HJ; Lu S; Zhan DC
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
    [TBL] [Abstract][Full Text] [Related]  

  • 46. ResKD: Residual-Guided Knowledge Distillation.
    Li X; Li S; Omar B; Wu F; Li X
    IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
    [TBL] [Abstract][Full Text] [Related]  

  • 47. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
    Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
    IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
    [TBL] [Abstract][Full Text] [Related]  

  • 48. TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.
    Wei X; Wang Z
    Sci Rep; 2024 Mar; 14(1):7414. PubMed ID: 38548859
    [TBL] [Abstract][Full Text] [Related]  

  • 49. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 50. A Comprehensive Study on Cross-View Gait Based Human Identification with Deep CNNs.
    Wu Z; Huang Y; Wang L; Wang X; Tan T
    IEEE Trans Pattern Anal Mach Intell; 2017 Feb; 39(2):209-226. PubMed ID: 27019478
    [TBL] [Abstract][Full Text] [Related]  

  • 51. Efficient Image and Sentence Matching.
    Huang Y; Wang Y; Wang L
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):2970-2983. PubMed ID: 35622793
    [TBL] [Abstract][Full Text] [Related]  

  • 52. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
    Yuan Z; Yang Z; Ning H; Tang X
    Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
    [TBL] [Abstract][Full Text] [Related]  

  • 53. A Novel Approach to Classifying Breast Cancer Histopathology Biopsy Images Using Bilateral Knowledge Distillation and Label Smoothing Regularization.
    Chaudhury S; Shelke N; Sau K; Prasanalakshmi B; Shabaz M
    Comput Math Methods Med; 2021; 2021():4019358. PubMed ID: 34721657
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
    Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
    [TBL] [Abstract][Full Text] [Related]  

  • 55. Efficient image classification through collaborative knowledge distillation: A novel AlexNet modification approach.
    Kuldashboy A; Umirzakova S; Allaberdiev S; Nasimov R; Abdusalomov A; Cho YI
    Heliyon; 2024 Jul; 10(14):e34376. PubMed ID: 39113984
    [TBL] [Abstract][Full Text] [Related]  

  • 56. MNGNAS: Distilling Adaptive Combination of Multiple Searched Networks for One-Shot Neural Architecture Search.
    Chen Z; Qiu G; Li P; Zhu L; Yang X; Sheng B
    IEEE Trans Pattern Anal Mach Intell; 2023 Nov; 45(11):13489-13508. PubMed ID: 37432801
    [TBL] [Abstract][Full Text] [Related]  

  • 57. Knowledge distillation approach towards melanoma detection.
    Khan MS; Alam KN; Dhruba AR; Zunair H; Mohammed N
    Comput Biol Med; 2022 Jul; 146():105581. PubMed ID: 35594685
    [TBL] [Abstract][Full Text] [Related]  

  • 58. Distilling Knowledge by Mimicking Features.
    Wang GH; Ge Y; Wu J
    IEEE Trans Pattern Anal Mach Intell; 2022 Nov; 44(11):8183-8195. PubMed ID: 34379588
    [TBL] [Abstract][Full Text] [Related]  

  • 59. Deep Neural Network Self-Distillation Exploiting Data Representation Invariance.
    Xu TB; Liu CL
    IEEE Trans Neural Netw Learn Syst; 2022 Jan; 33(1):257-269. PubMed ID: 33074828
    [TBL] [Abstract][Full Text] [Related]  

  • 60. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
    Tseng W; Liu H; Yang Y; Liu C; Lu B
    Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
    [No Abstract]   [Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 8.