These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

123 related articles for article (PubMed ID: 35877790)

  • 21. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 22. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Teachers' Relationship Closeness with Students as a Resource for Teacher Wellbeing: A Response Surface Analytical Approach.
    Milatz A; Lüftenegger M; Schober B
    Front Psychol; 2015; 6():1949. PubMed ID: 26779045
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 25. A Novel Approach to Classifying Breast Cancer Histopathology Biopsy Images Using Bilateral Knowledge Distillation and Label Smoothing Regularization.
    Chaudhury S; Shelke N; Sau K; Prasanalakshmi B; Shabaz M
    Comput Math Methods Med; 2021; 2021():4019358. PubMed ID: 34721657
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Gender differences in teachers' perceptions of students' temperament, educational competence, and teachability.
    Mullola S; Ravaja N; Lipsanen J; Alatupa S; Hintsanen M; Jokela M; Keltikangas-Järvinen L
    Br J Educ Psychol; 2012 Jun; 82(Pt 2):185-206. PubMed ID: 22583086
    [TBL] [Abstract][Full Text] [Related]  

  • 27. The nurse teacher's pedagogical cooperation with students, the clinical learning environment and supervision in clinical practicum: a European cross-sectional study of graduating nursing students.
    Strandell-Laine C; Salminen L; Blöndal K; Fuster P; Hourican S; Koskinen S; Leino-Kilpi H; Löyttyniemi E; Stubner J; Truš M; Suikkala A
    BMC Med Educ; 2022 Jun; 22(1):509. PubMed ID: 35765065
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Efficient knowledge distillation for liver CT segmentation using growing assistant network.
    Xu P; Kim K; Koh J; Wu D; Rim Lee Y; Young Park S; Young Tak W; Liu H; Li Q
    Phys Med Biol; 2021 Nov; 66(23):. PubMed ID: 34768246
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.
    Yang C; An Z; Cai L; Xu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation.
    Ding F; Yang Y; Hu H; Krovi V; Luo F
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2425-2435. PubMed ID: 35834455
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Structured Knowledge Distillation for Accurate and Efficient Object Detection.
    Zhang L; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Inside Out: A Scoping Review on the Physical Education Teacher's Personality.
    Schnitzius M; Kirch A; Mess F; Spengler S
    Front Psychol; 2019; 10():2510. PubMed ID: 31781005
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Efficient Combination of CNN and Transformer for Dual-Teacher Uncertainty-guided Semi-supervised Medical Image Segmentation.
    Xiao Z; Su Y; Deng Z; Zhang W
    Comput Methods Programs Biomed; 2022 Nov; 226():107099. PubMed ID: 36116398
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Multilayer Semantic Features Adaptive Distillation for Object Detectors.
    Zhang Z; Liu J; Chen Y; Mei W; Huang F; Chen L
    Sensors (Basel); 2023 Sep; 23(17):. PubMed ID: 37688070
    [TBL] [Abstract][Full Text] [Related]  

  • 35. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
    Su T; Zhang J; Yu Z; Wang G; Liu X
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
    [TBL] [Abstract][Full Text] [Related]  

  • 36. A single stage knowledge distillation network for brain tumor segmentation on limited MR image modalities.
    Choi Y; Al-Masni MA; Jung KJ; Yoo RE; Lee SY; Kim DH
    Comput Methods Programs Biomed; 2023 Oct; 240():107644. PubMed ID: 37307766
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Expanding and Refining Hybrid Compressors for Efficient Object Re-Identification.
    Xie Y; Wu H; Zhu J; Zeng H; Zhang J
    IEEE Trans Image Process; 2024; 33():3793-3808. PubMed ID: 38865219
    [TBL] [Abstract][Full Text] [Related]  

  • 38. ResKD: Residual-Guided Knowledge Distillation.
    Li X; Li S; Omar B; Wu F; Li X
    IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
    [TBL] [Abstract][Full Text] [Related]  

  • 39. DSP-KD: Dual-Stage Progressive Knowledge Distillation for Skin Disease Classification.
    Zeng X; Ji Z; Zhang H; Chen R; Liao Q; Wang J; Lyu T; Zhao L
    Bioengineering (Basel); 2024 Jan; 11(1):. PubMed ID: 38247947
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Efficient Crowd Counting via Dual Knowledge Distillation.
    Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
    IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.