214 related articles for article (PubMed ID: 34531005)
1. Resolution-based distillation for efficient histology image classification.
DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
[TBL] [Abstract][Full Text] [Related]
2. Semi-supervised training of deep convolutional neural networks with heterogeneous data and few local annotations: An experiment on prostate histopathology image classification.
Marini N; Otálora S; Müller H; Atzori M
Med Image Anal; 2021 Oct; 73():102165. PubMed ID: 34303169
[TBL] [Abstract][Full Text] [Related]
3. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
[TBL] [Abstract][Full Text] [Related]
4. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
Javed S; Mahmood A; Qaiser T; Werghi N
IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
[TBL] [Abstract][Full Text] [Related]
5. Uninformed Teacher-Student for hard-samples distillation in weakly supervised mitosis localization.
Fernandez-Martín C; Silva-Rodriguez J; Kiraz U; Morales S; Janssen EAM; Naranjo V
Comput Med Imaging Graph; 2024 Mar; 112():102328. PubMed ID: 38244279
[TBL] [Abstract][Full Text] [Related]
6. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
[TBL] [Abstract][Full Text] [Related]
7. Efficient Combination of CNN and Transformer for Dual-Teacher Uncertainty-guided Semi-supervised Medical Image Segmentation.
Xiao Z; Su Y; Deng Z; Zhang W
Comput Methods Programs Biomed; 2022 Nov; 226():107099. PubMed ID: 36116398
[TBL] [Abstract][Full Text] [Related]
8. Weakly Supervised Deep Learning for Whole Slide Lung Cancer Image Analysis.
Wang X; Chen H; Gan C; Lin H; Dou Q; Tsougenis E; Huang Q; Cai M; Heng PA
IEEE Trans Cybern; 2020 Sep; 50(9):3950-3962. PubMed ID: 31484154
[TBL] [Abstract][Full Text] [Related]
9. HistoPerm: A permutation-based view generation approach for improving histopathologic feature representation learning.
DiPalma J; Torresani L; Hassanpour S
J Pathol Inform; 2023; 14():100320. PubMed ID: 37457594
[TBL] [Abstract][Full Text] [Related]
10. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
Tseng W; Liu H; Yang Y; Liu C; Lu B
Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
[No Abstract] [Full Text] [Related]
11. Accurate deep learning model using semi-supervised learning and Noisy Student for cervical cancer screening in low magnification images.
Kurita Y; Meguro S; Tsuyama N; Kosugi I; Enomoto Y; Kawasaki H; Uemura T; Kimura M; Iwashita T
PLoS One; 2023; 18(5):e0285996. PubMed ID: 37200281
[TBL] [Abstract][Full Text] [Related]
12. A semi-supervised learning framework for micropapillary adenocarcinoma detection.
Gao Y; Ding Y; Xiao W; Yao Z; Zhou X; Sui X; Zhao Y; Zheng Y
Int J Comput Assist Radiol Surg; 2022 Apr; 17(4):639-648. PubMed ID: 35149953
[TBL] [Abstract][Full Text] [Related]
13. Deep semi-supervised multiple instance learning with self-correction for DME classification from OCT images.
Wang X; Tang F; Chen H; Cheung CY; Heng PA
Med Image Anal; 2023 Jan; 83():102673. PubMed ID: 36403310
[TBL] [Abstract][Full Text] [Related]
14. Sample self-selection using dual teacher networks for pathological image classification with noisy labels.
Han G; Guo W; Zhang H; Jin J; Gan X; Zhao X
Comput Biol Med; 2024 May; 174():108489. PubMed ID: 38640633
[TBL] [Abstract][Full Text] [Related]
15. RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis.
Kim H; Kwak TY; Chang H; Kim SW; Kim I
Bioengineering (Basel); 2023 Nov; 10(11):. PubMed ID: 38002403
[TBL] [Abstract][Full Text] [Related]
16. MTAN: A semi-supervised learning model for kidney tumor segmentation.
Sun P; Yang S; Guan H; Mo T; Yu B; Chen Z
J Xray Sci Technol; 2023; 31(6):1295-1313. PubMed ID: 37718833
[TBL] [Abstract][Full Text] [Related]
17. MSKD: Structured knowledge distillation for efficient medical image segmentation.
Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
[TBL] [Abstract][Full Text] [Related]
18. Overcoming limitation of dissociation between MD and MI classifications of breast cancer histopathological images through a novel decomposed feature-based knowledge distillation method.
Sepahvand M; Abdali-Mohammadi F
Comput Biol Med; 2022 Jun; 145():105413. PubMed ID: 35325731
[TBL] [Abstract][Full Text] [Related]
19. TGMIL: A hybrid multi-instance learning model based on the Transformer and the Graph Attention Network for whole-slide images classification of renal cell carcinoma.
Sun X; Li W; Fu B; Peng Y; He J; Wang L; Yang T; Meng X; Li J; Wang J; Huang P; Wang R
Comput Methods Programs Biomed; 2023 Dec; 242():107789. PubMed ID: 37722310
[TBL] [Abstract][Full Text] [Related]
20. DeepHistoNet: A robust deep-learning model for the classification of hepatocellular, lung, and colon carcinoma.
Kadirappa R; S D; R P; Ko SB
Microsc Res Tech; 2024 Feb; 87(2):229-256. PubMed ID: 37750465
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]