176 related articles for article (PubMed ID: 32824456)
1. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy.
Park S; Heo YS
Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456
[TBL] [Abstract][Full Text] [Related]
2. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
Yuan W; Lu X; Zhang R; Liu Y
Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
[TBL] [Abstract][Full Text] [Related]
3. Semantic Segmentation Using Pixel-Wise Adaptive Label Smoothing via Self-Knowledge Distillation for Limited Labeling Data.
Park S; Kim J; Heo YS
Sensors (Basel); 2022 Mar; 22(7):. PubMed ID: 35408237
[TBL] [Abstract][Full Text] [Related]
4. MSKD: Structured knowledge distillation for efficient medical image segmentation.
Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
[TBL] [Abstract][Full Text] [Related]
5. Double Similarity Distillation for Semantic Image Segmentation.
Feng Y; Sun X; Diao W; Li J; Gao X
IEEE Trans Image Process; 2021; 30():5363-5376. PubMed ID: 34048345
[TBL] [Abstract][Full Text] [Related]
6. DCCD: Reducing Neural Network Redundancy via Distillation.
Liu Y; Chen J; Liu Y
IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
[TBL] [Abstract][Full Text] [Related]
7. MFAFNet: A Lightweight and Efficient Network with Multi-Level Feature Adaptive Fusion for Real-Time Semantic Segmentation.
Lu K; Cheng J; Li H; Ouyang T
Sensors (Basel); 2023 Jul; 23(14):. PubMed ID: 37514676
[TBL] [Abstract][Full Text] [Related]
8. Bilateral attention decoder: A lightweight decoder for real-time semantic segmentation.
Peng C; Tian T; Chen C; Guo X; Ma J
Neural Netw; 2021 May; 137():188-199. PubMed ID: 33647536
[TBL] [Abstract][Full Text] [Related]
9. Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation.
Song Z; Zhang X; Shi Z
Sensors (Basel); 2023 Sep; 23(18):. PubMed ID: 37765877
[TBL] [Abstract][Full Text] [Related]
10. Adaptive Perspective Distillation for Semantic Segmentation.
Tian Z; Chen P; Lai X; Jiang L; Liu S; Zhao H; Yu B; Yang MC; Jia J
IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1372-1387. PubMed ID: 35294341
[TBL] [Abstract][Full Text] [Related]
11. ResKD: Residual-Guided Knowledge Distillation.
Li X; Li S; Omar B; Wu F; Li X
IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
[TBL] [Abstract][Full Text] [Related]
12. CGNet: A Light-Weight Context Guided Network for Semantic Segmentation.
Wu T; Tang S; Zhang R; Cao J; Zhang Y
IEEE Trans Image Process; 2021; 30():1169-1179. PubMed ID: 33306466
[TBL] [Abstract][Full Text] [Related]
13. Semi-Supervised Semantic Segmentation of Remote Sensing Images Based on Dual Cross-Entropy Consistency.
Cui M; Li K; Li Y; Kamuhanda D; Tessone CJ
Entropy (Basel); 2023 Apr; 25(4):. PubMed ID: 37190469
[TBL] [Abstract][Full Text] [Related]
14. Efficient Multi-Scale Stereo-Matching Network Using Adaptive Cost Volume Filtering.
Jeon S; Heo YS
Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35898003
[TBL] [Abstract][Full Text] [Related]
15. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
[TBL] [Abstract][Full Text] [Related]
16. STC-GAN: Spatio-Temporally Coupled Generative Adversarial Networks for Predictive Scene Parsing.
Qi M; Wang Y; Li A; Luo J
IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32248106
[TBL] [Abstract][Full Text] [Related]
17. Segmentation with mixed supervision: Confidence maximization helps knowledge distillation.
Liu B; Desrosiers C; Ben Ayed I; Dolz J
Med Image Anal; 2023 Jan; 83():102670. PubMed ID: 36413905
[TBL] [Abstract][Full Text] [Related]
18. Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation.
Noothout JMH; Lessmann N; van Eede MC; van Harten LD; Sogancioglu E; Heslinga FG; Veta M; van Ginneken B; IĆĄgum I
J Med Imaging (Bellingham); 2022 Sep; 9(5):052407. PubMed ID: 35692896
[No Abstract] [Full Text] [Related]
19. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
Shang R; Li W; Zhu S; Jiao L; Li Y
Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
[TBL] [Abstract][Full Text] [Related]
20. Resolution-Aware Knowledge Distillation for Efficient Inference.
Feng Z; Lai J; Xie X
IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]