These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

109 related articles for article (PubMed ID: 36366697)

  • 1. OptiDistillNet: Learning nonlinear pulse propagation using the student-teacher model.
    Gautam N; Kaushik V; Choudhary A; Lall B
    Opt Express; 2022 Nov; 30(23):42430-42439. PubMed ID: 36366697
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Knowledge distillation circumvents nonlinearity for optical convolutional neural networks.
    Xiang J; Colburn S; Majumdar A; Shlizerman E
    Appl Opt; 2022 Mar; 61(9):2173-2183. PubMed ID: 35333231
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Patient-specific uncertainty and bias quantification of non-transparent convolutional neural network model through knowledge distillation and Bayesian deep learning.
    Gong H; Yu L; Leng S; Hsieh SS; Fletcher JG; McCollough CH
    Proc SPIE Int Soc Opt Eng; 2023 Feb; 12463():. PubMed ID: 37063493
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Low-complexity full-field ultrafast nonlinear dynamics prediction by a convolutional feature separation modeling method.
    Yang H; Zhao H; Niu Z; Pu G; Xiao S; Hu W; Yi L
    Opt Express; 2022 Nov; 30(24):43691-43705. PubMed ID: 36523062
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Efficient Combination of CNN and Transformer for Dual-Teacher Uncertainty-guided Semi-supervised Medical Image Segmentation.
    Xiao Z; Su Y; Deng Z; Zhang W
    Comput Methods Programs Biomed; 2022 Nov; 226():107099. PubMed ID: 36116398
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks.
    Wang L; Yoon KJ
    IEEE Trans Pattern Anal Mach Intell; 2022 Jun; 44(6):3048-3068. PubMed ID: 33513099
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Reducing the U-Net size for practical scenarios: Virus recognition in electron microscopy images.
    Matuszewski DJ; Sintorn IM
    Comput Methods Programs Biomed; 2019 Sep; 178():31-39. PubMed ID: 31416558
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Efficient knowledge distillation for liver CT segmentation using growing assistant network.
    Xu P; Kim K; Koh J; Wu D; Rim Lee Y; Young Park S; Young Tak W; Liu H; Li Q
    Phys Med Biol; 2021 Nov; 66(23):. PubMed ID: 34768246
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression.
    Liu Y; Cao J; Li B; Hu W; Maybank S
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3378-3395. PubMed ID: 35731774
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
    Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
    [TBL] [Abstract][Full Text] [Related]  

  • 13. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
    Su T; Zhang J; Yu Z; Wang G; Liu X
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.
    Cho J; Lee M
    Sensors (Basel); 2019 Oct; 19(19):. PubMed ID: 31590266
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Student becomes teacher: training faster deep learning lightweight networks for automated identification of optical coherence tomography B-scans of interest using a student-teacher framework.
    Owen JP; Blazes M; Manivannan N; Lee GC; Yu S; Durbin MK; Nair A; Singh RP; Talcott KE; Melo AG; Greenlee T; Chen ER; Conti TF; Lee CS; Lee AY
    Biomed Opt Express; 2021 Sep; 12(9):5387-5399. PubMed ID: 34692189
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation.
    Ge S; Zhao S; Li C; Li J
    IEEE Trans Image Process; 2018 Nov; ():. PubMed ID: 30507531
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Real-Time Correlation Tracking via Joint Model Compression and Transfer.
    Wang N; Zhou W; Song Y; Ma C; Li H
    IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32356748
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Estimation of Pedestrian Pose Orientation Using Soft Target Training Based on Teacher⁻Student Framework.
    Heo D; Nam JY; Ko BC
    Sensors (Basel); 2019 Mar; 19(5):. PubMed ID: 30845772
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Model Compression for Faster Structural Separation of Macromolecules Captured by Cellular Electron Cryo-Tomography.
    Guo J; Zhou B; Zeng X; Freyberg Z; Xu M
    Image Anal Recognit; 2018 Jun; 10882():144-152. PubMed ID: 31231722
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation.
    Noothout JMH; Lessmann N; van Eede MC; van Harten LD; Sogancioglu E; Heslinga FG; Veta M; van Ginneken B; Išgum I
    J Med Imaging (Bellingham); 2022 Sep; 9(5):052407. PubMed ID: 35692896
    [No Abstract]   [Full Text] [Related]  

    [Next]    [New Search]
    of 6.