BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

268 related articles for article (PubMed ID: 32886540)

  • 1. Image-based laparoscopic tool detection and tracking using convolutional neural networks: a review of the literature.
    Yang C; Zhao Z; Hu S
    Comput Assist Surg (Abingdon); 2020 Dec; 25(1):15-28. PubMed ID: 32886540
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Real-time tracking of surgical instruments based on spatio-temporal context and deep learning.
    Zhao Z; Chen Z; Voros S; Cheng X
    Comput Assist Surg (Abingdon); 2019 Oct; 24(sup1):20-29. PubMed ID: 30760050
    [TBL] [Abstract][Full Text] [Related]  

  • 3. [Review of research on detection and tracking of minimally invasive surgical tools based on deep learning].
    Liu Y; Zhao Z
    Sheng Wu Yi Xue Gong Cheng Xue Za Zhi; 2019 Oct; 36(5):870-878. PubMed ID: 31631638
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A CNN-based prototype method of unstructured surgical state perception and navigation for an endovascular surgery robot.
    Zhao Y; Guo S; Wang Y; Cui J; Ma Y; Zeng Y; Liu X; Jiang Y; Li Y; Shi L; Xiao N
    Med Biol Eng Comput; 2019 Sep; 57(9):1875-1887. PubMed ID: 31222531
    [TBL] [Abstract][Full Text] [Related]  

  • 5. White blood cells detection and classification based on regional convolutional neural networks.
    Kutlu H; Avci E; Özyurt F
    Med Hypotheses; 2020 Feb; 135():109472. PubMed ID: 31760248
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Tracking-by-detection of surgical instruments in minimally invasive surgery via the convolutional neural network deep learning-based method.
    Zhao Z; Voros S; Weng Y; Chang F; Li R
    Comput Assist Surg (Abingdon); 2017 Dec; 22(sup1):26-35. PubMed ID: 28937281
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Surgical-tools detection based on Convolutional Neural Network in laparoscopic robot-assisted surgery.
    Bareum Choi ; Kyungmin Jo ; Songe Choi ; Jaesoon Choi
    Annu Int Conf IEEE Eng Med Biol Soc; 2017 Jul; 2017():1756-1759. PubMed ID: 29060227
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A deep learning framework for automatic detection of arbitrarily shaped fiducial markers in intrafraction fluoroscopic images.
    Mylonas A; Keall PJ; Booth JT; Shieh CC; Eade T; Poulsen PR; Nguyen DT
    Med Phys; 2019 May; 46(5):2286-2297. PubMed ID: 30929254
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Convolutional neural network-based surgical instrument detection.
    Cai T; Zhao Z
    Technol Health Care; 2020; 28(S1):81-88. PubMed ID: 32333566
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Articulated Multi-Instrument 2-D Pose Estimation Using Fully Convolutional Networks.
    Du X; Kurmann T; Chang PL; Allan M; Ourselin S; Sznitman R; Kelly JD; Stoyanov D
    IEEE Trans Med Imaging; 2018 May; 37(5):1276-1287. PubMed ID: 29727290
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Three-dimensional posture estimation of robot forceps using endoscope with convolutional neural network.
    Mikada T; Kanno T; Kawase T; Miyazaki T; Kawashima K
    Int J Med Robot; 2020 Apr; 16(2):e2062. PubMed ID: 31913577
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Facial Expressions Recognition for Human-Robot Interaction Using Deep Convolutional Neural Networks with Rectified Adam Optimizer.
    Melinte DO; Vladareanu L
    Sensors (Basel); 2020 Apr; 20(8):. PubMed ID: 32340140
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Combined 2D and 3D tracking of surgical instruments for minimally invasive and robotic-assisted surgery.
    Du X; Allan M; Dore A; Ourselin S; Hawkes D; Kelly JD; Stoyanov D
    Int J Comput Assist Radiol Surg; 2016 Jun; 11(6):1109-19. PubMed ID: 27038963
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Comparison of Graph Fitting and Sparse Deep Learning Model for Robot Pose Estimation.
    Rodziewicz-Bielewicz J; Korzeń M
    Sensors (Basel); 2022 Aug; 22(17):. PubMed ID: 36080976
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Long Term Safety Area Tracking (LT-SAT) with online failure detection and recovery for robotic minimally invasive surgery.
    Penza V; Du X; Stoyanov D; Forgione A; Mattos LS; De Momi E
    Med Image Anal; 2018 Apr; 45():13-23. PubMed ID: 29329053
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Detection, segmentation, and 3D pose estimation of surgical tools using convolutional neural networks and algebraic geometry.
    Hasan MK; Calvet L; Rabbani N; Bartoli A
    Med Image Anal; 2021 May; 70():101994. PubMed ID: 33611053
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Automatically Designing CNN Architectures Using the Genetic Algorithm for Image Classification.
    Sun Y; Xue B; Zhang M; Yen GG; Lv J
    IEEE Trans Cybern; 2020 Sep; 50(9):3840-3854. PubMed ID: 32324588
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Vision-based and marker-less surgical tool detection and tracking: a review of the literature.
    Bouget D; Allan M; Stoyanov D; Jannin P
    Med Image Anal; 2017 Jan; 35():633-654. PubMed ID: 27744253
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Patch-based adaptive weighting with segmentation and scale (PAWSS) for visual tracking in surgical video.
    Du X; Allan M; Bodenstedt S; Maier-Hein L; Speidel S; Dore A; Stoyanov D
    Med Image Anal; 2019 Oct; 57():120-135. PubMed ID: 31299494
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Application and evaluation of surgical tool and tool tip recognition based on Convolutional Neural Network in multiple endoscopic surgical scenarios.
    Ping L; Wang Z; Yao J; Gao J; Yang S; Li J; Shi J; Wu W; Hua S; Wang H
    Surg Endosc; 2023 Sep; 37(9):7376-7384. PubMed ID: 37580576
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 14.