These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
281 related articles for article (PubMed ID: 32577985)
1. Object extraction via deep learning-based marker-free tracking framework of surgical instruments for laparoscope-holder robots. Zhang J; Gao X Int J Comput Assist Radiol Surg; 2020 Aug; 15(8):1335-1345. PubMed ID: 32577985 [TBL] [Abstract][Full Text] [Related]
2. Development and validation of a deep-learning based assistance system for enhancing laparoscopic control level. Zheng Q; Yang R; Yang S; Ni X; Li Y; Jiang Z; Wang X; Wang L; Chen Z; Liu X Int J Med Robot; 2023 Feb; 19(1):e2449. PubMed ID: 35922092 [TBL] [Abstract][Full Text] [Related]
3. Tracking-by-detection of surgical instruments in minimally invasive surgery via the convolutional neural network deep learning-based method. Zhao Z; Voros S; Weng Y; Chang F; Li R Comput Assist Surg (Abingdon); 2017 Dec; 22(sup1):26-35. PubMed ID: 28937281 [TBL] [Abstract][Full Text] [Related]
4. U-NetPlus: A Modified Encoder-Decoder U-Net Architecture for Semantic and Instance Segmentation of Surgical Instruments from Laparoscopic Images. Kamrul Hasan SM; Linte CA Annu Int Conf IEEE Eng Med Biol Soc; 2019 Jul; 2019():7205-7211. PubMed ID: 31947497 [TBL] [Abstract][Full Text] [Related]
5. The role of insertion points in the detection and positioning of instruments in laparoscopy for robotic tasks. Doignon C; Nageotte F; de Mathelin M Med Image Comput Comput Assist Interv; 2006; 9(Pt 1):527-34. PubMed ID: 17354931 [TBL] [Abstract][Full Text] [Related]
6. Development and Validation of a Model for Laparoscopic Colorectal Surgical Instrument Recognition Using Convolutional Neural Network-Based Instance Segmentation and Videos of Laparoscopic Procedures. Kitaguchi D; Lee Y; Hayashi K; Nakajima K; Kojima S; Hasegawa H; Takeshita N; Mori K; Ito M JAMA Netw Open; 2022 Aug; 5(8):e2226265. PubMed ID: 35984660 [TBL] [Abstract][Full Text] [Related]
7. Image recognition of triangular tissue of an organ pulled by forceps in surgical working area for laparoscope robot. Nakasuji H; Naruki K; Kawai T; Nishikawa A; Nishizawa Y; Nakamura T Annu Int Conf IEEE Eng Med Biol Soc; 2017 Jul; 2017():3708-3711. PubMed ID: 29060704 [TBL] [Abstract][Full Text] [Related]
8. A Kalman-Filter-Based Common Algorithm Approach for Object Detection in Surgery Scene to Assist Surgeon's Situation Awareness in Robot-Assisted Laparoscopic Surgery. Ryu J; Moon Y; Choi J; Kim HC J Healthc Eng; 2018; 2018():8079713. PubMed ID: 29854366 [TBL] [Abstract][Full Text] [Related]
9. Rate of skill acquisition in the use of a robotic laparoscope holder (FreeHand(®)). Sbaih M; Arulampalam TH; Motson RW Minim Invasive Ther Allied Technol; 2016 Aug; 25(4):196-202. PubMed ID: 27270102 [TBL] [Abstract][Full Text] [Related]
10. Instrumentation for laparoscopic renal surgery--Padron Endoscopic Exposing Retractor (PEER) and Endoholder: point of technique. Rehman J; Sundaram CP; Khan SA; Venkatesh R; Waltzer WC Surg Laparosc Endosc Percutan Tech; 2005 Feb; 15(1):18-21. PubMed ID: 15714150 [TBL] [Abstract][Full Text] [Related]
11. A deep learning framework for automatic detection of arbitrarily shaped fiducial markers in intrafraction fluoroscopic images. Mylonas A; Keall PJ; Booth JT; Shieh CC; Eade T; Poulsen PR; Nguyen DT Med Phys; 2019 May; 46(5):2286-2297. PubMed ID: 30929254 [TBL] [Abstract][Full Text] [Related]
12. Electromagnetic tracking in image-guided laparoscopic surgery: Comparison with optical tracking and feasibility study of a combined laparoscope and laparoscopic ultrasound system. Xiao G; Bonmati E; Thompson S; Evans J; Hipwell J; Nikitichev D; Gurusamy K; Ourselin S; Hawkes DJ; Davidson B; Clarkson MJ Med Phys; 2018 Nov; 45(11):5094-5104. PubMed ID: 30247765 [TBL] [Abstract][Full Text] [Related]
13. An improved camera model for oblique-viewing laparoscopes: high reprojection accuracy independent of telescope rotation. Eppenga R; Snaauw G; Kuhlmann K; van der Heijden F; Ruers T; Nijkamp J Phys Med Biol; 2023 Sep; 68(18):. PubMed ID: 37582390 [No Abstract] [Full Text] [Related]
14. Real-time 3D visual tracking of laparoscopic instruments for robotized endoscope holder. Zhao Z Biomed Mater Eng; 2014; 24(6):2665-72. PubMed ID: 25226970 [TBL] [Abstract][Full Text] [Related]
15. Development of a novel intelligent laparoscope system for semi-automatic minimally invasive surgery. Sun Y; Pan B; Fu Y; Cao F Int J Med Robot; 2020 Feb; 16(1):e2049. PubMed ID: 31677231 [TBL] [Abstract][Full Text] [Related]
16. ST-ITEF: Spatio-Temporal Intraoperative Task Estimating Framework to recognize surgical phase and predict instrument path based on multi-object tracking in keratoplasty. Feng X; Zhang X; Shi X; Li L; Wang S Med Image Anal; 2024 Jan; 91():103026. PubMed ID: 37976868 [TBL] [Abstract][Full Text] [Related]
17. Articulated Multi-Instrument 2-D Pose Estimation Using Fully Convolutional Networks. Du X; Kurmann T; Chang PL; Allan M; Ourselin S; Sznitman R; Kelly JD; Stoyanov D IEEE Trans Med Imaging; 2018 May; 37(5):1276-1287. PubMed ID: 29727290 [TBL] [Abstract][Full Text] [Related]
18. Automatic localization of laparoscopic instruments for the visual servoing of an endoscopic camera holder. Voros S; Long JA; Cinquin P Med Image Comput Comput Assist Interv; 2006; 9(Pt 1):535-42. PubMed ID: 17354932 [TBL] [Abstract][Full Text] [Related]
19. Unpaired deep adversarial learning for multi-class segmentation of instruments in robot-assisted surgical videos. Nema S; Vachhani L Int J Med Robot; 2023 Aug; 19(4):e2514. PubMed ID: 36987579 [TBL] [Abstract][Full Text] [Related]
20. Stereo Dense Scene Reconstruction and Accurate Localization for Learning-Based Navigation of Laparoscope in Minimally Invasive Surgery. Wei R; Li B; Mo H; Lu B; Long Y; Yang B; Dou Q; Liu Y; Sun D IEEE Trans Biomed Eng; 2023 Feb; 70(2):488-500. PubMed ID: 35905063 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]