These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Machine Segmentation of Pelvic Anatomy in MRI-Assisted Radiosurgery (MARS) for Prostate Cancer Brachytherapy. Author: Sanders JW, Lewis GD, Thames HD, Kudchadker RJ, Venkatesan AM, Bruno TL, Ma J, Pagel MD, Frank SJ. Journal: Int J Radiat Oncol Biol Phys; 2020 Dec 01; 108(5):1292-1303. PubMed ID: 32634543. Abstract: PURPOSE: To investigate machine segmentation of pelvic anatomy in magnetic resonance imaging (MRI)-assisted radiosurgery (MARS) for prostate cancer using prostate brachytherapy MRIs acquired with different pulse sequences and image contrasts. METHODS AND MATERIALS: Two hundred 3-dimensional (3D) preimplant and postimplant prostate brachytherapy MRI scans were acquired with a T2-weighted sequence, a T2/T1-weighted sequence, or a T1-weighted sequence. One hundred twenty deep machine learning models were trained to segment the prostate, seminal vesicles, external urinary sphincter, rectum, and bladder using the MRI scans acquired with T2-weighted and T2/T1-weighted image contrast. The deep machine learning models consisted of 18 fully convolutional networks (FCNs) with different convolutional encoders. Both 2-dimensional and 3D U-Net FCNs were constructed for comparison. Six objective functions were investigated: cross-entropy, Jaccard distance, focal loss, and 3 variations of Tversky distance. The performance of the models was compared using similarity metrics, including pixel accuracy, Jaccard index, Dice similarity coefficient (DSC), 95% Hausdorff distance, relative volume difference, Matthews correlation coefficient, precision, recall, and average symmetrical surface distance. We selected the highest-performing architecture and investigated how the amount of training data, use of skip connections, and data augmentation affected segmentation performance. In addition, we investigated whether segmentation on T1-weighted MRI was possible with FCNs trained on only T2-weighted and T2/T1-weighted image contrast. RESULTS: Overall, an FCN with a DenseNet201 encoder trained via cross-entropy minimization yielded the highest combined segmentation performance. For the 53 3D test MRI scans acquired with T2-weighted or T2/T1-weighted image contrast, the DSCs of the prostate, external urinary sphincter, seminal vesicles, rectum, and bladder were 0.90 ± 0.04, 0.70 ± 0.15, 0.80 ± 0.12, 0.91 ± 0.06, and 0.96 ± 0.04, respectively, after model fine-tuning. For the 5 T1-weighted images, the DSCs of these organs were 0.82 ± 0.07, 0.17 ± 0.15, 0.46 ± 0.21, 0.87 ± 0.06, and 0.88 ± 0.05, respectively. CONCLUSIONS: Machine segmentation of the prostate and surrounding anatomy on 3D MRIs acquired with different pulse sequences for MARS low-dose-rate prostate brachytherapy is possible with a single FCN.[Abstract] [Full Text] [Related] [New Search]