These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Measuring mammographic density: comparing a fully automated volumetric assessment versus European radiologists' qualitative classification. Author: Sartor H, Lång K, Rosso A, Borgquist S, Zackrisson S, Timberg P. Journal: Eur Radiol; 2016 Dec; 26(12):4354-4360. PubMed ID: 27011371. Abstract: OBJECTIVES: Breast Imaging-Reporting and Data System (BI-RADS) mammographic density categories are associated with considerable interobserver variability. Automated methods of measuring volumetric breast density may reduce variability and be valuable in risk and mammographic screening stratification. Our objective was to assess agreement of mammographic density by a volumetric method with the radiologists' classification. METHODS: Eight thousand seven hundred and eighty-two examinations from the Malmö Breast Tomosynthesis Screening Trial were classified according to BI-RADS, 4th Edition. Volumetric breast density was assessed using automated software for 8433 examinations. Agreement between volumetric breast density and BI-RADS was descriptively analyzed. Agreement between radiologists and between categorical volumetric density and BI-RADS was calculated, rendering kappa values. RESULTS: The observed agreement between BI-RADS scores of different radiologists was 80.9 % [kappa 0.77 (0.76-0.79)]. A spread of volumetric breast density for each BI-RADS category was seen. The observed agreement between categorical volumetric density and BI-RADS scores was 57.1 % [kappa 0.55 (0.53-0.56)]. CONCLUSIONS: There was moderate agreement between volumetric density and BI-RADS scores from European radiologists indicating that radiologists evaluate mammographic density differently than software. The automated method may be a robust and valuable tool; however, differences in interpretation between radiologists and software require further investigation. KEY POINTS: • Agreement between qualitative and software density measurements has not been frequently studied. • There was substantial agreement between different radiologists´ qualitative density assessments. • There was moderate agreement between software and radiologists' density assessments. • Differences in interpretation between software and radiologists require further investigation.[Abstract] [Full Text] [Related] [New Search]