TY - JOUR
T1 - Breast ultrasound lesions recognition
T2 - End-to-end deep learning approaches
AU - Hoon Yap, Moi
AU - Goyal, Manu
AU - Osman, Fatima
AU - Martí, Robert
AU - Denton, Erika
AU - Juette, Arne
AU - Zwiggelaar, Reyer
N1 - Publisher Copyright:
© 2018 Society of Photo-Optical Instrumentation Engineers (SPIE).
PY - 2018/10/10
Y1 - 2018/10/10
N2 - Multistage processing of automated breast ultrasound lesions recognition is dependent on the performanceof prior stages. To improve the current state of the art, we propose the use of end-to-end deep learningapproaches using fully convolutional networks (FCNs), namely FCN-AlexNet, FCN-32s, FCN-16s, and FCN-8sfor semantic segmentation of breast lesions. We use pretrained models based on ImageNet and transfer learningto overcome the issue of data deficiency. We evaluate our results on two datasets, which consist of a total of113 malignant and 356 benign lesions. To assess the performance, we conduct fivefold cross validation usingthe following split: 70% for training data, 10% for validation data, and 20% testing data. The results showed thatour proposed method performed better on benign lesions, with a top “mean Dice” score of 0.7626 with FCN-16s,when compared with the malignant lesions with a top mean Dice score of 0.5484 with FCN-8s. When consideringthe number of images with Dice score >0.5, 89.6% of the benign lesions were successfully segmented andcorrectly recognised, whereas 60.6% of the malignant lesions were successfully segmented and correctly recognized.We conclude the paper by addressing the future challenges of the work.
AB - Multistage processing of automated breast ultrasound lesions recognition is dependent on the performanceof prior stages. To improve the current state of the art, we propose the use of end-to-end deep learningapproaches using fully convolutional networks (FCNs), namely FCN-AlexNet, FCN-32s, FCN-16s, and FCN-8sfor semantic segmentation of breast lesions. We use pretrained models based on ImageNet and transfer learningto overcome the issue of data deficiency. We evaluate our results on two datasets, which consist of a total of113 malignant and 356 benign lesions. To assess the performance, we conduct fivefold cross validation usingthe following split: 70% for training data, 10% for validation data, and 20% testing data. The results showed thatour proposed method performed better on benign lesions, with a top “mean Dice” score of 0.7626 with FCN-16s,when compared with the malignant lesions with a top mean Dice score of 0.5484 with FCN-8s. When consideringthe number of images with Dice score >0.5, 89.6% of the benign lesions were successfully segmented andcorrectly recognised, whereas 60.6% of the malignant lesions were successfully segmented and correctly recognized.We conclude the paper by addressing the future challenges of the work.
KW - image segmentation
KW - breast
KW - ultrasonography
KW - data modelling
KW - image classification
KW - tumor growth modelling
KW - RGB color model
KW - breast lesions recognition
KW - breast ultrasound
KW - fully convolutional network
KW - semantic segmentation
UR - http://www.scopus.com/inward/record.url?scp=85062967754&partnerID=8YFLogxK
U2 - 10.1117/1.JMI.6.1.011007
DO - 10.1117/1.JMI.6.1.011007
M3 - Article
SN - 2329-4310
VL - 6
JO - Journal of Medical Imaging
JF - Journal of Medical Imaging
IS - 1
M1 - 011007
ER -