Breast ultrasound lesions recognition: End-to-end deep learning approaches

Moi Hoon Yap, Manu Goyal, Fatima Osman, Robert Martí, Erika Denton, Arne Juette, Reyer Zwiggelaar

Research output: Contribution to journalArticlepeer-review

82 Citations (SciVal)
453 Downloads (Pure)


Multistage processing of automated breast ultrasound lesions recognition is dependent on the performanceof prior stages. To improve the current state of the art, we propose the use of end-to-end deep learningapproaches using fully convolutional networks (FCNs), namely FCN-AlexNet, FCN-32s, FCN-16s, and FCN-8sfor semantic segmentation of breast lesions. We use pretrained models based on ImageNet and transfer learningto overcome the issue of data deficiency. We evaluate our results on two datasets, which consist of a total of113 malignant and 356 benign lesions. To assess the performance, we conduct fivefold cross validation usingthe following split: 70% for training data, 10% for validation data, and 20% testing data. The results showed thatour proposed method performed better on benign lesions, with a top “mean Dice” score of 0.7626 with FCN-16s,when compared with the malignant lesions with a top mean Dice score of 0.5484 with FCN-8s. When consideringthe number of images with Dice score >0.5, 89.6% of the benign lesions were successfully segmented andcorrectly recognised, whereas 60.6% of the malignant lesions were successfully segmented and correctly recognized.We conclude the paper by addressing the future challenges of the work.
Original languageEnglish
Article number011007
Number of pages8
JournalJournal of Medical Imaging
Issue number1
Publication statusPublished - 10 Oct 2018


  • image segmentation
  • breast
  • ultrasonography
  • data modelling
  • image classification
  • tumor growth modelling
  • RGB color model
  • breast lesions recognition
  • breast ultrasound
  • fully convolutional network
  • semantic segmentation


Dive into the research topics of 'Breast ultrasound lesions recognition: End-to-end deep learning approaches'. Together they form a unique fingerprint.

Cite this