BCT Boost Segmentation with U-net in TensorFlow

Main Article Content

Grzegorz Wieczorek
Izabella Antoniuk
Michał Kruk
Jarosław Kurek
Arkadiusz Orłowski
Jakub Pach
Bartosz Świderski


Keywords : breast cancer, breast conserving therapy, image segmentation, U-net, Keras, TensorFlow
Abstract
In this paper we present a new segmentation method meant for boost area that remains after removing the tumour using BCT (breast conserving therapy). The selected area is a region on which radiation treatment will later be made. Consequently, an inaccurate designation of this region can result in a treatment missing its target or focusing on healthy breast tissue that otherwise could be spared. Needless to say that exact indication of boost area is an extremely important aspect of the entire medical procedure, where a better definition can lead to optimizing of the coverage of the target volume and, in result, can save normal breast tissue. Precise definition of this area has a potential to both improve the local control of the disease and to ensure better cosmetic outcome for the patient. In our approach we use U-net along with Keras and TensorFlow systems to tailor a precise solution for the indication of the boost area. During the training process we utilize a set of CT images, where each of them came with a contour assigned by an expert. We wanted to achieve a segmentation result as close to given contour as possible. With a rather small initial data set we used data augmentation techniques to increase the number of training examples, while the final outcomes were evaluated according to their similarity to the ones produced by experts, by calculating the mean square error and the structural similarity index (SSIM).

Article Details

How to Cite
Wieczorek, G., Antoniuk, I., Kruk, M., Kurek, J., Orłowski, A., Pach, J., & Świderski, B. (2019). BCT Boost Segmentation with U-net in TensorFlow. Machine Graphics and Vision, 28(1/4), 25–34. https://doi.org/10.22630/MGV.2019.28.1.3
References

M. Kaufmann, G. Von Minckwitz, J. Bergh, p. F. Conte, S. Darby, et al. Breakthroughs in research and treatment of early breast cancer: an overview of the last three decades. Archives of gynecology and obstetrics, 288(6), 1203-1212, 2003. (Crossref)

O. Ronneberger, P. Fischer, T. Brox. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image computing and computer-assisted intervention. (pp. 234-241). Springer, Cham, 2015, October. (Crossref)

C. J. Hansen, E. de Winton, S. Guglani, E. Vamvakas, D. Willis, B. H. Chua. Target localisation for tumour bed radiotherapy in early breast cancer. Journal of medical imaging and radiation oncology, 56(4), 452-457, 2012. (Crossref)

Z. Wang, A. C. Bovik, H. R. Sheikh, E. P. Simoncelli. Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing, 13(4), 600-612, 2004. (Crossref)

R. Girshick, J. Donahue, T. Darrell, J. Malik. Rich feature hierarchies for accurate object detection and semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014. (Crossref)

A. Krizhevsky, I. Sutskever, G. E. Hinton. Imagenet classification with deep convolutional neural networks. In: NIPS. pp. 1106–1114, 2012.

J. Kurek, G. Wieczorek, M. Kruk, A. Jegorowa, S. Osowski. Transfer learning in recognition of drill wear using convolutional neural network. 18th International Conference on Computational Problems of Electrical Engineering (CPEE) (pp. 1-4). IEEE. September 2017. (Crossref)

Keras deep learning library for python. Online: https://keras.io/

TensorFlow machine learning platform [online]. Online: https://www.tensorflow.org/

Python main web page. Online: https://www.python.org/

Statistics

Downloads

Download data is not yet available.
Recommend Articles
Similar Articles

You may also start an advanced similarity search for this article.