Loading [MathJax]/extensions/MathMenu.js
3D End-to-End Boundary-Aware Networks for Pancreas Segmentation | IEEE Conference Publication | IEEE Xplore

3D End-to-End Boundary-Aware Networks for Pancreas Segmentation


Abstract:

Accurate pancreas segmentation is crucial for computer aided pancreas diagnosis and surgery. It still remains challenging to precisely extract the pancreas due to its sma...Show More

Abstract:

Accurate pancreas segmentation is crucial for computer aided pancreas diagnosis and surgery. It still remains challenging to precisely extract the pancreas due to its small size, unclear boundary, and shape variations on CT images. This work proposes a new 3D end-to-end boundary-aware network architecture for automatic accurate pancreas segmentation from CT images. Specifically, this architecture introduces four hybrid blocks for feature extraction in accordance with 3D fully convolutional neural networks so that it can successfully extract and perceive spatial and contextual information from 3D CT data. Simultaneously, a reverse attention block and a boundary enhancement block are embedded into this architecture to enhance the ability to learn and extract feature maps with more context and boundary information. We evaluate our proposed method on publicly available pancreas data using 4-fold cross-validation, with the experimental results showing that our network model can obtain more accurate or comparable segmentation than other existing methods.
Date of Conference: 16-19 October 2022
Date Added to IEEE Xplore: 18 October 2022
ISBN Information:

ISSN Information:

Conference Location: Bordeaux, France

Funding Agency:

References is not available for this document.

1. INTRODUCTION

Pancreatic cancer is a highly fatal malignancy and the second leading cause of cancer death worldwide [1]. Numerous pancreatic cancer cases are diagnosed at the most advanced stage, resulting in a low survival rate [2]. While the incidence of pancreatic cancer is constantly increasing worldwide, early diagnosis and treatment is significant to reduce its mortality.

Select All
1.
Rebecca L. Siegel, Kimberly D. Miller, Hannah E. Fuchs and Ahmedin Jemal, "Cancer statistics 2022", CA: A Cancer Journal for Clinicians, vol. 72, no. 1, pp. 7-33, 2022.
2.
Terumi Kamisawa, Laura D Wood, Takao Itoi and Kyoichi Takaori, "Pancreatic cancer", The Lancet, vol. 388, no. 10039, pp. 73-85, 2016.
3.
Ken’ichi Karasawa, Masahiro Oda, Takayuki Kitasaka, Kazunari Misawa, Michitaka Fujiwara, Chengwen Chu, et al., "Multi-atlas pancreas segmentation: atlas selection based on vessel structure", Medical image analysis, vol. 39, pp. 18-28, 2017.
4.
Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep learning, MIT press, 2016.
5.
Geert Litjens, Thijs Kooi, Babak Ehteshami Bejnordi, Arnaud Arindra Adiyoso Setio, Francesco Ciompi, Mohsen Ghafoorian, et al., "A survey on deep learning in medical image analysis", Medical image analysis, vol. 42, pp. 60-88, 2017.
6.
Olaf Ronneberger, Philipp Fischer and Thomas Brox, "U-net: Convolutional networks for biomedical image segmentation", International Conference on Medical image computing and computer-assisted intervention, pp. 234-241, 2015.
7.
Özgün Çiçek, Ahmed Abdulkadir, Soeren S Lienkamp, Thomas Brox and Olaf Ronneberger, "3d u-net: learning dense volumetric segmentation from sparse annotation", International conference on medical image computing and computer-assisted intervention, pp. 424-432, 2016.
8.
Vijay Badrinarayanan, Alex Kendall and Roberto Cipolla, "Segnet: A deep convolutional encoder-decoder architecture for image segmentation", IEEE transactions on pattern analysis and machine intelligence, vol. 39, no. 12, pp. 2481-2495, 2017.
9.
Jinzheng Cai, Le Lu, Yuanpu Xie, Fuyong Xing and Lin Yang, "Improving deep pancreas segmentation in ct and mri images via recurrent neural contextual learning and direct loss function", 2017.
10.
Yuyin Zhou, Lingxi Xie, Wei Shen, Yan Wang, Elliot K Fishman and Alan L Yuille, "A fixed-point model for pancreas segmentation in abdominal ct scans", International conference on medical image computing and computer-assisted intervention, pp. 693-701, 2017.
11.
Holger R Roth, Le Lu, Nathan Lay, Adam P Harrison, Amal Farag, Andrew Sohn, et al., "Spatial aggregation of holistically-nested convolutional neural networks for automated pancreas localization and segmentation", Medical image analysis, vol. 45, pp. 94-107, 2018.
12.
Qihang Yu, Lingxi Xie, Yan Wang, Yuyin Zhou, Elliot K Fishman and Alan L Yuille, "Recurrent saliency transformation network: Incorporating multi-stage visual cues for small organ segmentation", Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 8280-8289, 2018.
13.
Xu Yao, Yuqing Song and Zhe Liu, "Advances on pancreas segmentation: a review", Multimedia Tools and Applications, vol. 79, no. 9, pp. 6799-6821, 2020.
14.
Michal Drozdzal, Eugene Vorontsov, Gabriel Chartrand, Samuel Kadoury and Chris Pal, "The importance of skip connections in biomedical image segmentation", 2016.
15.
Kaiming He, Xiangyu Zhang, Shaoqing Ren and Jian Sun, "Deep residual learning for image recognition", Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 770-778, 2016.
16.
Shuhan Chen, Xiuli Tan, Ben Wang and Xuelong Hu, "Reverse attention for salient object detection", Proceedings of the European Conference on Computer Vision (ECCV), pp. 234-250, 2018.
17.
Mateusz Buda, Atsuto Maki and Maciej A Mazurowski, "A systematic study of the class imbalance problem in convolutional neural networks", Neural Networks, vol. 106, pp. 249-259, 2018.
18.
Holger R Roth, Le Lu, Amal Farag, Hoo-Chang Shin, Jiamin Liu, Evrim B Turkbey, et al., "Deeporgan: Multi-level deep convolutional networks for automated pancreas segmentation", International conference on medical image computing and computer-assisted intervention, pp. 556-564, 2015.
19.
Jonathan Long, Evan Shelhamer and Trevor Darrell, "Fully convolutional networks for semantic segmentation", Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 3431-3440, 2015.
Contact IEEE to Subscribe

References

References is not available for this document.