Cited 0 times in Scipus Cited Count

Automatic segmentation of head anatomical structures from sparsely-annotated images

Authors
Sugino, T | Roth, HR | Eshghi, M | Oda, M | Chung, MS  | Mori, K
Citation
2017 IEEE International Conference on Cyborg and Bionic Systems (CBS), 2017. : 145-149, 2018
Journal Title
2017 IEEE International Conference on Cyborg and Bionic Systems (CBS)
Abstract
Bionic humanoid systems, which are elaborate human models with sensors, have been developed as a tool for quantitative evaluation of doctors' psychomotor skills and medical device performances. For creation of the elaborate human models, this study presents automated segmentation of head sectioned images using sparsely-annotated data based on deep convolutional neural network. We applied the following fully convolutional networks (FCNs) to the sparse-annotation-based segmentation: a standard FCN and a dilated convolution based FCN. To validate the availability of FCNs for segmentation of head structures from sparse annotation, we performed 8- and 243-label segmentation experiments using different two sets of head sectioned images in the Visible Korean Human project. In the segmentation experiments, only 10% of all images in each data set were used for training data. Both of the FCNs could achieve the mean segmentation accuracy of more than 85% in the 8-label segmentation. In the 243-label segmentation, though the mean segmentation accuracy was about 50%, the results suggested that the FCNs, especially the dilated convolution based FCNs, had potential to achieve accurate segmentation of anatomical structures, except for small-sized and complex-shaped tissues, even from sparse annotation.
Keywords

DOI
10.1109/CBS.2017.8266085
Appears in Collections:
Journal Papers > School of Medicine / Graduate School of Medicine > Anatomy
Ajou Authors
정, 민석
Files in This Item:
There are no files associated with this item.
Export

qrcode

해당 아이템을 이메일로 공유하기 원하시면 인증을 거치시기 바랍니다.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse