Reinforced Auto-Zoom Net: Towards Accurate and Fast Breast Cancer Segmentation in Whole-Slide Images
Permanent lenke
https://hdl.handle.net/10037/14733Dato
2018-09-20Type
Journal articleTidsskriftartikkel
Peer reviewed
Forfatter
Dong, Nanqing; Kampffmeyer, Michael C.; Liang, Xiaodan; Wang, Zeya; Dai, Wei; Xing, Eric P.Sammendrag
Convolutional neural networks have led to significant breakthroughs in the domain of medical image analysis. However, the task of breast cancer segmentation in whole-slide images (WSIs) is still underexplored. WSIs are large histopathological images with extremely high resolution. Constrained by the hardware and field of view, using high-magnification patches can slow down the inference process and using low-magnification patches can cause the loss of information. In this paper, we aim to achieve two seemingly conflicting goals for breast cancer segmentation: accurate and fast prediction. We propose a simple yet efficient framework Reinforced Auto-Zoom Net (RAZN) to tackle this task. Motivated by the zoom-in operation of a pathologist using a digital microscope, RAZN learns a policy network to decide whether zooming is required in a given region of interest. Because the zoom-in action is selective, RAZN is robust to unbalanced and noisy ground truth labels and can efficiently reduce overfitting. We evaluate our method on a public breast cancer dataset. RAZN outperforms both single-scale and multi-scale baseline approaches, achieving better accuracy at low inference cost.
Beskrivelse
Accepted manuscript version. Published version available at https://doi.org/10.1007/978-3-030-00889-5_36.