Loading…

Multiorientation scene text detection via coarse-to-fine supervision-based convolutional networks

Text detection in natural scenes has long been an open challenge and a lot of approaches have been presented, in which the deep learning-based methods have achieved state-of-the-art performance. However, most of them merely use coarse-level supervision information, limiting the detection effectivene...

Full description

Saved in:
Bibliographic Details
Published in:Journal of electronic imaging 2018-05, Vol.27 (3), p.033032-033032
Main Authors: Wang, Xihan, Xia, Zhaoqiang, Peng, Jinye, Feng, Xiaoyi
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Text detection in natural scenes has long been an open challenge and a lot of approaches have been presented, in which the deep learning-based methods have achieved state-of-the-art performance. However, most of them merely use coarse-level supervision information, limiting the detection effectiveness. We propose a deep method utilizing coarse-to-fine supervisions for multiorientation scene text detection. The coarse-to-fine supervisions are generated in three levels: coarse text region (TR), text central line, and fine character shape. With these multiple supervisions, the multiscale feature pyramids and deeply supervised nets are integrated in a unified architecture, and the corresponding convolutional kernels are learned jointly. An effective top-down pipeline is developed to obtain more precise text segmentation regions and their relationship from coarse TR. In addition, the proposed method can handle texts in multiple orientations and languages. Four public datasets, i.e., ICDAR2013, MSRA-TD500, USTB, and street view text dataset, are used to evaluate the performance of our proposed method. The experimental results show that our method achieves the state-of-the-art performance.
ISSN:1017-9909
1560-229X
DOI:10.1117/1.JEI.27.3.033032