Loading…

Pill Box Text Identification Using DBNet-CRNN

The recognition process of natural scenes is complicated at present, and images themselves may be complex owing to the special features of natural scenes. In this study, we use the detection and recognition of pill box text as an application scenario and design a deep-learning-based text detection a...

Full description

Saved in:
Bibliographic Details
Published in:International journal of environmental research and public health 2023-02, Vol.20 (5), p.3881
Main Authors: Xiang, Liuqing, Wen, Hanyun, Zhao, Ming
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The recognition process of natural scenes is complicated at present, and images themselves may be complex owing to the special features of natural scenes. In this study, we use the detection and recognition of pill box text as an application scenario and design a deep-learning-based text detection algorithm for such natural scenes. We propose an end-to-end graphical text detection and recognition model and implement a detection system based on the B/S research application for pill box recognition, which uses DBNet as the text detection framework and a convolutional recurrent neural network (CRNN) as the text recognition framework. No prior image preprocessing is required in the detection and recognition processes. The recognition result from the back-end is returned to the front-end display. Compared with traditional methods, this recognition process reduces the complexity of preprocessing prior to image detection and improves the simplicity of the model application. Experiments on the detection and recognition of 100 pill boxes demonstrate that the proposed method achieves better accuracy in text localization and recognition results than the previous CTPN + CRNN method. The proposed method is significantly more accurate and easier to use than the traditional approach in terms of both training and recognition processes.
ISSN:1660-4601
1661-7827
1660-4601
DOI:10.3390/ijerph20053881