Loading…
DeCNT: Deep Deformable CNN for Table Detection
This paper presents a novel approach for the detection of tables present in documents, leveraging the potential of deep neural networks. Conventional approaches for table detection rely on heuristics that are error prone and specific to a dataset. In contrast, the presented approach harvests the pot...
Saved in:
Published in: | IEEE access 2018, Vol.6, p.74151-74161 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents a novel approach for the detection of tables present in documents, leveraging the potential of deep neural networks. Conventional approaches for table detection rely on heuristics that are error prone and specific to a dataset. In contrast, the presented approach harvests the potential of data to recognize tables of arbitrary layout. Most of the prior approaches for table detection are only applicable to PDFs, whereas, the presented approach directly works on images making it generally applicable to any format. The presented approach is based on a novel combination of deformable CNN with faster R-CNN/FPN. Conventional CNN has a fixed receptive field which is problematic for table detection since tables can be present at arbitrary scales along with arbitrary transformations (orientation). Deformable convolution conditions its receptive field on the input itself allowing it to mold its receptive field according to its input. This adaptation of the receptive field enables the network to cater for tables of arbitrary layout. We evaluated the proposed approach on two major publicly available table detection datasets: ICDAR-2013 and ICDAR-2017 POD. The presented approach was able to surpass the state-of-the-art performance on both ICDAR-2013 and ICDAR-2017 POD datasets with a F-measure of 0.994 and 0.968, respectively, indicating its effectiveness and superiority for the task of table detection. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2018.2880211 |