Loading…
Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification
Multi-lingual script identification is a difficult task consisting of different language with complex backgrounds in scene text images. According to the current research scenario, deep neural networks are employed as teacher models to train a smaller student network by utilizing the teacher model...
Saved in:
Published in: | arXiv.org 2021-02 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Shuvayan Ghosh Dastidar Dutta, Kalpita Das, Nibaran Kundu, Mahantapas Nasipuri, Mita |
description | Multi-lingual script identification is a difficult task consisting of different language with complex backgrounds in scene text images. According to the current research scenario, deep neural networks are employed as teacher models to train a smaller student network by utilizing the teacher model's predictions. This process is known as dark knowledge transfer. It has been quite successful in many domains where the final result obtained is unachievable through directly training the student network with a simple architecture. In this paper, we explore dark knowledge transfer approach using long short-term memory(LSTM) and CNN based assistant model and various deep neural networks as the teacher model, with a simple CNN based student network, in this domain of multi-script identification from natural scene text images. We explore the performance of different teacher models and their ability to transfer knowledge to a student network. Although the small student network's limited size, our approach obtains satisfactory results on a well-known script identification dataset CVSI-2015. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2492477813</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2492477813</sourcerecordid><originalsourceid>FETCH-proquest_journals_24924778133</originalsourceid><addsrcrecordid>eNqNi0ELgjAYQEcQJOV_-KCzoJumndMIoi516SSim3w2trVN7Ocn0Q_o9A7vvQUJKGNJVKSUrkjo3BDHMd3lNMtYQB7V20htUfVwVnqSvOs5lOg8Stl41Aq0gAZKzg1c-WgbOcNP2j5BaAuXUXqMbq1F4wE7rjwKbL_jhixFIx0Pf1yT7bG6H06Rsfo1cufrQY9Wzaqm6Z6meV4kjP1XfQD4vkKH</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2492477813</pqid></control><display><type>article</type><title>Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification</title><source>Publicly Available Content Database</source><creator>Shuvayan Ghosh Dastidar ; Dutta, Kalpita ; Das, Nibaran ; Kundu, Mahantapas ; Nasipuri, Mita</creator><creatorcontrib>Shuvayan Ghosh Dastidar ; Dutta, Kalpita ; Das, Nibaran ; Kundu, Mahantapas ; Nasipuri, Mita</creatorcontrib><description>Multi-lingual script identification is a difficult task consisting of different language with complex backgrounds in scene text images. According to the current research scenario, deep neural networks are employed as teacher models to train a smaller student network by utilizing the teacher model's predictions. This process is known as dark knowledge transfer. It has been quite successful in many domains where the final result obtained is unachievable through directly training the student network with a simple architecture. In this paper, we explore dark knowledge transfer approach using long short-term memory(LSTM) and CNN based assistant model and various deep neural networks as the teacher model, with a simple CNN based student network, in this domain of multi-script identification from natural scene text images. We explore the performance of different teacher models and their ability to transfer knowledge to a student network. Although the small student network's limited size, our approach obtains satisfactory results on a well-known script identification dataset CVSI-2015.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Artificial neural networks ; Distillation ; Domains ; Knowledge management ; Neural networks ; Teachers</subject><ispartof>arXiv.org, 2021-02</ispartof><rights>2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2492477813?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml></links><search><creatorcontrib>Shuvayan Ghosh Dastidar</creatorcontrib><creatorcontrib>Dutta, Kalpita</creatorcontrib><creatorcontrib>Das, Nibaran</creatorcontrib><creatorcontrib>Kundu, Mahantapas</creatorcontrib><creatorcontrib>Nasipuri, Mita</creatorcontrib><title>Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification</title><title>arXiv.org</title><description>Multi-lingual script identification is a difficult task consisting of different language with complex backgrounds in scene text images. According to the current research scenario, deep neural networks are employed as teacher models to train a smaller student network by utilizing the teacher model's predictions. This process is known as dark knowledge transfer. It has been quite successful in many domains where the final result obtained is unachievable through directly training the student network with a simple architecture. In this paper, we explore dark knowledge transfer approach using long short-term memory(LSTM) and CNN based assistant model and various deep neural networks as the teacher model, with a simple CNN based student network, in this domain of multi-script identification from natural scene text images. We explore the performance of different teacher models and their ability to transfer knowledge to a student network. Although the small student network's limited size, our approach obtains satisfactory results on a well-known script identification dataset CVSI-2015.</description><subject>Artificial neural networks</subject><subject>Distillation</subject><subject>Domains</subject><subject>Knowledge management</subject><subject>Neural networks</subject><subject>Teachers</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNi0ELgjAYQEcQJOV_-KCzoJumndMIoi516SSim3w2trVN7Ocn0Q_o9A7vvQUJKGNJVKSUrkjo3BDHMd3lNMtYQB7V20htUfVwVnqSvOs5lOg8Stl41Aq0gAZKzg1c-WgbOcNP2j5BaAuXUXqMbq1F4wE7rjwKbL_jhixFIx0Pf1yT7bG6H06Rsfo1cufrQY9Wzaqm6Z6meV4kjP1XfQD4vkKH</recordid><startdate>20210220</startdate><enddate>20210220</enddate><creator>Shuvayan Ghosh Dastidar</creator><creator>Dutta, Kalpita</creator><creator>Das, Nibaran</creator><creator>Kundu, Mahantapas</creator><creator>Nasipuri, Mita</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PKEHL</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20210220</creationdate><title>Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification</title><author>Shuvayan Ghosh Dastidar ; Dutta, Kalpita ; Das, Nibaran ; Kundu, Mahantapas ; Nasipuri, Mita</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_24924778133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial neural networks</topic><topic>Distillation</topic><topic>Domains</topic><topic>Knowledge management</topic><topic>Neural networks</topic><topic>Teachers</topic><toplevel>online_resources</toplevel><creatorcontrib>Shuvayan Ghosh Dastidar</creatorcontrib><creatorcontrib>Dutta, Kalpita</creatorcontrib><creatorcontrib>Das, Nibaran</creatorcontrib><creatorcontrib>Kundu, Mahantapas</creatorcontrib><creatorcontrib>Nasipuri, Mita</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Databases</collection><collection>Technology collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection (ProQuest)</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shuvayan Ghosh Dastidar</au><au>Dutta, Kalpita</au><au>Das, Nibaran</au><au>Kundu, Mahantapas</au><au>Nasipuri, Mita</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification</atitle><jtitle>arXiv.org</jtitle><date>2021-02-20</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>Multi-lingual script identification is a difficult task consisting of different language with complex backgrounds in scene text images. According to the current research scenario, deep neural networks are employed as teacher models to train a smaller student network by utilizing the teacher model's predictions. This process is known as dark knowledge transfer. It has been quite successful in many domains where the final result obtained is unachievable through directly training the student network with a simple architecture. In this paper, we explore dark knowledge transfer approach using long short-term memory(LSTM) and CNN based assistant model and various deep neural networks as the teacher model, with a simple CNN based student network, in this domain of multi-script identification from natural scene text images. We explore the performance of different teacher models and their ability to transfer knowledge to a student network. Although the small student network's limited size, our approach obtains satisfactory results on a well-known script identification dataset CVSI-2015.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2021-02 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2492477813 |
source | Publicly Available Content Database |
subjects | Artificial neural networks Distillation Domains Knowledge management Neural networks Teachers |
title | Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-03-06T03%3A52%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Exploring%20Knowledge%20Distillation%20of%20a%20Deep%20Neural%20Network%20for%20Multi-Script%20identification&rft.jtitle=arXiv.org&rft.au=Shuvayan%20Ghosh%20Dastidar&rft.date=2021-02-20&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2492477813%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_24924778133%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2492477813&rft_id=info:pmid/&rfr_iscdi=true |