Loading…
Fast Training of Object Detection Using Stochastic Gradient Descent
Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iterat...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 427 |
container_issue | |
container_start_page | 424 |
container_title | |
container_volume | |
creator | Wijnhoven, R G J de With, P H N |
description | Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum. |
doi_str_mv | 10.1109/ICPR.2010.112 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_5597822</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5597822</ieee_id><sourcerecordid>5597822</sourcerecordid><originalsourceid>FETCH-LOGICAL-i1952-159883e3b0ae5e4a71653890bd3bd412af37a3050b7e25ff865d6106e914e4133</originalsourceid><addsrcrecordid>eNo1kMtOwzAURM1LIpQsWbHxD6T42r52vESBlkqViqBdV3ZyA0aQoDgb_p6Ux-poNEezGMauQMwBhLtZVY9Pcyl-ojxiubMlaKm1RQ36mGWyVFDYKZ6wi_9CylOWgUAotEE4Z3lKMQhprLGImLFq4dPIt4OPXexeeN_yTXijeuR3NE6Ifcd36dA8j339Ormx5svBN5G6g5PqiZfsrPXvifI_zthucb-tHor1ZrmqbtdFBIeyAHRlqUgF4QlJewsGVelEaFRoNEjfKuuVQBEsSWzb0mBjQBhyoEmDUjN2_bsbiWj_OcQPP3ztEacfpFTflyhM-A</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Fast Training of Object Detection Using Stochastic Gradient Descent</title><source>IEEE Xplore All Conference Series</source><creator>Wijnhoven, R G J ; de With, P H N</creator><creatorcontrib>Wijnhoven, R G J ; de With, P H N</creatorcontrib><description>Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum.</description><identifier>ISSN: 1051-4651</identifier><identifier>ISBN: 1424475422</identifier><identifier>ISBN: 9781424475421</identifier><identifier>EISSN: 2831-7475</identifier><identifier>EISBN: 9781424475414</identifier><identifier>EISBN: 9780769541099</identifier><identifier>EISBN: 1424475414</identifier><identifier>EISBN: 0769541097</identifier><identifier>DOI: 10.1109/ICPR.2010.112</identifier><language>eng</language><publisher>IEEE</publisher><subject>classification ; Computer vision ; detection ; Feature extraction ; histogram of oriented gradients ; HOG ; Object detection ; object recognition ; Optimization ; Pattern recognition ; stochastic gradient descent ; Support vector machines ; SVM ; Training</subject><ispartof>2010 20th International Conference on Pattern Recognition, 2010, p.424-427</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5597822$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,27925,54555,54920,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5597822$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Wijnhoven, R G J</creatorcontrib><creatorcontrib>de With, P H N</creatorcontrib><title>Fast Training of Object Detection Using Stochastic Gradient Descent</title><title>2010 20th International Conference on Pattern Recognition</title><addtitle>ICPR</addtitle><description>Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum.</description><subject>classification</subject><subject>Computer vision</subject><subject>detection</subject><subject>Feature extraction</subject><subject>histogram of oriented gradients</subject><subject>HOG</subject><subject>Object detection</subject><subject>object recognition</subject><subject>Optimization</subject><subject>Pattern recognition</subject><subject>stochastic gradient descent</subject><subject>Support vector machines</subject><subject>SVM</subject><subject>Training</subject><issn>1051-4651</issn><issn>2831-7475</issn><isbn>1424475422</isbn><isbn>9781424475421</isbn><isbn>9781424475414</isbn><isbn>9780769541099</isbn><isbn>1424475414</isbn><isbn>0769541097</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2010</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1kMtOwzAURM1LIpQsWbHxD6T42r52vESBlkqViqBdV3ZyA0aQoDgb_p6Ux-poNEezGMauQMwBhLtZVY9Pcyl-ojxiubMlaKm1RQ36mGWyVFDYKZ6wi_9CylOWgUAotEE4Z3lKMQhprLGImLFq4dPIt4OPXexeeN_yTXijeuR3NE6Ifcd36dA8j339Ormx5svBN5G6g5PqiZfsrPXvifI_zthucb-tHor1ZrmqbtdFBIeyAHRlqUgF4QlJewsGVelEaFRoNEjfKuuVQBEsSWzb0mBjQBhyoEmDUjN2_bsbiWj_OcQPP3ztEacfpFTflyhM-A</recordid><startdate>201008</startdate><enddate>201008</enddate><creator>Wijnhoven, R G J</creator><creator>de With, P H N</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201008</creationdate><title>Fast Training of Object Detection Using Stochastic Gradient Descent</title><author>Wijnhoven, R G J ; de With, P H N</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i1952-159883e3b0ae5e4a71653890bd3bd412af37a3050b7e25ff865d6106e914e4133</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2010</creationdate><topic>classification</topic><topic>Computer vision</topic><topic>detection</topic><topic>Feature extraction</topic><topic>histogram of oriented gradients</topic><topic>HOG</topic><topic>Object detection</topic><topic>object recognition</topic><topic>Optimization</topic><topic>Pattern recognition</topic><topic>stochastic gradient descent</topic><topic>Support vector machines</topic><topic>SVM</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Wijnhoven, R G J</creatorcontrib><creatorcontrib>de With, P H N</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wijnhoven, R G J</au><au>de With, P H N</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Fast Training of Object Detection Using Stochastic Gradient Descent</atitle><btitle>2010 20th International Conference on Pattern Recognition</btitle><stitle>ICPR</stitle><date>2010-08</date><risdate>2010</risdate><spage>424</spage><epage>427</epage><pages>424-427</pages><issn>1051-4651</issn><eissn>2831-7475</eissn><isbn>1424475422</isbn><isbn>9781424475421</isbn><eisbn>9781424475414</eisbn><eisbn>9780769541099</eisbn><eisbn>1424475414</eisbn><eisbn>0769541097</eisbn><abstract>Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum.</abstract><pub>IEEE</pub><doi>10.1109/ICPR.2010.112</doi><tpages>4</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1051-4651 |
ispartof | 2010 20th International Conference on Pattern Recognition, 2010, p.424-427 |
issn | 1051-4651 2831-7475 |
language | eng |
recordid | cdi_ieee_primary_5597822 |
source | IEEE Xplore All Conference Series |
subjects | classification Computer vision detection Feature extraction histogram of oriented gradients HOG Object detection object recognition Optimization Pattern recognition stochastic gradient descent Support vector machines SVM Training |
title | Fast Training of Object Detection Using Stochastic Gradient Descent |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T22%3A07%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Fast%20Training%20of%20Object%20Detection%20Using%20Stochastic%20Gradient%20Descent&rft.btitle=2010%2020th%20International%20Conference%20on%20Pattern%20Recognition&rft.au=Wijnhoven,%20R%20G%20J&rft.date=2010-08&rft.spage=424&rft.epage=427&rft.pages=424-427&rft.issn=1051-4651&rft.eissn=2831-7475&rft.isbn=1424475422&rft.isbn_list=9781424475421&rft_id=info:doi/10.1109/ICPR.2010.112&rft.eisbn=9781424475414&rft.eisbn_list=9780769541099&rft.eisbn_list=1424475414&rft.eisbn_list=0769541097&rft_dat=%3Cieee_CHZPO%3E5597822%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i1952-159883e3b0ae5e4a71653890bd3bd412af37a3050b7e25ff865d6106e914e4133%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5597822&rfr_iscdi=true |