Loading…

PP-YOLOE: An evolved version of YOLO

In this report, we present PP-YOLOE, an industrial state-of-the-art object detector with high performance and friendly deployment. We optimize on the basis of the previous PP-YOLOv2, using anchor-free paradigm, more powerful backbone and neck equipped with CSPRepResStage, ET-head and dynamic label a...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2022-12
Main Authors: Xu, Shangliang, Wang, Xinxin, Lv, Wenyu, Chang, Qinyao, Cui, Cheng, Deng, Kaipeng, Wang, Guanzhong, Dang, Qingqing, Shengyu Wei, Du, Yuning, Lai, Baohua
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Xu, Shangliang
Wang, Xinxin
Lv, Wenyu
Chang, Qinyao
Cui, Cheng
Deng, Kaipeng
Wang, Guanzhong
Dang, Qingqing
Shengyu Wei
Du, Yuning
Lai, Baohua
description In this report, we present PP-YOLOE, an industrial state-of-the-art object detector with high performance and friendly deployment. We optimize on the basis of the previous PP-YOLOv2, using anchor-free paradigm, more powerful backbone and neck equipped with CSPRepResStage, ET-head and dynamic label assignment algorithm TAL. We provide s/m/l/x models for different practice scenarios. As a result, PP-YOLOE-l achieves 51.4 mAP on COCO test-dev and 78.1 FPS on Tesla V100, yielding a remarkable improvement of (+1.9 AP, +13.35% speed up) and (+1.3 AP, +24.96% speed up), compared to the previous state-of-the-art industrial models PP-YOLOv2 and YOLOX respectively. Further, PP-YOLOE inference speed achieves 149.2 FPS with TensorRT and FP16-precision. We also conduct extensive experiments to verify the effectiveness of our designs. Source code and pre-trained models are available at https://github.com/PaddlePaddle/PaddleDetection.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2645691331</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2645691331</sourcerecordid><originalsourceid>FETCH-proquest_journals_26456913313</originalsourceid><addsrcrecordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRQCQjQjfT38Xe1UnDMU0gty88pS01RKEstKs7Mz1PIT1MASfIwsKYl5hSn8kJpbgZlN9cQZw_dgqL8wtLU4pL4rPzSojygVLyRmYmpmaUh0EJj4lQBAAvdLOo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2645691331</pqid></control><display><type>article</type><title>PP-YOLOE: An evolved version of YOLO</title><source>Publicly Available Content Database</source><creator>Xu, Shangliang ; Wang, Xinxin ; Lv, Wenyu ; Chang, Qinyao ; Cui, Cheng ; Deng, Kaipeng ; Wang, Guanzhong ; Dang, Qingqing ; Shengyu Wei ; Du, Yuning ; Lai, Baohua</creator><creatorcontrib>Xu, Shangliang ; Wang, Xinxin ; Lv, Wenyu ; Chang, Qinyao ; Cui, Cheng ; Deng, Kaipeng ; Wang, Guanzhong ; Dang, Qingqing ; Shengyu Wei ; Du, Yuning ; Lai, Baohua</creatorcontrib><description>In this report, we present PP-YOLOE, an industrial state-of-the-art object detector with high performance and friendly deployment. We optimize on the basis of the previous PP-YOLOv2, using anchor-free paradigm, more powerful backbone and neck equipped with CSPRepResStage, ET-head and dynamic label assignment algorithm TAL. We provide s/m/l/x models for different practice scenarios. As a result, PP-YOLOE-l achieves 51.4 mAP on COCO test-dev and 78.1 FPS on Tesla V100, yielding a remarkable improvement of (+1.9 AP, +13.35% speed up) and (+1.3 AP, +24.96% speed up), compared to the previous state-of-the-art industrial models PP-YOLOv2 and YOLOX respectively. Further, PP-YOLOE inference speed achieves 149.2 FPS with TensorRT and FP16-precision. We also conduct extensive experiments to verify the effectiveness of our designs. Source code and pre-trained models are available at https://github.com/PaddlePaddle/PaddleDetection.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Source code</subject><ispartof>arXiv.org, 2022-12</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2645691331?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Xu, Shangliang</creatorcontrib><creatorcontrib>Wang, Xinxin</creatorcontrib><creatorcontrib>Lv, Wenyu</creatorcontrib><creatorcontrib>Chang, Qinyao</creatorcontrib><creatorcontrib>Cui, Cheng</creatorcontrib><creatorcontrib>Deng, Kaipeng</creatorcontrib><creatorcontrib>Wang, Guanzhong</creatorcontrib><creatorcontrib>Dang, Qingqing</creatorcontrib><creatorcontrib>Shengyu Wei</creatorcontrib><creatorcontrib>Du, Yuning</creatorcontrib><creatorcontrib>Lai, Baohua</creatorcontrib><title>PP-YOLOE: An evolved version of YOLO</title><title>arXiv.org</title><description>In this report, we present PP-YOLOE, an industrial state-of-the-art object detector with high performance and friendly deployment. We optimize on the basis of the previous PP-YOLOv2, using anchor-free paradigm, more powerful backbone and neck equipped with CSPRepResStage, ET-head and dynamic label assignment algorithm TAL. We provide s/m/l/x models for different practice scenarios. As a result, PP-YOLOE-l achieves 51.4 mAP on COCO test-dev and 78.1 FPS on Tesla V100, yielding a remarkable improvement of (+1.9 AP, +13.35% speed up) and (+1.3 AP, +24.96% speed up), compared to the previous state-of-the-art industrial models PP-YOLOv2 and YOLOX respectively. Further, PP-YOLOE inference speed achieves 149.2 FPS with TensorRT and FP16-precision. We also conduct extensive experiments to verify the effectiveness of our designs. Source code and pre-trained models are available at https://github.com/PaddlePaddle/PaddleDetection.</description><subject>Algorithms</subject><subject>Source code</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRQCQjQjfT38Xe1UnDMU0gty88pS01RKEstKs7Mz1PIT1MASfIwsKYl5hSn8kJpbgZlN9cQZw_dgqL8wtLU4pL4rPzSojygVLyRmYmpmaUh0EJj4lQBAAvdLOo</recordid><startdate>20221212</startdate><enddate>20221212</enddate><creator>Xu, Shangliang</creator><creator>Wang, Xinxin</creator><creator>Lv, Wenyu</creator><creator>Chang, Qinyao</creator><creator>Cui, Cheng</creator><creator>Deng, Kaipeng</creator><creator>Wang, Guanzhong</creator><creator>Dang, Qingqing</creator><creator>Shengyu Wei</creator><creator>Du, Yuning</creator><creator>Lai, Baohua</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20221212</creationdate><title>PP-YOLOE: An evolved version of YOLO</title><author>Xu, Shangliang ; Wang, Xinxin ; Lv, Wenyu ; Chang, Qinyao ; Cui, Cheng ; Deng, Kaipeng ; Wang, Guanzhong ; Dang, Qingqing ; Shengyu Wei ; Du, Yuning ; Lai, Baohua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_26456913313</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Source code</topic><toplevel>online_resources</toplevel><creatorcontrib>Xu, Shangliang</creatorcontrib><creatorcontrib>Wang, Xinxin</creatorcontrib><creatorcontrib>Lv, Wenyu</creatorcontrib><creatorcontrib>Chang, Qinyao</creatorcontrib><creatorcontrib>Cui, Cheng</creatorcontrib><creatorcontrib>Deng, Kaipeng</creatorcontrib><creatorcontrib>Wang, Guanzhong</creatorcontrib><creatorcontrib>Dang, Qingqing</creatorcontrib><creatorcontrib>Shengyu Wei</creatorcontrib><creatorcontrib>Du, Yuning</creatorcontrib><creatorcontrib>Lai, Baohua</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xu, Shangliang</au><au>Wang, Xinxin</au><au>Lv, Wenyu</au><au>Chang, Qinyao</au><au>Cui, Cheng</au><au>Deng, Kaipeng</au><au>Wang, Guanzhong</au><au>Dang, Qingqing</au><au>Shengyu Wei</au><au>Du, Yuning</au><au>Lai, Baohua</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>PP-YOLOE: An evolved version of YOLO</atitle><jtitle>arXiv.org</jtitle><date>2022-12-12</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>In this report, we present PP-YOLOE, an industrial state-of-the-art object detector with high performance and friendly deployment. We optimize on the basis of the previous PP-YOLOv2, using anchor-free paradigm, more powerful backbone and neck equipped with CSPRepResStage, ET-head and dynamic label assignment algorithm TAL. We provide s/m/l/x models for different practice scenarios. As a result, PP-YOLOE-l achieves 51.4 mAP on COCO test-dev and 78.1 FPS on Tesla V100, yielding a remarkable improvement of (+1.9 AP, +13.35% speed up) and (+1.3 AP, +24.96% speed up), compared to the previous state-of-the-art industrial models PP-YOLOv2 and YOLOX respectively. Further, PP-YOLOE inference speed achieves 149.2 FPS with TensorRT and FP16-precision. We also conduct extensive experiments to verify the effectiveness of our designs. Source code and pre-trained models are available at https://github.com/PaddlePaddle/PaddleDetection.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2022-12
issn 2331-8422
language eng
recordid cdi_proquest_journals_2645691331
source Publicly Available Content Database
subjects Algorithms
Source code
title PP-YOLOE: An evolved version of YOLO
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-30T22%3A44%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=PP-YOLOE:%20An%20evolved%20version%20of%20YOLO&rft.jtitle=arXiv.org&rft.au=Xu,%20Shangliang&rft.date=2022-12-12&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2645691331%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_26456913313%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2645691331&rft_id=info:pmid/&rfr_iscdi=true