Loading…
Practical Implementation of Real-Time Waste Detection and Recycling based on Deep Learning for Delta Parallel Robot
Intelligent robots play an essential role in waste management and recycling due to their high speed and a wide variety of applications. In this paper, two methods for waste detection and accurate pick-and-place based on computer vision and neural networks are presented. The suggested methods have be...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 252 |
container_issue | |
container_start_page | 246 |
container_title | |
container_volume | |
creator | Jalali, Hasan Garjani, Shaya Kalhor, Ahmad Masouleh, Mehdi Tale Yousefi, Parisa |
description | Intelligent robots play an essential role in waste management and recycling due to their high speed and a wide variety of applications. In this paper, two methods for waste detection and accurate pick-and-place based on computer vision and neural networks are presented. The suggested methods have been put into practical application on a 3-DOF Delta parallel robot to show the accuracy and fastness of the foregoing method for real intelligence systems. The first method, Multi-Stage Detection, consists of two stages to detect the waste objects, namely, object localization and segmentation, and classification. The second method, known as One-Stage object detectors, such as YOLOv5, has the capability to simultaneously localize and classify the waste objects. The dataset utilized in this paper relies on the TrashNet dataset as its foundation. In order to improve the classification capabilities in the multi-stage method, a larger dataset was created by utilizing data augmentation. Also, for the one-stage method, a new multi-label dataset is constructed based on the TrashNet dataset. Additionally, the results of the experimental implementation were compared based on time and evaluation metrics for detection and classification. The ResNet50 model achieved the highest accuracy in the multi-stage method, with 99.31% accuracy. In the one-stage detection method, the YOLOv5x model achieved the best mAP (@IoU = 0.75) of 97.4%, which outperformed the YOLOv5s model by 0.8 percent; however, the inference speed of the YOLOv5x in comparison with the YOLOv5s models was six times as slow. Therefore, the YOLOv5s model was employed in real-time online waste detection, which resulted in 82.1% mAP (@IoU = 0.5) after being trained on real images from the waste-sorting platform. |
doi_str_mv | 10.1109/ICCKE60553.2023.10326225 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10326225</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10326225</ieee_id><sourcerecordid>10326225</sourcerecordid><originalsourceid>FETCH-LOGICAL-i119t-9ff7e0e85343a939feb0953c0e0cbfdcface02ada5aa0a2d740925f4e2990deb3</originalsourceid><addsrcrecordid>eNo1kM1Kw0AURkdBsNS-gYt5gdQ7czNNZylp1WLAUiq6KzeTOzIySUoym7699W_1wTlwFp8QUsFcKbB3m7J8Xi_AGJxr0DhXgHqhtbkQM1vYJRpABGXUpZjoRY6ZLuz7tZiN4ycAoCoKs9QTMW4Hcik4inLTHiO33CVKoe9k7-WOKWb70LJ8ozGxXHFi9yOpa87WnVwM3YesaeRGnvGK-SgrpqH7xr4fziQmklsaKEaOctfXfboRV57iyLO_nYrXh_W-fMqql8dNeV9lQSmbMut9wcBLgzmSReu5BmvQAYOrfeM8OQZNDRkiIN0UOVhtfM7aWmi4xqm4_e0GZj4ch9DScDr8_4Rfvx1e5g</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Practical Implementation of Real-Time Waste Detection and Recycling based on Deep Learning for Delta Parallel Robot</title><source>IEEE Xplore All Conference Series</source><creator>Jalali, Hasan ; Garjani, Shaya ; Kalhor, Ahmad ; Masouleh, Mehdi Tale ; Yousefi, Parisa</creator><creatorcontrib>Jalali, Hasan ; Garjani, Shaya ; Kalhor, Ahmad ; Masouleh, Mehdi Tale ; Yousefi, Parisa</creatorcontrib><description>Intelligent robots play an essential role in waste management and recycling due to their high speed and a wide variety of applications. In this paper, two methods for waste detection and accurate pick-and-place based on computer vision and neural networks are presented. The suggested methods have been put into practical application on a 3-DOF Delta parallel robot to show the accuracy and fastness of the foregoing method for real intelligence systems. The first method, Multi-Stage Detection, consists of two stages to detect the waste objects, namely, object localization and segmentation, and classification. The second method, known as One-Stage object detectors, such as YOLOv5, has the capability to simultaneously localize and classify the waste objects. The dataset utilized in this paper relies on the TrashNet dataset as its foundation. In order to improve the classification capabilities in the multi-stage method, a larger dataset was created by utilizing data augmentation. Also, for the one-stage method, a new multi-label dataset is constructed based on the TrashNet dataset. Additionally, the results of the experimental implementation were compared based on time and evaluation metrics for detection and classification. The ResNet50 model achieved the highest accuracy in the multi-stage method, with 99.31% accuracy. In the one-stage detection method, the YOLOv5x model achieved the best mAP (@IoU = 0.75) of 97.4%, which outperformed the YOLOv5s model by 0.8 percent; however, the inference speed of the YOLOv5x in comparison with the YOLOv5s models was six times as slow. Therefore, the YOLOv5s model was employed in real-time online waste detection, which resulted in 82.1% mAP (@IoU = 0.5) after being trained on real images from the waste-sorting platform.</description><identifier>EISSN: 2643-279X</identifier><identifier>EISBN: 9798350330151</identifier><identifier>DOI: 10.1109/ICCKE60553.2023.10326225</identifier><language>eng</language><publisher>IEEE</publisher><subject>Deep Learning ; Delta Parallel Robot ; Detectors ; Location awareness ; Measurement ; Neural Networks ; Parallel robots ; Real-time systems ; Reinforcement learning ; Waste Classification ; Waste Detection ; Waste management</subject><ispartof>2023 13th International Conference on Computer and Knowledge Engineering (ICCKE), 2023, p.246-252</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10326225$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27923,54553,54930</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10326225$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Jalali, Hasan</creatorcontrib><creatorcontrib>Garjani, Shaya</creatorcontrib><creatorcontrib>Kalhor, Ahmad</creatorcontrib><creatorcontrib>Masouleh, Mehdi Tale</creatorcontrib><creatorcontrib>Yousefi, Parisa</creatorcontrib><title>Practical Implementation of Real-Time Waste Detection and Recycling based on Deep Learning for Delta Parallel Robot</title><title>2023 13th International Conference on Computer and Knowledge Engineering (ICCKE)</title><addtitle>ICCKE</addtitle><description>Intelligent robots play an essential role in waste management and recycling due to their high speed and a wide variety of applications. In this paper, two methods for waste detection and accurate pick-and-place based on computer vision and neural networks are presented. The suggested methods have been put into practical application on a 3-DOF Delta parallel robot to show the accuracy and fastness of the foregoing method for real intelligence systems. The first method, Multi-Stage Detection, consists of two stages to detect the waste objects, namely, object localization and segmentation, and classification. The second method, known as One-Stage object detectors, such as YOLOv5, has the capability to simultaneously localize and classify the waste objects. The dataset utilized in this paper relies on the TrashNet dataset as its foundation. In order to improve the classification capabilities in the multi-stage method, a larger dataset was created by utilizing data augmentation. Also, for the one-stage method, a new multi-label dataset is constructed based on the TrashNet dataset. Additionally, the results of the experimental implementation were compared based on time and evaluation metrics for detection and classification. The ResNet50 model achieved the highest accuracy in the multi-stage method, with 99.31% accuracy. In the one-stage detection method, the YOLOv5x model achieved the best mAP (@IoU = 0.75) of 97.4%, which outperformed the YOLOv5s model by 0.8 percent; however, the inference speed of the YOLOv5x in comparison with the YOLOv5s models was six times as slow. Therefore, the YOLOv5s model was employed in real-time online waste detection, which resulted in 82.1% mAP (@IoU = 0.5) after being trained on real images from the waste-sorting platform.</description><subject>Deep Learning</subject><subject>Delta Parallel Robot</subject><subject>Detectors</subject><subject>Location awareness</subject><subject>Measurement</subject><subject>Neural Networks</subject><subject>Parallel robots</subject><subject>Real-time systems</subject><subject>Reinforcement learning</subject><subject>Waste Classification</subject><subject>Waste Detection</subject><subject>Waste management</subject><issn>2643-279X</issn><isbn>9798350330151</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2023</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1kM1Kw0AURkdBsNS-gYt5gdQ7czNNZylp1WLAUiq6KzeTOzIySUoym7699W_1wTlwFp8QUsFcKbB3m7J8Xi_AGJxr0DhXgHqhtbkQM1vYJRpABGXUpZjoRY6ZLuz7tZiN4ycAoCoKs9QTMW4Hcik4inLTHiO33CVKoe9k7-WOKWb70LJ8ozGxXHFi9yOpa87WnVwM3YesaeRGnvGK-SgrpqH7xr4fziQmklsaKEaOctfXfboRV57iyLO_nYrXh_W-fMqql8dNeV9lQSmbMut9wcBLgzmSReu5BmvQAYOrfeM8OQZNDRkiIN0UOVhtfM7aWmi4xqm4_e0GZj4ch9DScDr8_4Rfvx1e5g</recordid><startdate>20231101</startdate><enddate>20231101</enddate><creator>Jalali, Hasan</creator><creator>Garjani, Shaya</creator><creator>Kalhor, Ahmad</creator><creator>Masouleh, Mehdi Tale</creator><creator>Yousefi, Parisa</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>20231101</creationdate><title>Practical Implementation of Real-Time Waste Detection and Recycling based on Deep Learning for Delta Parallel Robot</title><author>Jalali, Hasan ; Garjani, Shaya ; Kalhor, Ahmad ; Masouleh, Mehdi Tale ; Yousefi, Parisa</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i119t-9ff7e0e85343a939feb0953c0e0cbfdcface02ada5aa0a2d740925f4e2990deb3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Deep Learning</topic><topic>Delta Parallel Robot</topic><topic>Detectors</topic><topic>Location awareness</topic><topic>Measurement</topic><topic>Neural Networks</topic><topic>Parallel robots</topic><topic>Real-time systems</topic><topic>Reinforcement learning</topic><topic>Waste Classification</topic><topic>Waste Detection</topic><topic>Waste management</topic><toplevel>online_resources</toplevel><creatorcontrib>Jalali, Hasan</creatorcontrib><creatorcontrib>Garjani, Shaya</creatorcontrib><creatorcontrib>Kalhor, Ahmad</creatorcontrib><creatorcontrib>Masouleh, Mehdi Tale</creatorcontrib><creatorcontrib>Yousefi, Parisa</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore (IEEE/IET Electronic Library - IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Jalali, Hasan</au><au>Garjani, Shaya</au><au>Kalhor, Ahmad</au><au>Masouleh, Mehdi Tale</au><au>Yousefi, Parisa</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Practical Implementation of Real-Time Waste Detection and Recycling based on Deep Learning for Delta Parallel Robot</atitle><btitle>2023 13th International Conference on Computer and Knowledge Engineering (ICCKE)</btitle><stitle>ICCKE</stitle><date>2023-11-01</date><risdate>2023</risdate><spage>246</spage><epage>252</epage><pages>246-252</pages><eissn>2643-279X</eissn><eisbn>9798350330151</eisbn><abstract>Intelligent robots play an essential role in waste management and recycling due to their high speed and a wide variety of applications. In this paper, two methods for waste detection and accurate pick-and-place based on computer vision and neural networks are presented. The suggested methods have been put into practical application on a 3-DOF Delta parallel robot to show the accuracy and fastness of the foregoing method for real intelligence systems. The first method, Multi-Stage Detection, consists of two stages to detect the waste objects, namely, object localization and segmentation, and classification. The second method, known as One-Stage object detectors, such as YOLOv5, has the capability to simultaneously localize and classify the waste objects. The dataset utilized in this paper relies on the TrashNet dataset as its foundation. In order to improve the classification capabilities in the multi-stage method, a larger dataset was created by utilizing data augmentation. Also, for the one-stage method, a new multi-label dataset is constructed based on the TrashNet dataset. Additionally, the results of the experimental implementation were compared based on time and evaluation metrics for detection and classification. The ResNet50 model achieved the highest accuracy in the multi-stage method, with 99.31% accuracy. In the one-stage detection method, the YOLOv5x model achieved the best mAP (@IoU = 0.75) of 97.4%, which outperformed the YOLOv5s model by 0.8 percent; however, the inference speed of the YOLOv5x in comparison with the YOLOv5s models was six times as slow. Therefore, the YOLOv5s model was employed in real-time online waste detection, which resulted in 82.1% mAP (@IoU = 0.5) after being trained on real images from the waste-sorting platform.</abstract><pub>IEEE</pub><doi>10.1109/ICCKE60553.2023.10326225</doi><tpages>7</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | EISSN: 2643-279X |
ispartof | 2023 13th International Conference on Computer and Knowledge Engineering (ICCKE), 2023, p.246-252 |
issn | 2643-279X |
language | eng |
recordid | cdi_ieee_primary_10326225 |
source | IEEE Xplore All Conference Series |
subjects | Deep Learning Delta Parallel Robot Detectors Location awareness Measurement Neural Networks Parallel robots Real-time systems Reinforcement learning Waste Classification Waste Detection Waste management |
title | Practical Implementation of Real-Time Waste Detection and Recycling based on Deep Learning for Delta Parallel Robot |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T00%3A52%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Practical%20Implementation%20of%20Real-Time%20Waste%20Detection%20and%20Recycling%20based%20on%20Deep%20Learning%20for%20Delta%20Parallel%20Robot&rft.btitle=2023%2013th%20International%20Conference%20on%20Computer%20and%20Knowledge%20Engineering%20(ICCKE)&rft.au=Jalali,%20Hasan&rft.date=2023-11-01&rft.spage=246&rft.epage=252&rft.pages=246-252&rft.eissn=2643-279X&rft_id=info:doi/10.1109/ICCKE60553.2023.10326225&rft.eisbn=9798350330151&rft_dat=%3Cieee_CHZPO%3E10326225%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i119t-9ff7e0e85343a939feb0953c0e0cbfdcface02ada5aa0a2d740925f4e2990deb3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10326225&rfr_iscdi=true |