Loading…

Progressive Growing of Patch Size: Resource-Efficient Curriculum Learning for Dense Prediction Tasks

In this work, we introduce Progressive Growing of Patch Size, a resource-efficient implicit curriculum learning approach for dense prediction tasks. Our curriculum approach is defined by growing the patch size during model training, which gradually increases the task's difficulty. We integrated...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2024-07
Main Authors: Fischer, Stefan M, Felsner, Lina, Osuala, Richard, Kiechle, Johannes, Lang, Daniel M, Peeken, Jan C, Schnabel, Julia A
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Fischer, Stefan M
Felsner, Lina
Osuala, Richard
Kiechle, Johannes
Lang, Daniel M
Peeken, Jan C
Schnabel, Julia A
description In this work, we introduce Progressive Growing of Patch Size, a resource-efficient implicit curriculum learning approach for dense prediction tasks. Our curriculum approach is defined by growing the patch size during model training, which gradually increases the task's difficulty. We integrated our curriculum into the nnU-Net framework and evaluated the methodology on all 10 tasks of the Medical Segmentation Decathlon. With our approach, we are able to substantially reduce runtime, computational costs, and CO2 emissions of network training compared to classical constant patch size training. In our experiments, the curriculum approach resulted in improved convergence. We are able to outperform standard nnU-Net training, which is trained with constant patch size, in terms of Dice Score on 7 out of 10 MSD tasks while only spending roughly 50% of the original training runtime. To the best of our knowledge, our Progressive Growing of Patch Size is the first successful employment of a sample-length curriculum in the form of patch size in the field of computer vision. Our code is publicly available at https://github.com/compai-lab/2024-miccai-fischer.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3079558924</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3079558924</sourcerecordid><originalsourceid>FETCH-proquest_journals_30795589243</originalsourceid><addsrcrecordid>eNqNjcsKwjAQRYMgWLT_MOC6UJNWrVufCxdFu5cSJxofic40Cn69Cn6Aq7s453BbIpJKDZJxJmVHxMynNE3lcCTzXEViX5I_EDLbB8KS_NO6A3gDZd3oI2ztCyewQfaBNCZzY6y26BqYBiKrwyVcYY01uW9lPMEMHSOUhHurG-sdVDWfuSfapr4wxr_tiv5iXk1XyY38PSA3u9PnwH3QTqWjIs_HhczUf9YbH4BGaw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3079558924</pqid></control><display><type>article</type><title>Progressive Growing of Patch Size: Resource-Efficient Curriculum Learning for Dense Prediction Tasks</title><source>Publicly Available Content Database</source><creator>Fischer, Stefan M ; Felsner, Lina ; Osuala, Richard ; Kiechle, Johannes ; Lang, Daniel M ; Peeken, Jan C ; Schnabel, Julia A</creator><creatorcontrib>Fischer, Stefan M ; Felsner, Lina ; Osuala, Richard ; Kiechle, Johannes ; Lang, Daniel M ; Peeken, Jan C ; Schnabel, Julia A</creatorcontrib><description>In this work, we introduce Progressive Growing of Patch Size, a resource-efficient implicit curriculum learning approach for dense prediction tasks. Our curriculum approach is defined by growing the patch size during model training, which gradually increases the task's difficulty. We integrated our curriculum into the nnU-Net framework and evaluated the methodology on all 10 tasks of the Medical Segmentation Decathlon. With our approach, we are able to substantially reduce runtime, computational costs, and CO2 emissions of network training compared to classical constant patch size training. In our experiments, the curriculum approach resulted in improved convergence. We are able to outperform standard nnU-Net training, which is trained with constant patch size, in terms of Dice Score on 7 out of 10 MSD tasks while only spending roughly 50% of the original training runtime. To the best of our knowledge, our Progressive Growing of Patch Size is the first successful employment of a sample-length curriculum in the form of patch size in the field of computer vision. Our code is publicly available at https://github.com/compai-lab/2024-miccai-fischer.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Computer vision ; Curricula ; Learning ; Run time (computers)</subject><ispartof>arXiv.org, 2024-07</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/3079558924?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Fischer, Stefan M</creatorcontrib><creatorcontrib>Felsner, Lina</creatorcontrib><creatorcontrib>Osuala, Richard</creatorcontrib><creatorcontrib>Kiechle, Johannes</creatorcontrib><creatorcontrib>Lang, Daniel M</creatorcontrib><creatorcontrib>Peeken, Jan C</creatorcontrib><creatorcontrib>Schnabel, Julia A</creatorcontrib><title>Progressive Growing of Patch Size: Resource-Efficient Curriculum Learning for Dense Prediction Tasks</title><title>arXiv.org</title><description>In this work, we introduce Progressive Growing of Patch Size, a resource-efficient implicit curriculum learning approach for dense prediction tasks. Our curriculum approach is defined by growing the patch size during model training, which gradually increases the task's difficulty. We integrated our curriculum into the nnU-Net framework and evaluated the methodology on all 10 tasks of the Medical Segmentation Decathlon. With our approach, we are able to substantially reduce runtime, computational costs, and CO2 emissions of network training compared to classical constant patch size training. In our experiments, the curriculum approach resulted in improved convergence. We are able to outperform standard nnU-Net training, which is trained with constant patch size, in terms of Dice Score on 7 out of 10 MSD tasks while only spending roughly 50% of the original training runtime. To the best of our knowledge, our Progressive Growing of Patch Size is the first successful employment of a sample-length curriculum in the form of patch size in the field of computer vision. Our code is publicly available at https://github.com/compai-lab/2024-miccai-fischer.</description><subject>Computer vision</subject><subject>Curricula</subject><subject>Learning</subject><subject>Run time (computers)</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNjcsKwjAQRYMgWLT_MOC6UJNWrVufCxdFu5cSJxofic40Cn69Cn6Aq7s453BbIpJKDZJxJmVHxMynNE3lcCTzXEViX5I_EDLbB8KS_NO6A3gDZd3oI2ztCyewQfaBNCZzY6y26BqYBiKrwyVcYY01uW9lPMEMHSOUhHurG-sdVDWfuSfapr4wxr_tiv5iXk1XyY38PSA3u9PnwH3QTqWjIs_HhczUf9YbH4BGaw</recordid><startdate>20240711</startdate><enddate>20240711</enddate><creator>Fischer, Stefan M</creator><creator>Felsner, Lina</creator><creator>Osuala, Richard</creator><creator>Kiechle, Johannes</creator><creator>Lang, Daniel M</creator><creator>Peeken, Jan C</creator><creator>Schnabel, Julia A</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240711</creationdate><title>Progressive Growing of Patch Size: Resource-Efficient Curriculum Learning for Dense Prediction Tasks</title><author>Fischer, Stefan M ; Felsner, Lina ; Osuala, Richard ; Kiechle, Johannes ; Lang, Daniel M ; Peeken, Jan C ; Schnabel, Julia A</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30795589243</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer vision</topic><topic>Curricula</topic><topic>Learning</topic><topic>Run time (computers)</topic><toplevel>online_resources</toplevel><creatorcontrib>Fischer, Stefan M</creatorcontrib><creatorcontrib>Felsner, Lina</creatorcontrib><creatorcontrib>Osuala, Richard</creatorcontrib><creatorcontrib>Kiechle, Johannes</creatorcontrib><creatorcontrib>Lang, Daniel M</creatorcontrib><creatorcontrib>Peeken, Jan C</creatorcontrib><creatorcontrib>Schnabel, Julia A</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Fischer, Stefan M</au><au>Felsner, Lina</au><au>Osuala, Richard</au><au>Kiechle, Johannes</au><au>Lang, Daniel M</au><au>Peeken, Jan C</au><au>Schnabel, Julia A</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Progressive Growing of Patch Size: Resource-Efficient Curriculum Learning for Dense Prediction Tasks</atitle><jtitle>arXiv.org</jtitle><date>2024-07-11</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>In this work, we introduce Progressive Growing of Patch Size, a resource-efficient implicit curriculum learning approach for dense prediction tasks. Our curriculum approach is defined by growing the patch size during model training, which gradually increases the task's difficulty. We integrated our curriculum into the nnU-Net framework and evaluated the methodology on all 10 tasks of the Medical Segmentation Decathlon. With our approach, we are able to substantially reduce runtime, computational costs, and CO2 emissions of network training compared to classical constant patch size training. In our experiments, the curriculum approach resulted in improved convergence. We are able to outperform standard nnU-Net training, which is trained with constant patch size, in terms of Dice Score on 7 out of 10 MSD tasks while only spending roughly 50% of the original training runtime. To the best of our knowledge, our Progressive Growing of Patch Size is the first successful employment of a sample-length curriculum in the form of patch size in the field of computer vision. Our code is publicly available at https://github.com/compai-lab/2024-miccai-fischer.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-07
issn 2331-8422
language eng
recordid cdi_proquest_journals_3079558924
source Publicly Available Content Database
subjects Computer vision
Curricula
Learning
Run time (computers)
title Progressive Growing of Patch Size: Resource-Efficient Curriculum Learning for Dense Prediction Tasks
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T01%3A13%3A23IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Progressive%20Growing%20of%20Patch%20Size:%20Resource-Efficient%20Curriculum%20Learning%20for%20Dense%20Prediction%20Tasks&rft.jtitle=arXiv.org&rft.au=Fischer,%20Stefan%20M&rft.date=2024-07-11&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3079558924%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_30795589243%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3079558924&rft_id=info:pmid/&rfr_iscdi=true