Loading…
Better Batch for Deep Probabilistic Time Series Forecasting
Deep probabilistic time series forecasting has gained attention for its ability to provide nonlinear approximation and valuable uncertainty quantification for decision-making. However, existing models often oversimplify the problem by assuming a time-independent error process and overlooking serial...
Saved in:
Published in: | arXiv.org 2024-10 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Vincent Zhihao Zheng Choi, Seongjin Sun, Lijun |
description | Deep probabilistic time series forecasting has gained attention for its ability to provide nonlinear approximation and valuable uncertainty quantification for decision-making. However, existing models often oversimplify the problem by assuming a time-independent error process and overlooking serial correlation. To overcome this limitation, we propose an innovative training method that incorporates error autocorrelation to enhance probabilistic forecasting accuracy. Our method constructs a mini-batch as a collection of \(D\) consecutive time series segments for model training. It explicitly learns a time-varying covariance matrix over each mini-batch, encoding error correlation among adjacent time steps. The learned covariance matrix can be used to improve prediction accuracy and enhance uncertainty quantification. We evaluate our method on two different neural forecasting models and multiple public datasets. Experimental results confirm the effectiveness of the proposed approach in improving the performance of both models across a range of datasets, resulting in notable improvements in predictive accuracy. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2820199140</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2820199140</sourcerecordid><originalsourceid>FETCH-proquest_journals_28201991403</originalsourceid><addsrcrecordid>eNqNisEKgkAUAJcgSMp_eNBZWN9qKd2spGOQd1nlWSvm2tv1__PQB3QamJmVCFCpOMoSxI0IneullHg4YpqqQJwK8p4YCu3bF3SW4UI0wZ1toxszGOdNC5V5EzyIDTkoLVOrFz0-d2Ld6cFR-ONW7Mtrdb5FE9vPTM7XvZ15XFKNGco4z-NEqv-uL7aINqs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2820199140</pqid></control><display><type>article</type><title>Better Batch for Deep Probabilistic Time Series Forecasting</title><source>Publicly Available Content (ProQuest)</source><creator>Vincent Zhihao Zheng ; Choi, Seongjin ; Sun, Lijun</creator><creatorcontrib>Vincent Zhihao Zheng ; Choi, Seongjin ; Sun, Lijun</creatorcontrib><description>Deep probabilistic time series forecasting has gained attention for its ability to provide nonlinear approximation and valuable uncertainty quantification for decision-making. However, existing models often oversimplify the problem by assuming a time-independent error process and overlooking serial correlation. To overcome this limitation, we propose an innovative training method that incorporates error autocorrelation to enhance probabilistic forecasting accuracy. Our method constructs a mini-batch as a collection of \(D\) consecutive time series segments for model training. It explicitly learns a time-varying covariance matrix over each mini-batch, encoding error correlation among adjacent time steps. The learned covariance matrix can be used to improve prediction accuracy and enhance uncertainty quantification. We evaluate our method on two different neural forecasting models and multiple public datasets. Experimental results confirm the effectiveness of the proposed approach in improving the performance of both models across a range of datasets, resulting in notable improvements in predictive accuracy.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Accuracy ; Autocorrelation ; Covariance matrix ; Decision making ; Errors ; Forecasting ; Mathematical models ; Time series ; Training ; Uncertainty</subject><ispartof>arXiv.org, 2024-10</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2820199140?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Vincent Zhihao Zheng</creatorcontrib><creatorcontrib>Choi, Seongjin</creatorcontrib><creatorcontrib>Sun, Lijun</creatorcontrib><title>Better Batch for Deep Probabilistic Time Series Forecasting</title><title>arXiv.org</title><description>Deep probabilistic time series forecasting has gained attention for its ability to provide nonlinear approximation and valuable uncertainty quantification for decision-making. However, existing models often oversimplify the problem by assuming a time-independent error process and overlooking serial correlation. To overcome this limitation, we propose an innovative training method that incorporates error autocorrelation to enhance probabilistic forecasting accuracy. Our method constructs a mini-batch as a collection of \(D\) consecutive time series segments for model training. It explicitly learns a time-varying covariance matrix over each mini-batch, encoding error correlation among adjacent time steps. The learned covariance matrix can be used to improve prediction accuracy and enhance uncertainty quantification. We evaluate our method on two different neural forecasting models and multiple public datasets. Experimental results confirm the effectiveness of the proposed approach in improving the performance of both models across a range of datasets, resulting in notable improvements in predictive accuracy.</description><subject>Accuracy</subject><subject>Autocorrelation</subject><subject>Covariance matrix</subject><subject>Decision making</subject><subject>Errors</subject><subject>Forecasting</subject><subject>Mathematical models</subject><subject>Time series</subject><subject>Training</subject><subject>Uncertainty</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNisEKgkAUAJcgSMp_eNBZWN9qKd2spGOQd1nlWSvm2tv1__PQB3QamJmVCFCpOMoSxI0IneullHg4YpqqQJwK8p4YCu3bF3SW4UI0wZ1toxszGOdNC5V5EzyIDTkoLVOrFz0-d2Ld6cFR-ONW7Mtrdb5FE9vPTM7XvZ15XFKNGco4z-NEqv-uL7aINqs</recordid><startdate>20241018</startdate><enddate>20241018</enddate><creator>Vincent Zhihao Zheng</creator><creator>Choi, Seongjin</creator><creator>Sun, Lijun</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20241018</creationdate><title>Better Batch for Deep Probabilistic Time Series Forecasting</title><author>Vincent Zhihao Zheng ; Choi, Seongjin ; Sun, Lijun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28201991403</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Autocorrelation</topic><topic>Covariance matrix</topic><topic>Decision making</topic><topic>Errors</topic><topic>Forecasting</topic><topic>Mathematical models</topic><topic>Time series</topic><topic>Training</topic><topic>Uncertainty</topic><toplevel>online_resources</toplevel><creatorcontrib>Vincent Zhihao Zheng</creatorcontrib><creatorcontrib>Choi, Seongjin</creatorcontrib><creatorcontrib>Sun, Lijun</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Vincent Zhihao Zheng</au><au>Choi, Seongjin</au><au>Sun, Lijun</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Better Batch for Deep Probabilistic Time Series Forecasting</atitle><jtitle>arXiv.org</jtitle><date>2024-10-18</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Deep probabilistic time series forecasting has gained attention for its ability to provide nonlinear approximation and valuable uncertainty quantification for decision-making. However, existing models often oversimplify the problem by assuming a time-independent error process and overlooking serial correlation. To overcome this limitation, we propose an innovative training method that incorporates error autocorrelation to enhance probabilistic forecasting accuracy. Our method constructs a mini-batch as a collection of \(D\) consecutive time series segments for model training. It explicitly learns a time-varying covariance matrix over each mini-batch, encoding error correlation among adjacent time steps. The learned covariance matrix can be used to improve prediction accuracy and enhance uncertainty quantification. We evaluate our method on two different neural forecasting models and multiple public datasets. Experimental results confirm the effectiveness of the proposed approach in improving the performance of both models across a range of datasets, resulting in notable improvements in predictive accuracy.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-10 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2820199140 |
source | Publicly Available Content (ProQuest) |
subjects | Accuracy Autocorrelation Covariance matrix Decision making Errors Forecasting Mathematical models Time series Training Uncertainty |
title | Better Batch for Deep Probabilistic Time Series Forecasting |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T02%3A11%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Better%20Batch%20for%20Deep%20Probabilistic%20Time%20Series%20Forecasting&rft.jtitle=arXiv.org&rft.au=Vincent%20Zhihao%20Zheng&rft.date=2024-10-18&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2820199140%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_28201991403%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2820199140&rft_id=info:pmid/&rfr_iscdi=true |