Loading…
Ensemble of temporal Transformers for financial time series
The accuracy of price forecasts is important for financial market trading strategies and portfolio management. Compared to traditional models such as ARIMA and other state-of-the-art deep learning techniques, temporal Transformers with similarity embedding perform better for multi-horizon forecasts...
Saved in:
Published in: | Journal of intelligent information systems 2024-08, Vol.62 (4), p.1087-1111 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c270t-74c1c5a250ed27ce589fae38cc526c6d5104cccef4b74cad0033951783692b6b3 |
container_end_page | 1111 |
container_issue | 4 |
container_start_page | 1087 |
container_title | Journal of intelligent information systems |
container_volume | 62 |
creator | Olorunnimbe, Kenniy Viktor, Herna |
description | The accuracy of price forecasts is important for financial market trading strategies and portfolio management. Compared to traditional models such as ARIMA and other state-of-the-art deep learning techniques, temporal Transformers with similarity embedding perform better for multi-horizon forecasts in financial time series, as they account for the conditional heteroscedasticity inherent in financial data. Despite this, the methods employed in generating these forecasts must be optimized to achieve the highest possible level of precision. One approach that has been shown to improve the accuracy of machine learning models is ensemble techniques. To this end, we present an ensemble approach that efficiently utilizes the available data over an extended timeframe. Our ensemble combines multiple temporal Transformer models learned within sliding windows, thereby making optimal use of the data. As combination methods, along with an averaging approach, we also introduced a stacking meta-learner that leverages a quantile estimator to determine the optimal weights for combining the base models of smaller windows. By decomposing the constituent time series of an extended timeframe, we optimize the utilization of the series for financial deep learning. This simplifies the training process of a temporal Transformer model over an extended time series while achieving better performance, particularly when accounting for the non-constant variance of financial time series. Our experiments, conducted across volatile and non-volatile extrapolation periods, using 20 companies from the Dow Jones Industrial Average show more than 40% and 60% improvement in predictive performance compared to the baseline temporal Transformer. |
doi_str_mv | 10.1007/s10844-024-00851-2 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3100997888</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3100997888</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-74c1c5a250ed27ce589fae38cc526c6d5104cccef4b74cad0033951783692b6b3</originalsourceid><addsrcrecordid>eNp9kM1LxDAQxYMoWFf_AU8Fz9FJ0jQJnmRZP2DBy3oOaTqRLv1Yk-7B_95oBW8ehneY33vDPEKuGdwyAHWXGOiqosDzgJaM8hNSMKkEVbWSp6QAwyU1Bvg5uUhpDwBG11CQ-82YcGh6LKdQzjgcpuj6chfdmMIUB4ypzFqGbnSj7_Jq7gYsE8YO0yU5C65PePWrK_L2uNmtn-n29ell_bClniuYqao889JxCdhy5VFqExwK7b3kta9byaDy3mOomoy6FkAII5nSoja8qRuxIjdL7iFOH0dMs91Pxzjmk1bk741RWutM8YXycUopYrCH2A0ufloG9rsku5Rkc0n2pyTLs0ksppTh8R3jX_Q_ri-2iWl7</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3100997888</pqid></control><display><type>article</type><title>Ensemble of temporal Transformers for financial time series</title><source>Springer Link</source><creator>Olorunnimbe, Kenniy ; Viktor, Herna</creator><creatorcontrib>Olorunnimbe, Kenniy ; Viktor, Herna</creatorcontrib><description>The accuracy of price forecasts is important for financial market trading strategies and portfolio management. Compared to traditional models such as ARIMA and other state-of-the-art deep learning techniques, temporal Transformers with similarity embedding perform better for multi-horizon forecasts in financial time series, as they account for the conditional heteroscedasticity inherent in financial data. Despite this, the methods employed in generating these forecasts must be optimized to achieve the highest possible level of precision. One approach that has been shown to improve the accuracy of machine learning models is ensemble techniques. To this end, we present an ensemble approach that efficiently utilizes the available data over an extended timeframe. Our ensemble combines multiple temporal Transformer models learned within sliding windows, thereby making optimal use of the data. As combination methods, along with an averaging approach, we also introduced a stacking meta-learner that leverages a quantile estimator to determine the optimal weights for combining the base models of smaller windows. By decomposing the constituent time series of an extended timeframe, we optimize the utilization of the series for financial deep learning. This simplifies the training process of a temporal Transformer model over an extended time series while achieving better performance, particularly when accounting for the non-constant variance of financial time series. Our experiments, conducted across volatile and non-volatile extrapolation periods, using 20 companies from the Dow Jones Industrial Average show more than 40% and 60% improvement in predictive performance compared to the baseline temporal Transformer.</description><identifier>ISSN: 0925-9902</identifier><identifier>EISSN: 1573-7675</identifier><identifier>DOI: 10.1007/s10844-024-00851-2</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Accuracy ; Artificial Intelligence ; Autoregressive models ; Computer Science ; Data Structures and Information Theory ; Deep learning ; Information Storage and Retrieval ; IT in Business ; Machine learning ; Natural Language Processing (NLP) ; Optimization ; Performance prediction ; Portfolio management ; Time series ; Transformers ; Windows (intervals)</subject><ispartof>Journal of intelligent information systems, 2024-08, Vol.62 (4), p.1087-1111</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-74c1c5a250ed27ce589fae38cc526c6d5104cccef4b74cad0033951783692b6b3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Olorunnimbe, Kenniy</creatorcontrib><creatorcontrib>Viktor, Herna</creatorcontrib><title>Ensemble of temporal Transformers for financial time series</title><title>Journal of intelligent information systems</title><addtitle>J Intell Inf Syst</addtitle><description>The accuracy of price forecasts is important for financial market trading strategies and portfolio management. Compared to traditional models such as ARIMA and other state-of-the-art deep learning techniques, temporal Transformers with similarity embedding perform better for multi-horizon forecasts in financial time series, as they account for the conditional heteroscedasticity inherent in financial data. Despite this, the methods employed in generating these forecasts must be optimized to achieve the highest possible level of precision. One approach that has been shown to improve the accuracy of machine learning models is ensemble techniques. To this end, we present an ensemble approach that efficiently utilizes the available data over an extended timeframe. Our ensemble combines multiple temporal Transformer models learned within sliding windows, thereby making optimal use of the data. As combination methods, along with an averaging approach, we also introduced a stacking meta-learner that leverages a quantile estimator to determine the optimal weights for combining the base models of smaller windows. By decomposing the constituent time series of an extended timeframe, we optimize the utilization of the series for financial deep learning. This simplifies the training process of a temporal Transformer model over an extended time series while achieving better performance, particularly when accounting for the non-constant variance of financial time series. Our experiments, conducted across volatile and non-volatile extrapolation periods, using 20 companies from the Dow Jones Industrial Average show more than 40% and 60% improvement in predictive performance compared to the baseline temporal Transformer.</description><subject>Accuracy</subject><subject>Artificial Intelligence</subject><subject>Autoregressive models</subject><subject>Computer Science</subject><subject>Data Structures and Information Theory</subject><subject>Deep learning</subject><subject>Information Storage and Retrieval</subject><subject>IT in Business</subject><subject>Machine learning</subject><subject>Natural Language Processing (NLP)</subject><subject>Optimization</subject><subject>Performance prediction</subject><subject>Portfolio management</subject><subject>Time series</subject><subject>Transformers</subject><subject>Windows (intervals)</subject><issn>0925-9902</issn><issn>1573-7675</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kM1LxDAQxYMoWFf_AU8Fz9FJ0jQJnmRZP2DBy3oOaTqRLv1Yk-7B_95oBW8ehneY33vDPEKuGdwyAHWXGOiqosDzgJaM8hNSMKkEVbWSp6QAwyU1Bvg5uUhpDwBG11CQ-82YcGh6LKdQzjgcpuj6chfdmMIUB4ypzFqGbnSj7_Jq7gYsE8YO0yU5C65PePWrK_L2uNmtn-n29ell_bClniuYqao889JxCdhy5VFqExwK7b3kta9byaDy3mOomoy6FkAII5nSoja8qRuxIjdL7iFOH0dMs91Pxzjmk1bk741RWutM8YXycUopYrCH2A0ufloG9rsku5Rkc0n2pyTLs0ksppTh8R3jX_Q_ri-2iWl7</recordid><startdate>20240801</startdate><enddate>20240801</enddate><creator>Olorunnimbe, Kenniy</creator><creator>Viktor, Herna</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20240801</creationdate><title>Ensemble of temporal Transformers for financial time series</title><author>Olorunnimbe, Kenniy ; Viktor, Herna</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-74c1c5a250ed27ce589fae38cc526c6d5104cccef4b74cad0033951783692b6b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Artificial Intelligence</topic><topic>Autoregressive models</topic><topic>Computer Science</topic><topic>Data Structures and Information Theory</topic><topic>Deep learning</topic><topic>Information Storage and Retrieval</topic><topic>IT in Business</topic><topic>Machine learning</topic><topic>Natural Language Processing (NLP)</topic><topic>Optimization</topic><topic>Performance prediction</topic><topic>Portfolio management</topic><topic>Time series</topic><topic>Transformers</topic><topic>Windows (intervals)</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Olorunnimbe, Kenniy</creatorcontrib><creatorcontrib>Viktor, Herna</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of intelligent information systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Olorunnimbe, Kenniy</au><au>Viktor, Herna</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Ensemble of temporal Transformers for financial time series</atitle><jtitle>Journal of intelligent information systems</jtitle><stitle>J Intell Inf Syst</stitle><date>2024-08-01</date><risdate>2024</risdate><volume>62</volume><issue>4</issue><spage>1087</spage><epage>1111</epage><pages>1087-1111</pages><issn>0925-9902</issn><eissn>1573-7675</eissn><abstract>The accuracy of price forecasts is important for financial market trading strategies and portfolio management. Compared to traditional models such as ARIMA and other state-of-the-art deep learning techniques, temporal Transformers with similarity embedding perform better for multi-horizon forecasts in financial time series, as they account for the conditional heteroscedasticity inherent in financial data. Despite this, the methods employed in generating these forecasts must be optimized to achieve the highest possible level of precision. One approach that has been shown to improve the accuracy of machine learning models is ensemble techniques. To this end, we present an ensemble approach that efficiently utilizes the available data over an extended timeframe. Our ensemble combines multiple temporal Transformer models learned within sliding windows, thereby making optimal use of the data. As combination methods, along with an averaging approach, we also introduced a stacking meta-learner that leverages a quantile estimator to determine the optimal weights for combining the base models of smaller windows. By decomposing the constituent time series of an extended timeframe, we optimize the utilization of the series for financial deep learning. This simplifies the training process of a temporal Transformer model over an extended time series while achieving better performance, particularly when accounting for the non-constant variance of financial time series. Our experiments, conducted across volatile and non-volatile extrapolation periods, using 20 companies from the Dow Jones Industrial Average show more than 40% and 60% improvement in predictive performance compared to the baseline temporal Transformer.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10844-024-00851-2</doi><tpages>25</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0925-9902 |
ispartof | Journal of intelligent information systems, 2024-08, Vol.62 (4), p.1087-1111 |
issn | 0925-9902 1573-7675 |
language | eng |
recordid | cdi_proquest_journals_3100997888 |
source | Springer Link |
subjects | Accuracy Artificial Intelligence Autoregressive models Computer Science Data Structures and Information Theory Deep learning Information Storage and Retrieval IT in Business Machine learning Natural Language Processing (NLP) Optimization Performance prediction Portfolio management Time series Transformers Windows (intervals) |
title | Ensemble of temporal Transformers for financial time series |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T18%3A01%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Ensemble%20of%20temporal%20Transformers%20for%20financial%20time%20series&rft.jtitle=Journal%20of%20intelligent%20information%20systems&rft.au=Olorunnimbe,%20Kenniy&rft.date=2024-08-01&rft.volume=62&rft.issue=4&rft.spage=1087&rft.epage=1111&rft.pages=1087-1111&rft.issn=0925-9902&rft.eissn=1573-7675&rft_id=info:doi/10.1007/s10844-024-00851-2&rft_dat=%3Cproquest_cross%3E3100997888%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c270t-74c1c5a250ed27ce589fae38cc526c6d5104cccef4b74cad0033951783692b6b3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3100997888&rft_id=info:pmid/&rfr_iscdi=true |