Loading…
Towards efficient similarity embedded temporal Transformers via extended timeframe analysis
Price prediction remains a crucial aspect of financial market research as it forms the basis for various trading strategies and portfolio management techniques. However, traditional models such as ARIMA are not effective for multi-horizon forecasting, and current deep learning approaches do not take...
Saved in:
Published in: | Complex & intelligent systems 2024-08, Vol.10 (4), p.4793-4815 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c453t-9bdb5c8a9175a00aab87275aad297372e4653a91c371c3109e645d5a9a165e853 |
container_end_page | 4815 |
container_issue | 4 |
container_start_page | 4793 |
container_title | Complex & intelligent systems |
container_volume | 10 |
creator | Olorunnimbe, Kenniy Viktor, Herna |
description | Price prediction remains a crucial aspect of financial market research as it forms the basis for various trading strategies and portfolio management techniques. However, traditional models such as ARIMA are not effective for multi-horizon forecasting, and current deep learning approaches do not take into account the conditional heteroscedasticity of financial market time series. In this work, we introduce the similarity embedded temporal Transformer (SeTT) algorithms, which extend the state-of-the-art temporal Transformer architecture. These algorithms utilise historical trends in financial time series, as well as statistical principles, to enhance forecasting performance. We conducted a thorough analysis of various hyperparameters including learning rate, local window size, and the choice of similarity function in this extension of the study in a bid to get optimal model performance. We also experimented over an extended timeframe, which allowed us to more accurately assess the performance of the models in different market conditions and across different lengths of time. Overall, our results show that SeTT provides improved performance for financial market prediction, as it outperforms both classical financial models and state-of-the-art deep learning methods, across volatile and non-volatile extrapolation periods, with varying effects of historical volatility on the extrapolation. Despite the availability of a substantial amount of data spanning up to 13 years, optimal results were primarily attained through a historical window of 1–3 years for the extrapolation period under examination. |
doi_str_mv | 10.1007/s40747-024-01400-8 |
format | article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_66f5f207343442b4b00ff4581c0d7b6f</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_66f5f207343442b4b00ff4581c0d7b6f</doaj_id><sourcerecordid>3082046609</sourcerecordid><originalsourceid>FETCH-LOGICAL-c453t-9bdb5c8a9175a00aab87275aad297372e4653a91c371c3109e645d5a9a165e853</originalsourceid><addsrcrecordid>eNp9UU1r3EAMNaWFhjR_oCdDz0413-NjCEkbCPSyPfUwyB5NmMX2bGecpPvvM1mX9NaDkJDee5J4TfOZwSUDMF-LBCNNB1x2wCRAZ981Z5z1ttOgxPtT3XdSCf2xuShlDwDMGCuAnzW_dukZsy8thRDHSMvaljjHCXNcjy3NA3lPvl1pPqSMU7vLuJSQ8ky5tE8RW_qz0nKCxJlCxplaXHA6llg-NR8CToUu_ubz5uftze76e3f_49vd9dV9N9ab1q4f_KBGiz0zCgEQB2t4LdHz3gjDSWol6nQUpgaDnrRUXmGPTCuySpw3d5uuT7h3hxxnzEeXMLpTI-UHh3mN40RO66ACByOkkJIPcgAIQSrLRvBm0KFqfdm0Djn9fqSyun16zPWh4gRYDlJr6CuKb6gxp1IyhbetDNyrJ27zxFVP3MkTZytJbKRSwcsD5X_S_2G9AO2qjpI</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3082046609</pqid></control><display><type>article</type><title>Towards efficient similarity embedded temporal Transformers via extended timeframe analysis</title><source>Publicly Available Content Database</source><source>Springer Nature - SpringerLink Journals - Fully Open Access </source><source>Coronavirus Research Database</source><creator>Olorunnimbe, Kenniy ; Viktor, Herna</creator><creatorcontrib>Olorunnimbe, Kenniy ; Viktor, Herna</creatorcontrib><description>Price prediction remains a crucial aspect of financial market research as it forms the basis for various trading strategies and portfolio management techniques. However, traditional models such as ARIMA are not effective for multi-horizon forecasting, and current deep learning approaches do not take into account the conditional heteroscedasticity of financial market time series. In this work, we introduce the similarity embedded temporal Transformer (SeTT) algorithms, which extend the state-of-the-art temporal Transformer architecture. These algorithms utilise historical trends in financial time series, as well as statistical principles, to enhance forecasting performance. We conducted a thorough analysis of various hyperparameters including learning rate, local window size, and the choice of similarity function in this extension of the study in a bid to get optimal model performance. We also experimented over an extended timeframe, which allowed us to more accurately assess the performance of the models in different market conditions and across different lengths of time. Overall, our results show that SeTT provides improved performance for financial market prediction, as it outperforms both classical financial models and state-of-the-art deep learning methods, across volatile and non-volatile extrapolation periods, with varying effects of historical volatility on the extrapolation. Despite the availability of a substantial amount of data spanning up to 13 years, optimal results were primarily attained through a historical window of 1–3 years for the extrapolation period under examination.</description><identifier>ISSN: 2199-4536</identifier><identifier>EISSN: 2198-6053</identifier><identifier>DOI: 10.1007/s40747-024-01400-8</identifier><language>eng</language><publisher>Cham: Springer International Publishing</publisher><subject>Algorithms ; Autoregressive models ; Complexity ; Computational Intelligence ; Data Structures and Information Theory ; Deep learning ; Engineering ; Extrapolation ; Financial price prediction ; Forecasting ; Hyperparameter optimisation ; Machine learning ; Multi-horizon forecast ; Original Article ; Performance prediction ; Portfolio management ; Securities markets ; Similarity ; Statistical analysis ; Stock market forecast ; Temporal Transformer ; Time series</subject><ispartof>Complex & intelligent systems, 2024-08, Vol.10 (4), p.4793-4815</ispartof><rights>The Author(s) 2024</rights><rights>The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c453t-9bdb5c8a9175a00aab87275aad297372e4653a91c371c3109e645d5a9a165e853</cites><orcidid>0000-0002-1815-0555</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/3082046609?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,25731,27901,27902,36989,38493,43871,44566</link.rule.ids></links><search><creatorcontrib>Olorunnimbe, Kenniy</creatorcontrib><creatorcontrib>Viktor, Herna</creatorcontrib><title>Towards efficient similarity embedded temporal Transformers via extended timeframe analysis</title><title>Complex & intelligent systems</title><addtitle>Complex Intell. Syst</addtitle><description>Price prediction remains a crucial aspect of financial market research as it forms the basis for various trading strategies and portfolio management techniques. However, traditional models such as ARIMA are not effective for multi-horizon forecasting, and current deep learning approaches do not take into account the conditional heteroscedasticity of financial market time series. In this work, we introduce the similarity embedded temporal Transformer (SeTT) algorithms, which extend the state-of-the-art temporal Transformer architecture. These algorithms utilise historical trends in financial time series, as well as statistical principles, to enhance forecasting performance. We conducted a thorough analysis of various hyperparameters including learning rate, local window size, and the choice of similarity function in this extension of the study in a bid to get optimal model performance. We also experimented over an extended timeframe, which allowed us to more accurately assess the performance of the models in different market conditions and across different lengths of time. Overall, our results show that SeTT provides improved performance for financial market prediction, as it outperforms both classical financial models and state-of-the-art deep learning methods, across volatile and non-volatile extrapolation periods, with varying effects of historical volatility on the extrapolation. Despite the availability of a substantial amount of data spanning up to 13 years, optimal results were primarily attained through a historical window of 1–3 years for the extrapolation period under examination.</description><subject>Algorithms</subject><subject>Autoregressive models</subject><subject>Complexity</subject><subject>Computational Intelligence</subject><subject>Data Structures and Information Theory</subject><subject>Deep learning</subject><subject>Engineering</subject><subject>Extrapolation</subject><subject>Financial price prediction</subject><subject>Forecasting</subject><subject>Hyperparameter optimisation</subject><subject>Machine learning</subject><subject>Multi-horizon forecast</subject><subject>Original Article</subject><subject>Performance prediction</subject><subject>Portfolio management</subject><subject>Securities markets</subject><subject>Similarity</subject><subject>Statistical analysis</subject><subject>Stock market forecast</subject><subject>Temporal Transformer</subject><subject>Time series</subject><issn>2199-4536</issn><issn>2198-6053</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>COVID</sourceid><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9UU1r3EAMNaWFhjR_oCdDz0413-NjCEkbCPSyPfUwyB5NmMX2bGecpPvvM1mX9NaDkJDee5J4TfOZwSUDMF-LBCNNB1x2wCRAZ981Z5z1ttOgxPtT3XdSCf2xuShlDwDMGCuAnzW_dukZsy8thRDHSMvaljjHCXNcjy3NA3lPvl1pPqSMU7vLuJSQ8ky5tE8RW_qz0nKCxJlCxplaXHA6llg-NR8CToUu_ubz5uftze76e3f_49vd9dV9N9ab1q4f_KBGiz0zCgEQB2t4LdHz3gjDSWol6nQUpgaDnrRUXmGPTCuySpw3d5uuT7h3hxxnzEeXMLpTI-UHh3mN40RO66ACByOkkJIPcgAIQSrLRvBm0KFqfdm0Djn9fqSyun16zPWh4gRYDlJr6CuKb6gxp1IyhbetDNyrJ27zxFVP3MkTZytJbKRSwcsD5X_S_2G9AO2qjpI</recordid><startdate>20240801</startdate><enddate>20240801</enddate><creator>Olorunnimbe, Kenniy</creator><creator>Viktor, Herna</creator><general>Springer International Publishing</general><general>Springer Nature B.V</general><general>Springer</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X5</scope><scope>8FE</scope><scope>8FG</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>COVID</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>K6~</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-1815-0555</orcidid></search><sort><creationdate>20240801</creationdate><title>Towards efficient similarity embedded temporal Transformers via extended timeframe analysis</title><author>Olorunnimbe, Kenniy ; Viktor, Herna</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c453t-9bdb5c8a9175a00aab87275aad297372e4653a91c371c3109e645d5a9a165e853</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Autoregressive models</topic><topic>Complexity</topic><topic>Computational Intelligence</topic><topic>Data Structures and Information Theory</topic><topic>Deep learning</topic><topic>Engineering</topic><topic>Extrapolation</topic><topic>Financial price prediction</topic><topic>Forecasting</topic><topic>Hyperparameter optimisation</topic><topic>Machine learning</topic><topic>Multi-horizon forecast</topic><topic>Original Article</topic><topic>Performance prediction</topic><topic>Portfolio management</topic><topic>Securities markets</topic><topic>Similarity</topic><topic>Statistical analysis</topic><topic>Stock market forecast</topic><topic>Temporal Transformer</topic><topic>Time series</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Olorunnimbe, Kenniy</creatorcontrib><creatorcontrib>Viktor, Herna</creatorcontrib><collection>Springer Nature OA/Free Journals</collection><collection>CrossRef</collection><collection>Entrepreneurship Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>Coronavirus Research Database</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Business Collection</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Complex & intelligent systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Olorunnimbe, Kenniy</au><au>Viktor, Herna</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Towards efficient similarity embedded temporal Transformers via extended timeframe analysis</atitle><jtitle>Complex & intelligent systems</jtitle><stitle>Complex Intell. Syst</stitle><date>2024-08-01</date><risdate>2024</risdate><volume>10</volume><issue>4</issue><spage>4793</spage><epage>4815</epage><pages>4793-4815</pages><issn>2199-4536</issn><eissn>2198-6053</eissn><abstract>Price prediction remains a crucial aspect of financial market research as it forms the basis for various trading strategies and portfolio management techniques. However, traditional models such as ARIMA are not effective for multi-horizon forecasting, and current deep learning approaches do not take into account the conditional heteroscedasticity of financial market time series. In this work, we introduce the similarity embedded temporal Transformer (SeTT) algorithms, which extend the state-of-the-art temporal Transformer architecture. These algorithms utilise historical trends in financial time series, as well as statistical principles, to enhance forecasting performance. We conducted a thorough analysis of various hyperparameters including learning rate, local window size, and the choice of similarity function in this extension of the study in a bid to get optimal model performance. We also experimented over an extended timeframe, which allowed us to more accurately assess the performance of the models in different market conditions and across different lengths of time. Overall, our results show that SeTT provides improved performance for financial market prediction, as it outperforms both classical financial models and state-of-the-art deep learning methods, across volatile and non-volatile extrapolation periods, with varying effects of historical volatility on the extrapolation. Despite the availability of a substantial amount of data spanning up to 13 years, optimal results were primarily attained through a historical window of 1–3 years for the extrapolation period under examination.</abstract><cop>Cham</cop><pub>Springer International Publishing</pub><doi>10.1007/s40747-024-01400-8</doi><tpages>23</tpages><orcidid>https://orcid.org/0000-0002-1815-0555</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2199-4536 |
ispartof | Complex & intelligent systems, 2024-08, Vol.10 (4), p.4793-4815 |
issn | 2199-4536 2198-6053 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_66f5f207343442b4b00ff4581c0d7b6f |
source | Publicly Available Content Database; Springer Nature - SpringerLink Journals - Fully Open Access ; Coronavirus Research Database |
subjects | Algorithms Autoregressive models Complexity Computational Intelligence Data Structures and Information Theory Deep learning Engineering Extrapolation Financial price prediction Forecasting Hyperparameter optimisation Machine learning Multi-horizon forecast Original Article Performance prediction Portfolio management Securities markets Similarity Statistical analysis Stock market forecast Temporal Transformer Time series |
title | Towards efficient similarity embedded temporal Transformers via extended timeframe analysis |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T18%3A24%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Towards%20efficient%20similarity%20embedded%20temporal%20Transformers%20via%20extended%20timeframe%20analysis&rft.jtitle=Complex%20&%20intelligent%20systems&rft.au=Olorunnimbe,%20Kenniy&rft.date=2024-08-01&rft.volume=10&rft.issue=4&rft.spage=4793&rft.epage=4815&rft.pages=4793-4815&rft.issn=2199-4536&rft.eissn=2198-6053&rft_id=info:doi/10.1007/s40747-024-01400-8&rft_dat=%3Cproquest_doaj_%3E3082046609%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c453t-9bdb5c8a9175a00aab87275aad297372e4653a91c371c3109e645d5a9a165e853%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3082046609&rft_id=info:pmid/&rfr_iscdi=true |