Loading…
Predicting earnings per share using feature-engineered extreme gradient boosting models and constructing alpha trading strategies
This study explores the effectiveness of Extreme Gradient Boosting (XGBoost) models in predicting a stock's future Earnings Per Share (EPS). It utilizes preprocessed technical, fundamental, and analyst forecast data, along with feature engineering techniques to construct indicators. The study a...
Saved in:
Published in: | International journal of information technology (Singapore. Online) 2023-12, Vol.15 (8), p.3999-4012 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This study explores the effectiveness of Extreme Gradient Boosting (XGBoost) models in predicting a stock's future Earnings Per Share (EPS). It utilizes preprocessed technical, fundamental, and analyst forecast data, along with feature engineering techniques to construct indicators. The study achieves good results in EPS prediction using these feature-engineered datasets and varied hyperparameter settings. The predicted EPS aligns with the actual EPS, demonstrating the model's efficacy. The study proposes using the predicted EPS for developing trading strategies and demonstrates the model's robustness in the Financial and Technological sectors. It also investigates the feature dependency of various technical and fundamental parameters for estimating EPS, highlighting the importance of feature selection to enhance prediction accuracy. Additionally, the study employs these predictions to build alpha trading strategies and portfolios, resulting in anomalous returns with an average Information Ratio of 6.14. The research suggests expanding the use of XGBoost models to anticipate additional fundamental parameters, enabling the construction of more robust portfolios and alpha trading approaches. |
---|---|
ISSN: | 2511-2104 2511-2112 |
DOI: | 10.1007/s41870-023-01450-0 |