Abstract
Estimating and predicting volatility in time series is of great importance in different areas where it is required to quantify risk based on variability and uncertainty. This work proposes a new methodology to predict Time Series volatility by combining Generalized AutoRegressive Conditional Heteroscedasticity (GARCH) methods with Deep Neural Networks. Additionally, the proposal incorporates a mechanism to determine the optimal size of the sliding window used to estimate volatility. In this work, the recurrent neural networks Gated Recurrent Units, Long/Short-Term Memory (LSTM), and Bidirectional Long/Short-Term Memory (BiLSTM) are evaluated with the methods of the family Garch (fGARCH). We conducted Monte Carlo simulation studies with heteroscedastic time series to validate our proposed methodology. Moreover, we have applied the proposed method to real financial data from the stock market, such as the Selective Stock Price Index Chile index, Standard & Poor’s 500 Index (S &P500), and the prices of the Stock Exchange from Australia (ASX200). The proposed methodology performs well in predicting the stock options returns volatility one week ahead.
Original language | English |
---|---|
Journal | Computational Statistics |
DOIs | |
State | Accepted/In press - 2023 |
Externally published | Yes |
Keywords
- BiLSTM
- Deep learning
- GRU
- LSTM
- Stochastic volatility
- Stock options return