Papers
Topics
Authors
Recent
Search
2000 character limit reached

Quantifying neural network uncertainty under volatility clustering

Published 22 Feb 2024 in q-fin.ST | (2402.14476v2)

Abstract: Time-series with volatility clustering pose a unique challenge to uncertainty quantification (UQ) for returns forecasts. Methods for UQ such as Deep Evidential regression offer a simple way of quantifying return forecast uncertainty without the costs of a full Bayesian treatment. However, the Normal-Inverse-Gamma (NIG) prior adopted by Deep Evidential regression is prone to miscalibration as the NIG prior is assigned to latent mean and variance parameters in a hierarchical structure. Moreover, it also overparameterizes the marginal data distribution. These limitations may affect the accurate delineation of epistemic (model) and aleatoric (data) uncertainties. We propose a Scale Mixture Distribution as a simpler alternative which can provide favorable complexity-accuracy trade-off and assign separate subnetworks to each model parameter. To illustrate the performance of our proposed method, we apply it to two sets of financial time-series exhibiting volatility clustering: cryptocurrencies and U.S. equities and test the performance in some ablation studies.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (61)
  1. Ambachtsheer, K. (1974). Profit potential in an almost efficient market. Journal of Portfolio Management, 1(1):84.
  2. Deep evidential regression. In Larochelle, H., Ranzato, M., Raia, Hadsell, Balcan, M.-F., and Lin, H., editors, Advances in Neural Information Processing Systems 33, pages 14927–14937, Vancouver, BC, Canada. Curran Associates, Inc.
  3. Scale mixtures of normal distributions. Journal of the Royal Statistical Society: Series B (Methodological), 36(1):99–102.
  4. On second-order scoring rules for epistemic uncertainty quantification. In arXiv.
  5. Bayesian theory. John Wiley & Sons Ltd.
  6. Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.
  7. Asset allocation: Combining investor views with market equilibrium. Journal of Fixed Income, 1(2):7–18.
  8. Bollerslev, T. (1986). Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics, 31(3):307–327.
  9. Time Series Analysis: Forecasting and Control. Prentice Hall, Englewood Cliffs, N.J., USA, 3 edition.
  10. Breiman, L. (1996). Bagging predictors. Machine Learning, 24(2):123–140.
  11. A practical guide to volatility forecasting through calm and storm. Journal of Risk, 14(2):3–22.
  12. Generalized framework for applying the kelly criterion to stock markets. International Journal of Theoretical and Applied Finance, 21(05):1–13.
  13. Vision-based autonomous car racing using deep imitative reinforcement learning. IEEE Robotics and Automation Letters, 6(4):7262–7269.
  14. GARCH Modeling of Stock Market Volatility, pages 71–90. Chapman & Hall/CRC finance series. CRC Press, New York, NY, USA, 1st edition.
  15. Natural posterior network: Deep bayesian predictive uncertainty for exponential family distributions. In 9th International Conference on Learning Representations (ICLR), online. OpenReview.net.
  16. Sentiment-induced bubbles in the cryptocurrency market. Journal of Risk and Financial Management, 12(2):1–12.
  17. Scale mixtures distributions in statistical modelling. Australian & New Zealand Journal of Statistics, 50(2):135–146.
  18. Cont, R. (2001). Empirical properties of asset returns: stylized facts and statistical issues. Quantitative Finance, 1:223–236.
  19. A large-scale study of probabilistic calibration in neural network regression. In Krause, A., Brunskill, E., Cho, K., Engelhardt, B., Sabato, S., and Scarlett, J., editors, Proceedings of the 40th International Conference on Machine Learning (ICML), volume 202, pages 7813–7836. JMLR.org.
  20. Engle, R. F. (1982). Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom inflation. Econometrica, 50(4):987–1007.
  21. Negative bubbles and shocks in cryptocurrency markets. International Review of Financial Analysis, 47:343–352.
  22. Gal, Y. (2016). Uncertainty in Deep Learning. University of Cambridge. PhD thesis.
  23. Dropout as a bayesian approximation: Representing model uncertainty in deep learning. In Balcan, M.-F. and Weinberger, K. Q., editors, Proceedings of the 33rd International Conference on Machine Learning (ICML), volume 48, pages 1050–1059. JMLR.org.
  24. A survey of uncertainty in deep neural networks. In arXiv.
  25. Neural network–based financial volatility forecasting: A systematic review. ACM Computing Surveys, 55(1).
  26. Strictly proper scoring rules, prediction, and estimation. Journal of the American Statistical Association, 102(477):359–378.
  27. Deep Learning. MIT Press. http://www.deeplearningbook.org.
  28. Active Portfolio Management: A Quantitative Approach for Producing Superior Returns and Controlling Risk. McGraw-Hill Education.
  29. Sources of uncertainty in machine learning – a statisticians’ view. In arXiv.
  30. Empirical asset pricing via machine learning. The Review of Financial Studies, 33(5):2223–2273.
  31. Hafner, C. M. (2018). Testing for Bubbles in Cryptocurrencies with Time-Varying Volatility. Journal of Financial Econometrics, 18(2):233–249.
  32. Hastings, W. K. (1970). Monte carlo sampling methods using markov chains and their applications. Biometrika, 57(1):97–109.
  33. Probabilistic backpropagation for scalable learning of bayesian neural networks. In Bach, F. R. and Blei, D. M., editors, Proceedings of the 32nd International Conference on Machine Learning (ICML), volume 37, pages 1861–1869. JMLR.org.
  34. Long short-term memory. Neural Computation, 9(8):1735–1780.
  35. Hora, S. C. (1996). Aleatory and epistemic uncertainty in probability elicitation with an example from hazardous waste management: Treatment of aleatory and epistemic uncertainty. Reliability engineering & system safety, 54(2–3):217–223.
  36. Jordan, M. (2009). The exponential family: Conjugate priors.
  37. Hands-on bayesian neural networks — a tutorial for deep learning users. IEEE computational intelligence magazine, 17(2):29–48.
  38. Kelly, J. L. (1956). A new interpretation of information rate. Bell System Technical Journal, 35(4):917–926.
  39. Being bayesian, even just a bit, fixes overconfidence in relu networks. In arXiv.
  40. Simple and scalable predictive uncertainty estimation using deep ensembles. In Guyon, I., von Luxburg, U., Bengio, S., Wallach, H. M., Fergus, R., Vishwanathan, S. V. N., and Garnett, R., editors, Advances in Neural Information Processing Systems 30, pages 6405–6416, Long Beach, CA, USA. Curran Associates, Inc.
  41. 3D high-quality magnetic resonance image restoration in clinics using deep learning. In arXiv.
  42. Lipton, A. (2021). Cryptocurrencies change everything. Quantitative Finance, 21(8):1257–1262.
  43. Liu, Y. (2019). Novel volatility forecasting using deep learning–long short term memory recurrent neural networks. Expert systems with applications, 132:99–109.
  44. Efficient and robust lidar-based end-to-end navigation. In Sun, Y., editor, 2021 IEEE International Conference on Robotics and Automation, ICRA, pages 13247–13254, Xi’an, China. IEEE.
  45. MacKay, D. J. C. (1992). A practical bayesian framework for backpropagation networks. Neural Computation, 4(3):448–472.
  46. Regression prior networks. In arXiv.
  47. The unreasonable effectiveness of deep evidential regression. In arXiv.
  48. On the validity of bayesian neural networks for uncertainty estimation. In arXiv.
  49. Neal, R. M. (1996). Bayesian Learning for Neural Networks. Springer-Verlag, Berlin, Heidelberg.
  50. Statistical analysis of bitcoin during explosive behavior periods. PLOS ONE, 14(3):1–22.
  51. Improving evidential deep learning via multi-task learning. In Thirty-Sixth AAAI Conference on Artificial Intelligence, pages 7895–7903. AAAI Press.
  52. Can you trust your model’s uncertainty? evaluating predictive uncertainty under dataset shift. In Wallach, H. M., Larochelle, H., Beygelzimer, A., d’Alché-Buc, F., Fox, E. B., and Garnett, R., editors, Advances in Neural Information Processing Systems 32, Vancouver, BC, Canada. Curran Associates, Inc.
  53. Predictability of stock returns: Robustness and economic significance. Journal of Finance, 50:1201–1228.
  54. Investing with cryptocurrencies – evaluating their potential for portfolio allocation strategies. Quantitative Finance, 21(11):1825–1853.
  55. Speeding up MCMC by efficient data subsampling. Journal of the American Statistical Association, 114(526):831–843.
  56. Online prediction under model uncertainty via dynamic model averaging: Application to a cold rolling mill. Technometrics, 52(1):52–66. PMID: 20607102.
  57. A comparison of arima and lstm in forecasting time series. In Wani, M. A., editor, Proceedings of the 17th IEEE International Conference on Machine Learning and Applications, pages 1394–1401, Orlando, FL, USA. IEEE.
  58. Leveraging graph and deep learning uncertainties to detect anomalous maritime trajectories. IEEE Transactions on Intelligent Transportation Systems, 23(12):23488–23502.
  59. Evidential deep learning for guided molecular property prediction and discovery. ACS central science, 7(8):1356–1367.
  60. Sullivan, T. (2015). Introduction to Uncertainty Quantification. Number 63 in Texts in Applied Mathematics. Springer International Publishing, Cham, Germany, 1st edition.
  61. Supervised temporal autoencoder for stock return time-series forecasting. In Chan, W. K., Claycomb, B., and Takakura, H., editors, Proceedings of the IEEE 45th Annual Computer Software and Applications Conference (COMPSAC’21), Madrid, Spain. IEEE.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We found no open problems mentioned in this paper.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 4 tweets with 3 likes about this paper.