Quantifying neural network uncertainty under volatility clustering
【Author】 Wong, Steven Y. K.; Chan, Jennifer S. K.; Azizi, Lamiae
【Source】NEUROCOMPUTING
【影响因子】5.779
【Abstract】Time-series with volatility clustering pose a unique challenge to uncertainty quantification (UQ) for returns forecasts. Methods for UQ such as Deep Evidential regression offer a simple way of quantifying return forecast uncertainty without the costs of a full Bayesian treatment. However, the Normal-Inverse-Gamma (NIG) prior adopted by Deep Evidential regression is prone to miscalibration as the NIG prior is assigned to latent mean and variance parameters in a hierarchical structure. Moreover, it also overparameterizes the marginal data distribution. These limitations may affect the accurate delineation of epistemic (model) and aleatoric (data) uncertainties. We propose a Scale Mixture Distribution as a simpler alternative which can provide favourable complexity-accuracy trade-off and assign separate subnetworks to each model parameter. To illustrate the performance of our proposed method, we apply it to two sets of financial time-series exhibiting volatility clustering: cryptocurrencies and U.S. equities and test the performance in some ablation studies.
【Keywords】Neural network; Uncertainty quantification; Time-series; Volatility clustering
【发表时间】2025 JAN 21
【收录时间】2024-11-27
【文献类型】
【主题类别】
--
评论