Learning quantile function without quantile crossing for distribution-free time series forecasting
2021
Quantile regression is an effective technique to quantify uncertainty, and fit challenging underlying distributions. Generating full probabilistic predictions requires multiple quantile regressions over multiple quantile levels. As a result, quantile crossing is a common drawback to these approaches since it violates the desirable monotone property of the conditional quantile function. In this work, we propose the incremental quantile function (IQF), a general distribution-free quantile estimation framework with a simple neural network layer that resolves the issue of quantile crossing. Moreover, IQF is able to extrapolate to other quantile levels, resulting in predictions on arbitrary quantile levels that differ from the underlying training quantiles. We apply IQF to the NN-based times series forecasting use case, where this is particularly appealing because it saves the expensive re-training cost for non-trained quantile levels. We provide generalization error analysis with our proposed approach under the sequence-to-sequence forecasting setting. Experimental results demonstrate the effectiveness and resulting accuracy improvement of our method.
Research areas