End-to-end learning of coherent probabilistic forecasts for hierarchical time series
This paper presents a novel approach to forecasting of hierarchical time series that produces coherent, probabilistic forecasts without requiring any explicit post-processing step. Unlike the state-of-the-art, the proposed method simultaneously learns from all time series in the hierarchy and incorporates the reconciliation step as part of a single trainable model. This is achieved by applying the reparameterization trick and utilizing the observation that reconciliation can be cast as an optimization problem with a closed-form solution. These model features make end-to-end learning of hierarchical forecasts possible, while accomplishing the challenging task of generating forecasts that are both probabilistic and coherent. Importantly, our approach also accommodates general aggregation constraints including grouped, temporal, and cross-temporal hierarchies. An extensive empirical evaluation on real-world hierarchical datasets demonstrates the advantages of the proposed approach over the state-of-the-art.