Cross-sectional state-space forecasting with partial pooling
We propose a novel architecture for time series models built upon state-space methods. We jointly estimate many, potentially multivariate, distributions defined using state-space models by partially pooling their parameters across the cross-section. These joint distributions define a novel recurrent neural network. By combining state-space methods and neural networks, we leverage the interpretability of state-space models and the scalability and flexibility of neural networks. This lets us build an accurate, flexible, and scalable forecast that is not a black box. We implement this architecture by building a library on the deep learning library MXNet to leverage state-of-the-art scalable optimization techniques including automatic differentiation and computation graphs to estimate the parameters governing the state-space models. This library abstracts over a large class of state-space models allowing users to estimate almost arbitrary model specifications without code changes. We forecast weekly business formation by state, which is obtained from the FRED economic database at the St. Louis Federal Reserve bank. We show that this forecast is more accurate than state-of-the-art neural network approaches (DeepState) and more accurate than state-of-the-art univariate generalized linear models (Prophet) when the data are particularly volatile.