Inference calibration of neural network models
2022
The development of both the theory and the practice of neural network models has been significant in the last decade. Novel solutions including Dropout and Batch Normalization (BN) have been proposed and pervasively adopted to overcome issues like over-fitting and to improve the convergence of the models. Despite of their remarkable success, in this paper we show that Dropout and BN can make the model biased and suboptimum in inference time because of the shift of their behaviour from training to inference. We propose a simple method, called Inference Calibration, to reduce this bias and improve the performance for neural network models. Our experiments show that Inference Calibration algorithm can effectively reduce prediction error and improve model’s accuracy, while reducing the calibration error for both regression and classification models.
Research areas