Seminar

Dropout Training is Distributionally Robust Optimal

Speaker

José Luis Montiel Olea (Columbia)

Date & Time

26 April 2022

Type

Seminar

Venue

Online only

Abstract: This paper shows that dropout training in Generalized Linear Models is the minimax solution of a two-player, zero-sum game where an adversarial nature corrupts a statistician’s covariates using a multiplicative nonparametric errors-in-variables model. In this game, nature’s least favorable distribution is dropout noise, where nature independently deletes
entries of the covariate vector with some fixed probability δ. This result implies that dropout training indeed provides out-of-sample expected loss guarantees for distributions that arise from multiplicative perturbations of in-sample data. In addition to the decision theoretic analysis, the paper makes two more contributions. First, there is a concrete recommendation on how to select the tuning parameter δ to guarantee that, as the sample
size grows large, the in-sample loss after dropout training exceeds the true population loss with some pre-specified probability. Second, the paper provides a novel, parallelizable, Unbiased Multi-Level Monte Carlo algorithm to speed-up the implementation of dropout training. Our algorithm has a much smaller computational cost compared to the naive
implementation of dropout, provided the number of data points is much smaller than the dimension of the covariate vector.