Dmitri Vetrov
National Research University Higher School of Economics and Yandex

Bayesian framework in Deep Learning Problems

We are witnessing a fast convergence of two approaches to Machine Learning problems, the Bayesian framework and Deep Neural Networks. On the one hand, neural networks allow to efficiently approximate nontrivial a posteriori distributions which are the results of Bayesian inference. On the other hand, a neural network itself can be interpreted as a Bayesian model, which endows it with interesting features. In the talk we will show how to crossbreed Neural Networks with Bayesian inference, and in particular will present a Variational Bayesian Dropout model as a new approach to regularization of neural networks, which enables to compress popular NN architectures by factor of hundreds without quality deterioration.