Speaker: Miika Nikula Title: Probabilistic Machine Learning for Probabilists Abstract: We give an overview of probabilistic methods in machine learning with a special focus on inference for generative models. Probabilistic estimates are important for several reasons. In applications such as medicine it is important that any estimates are provided with well-calibrated uncertainty bounds. More intrinsically, probabilistic estimation provides a principled way of regularization and guarding against overfitting. The main challenge in formulating and obtaining probabilistic estimates in machine learning is the computational cost: while a point estimate of parameters can often be obtained in time O(N) for a dataset of N samples, a probabilistic estimate such as Gaussian process regression can be as costly as O(N^3). Various approximative methods have been developed to deal with the unfavorable scaling of time complexity, two of the most influential general-purpose methods being variational Bayesian inference and the family of stochastic gradient MCMC methods. We present an introduction to these methods and examples of their application in generative modeling.