27. EM Algorithm for Latent Variable Models

Описание к видео 27. EM Algorithm for Latent Variable Models

It turns out, fitting a Gaussian mixture model by maximum likelihood is easier said than done: there is no closed from solution, and our usual gradient methods do not work well. The standard approach to maximum likelihood estimation in a Gaussian mixture model is the expectation maximization algorithm. In this lecture, we present the EM algorithm in the general setting of latent variable models, of which GMM is a special case. We present the EM algorithm as a very basic "variational method" and indicate a few generalizations.

Access the full course at https://bloom.bg/2ui2T4q

Комментарии

Информация по комментариям в разработке