07L – PCA, AE, K-means, Gaussian mixture model, sparse coding, and intuitive VAE

Описание к видео 07L – PCA, AE, K-means, Gaussian mixture model, sparse coding, and intuitive VAE

Course website: http://bit.ly/DLSP21-web
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Yann LeCun

Chapters
00:00:00 – Welcome to class
00:06:55 – Training methods revisited
00:08:03 – Architectural methods
00:12:00 – 1. PCA
00:18:04 – Q&A on Definitions: Labels, (un)conditional, and (un, self)supervised learning
00:25:31 – 2. Auto-encoder with Bottleneck
00:27:40 – 3. K-Means
00:34:40 – 4. Gaussian mixture model
00:41:37 – Regularized EBM
00:52:08 – Yann out of context
00:53:24 – Q&A on Norms and Posterior: when the student is thinking too far ahead
00:53:58 – 1. Unconditional regularized latent variable EBM: Sparse coding
01:06:10 – Sparse modeling on MNIST & natural patches
01:12:18 – 2. Amortized inference
01:17:02 – ISTA algorithm & RNN Encoder
01:26:56 – 3. Convolutional sparce coding
01:36:37 – 4. Video prediction: very briefly
01:39:22 – 5. VAE: an intuitive interpretation
01:48:34 – Helpful whiteboard stuff
01:52:35 – Another interpretation

Комментарии

Информация по комментариям в разработке