Lecture 2 - ML Refresher / Softmax Regression

Описание к видео Lecture 2 - ML Refresher / Softmax Regression

Lecture 2 of the online course Deep Learning Systems: Algorithms and Implementation.

This lecture covers a refresher of the basic principles of (supervised) machine learning, as exemplified by the softmax regression algorithm. We will go through the derivation of the softmax regression method and stochastic gradient descent applied to train this class of model.

Sign up for the course for free at http://dlsyscourse.org.

Errata:
1:14:24 - X should be m x n, not m x k. The final gradient expression is correct (using the right size), but this was a typo in annotating.

Contents:
00:00 - Introduction
01:08 - Machine learning and data-driven programming
05:34 - Three ingredients of a machine learning algorithm
08:40 - Multi-class classification setting
12:04 - Linear hypothesis function
16:52 - Matrix batch notation
22:34 - Loss function #1: classification error
26:44 - Loss function #2: softmax / cross-entropy loss
35:28 - The softmax regression optimization problem
39:16 - Optimization: gradient descent
50:35 - Stochastic gradient descent
55:26 - The gradient of the softmax objective
1:08:16 - The slide I'm embarrassed to include...
1:16:49 - Putting it all together

Комментарии

Информация по комментариям в разработке