11. Subgradient Descent

Описание к видео 11. Subgradient Descent

Neither the lasso nor the SVM objective function is differentiable, and we had to do some work for each to optimize with gradient-based methods. It turns out, however, that gradient descent will essentially work in these situations, so long as you're careful about handling the non-differentiable points. To this end, we introduce "subgradient descent", and we show the surprising result that, even though the objective value may not decrease with each step, every step brings us closer to the minimizer.

This mathematically intense lecture may be safely skipped.

Access the full course at https://bloom.bg/2ui2T4q

Комментарии

Информация по комментариям в разработке