Lecture 3 (Part I) - "Manual" Neural Networks

Описание к видео Lecture 3 (Part I) - "Manual" Neural Networks

Lecture 3 (Part 1) of the online course Deep Learning Systems: Algorithms and Implementation.

This lecture discusses the nature of simple networks, such as two-layer fully-connected networks, and more general "multi-layer perceptrons." The lecture explains the motivation behind and nature of this particular form of hypothesis class, then derives the backpropagation algorithm in it's "manual" form (i.e., without using automatic differentiation).

Sign up for the course for free at http://dlsyscourse.org.

Contents:
00:00 - Introduction
02:32 - The trouble with linear hypothesis classes
04:13 - What about nonlinear classification boundaries?
09:31 - How do we create features?
12:37 - Nonlinear features
18:28 - Neural networks / deep learning
22:45 - The "two layer" neural network
27:36 - Universal function approximation
42:26 - Fully-connected deep networks
49:06 - Why deep networks?

Комментарии

Информация по комментариям в разработке