Back Propagation in training neural networks step by step

Описание к видео Back Propagation in training neural networks step by step

This video follows on from the previous video Neural Networks: Part 1 - Forward Propagation.

I present a simple example using numbers of how back prop works.

0:00 Introduction
0:35 Our silly dataset
0:55 Recap of forward propagation
2:00 Backpropagation beginning
3:00 Intuition behind backpropagation
4:45 The best way to carry out backprop is by using gradient descent
4:50 What is gradient descent?
7:00 What is a partial derivative?
7:30 What is a cost function?
8:05 Partial derivative formula using the chain rule
13:35 Update the weights and biases using gradient descent
14:00 What is a learning rate?
14:10 Gradient descent formula and full examples
24:26 Updated weights
25:00 Stochastic gradient descent
26:30 What is an epoch?
27:10 Unresolved questions. Learning rate; stochastic gradient descent; activation function;

Комментарии

Информация по комментариям в разработке