Batch Normalization - Part 2: How it works & Essence of Beta & Gamma

Описание к видео Batch Normalization - Part 2: How it works & Essence of Beta & Gamma

We have been discussing Batch Normalization in detail. We have seen why do we need Batch Normalization in the previous video. In this video, we will dig deeper into how Batch Normalization works and also understand the significance of learnable parameters called Gamma and Beta which are Scaling and Shifting.

Deep Learning Projects playlist:
   • Deep Learning Projects  

Neural Networks From Scratch in Python:
   • Neural Networks From Scratch in Python  


Chapters:
00:00 Introduction
00:27 Normalization Vs Standardization
01:26 Recap of Why we need BN
05:40 How Batch Normalization works
07:15 Batch Vs Mini-batch
10:43 Visual Explanation of BN
13:52 Why we need Beta and Gamma in BN
17:25 Common Misconception
19:42 Beauty of Gamma and Beta parameters
23:18 Summary & Next video topics

#batchnormalization #normalization #neuralnetworks #internalcovariateshift #covariance #normaldistribution

Комментарии

Информация по комментариям в разработке