PYTORCH COMMON MISTAKES - How To Save Time 🕒

Описание к видео PYTORCH COMMON MISTAKES - How To Save Time 🕒

In this video I show you 10 common Pytorch mistakes and by avoiding these you will save a lot time on debugging models. This was inspired by a tweet by Andrej Karpathy and that's why I said it was approved by him :)

Andrej Karpathy Tweet:
  / 1013244313327681536  

❤️ Support the channel ❤️
   / @aladdinpersson  

Paid Courses I recommend for learning (affiliate links, no extra cost for you):
⭐ Machine Learning Specialization https://bit.ly/3hjTBBt
⭐ Deep Learning Specialization https://bit.ly/3YcUkoI
📘 MLOps Specialization http://bit.ly/3wibaWy
📘 GAN Specialization https://bit.ly/3FmnZDl
📘 NLP Specialization http://bit.ly/3GXoQuP

✨ Free Resources that are great:
NLP: https://web.stanford.edu/class/cs224n/
CV: http://cs231n.stanford.edu/
Deployment: https://fullstackdeeplearning.com/
FastAI: https://www.fast.ai/

💻 My Deep Learning Setup and Recording Setup:
https://www.amazon.com/shop/aladdinpe...

GitHub Repository:
https://github.com/aladdinpersson/Mac...

✅ One-Time Donations:
Paypal: https://bit.ly/3buoRYH

▶️ You Can Connect with me on:
Twitter -   / aladdinpersson  
LinkedIn -   / aladdin-persson-a95384153  
Github - https://github.com/aladdinpersson

OUTLINE:
0:00 - Introduction
0:21 - 1. Didn't overfit batch
2:45 - 2. Forgot toggle train/eval
4:47 - 3. Forgot .zero_grad()
6:15 - 4. Softmax when using CrossEntropy
8:09 - 5. Bias term with BatchNorm
9:54 - 6. Using view as permute
12:10 - 7. Incorrect Data Augmentation
14:19 - 8. Not Shuffling Data
15:28 - 9. Not Normalizing Data
17:28 - 10. Not Clipping Gradients
18:40 - Which ones did I miss?

Комментарии

Информация по комментариям в разработке