Hyperparameter Optimization - The Math of Intelligence #7

Описание к видео Hyperparameter Optimization - The Math of Intelligence #7

Hyperparameters are the magic numbers of machine learning. We're going to learn how to find them in a more intelligent way than just trial-and-error. We'll go over grid search, random search, and Bayesian Optimization. I'll also cover the difference between Bayesian and Frequentist probability.

Code for this video: https://github.com/llSourcell/hyperpa...

Noah's Winning Code:
https://github.com/NoahLidell/math-of...

Hammad's Runner-up Code:
https://github.com/hammadshaikhha/Mat...

More learning resources:
https://www.iro.umontreal.ca/~bengioy...
https://thuijskens.github.io/2016/12/...
https://jmhessel.github.io/Bayesian-O...
https://arimo.com/data-science/2016/b...
https://dhnzl.files.wordpress.com/201...
http://blog.revolutionanalytics.com/2...
   • Видео  
https://nlpers.blogspot.nl/2014/10/hy...
http://neupy.com/2016/12/17/hyperpara...

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

And please support me on Patreon:
https://www.patreon.com/user?u=3191693

Thanks to Veritasium (bayesian animation) & Angela Schoellig (drone clip)
Follow me:
Twitter:   / sirajraval  
Facebook:   / sirajology   Instagram:   / sirajraval   Instagram:   / sirajraval  
Signup for my newsletter for exciting updates in the field of AI:
https://goo.gl/FZzJ5w
Hit the Join button above to sign up to become a member of my channel for access to exclusive content! Join my AI community: http://chatgptschool.io/ Sign up for my AI Sports betting Bot, WagerGPT! (500 spots available):
https://www.wagergpt.co

Комментарии

Информация по комментариям в разработке