Deep Neural Networks in Julia With Flux.jl | Talk Julia #10

Описание к видео Deep Neural Networks in Julia With Flux.jl | Talk Julia #10

David and Randy explore deep neural networks in Julia using Flux.jl by recreating Grant Sanderson's model for predicting handwritten digits in the MNIST data set. We also show how to visualize model results and training performance in TensorBoard using the TensorBoardLogging.jl package.

Episode links are available in the show notes on our website ➡ https://www.talkjulia.com/10


EPISODE CHAPTERS
00:00 — Hello
00:27 — 3blue1brown videos on neural networks
03:45 — Training a neural network on the MNIST data set
27:17 — A quick introduction to TensorBoard
28:48 — Setting up TensorBoardLogging.jl
33:33 — Viewing results in TensorBoard
36:47 — Goodbye


ABOUT THE SHOW
Talk Julia is a weekly podcast devoted to the Julia programming language. Join hosts David Amos and Randy Davila as we explore Julia news and resources, learn Julia for ourselves, and share our experience and everything that we've learned.

Check out our website ➡ https://www.talkjulia.com
Connect with us on Twitter ➡   / talkjuliapod  


✨ HELP SUPPORT THE SHOW ✨
If you get value from listening to our podcast each week and would like to support us, you can help us out for free by simply telling someone you know to check out our show.

But if you'd like to support us financially then... Wow! Thank you so much 🙏🏼 Here are two ways to do that:

Tip us on Ko-Fi ➡ https://ko-fi.com/talkjuliapodcast
Set up a recurring donation on Liberapay ➡ https://liberapay.com/TalkJulia/

For other ways to support the show, see https://www.talkjulia.com/donate


OTHER LINKS
David's YouTube channel ➡    / davidamos  
Randy's YouTube channel ➡    / randydavila1  

#JuliaLang #MachineLearning #FluxML

Комментарии

Информация по комментариям в разработке