Synthetic Gradients Tutorial - How to Speed Up Deep Learning Training

Описание к видео Synthetic Gradients Tutorial - How to Speed Up Deep Learning Training

Synthetic Gradients were introduced in 2016 by Max Jaderberg and other researchers at DeepMind. They are designed to replace backpropagation, and they can make all sorts of deep neural networks much faster to train, and even perform better. Moreover, they can allow Recurrent Neural Networks to learn long term patterns in the data.

Papers:
Decoupled Neural Interfaces using Synthetic Gradients, Max Jaderberg et al., 2016: https://arxiv.org/abs/1608.05343
Understanding Synthetic Gradients and Decoupled Neural Interfaces, Wojciech Marian Czarnecki et al., 2017: https://arxiv.org/abs/1703.00522

Blog posts:
By Max Jaderberg, DeepMind: https://deepmind.com/blog/decoupled-n...
By iamtrask: https://iamtrask.github.io/2017/03/21...

Implementations:
DNI-TensorFlow by Andrew Liao: https://github.com/vyraun/DNI-tensorflow
Jupyter Notebook with TensorFlow by Nitarshan Rajkumar: https://github.com/nitarshan/decouple...
DNI.PyTorch by Andrew Liao: https://github.com/andrewliao11/dni.p...

Slides:
https://www.slideshare.net/aurelienge...

The painting on the first slide is by Annie Clavel, a great French artist currently living in Los Angeles. Visit her website: http://www.annieclavel.com/.

Комментарии

Информация по комментариям в разработке