What is Automatic Differentiation?

Описание к видео What is Automatic Differentiation?

This short tutorial covers the basics of automatic differentiation, a set of techniques that allow us to efficiently compute derivatives of functions implemented as programs. It is based in part on Baydin et al., 2018: Automatic Differentiation in Machine Learning: A Survey (https://arxiv.org/abs/1502.05767).

Errata:
At 6:23 in bottom right, it should be v̇6 = v̇5*v4 + v̇4*v5 (instead of "-").

Additional references:
Griewank & Walther, 2008: Evaluating Derivatives: Principles and Techniques
of Algorithmic Differentiation (https://dl.acm.org/doi/book/10.5555/1...)
Adams, 2018: COS 324 – Computing Gradients with Backpropagation (https://www.cs.princeton.edu/courses/...)
Grosse, 2018: CSC 321 – Lecture 10: Automatic Differentiation (https://www.cs.toronto.edu/~rgrosse/c...)
Pearlmutter, 1994: Fast exact multiplication by the Hessian (http://www.bcl.hamilton.ie/~barak/pap...)

Alleviating memory requirements of reverse mode:
Griewank & Walther, 2000: Algorithm 799: revolve: an
implementation of checkpointing for the reverse or adjoint mode of computational differentiation (https://dl.acm.org/doi/10.1145/347837...)
Dauvergne & Hascoët, 2006. The data-flow equations of checkpointing in
reverse automatic differentiation (https://link.springer.com/chapter/10....)
Chen, T et al., 2016: Training Deep Nets with Sublinear Memory Cost (https://arxiv.org/abs/1604.06174)
Gruslys et al., 2016: Memory-efficient Backpropagation
Through Time (https://arxiv.org/abs/1606.03401)
Siskind & Pearlmutter. Divide-and-conquer checkpointing for arbitrary programs with no user annotation (https://arxiv.org/abs/1708.06799)
Oktay et al., 2020: Randomized Automatic Differentiation (https://arxiv.org/abs/2007.10412)

Example software libraries using various implementation routes:

Source code transformation:
Tangent – https://github.com/google/tangent
Zygote – https://github.com/FluxML/Zygote.jl

Operator overloading:
Autograd – https://github.com/HIPS/autograd
Jax – https://github.com/google/jax
PyTorch – https://pytorch.org/

Graph-based w/ embedding mini lanugage:
TensorFlow – https://www.tensorflow.org


Special thanks to Ryan Adams, Alex Beatson, Geoffrey Roeder, Greg Gundersen, and Deniz Oktay for feedback on this video.

Some of the animations in this video were created with 3Blue1Brown's manim library (https://github.com/3b1b/manim).

Music: Trinkets by Vincent Rubinetti

Links:
YouTube:    / ariseffai  
Twitter:   / ari_seff  
Homepage: https://www.ariseff.com

If you'd like to help support the channel (completely optional), you can donate a cup of coffee via the following:
Venmo: https://venmo.com/ariseff
PayPal: https://www.paypal.me/ariseff

Комментарии

Информация по комментариям в разработке