What is a vector-Jacobian product (vjp) in JAX?

Описание к видео What is a vector-Jacobian product (vjp) in JAX?

Reverse-mode automatic differentiation is the essential ingredient to training artificial Neural Networks. This video looks at its primitive, the pullback, also called vjp in the JAX deep learning framework.

The vjp intrinsics of JAX (or similar functionality in TensorFlow, PyTorch or Julia) allow evaluating the left-multiplication of a vector to the Jacobian matrix of a vector-valued function at high efficiency. This is particularly necessary for sparse Jacobians of functions will small locality.

Find more details about vJp on JAX' documentation: https://jax.readthedocs.io/en/latest/...

-------

📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-lea...

📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff:   / felix-koehler   and   / felix_m_koehler  

💸 : If you want to support my work on the channel, you can become a Patreon here:   / mlsim  

🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/Felix...

-------

Timestamps:
00:00 Intro
00:25 Vector-Valued Function
00:58 Full Jacobian by reverse-mode autodiff
02:26 Concept of vector-Jacobian product
04:04 Motivation for vJp intrinsic
04:59 Using vjp from JAX
09:07 What is happening under the hood?
09:33 Outro

Комментарии

Информация по комментариям в разработке