What is a Pullback in Zygote.jl? | vector-Jacobian products in Julia

Описание к видео What is a Pullback in Zygote.jl? | vector-Jacobian products in Julia

There are many great packages for reverse-mode Automatic Differentiation in the Julia language. Most of them provide the primitive for adjoint sensitivity analysis: the vJp. This video covers Zygote.jl's implementation called a "pullback".

Adjoint sensitivity methods are the backbone of modern optimization and machine learning. When used on explicit forward computations (like in classical Neural Networks) they are often referred to as Automatic Differentiation, in a special case also "backpropagation". An essential building block for reverse-mode AD is the vJp which is implemented by the "pullback" function in Zygote.jl

-------

📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-lea...

📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff:   / felix-koehler   and   / felix_m_koehler  

💸 : If you want to support my work on the channel, you can become a Patreon here:   / mlsim  

🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/Felix...

-------

Timestamps:
00:00 Intro
00:24 A vector-valued function
00:51 Reverse-mode AD with Zygote.jl
01:30 Obtaining full Jacobian
03:06 Concept of a vector-Jacobian product
04:29 Using Zygote.pullback
06:50 Clever vjp with pullback
09:24 Naive vjp with full & dense Jacobian
10:30 Benchmark both approaches
11:25 Discussion
12:40 Outro

Комментарии

Информация по комментариям в разработке