Jacobian-vector product (Jvp) with ForwardDiff.jl in Julia

Описание к видео Jacobian-vector product (Jvp) with ForwardDiff.jl in Julia

Jacobian-vector products (also called pushforwards) are the primitives of forward sensitivity analyses. Let's use the forward-mode Automatic Differentiation capabilities of the Julia language to implement them efficiently. Also, we benchmark their performance.

A Jvp computes the result of df/dx * v, where f is a vector-valued function with a multidimensional input. Instead of naively computing the full and dense Jacobian df/dx, we can take a clever shortcut in ForwardDiff.jl

-------

📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-lea...

📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff:   / felix-koehler   and   / felix_m_koehler  

💸 : If you want to support my work on the channel, you can become a Patreon here:   / mlsim  

🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/Felix...

-------

Timestamps:
00:00 Intro
00:19 A vector-valued function
00:55 Forward-mode Automatic Differentiation with ForwardDiff.jl
01:29 Full Jacobian using ForwardDiff.jl
02:17 More interesting: Jacobian-vector product
02:44 Conceptually implementing a Jvp
03:31 More efficient Jvp with ForwardDiff.derivative
06:23 Function for naive Jvp
07:06 Benchmark clever vs. naive approach
07:49 Discussion
08:56 Outro

Комментарии

Информация по комментариям в разработке