Rudy’s OBTF Rudolf Adamkovič

Home / Deep learning / Automatic differentiation / Method / Backward pass


Compute the gradients of \(a\) with respect to \(x\) and \(y\):

\begin{align*} \pdv{a}{x} & = \pdv{x} (x + y) = \pdv{x} (x + 2) \eval{}_{y = 2} = 1 \\[1ex] \pdv{a}{y} & = \pdv{y} (x + y) = \pdv{x} (1 + y) \eval{}_{x = 1} = 1 \end{align*}

© 2025 Rudolf Adamkovič under GNU General Public License version 3.
Made with Emacs and secret alien technologies of yesteryear.