Rudy’s OBTF Rudolf Adamkovič

Home / Deep learning / Automatic differentiation / Method / Backward pass


Compute the gradients of \(b\) with respect to \(x\) and \(y\):

\begin{align*} \pdv{b}{x} & = \pdv{x} (y + 3) = \pdv{x} (2 + 3) \eval{}_{y = 2} = 0 \\[1ex] \pdv{b}{y} & = \pdv{y} (y + 3) = \pdv{x} (y + 3) \eval{}_{x = 1} = 1 \end{align*}

© 2025 Rudolf Adamkovič under GNU General Public License version 3.
Made with Emacs and secret alien technologies of yesteryear.