Rudy’s OBTF Rudolf Adamkovič

Home / Deep learning / Automatic differentiation / Method / Backward pass


Use the chain rule to compute the gradients of \(c\) with respect to \(x\) and \(y\):

\begin{align*} \pdv{c}{x} & = \pdv{c}{a} \pdv{a}{x} + \pdv{c}{b} \pdv{b}{x} = 5 \cdot 1 + 3 \cdot 0 = 5 \\[1ex] \pdv{c}{y} & = \pdv{c}{a} \pdv{a}{y} + \pdv{c}{b} \pdv{b}{y} = 5 \cdot 1 + 3 \cdot 1 = 8 \end{align*}

© 2025 Rudolf Adamkovič under GNU General Public License version 3.
Made with Emacs and secret alien technologies of yesteryear.