Home / Deep learning / Automatic differentiation / Method / Backward pass
Compute the gradients of \(a\) with respect to \(x\) and \(y\):
\begin{align*}
\pdv{a}{x} & = \pdv{x} (x + y) = \pdv{x} (x + 2) \eval{}_{y = 2} = 1 \\[1ex]
\pdv{a}{y} & = \pdv{y} (x + y) = \pdv{x} (1 + y) \eval{}_{x = 1} = 1
\end{align*}