Last Updated:

What is Jacobian matrix of $f(x)= Ax$?

S. M. Saeed Damadi

In this short post we are going to find the Jacobian matrix of $f(x)= Ax$ where $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$, $x \in \mathbb{R}^n$, and $A \in \mathbb{R}^{m \times n}$.

Why this function is so important?

  1. It is a linear function
  2. It appers in neural networks when we want to find the loss function
  3. It generalize $f(x)= ax$ where $x$, and $a$ are scalars.

As I explained here, in order to find the Jacobian matrix we need a vector-valued function which we have but one should be able to represent each coordinte of $f$ as a function so we do the following:

$$
\begin{align}
f(x)&= Ax =
\begin{bmatrix}
A_{1\bullet}\\
A_{2\bullet}\\
\vdots\\
A_{m\bullet}\\
\end{bmatrix}
x
=
\begin{bmatrix}
A_{1\bullet}x\\
A_{2\bullet}x\\
\vdots\\
A_{m\bullet}x\\
\end{bmatrix}=
\begin{bmatrix}
f_1(x)\\
f_2(x)\\
\vdots\\
f_n(x)\\
\end{bmatrix}\\
&=
\begin{bmatrix}
a_{11}x_1 + a_{12}x_2 + \cdots + a_{1n}x_n\\
a_{21}x_1 + a_{22}x_2 + \cdots + a_{2n}x_n\\
\vdots\\
a_{m1}x_1 + a_{m2}x_2 + \cdots + a_{mn}x_nx\\
\end{bmatrix}\\
\end{align}
$$

where $A_{i\bullet}$ is the $i$-th row of $A$ and $x = [x_1, x_2, \cdots, x_n]^{\top}$.
Hence according to what we discussed in the other post,
$$
J_f(x) = \begin{bmatrix}
a_{11} & a_{12} & \cdots & a_{1n}\\
a_{21} & a_{22} & \cdots & a_{2n}\\
\vdots & \vdots & \cdots & \vdots\\
a_{m1} & a_{m2} & \cdots & a_{mn}\\
\end{bmatrix}
=A
$$

Note: It does not make any differences if we add a constant vector $b$ to $Ax$, i.e., $f(x) = Ax + b$.