S. M. Saeed Damadi (6)

Creating nonlinear neural network and finding the Jacobian matrix of its funciton

The next step towards finding loss function of a neural network is to extend the results we found here to add a bias to the linear function, i.e., creating $W^{\top}x+b$ where $x \in \mathbb{R}^n$, $b \in \mathbb{R}^m$, and $W \in \mathbb{R}^{n \times m}$. This can be done easily because…

Linear neural network to creat $f(x)= W^{\top}x$

Goal:In this post we are going to go through the very first step that help us to suppress algebraic notations for representing the function calculating by a fully connected neural network. Following this step helps you to master shorthand notations to express operations happening in…

What is Jacobian matrix of $f(x)= Ax$?

In this short post we are going to find the Jacobian matrix of $f(x)= Ax$ where $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$, $x \in \mathbb{R}^n$, and $A \in \mathbb{R}^{m \times n}$. As I explained here, in order to find the Jacobian matrix we need a vector-valued function…

What is Jacobian matrix and why do we need it?

Derivative of univariate functionTo understand what is Jacobian, we need to revisit the derivative of a univariate function wherein $f$ maps the real line into the real line, that is, $f: \mathbb{R} \rightarrow \mathbb{R} $. The derivative of $f$ denoted by $f'$ measures the sensitivity to change…

Jacobian matrix of a composite function

This post is the continuation of what I have discussed here to clarify what is Jacobian matrix. We are going to see an example in which Jacobian matrix is being applied on a composite function including two functions. This case is very intuitive since it…