# Inverting A Matrix: Gaussian Elimination Row Echelon Form

The inverse of a matrix is not exactly an easy task if you have not yet been introduced to Gaussian Elimination. The inverse of a matrix is used in a large number of algorithms, one of the simplest being Linear Regression.

There are two steps to inverting a matrix:

- Checking if the matrix is invertible by finding the Determinant
- Inverting the matrix by a Gaussian Elimination variant Gauss-Jordan

# Table Of Contents (Click To Scroll)

## Checking If A Matrix Is Invertible

The concept of rank in linear algebra relates to dimensionality. All matrices are of some size $m times n$, where $m$ is the number of rows and $n$ the number of columns.

### Matrix Rank

- Rank $1$: When a matrix is a line, it has rank $1$.
- Rank $2$: When a matrix is a plane, it has rank $2$.
- A matrix is said to have
**Full Rank**if and only if the matrix does not contain a column as a linear combination of two columns because that means

Any given matrix can only have as high a rank as the number of columns $n$ for that matrix. But a matrix with $n=3$ might only be of rank $2$, because some transformation might squish the column vectors onto a plane – and the rank could even be decreased to rank $1$, if a transformation squishes the vectors onto a straight line.

Let's examine what this means.

### Linear Combinations

We define three vectors $vec{v}$, $vec{u}$, and $vec{w}$, and we combine them as column vectors in a matrix $A$.

If we perform vector addition on $vec{u}$ and $vec{w}$, then the result is the vector $vec{v}$. This means that $vec{v}$ is a linear combination of $vec{u}$ and $vec{w}$, i.e. the matrix $A$ does not have full rank.

Similarly, if you subtract $vec{v}$ and $vec{u}$, then you find that they are a linear combination of $vec{w}$.

The geometrical interpretation of linear combinations is that we *lose information* if we were

Source - Continue Reading: https://mlfromscratch.com/gaussian-elimination/