Cancel Preloader

Inverting A Matrix: Gaussian Elimination Row Echelon Form

 Inverting A Matrix: Gaussian Elimination  Row Echelon Form
Inverting A Matrix: Gaussian Elimination & Row Echelon Form

The inverse of a matrix is not exactly an easy task if you have not yet been introduced to Gaussian Elimination. The inverse of a matrix is used in a large number of algorithms, one of the simplest being Linear Regression.

There are two steps to inverting a matrix:

  1. Checking if the matrix is invertible by finding the Determinant
  2. Inverting the matrix by a Gaussian Elimination variant Gauss-Jordan

Table Of Contents (Click To Scroll)

  1. Checking If A Matrix Is Invertible
  2. Inverting A Matrix
  3. References

Checking If A Matrix Is Invertible

The concept of rank in linear algebra relates to dimensionality. All matrices are of some size $m times n$, where $m$ is the number of rows and $n$ the number of columns.

Matrix Rank

  1. Rank $1$: When a matrix is a line, it has rank $1$.
  2. Rank $2$: When a matrix is a plane, it has rank $2$.
  3. A matrix is said to have Full Rank if and only if the matrix does not contain a column as a linear combination of two columns because that means

Any given matrix can only have as high a rank as the number of columns $n$ for that matrix. But a matrix with $n=3$ might only be of rank $2$, because some transformation might squish the column vectors onto a plane – and the rank could even be decreased to rank $1$, if a transformation squishes the vectors onto a straight line.

Let's examine what this means.

Linear Combinations

We define three vectors $vec{v}$, $vec{u}$, and $vec{w}$, and we combine them as column vectors in a matrix $A$.

$$ vec{v} = begin{bmatrix} 2\ 0\ 1 end{bmatrix} ,quad vec{u} = begin{bmatrix} 1\ 2\ 1 end{bmatrix} ,quad vec{w} = begin{bmatrix} 1\ -2\ 0 end{bmatrix} ,quad A = begin{bmatrix} 2 &1 &1 \ 0 &2 &-2 \ 1 &1 &0 end{bmatrix} $$

If we perform vector addition on $vec{u}$ and $vec{w}$, then the result is the vector $vec{v}$. This means that $vec{v}$ is a linear combination of $vec{u}$ and $vec{w}$, i.e. the matrix $A$ does not have full rank.

$$ vec{u} + vec{w} = begin{bmatrix} 1+1\ 2+(-2)\ 1+0 end{bmatrix} = begin{bmatrix} 2\ 0\ 1 end{bmatrix} $$

Similarly, if you subtract $vec{v}$ and $vec{u}$, then you find that they are a linear combination of $vec{w}$.

The geometrical interpretation of linear combinations is that we lose information if we were

[...]

Source - Continue Reading: https://mlfromscratch.com/gaussian-elimination/

webmaster

Related post