## Search This Blog

### Inverse of a matrix

All matrices we talk about here are $n \times n$ square matrices over a numeric field.

In linear algebra the definition of the concept invertible matrix is usually given this way.

Def.: A matrix is said to be invertible if there exists a matrix B such that $AB=BA=I$ where I is the identity matrix. In this case B is called the inverse matrix of A.

But I somehow don't like this definition, it seems too strong to me since it implies B is both left inverse (meaning $BA=I$) and right inverse (meaning $AB=I$) of the matrix A.

So let us try to introduce this concept in a somewhat different way.

Def.1: A matrix is said to be left invertible if there exists a matrix B such that $BA=I$ where I is the identity matrix. In this case B is called left inverse matrix of A.

Def.2: A matrix is said to be right invertible if there exists a matrix B such that $AB=I$ where I is the identity matrix. In this case B is called right inverse matrix of A.

We still don't know if left/right inverses of a matrix A exist and under what conditions. We also don't know if they are unique (in case they exist).

Now we will prove a few statements to clarify all this.

Th1: If a matrix A is left (or right) invertible then $\det (A) \ne 0$

Proof: We know that $\det (XY) = \det(X) \cdot \det(Y)$

If A is left invertible then there exists a matrix B such that $BA = I$. But then $1 = \det(I) = \det(BA) = \det(B) \cdot \det(A)$ And now it follows that $\det(A) \ne 0$

If A is right invertible then there exists a matrix B such that $AB = I$. But then $1 = \det(I) = \det(AB) = \det(A) \cdot \det(B)$ And now again it follows that $\det(A) \ne 0$

Th2: If $\det(A) \ne 0$ then A is left and right invertible.

Proof: The proof here is done by construction. If $A=(a_{ij})$, we construct the matrix $S=(A_{ji})$ which is the matrix formed by the cofactors of $A$ transposed. Then one easily shows (using previous theory from linear algebra) that the matrix $T = \frac{1}{\det(A)} \cdot S$ satisfies both $TA=I$ and $AT=I$

So far we proved that a matrix A is left/right invertible if and only if $\det(A) \ne 0$ In the case when $\det(A) \ne 0$, we also showed how one left inverse and one right inverse can be constructed i.e. we showed existence of the left/right inverses (the matrix T is both left and right inverse of A).

This construction of $T$ (from $A$) is important so we will keep denoting this so-constructed matrix as $T$ for the rest of this post.

Th3: For each matrix $A$ with $\det(A) \ne 0$ there is a unique left inverse and a unique right inverse and they are both equal to the above constructed matrix $T$.

Proof: Let's assume that $BA=CA=I$ for some matrices $B,C$ - left inverses of $A$.

Then $T=IT=(BA)T=B(AT)=BI = B$

And also $T=IT=(CA)T=C(AT)=CI = C$

OK, so it follows that $B=C$ (and both B and C are equal to that special matrix T). This proves the uniqueness of the left inverse

Note that to prove the uniqueness of the left inverse we used the existence of the right inverse T of A.

The uniqueness of the right inverse is proved in the same way.

Let's assume that $AB'=AC'=I$ for some matrices $B',C'$ - right inverses of $A$.

Then $T=TI=T(AB')=(TA)B'=IB' = B'$

Also  $T=TI=T(AC')=(TA)C'=IC' = C'$

So it follows that $B'=C'$ (and both B' and C' are equal to that special matrix T). This proves the uniqueness of the right inverse

Note that to prove the uniqueness of the right inverse we used the existence of the left inverse T of A.

We are done. Now we have everything introduced in a clear way. We proved that A is left/right invertible if and only if its determinant is non-zero. And we proved that in that case (when the determinant is non-zero) the left/right inverses exists (T), and also that they are unique and coincide (both are equal to T).

### Elementary row/column operations on matrices

The elementary row/column operations on matrices are:

1) Multiplying the $i$-th row (column) by a number $\lambda \ne 0$.

$R_i := \lambda \cdot R_i$

2) Adding the $j$-th row/column multiplied by a number $\lambda$ to the $i$-th row/column.

$R_i := R_i + \lambda \cdot R_j$

3) Exchanging the rows/columns $i$ and $j$.

$R = R_i$

$R_i = R_j$

$R_j = R$

The interesting thing is that for square matrices each of these operations can be accomplished by matrix multiplication.

Let us define:

• $E$ - the identity matrix of order $n \times n$.
• $E_{ij}$ - the square matrix of order $n \times n$ which has an element $1$ at position $(i, j)$ and zeroes at all other positions.
• $A$ - any square matrix of order $n \times n$.

Then one can show that:

(1) Multiplying the i-th row/column of A by the number $\lambda \ne 0$ is accomplished by left/right multiplying A with the matrix $A_i(\lambda) = E + (\lambda-1)E_{ii}$

(2A) Adding the $j$-th row multiplied by a number $\lambda$ to the $i$-th row is accomplished by left multiplying A with the matrix $B_{ij}(\lambda) = E + \lambda E_{ij}$

(2B) Adding the $j$-th column multiplied by a number $\lambda$ to the $i$-th column is accomplished by right multiplying A with the matrix $B_{ji}(\lambda) = E + \lambda E_{ji}$

(3) Exchanging the rows/columns $i$ and $j$ is accomplished by left/right multiplying A with the matrix $C_{ij} = E - E_{ii} - E_{jj} + E_{ij} + E_{ji}$

The matrices $A_i(\lambda), B_{ij}(\lambda), C_{ij}$ are usually called matrices of the elementary transformations.