Search This Blog

A problem about scalar matrices $A = \lambda \cdot E$

I encountered this problem on MathSE 

Even though this question was heavily downvoted, I think it's quite a nice problem. Here it is. 

We are given that $A$ is a square $n \times n$ matrix which commutes with all invertible matrices of the same size $n$.  

Prove that $A$ is the scalar matrix i.e. $A = \lambda E\ $ where $E$ is the identity matrix and $\lambda$ is a scalar/number. 

Let's solve this problem for the case $n=3$. Solving for any $n$ is fully analogical to the case $n=3$.

Let's assume 

$$A = \begin{bmatrix}a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33}\end{bmatrix}$$

and let's assume it commutes with all invertible matrices of size 3.

Let's pick the matrix 

$$B = \begin{bmatrix}1&0&0\\0&2&0\\0&0&3\end{bmatrix}$$

This one is obviously invertible as its determinant is equal to $6 = 3!$

Now we use the fact that $AB=BA$.

$$AB = \begin{bmatrix}a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33}\end{bmatrix} \cdot \begin{bmatrix}1&0&0\\0&2&0\\0&0&3\end{bmatrix} = \begin{bmatrix}a_{11}&2a_{12}&3a_{13}\\a_{21}&2a_{22}&3a_{23}\\a_{31}&2a_{32}&3a_{33}\end{bmatrix}$$


$$BA = \begin{bmatrix}1&0&0\\0&2&0\\0&0&3\end{bmatrix} \cdot \begin{bmatrix}a_{11}&a_{12}&a_{13}\\a_{21}&a_{22}&a_{23}\\a_{31}&a_{32}&a_{33}\end{bmatrix} = \begin{bmatrix}a_{11}&a_{12}&a_{13}\\2a_{21}&2a_{22}&2a_{23}\\3a_{31}&3a_{32}&3a_{33}\end{bmatrix}$$

But these two resulting matrices must be equal. Comparing their respective elements, it's easy to see that we get the following. $$a_{ij} = 0,\ \ for\ \ all\ \ i \ne j \tag{1}$$

OK... So now our matrix $A$ gets the form

$$A = \begin{bmatrix}a&0&0\\0&b&0\\0&0&c\end{bmatrix} \tag{2}$$

Now let's pick another matrix $C$ and use the fact that $A$ commutes with $C$. We pick $C$ to be e.g. the Vandermonde matrix for the numbers 1,2,3 (the important part is to pick the alphas in the Vandermonde matrix to be all distinct numbers).

$$C = \begin{bmatrix}1^0&1^1&1^2\\2^0&2^1&2^2\\3^0&3^1&3^2\end{bmatrix}$$

We know that the determinant of $C$ equals $(2-1)(3-1)(3-2) = 2 \ne 0$ so $C$ is invertible.

Now in a similar way we use that $A$ commutes with $C$ i.e. $AC = CA$.

$$AC = \begin{bmatrix}a&0&0\\0&b&0\\0&0&c\end{bmatrix} \cdot \begin{bmatrix}1^0&1^1&1^2\\2^0&2^1&2^2\\3^0&3^1&3^2\end{bmatrix} = \begin{bmatrix}a&a&a\\b&2b&4b\\c&3c&9c\end{bmatrix}$$ 

$$CA =  \begin{bmatrix}1^0&1^1&1^2\\2^0&2^1&2^2\\3^0&3^1&3^2\end{bmatrix} \cdot \begin{bmatrix}a&0&0\\0&b&0\\0&0&c\end{bmatrix} = \begin{bmatrix}a&b&c\\a&2b&4c\\a&3b&9c\end{bmatrix}$$ 

But these two resulting matrices must be equal. Comparing their first columns we finally get that $a=b=c$.

Thus our matrix $A$ finally gets the form

$$A = \begin{bmatrix}a&0&0\\0&a&0\\0&0&a\end{bmatrix} \tag{3}$$

which is exactly what we wanted to prove.