# Diagonalization of Matrices

  

## Definition and Theorem of Diagonalizable Matrices

Definition: A matrix of size $n \times n$ is said to be diagonalizable if there exists an invertible matrix $P$ (it has an inverse) and a diagonal matrix $D$ such that
$A = P D P^{-1}$
Theorem: An $n \times n$ square matrix $A$ is diagonalizable if and only if it has $n$ linearly independent eigenvectors. Matrix $P$ is the set of the $n$ eigenvectors and matrix $D$ is a diagonal matrix whose entries are the eigenvalues of $A$.

## Examples with Solutions

Example 1
Let $A = \begin{bmatrix} -1 & 2\\ 0 & 1 \end{bmatrix}$
a) Find the eigenvalues of $A$ and their corresponding eigenvectors.
b) Show that the eigenvectors forms a basis for $\mathbb{R}^2$.
c) Diagonalize matrix $A$ if possible.

Solution
Find the eigenvalues using the characteristic polynomial given by
$Det (A - \lambda I)$ , where $I$ is the identity matrix and $Det$ is the determinant.
Substitute $A$ and $I$
$Det (A - \lambda I) = Det \left(\begin{bmatrix} -1 & 2\\ 0 & 1 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix} \right)$

$= Det \begin{bmatrix} -1-\lambda & 2\\ 0 & 1-\lambda \end{bmatrix}$

Evaluate the determinant
$= (-1-\lambda)(1-\lambda)$
Find the eigenvalues by solving the characteristic equation
$Det (A - \lambda I) = 0$
Hence
$(-1-\lambda)(1-\lambda) = 0$
gives the eigenvalues: $\lambda_1 = -1$   ,   $\lambda_2 = 1$

Find the eigenvalue corresponding to each eigenvector
Eigenvector corresponding to $\lambda = \lambda_1 = -1$
The eigenvector $\textbf x = \begin{bmatrix} x_1\\ x_2 \end{bmatrix}$ corresponding to $\lambda = -1$ is the solution to the system
$(A - \lambda_1 I) \begin{bmatrix} x_1\\ x_2 \end{bmatrix} = 0$

$\begin{bmatrix} 0 & 2\\ 0 & 2 \end{bmatrix} \begin{bmatrix} x_1\\ x_2 \end{bmatrix} = 0$

Write the above system as an augmented matrix
$\begin{bmatrix} 0 & 2 &|& 0\\ 0 & 2&|& 0 \end{bmatrix}$

Row reduce using Gauss-Jordan method
$\begin{bmatrix} 0 & 1 &|& 0\\ 0 & 0 & |& 0 \end{bmatrix}$
$x_1$ is the free variable.
$x_2 = 0$
Vector $\textbf x$ is given by
$\textbf x = x_1 \begin{bmatrix} 1 \\ 0 \end{bmatrix}$

Eigenvector corresponding to $\lambda = \lambda_2 = 1$
The eigenvector $\textbf x = \begin{bmatrix} x_1\\ x_2 \end{bmatrix}$ corresponding to $\lambda = 1$ is the solution to the system
$(A - \lambda_1 I) \begin{bmatrix} x_1\\ x_2 \end{bmatrix} = 0$

$\begin{bmatrix} -2 & 2\\ 0 & 0 \end{bmatrix} \begin{bmatrix} x_1\\ x_2 \end{bmatrix} = 0$

Write the above system as an augmented matrix
$\begin{bmatrix} -2 & 2 &|& 0\\ 0 & 0&|& 0 \end{bmatrix}$

Row reduce using Gauss-Jordan method
$\begin{bmatrix} 1 & -1 &|& 0\\ 0 & 0 & |& 0 \end{bmatrix}$
$x_2$ is the free variable.
$x_1 = x_2$
Vector $\textbf x$ is given by
$\textbf x = x_2 \begin{bmatrix} 1 \\ 1 \end{bmatrix}$

Conclusion: the eigenvalues and their corresponding eigenvectors are given by: $\lambda = -1, 1$ in the corresponding order $\left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix} , \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right\}$
b)
To prove that the eigenvectors forms a basis for $\mathbb{R}^2$, it is enough to show that the two vectors are linearly independent. Let us use the test for linearity.
the augmented matrix made up of the eigenvectors and zeros for the last column on the right as follows:
$\begin{bmatrix} 1 & 1 & | & 0 \\ 0 & 1 & | & 0 \end{bmatrix}$     (I)
Row reduce using Gauss-Jordan the above augmented matrix.
$\begin{bmatrix} 1 & 0 & | & 0 \\ 0 & 1 & | & 0 \end{bmatrix}$
The system corresponding to the augmented matrix (I) above has one solution only given by $\begin{bmatrix} 0\\ 0 \end{bmatrix}$ and therefore the eigenvectors are linearly independent and hence the two eigenvectors form a basis for $\mathbb{R}^2$.
c)
Matrix $A$ is diagonalizable and we need to find matrix $P$ and its inverse $P^{-1}$
We now construct a matrix $P$ whose columns are the eigenvectors as follows
$P = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$
Find $P^{-1}$ as follows:
Write the augmented matrix $A | I$ , where $I$ is the identity matrix
$\begin{bmatrix} 1 & 1 & | & 1 & 0 \\ 0 & 1 & | & 0 & 1 \end{bmatrix}$
Row reduce the above augmented matrix
$\begin{bmatrix} 1 & 0 & | & 1 & -1 \\ 0 & 1 & | & 0 & 1 \end{bmatrix}$
The above matrix has the form $I | P^{-1}$ where $P^{-1}$ is the inverse of matrix $P$ and is given by
$P^{-1} = \begin{bmatrix} 1 & -1 \\ 0 & 1 \end{bmatrix}$
Let $D$ be the diagonal matrix whose entries in the main diagonal are the eigenvalues and $P^{-1}$ be the inverse of matrix $P$. Matrix A may now be diagonalized as follows
$A = P D P^{-1} = \begin{bmatrix} \color{red}1 & \color{blue}1 \\ \color{red}0 & \color{blue}1 \end{bmatrix} \begin{bmatrix} \color{red}{- 1} & 0 \\ 0 & \color{blue}1 \end{bmatrix} \begin{bmatrix} 1 & -1\\ 0 & 1 \end{bmatrix}$
Use a calculator to check that the above diagonalization is correct.
In fact it is enough to check that $AP = PD$ which does not require the computation of $P^{-1}$.
Note that the order in which the eigenvectors are arranged in matrix $P$ and the eigenvalues in matrix $D$ is important. The eigenvector in column $k$ in matrix $P$ corresponds to the eigenvalue in row $k$ in matrix D. Red and blue colors are used in the above example.

Example 2
Let $A = \begin{bmatrix} 2 & -3 & 0 \\ 0 & -1 & 0 \\ 1 & 3 & 1 \end{bmatrix}$
a) Find the eigenvalues of $A$ and their corresponding eigenvectors.
b) Show that the eigenvectors forms a basis for $\mathbb{R}^3$.
c) Diagonalize matrix $A$ if possible.

Solution
a)
Find the eigenvalues using the characteristic polynomial given by
$Det (A - \lambda I)$ , where $I$ is the identity matrix and $Det$ is the determinant.
Substitute $A$ and $I$
$Det (A - \lambda I) = Det \left(\begin{bmatrix} 2 & -3 & 0 \\ 0 & -1 & 0 \\ 1 & 3 & 1 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \right)$

$= Det \begin{bmatrix} 2-\lambda & -3 & 0\\ 0 & -1-\lambda & 0\\ 1 & 3 & 1-\lambda \end{bmatrix}$
Evaluate the determinant using the second row (it has 2 zeros which makes calculations easy)
$= (-1-\lambda)(2-\lambda)(1-\lambda)$
Find the eigenvalues by solving the characteristic equation
$Det (A - \lambda I) = 0$
Hence
$(-1-\lambda)(2-\lambda)(1-\lambda) = 0$
gives the eigenvalues: $\lambda_1 = -1$   ,   $\lambda_2 = 1$   ,   $\lambda_3 = 2$

Find the eiganvalue corresponding to each eigenvector
Eigenvector corresponding to $\lambda = \lambda_1 = -1$
The eigenvector $\textbf x = \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ corresponding to $\lambda = -1$ is the solution to the system
$(A - \lambda_1 I) \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$

$\begin{bmatrix} 3 & -3 & 0\\ 0 & 0 & 0\\ 1 & 3 & 2 \end{bmatrix} \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$

Write the above system as an augmented matrix
$\begin{bmatrix} 3 & -3 & 0 &|& 0\\ 0 & 0 & 0 &|& 0\\ 1 & 3 & 2 &|& 0 \end{bmatrix}$

Row reduce using Gauss-Jordan method
$\begin{bmatrix} 1 & 0 & \dfrac{1}{2} &|& 0\\ 0 & 1 & \dfrac{1}{2} &|& 0\\ 0 & 0 & 0 &|& 0 \end{bmatrix}$
$x_3$ is the free variable.
$x_2 = - \dfrac{1}{2} x_3$
$x_1 = - \dfrac{1}{2} x_3$
Vector $\textbf x$ is given by
$\textbf x = x_3 \begin{bmatrix} - \dfrac{1}{2} \\ - \dfrac{1}{2} \\ 1 \end{bmatrix}$
Since $x_3$ is a free variable and can take any real value, we can multiply all components by $2$ and rewrite the eigenvector as
$\textbf x = x_3 \begin{bmatrix} - 1 \\ - 1 \\ 2 \end{bmatrix}$

Eigenvector corresponding to $\lambda = \lambda_1 = 1$
The eigenvector $\textbf x = \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ corresponding to $\lambda = 1$ is the solution to the system

$(A - \lambda_1 I) \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$

$\begin{bmatrix} 1 & -3 & 0\\ 0 & -2 & 0\\ 1 & 3 & 0 \end{bmatrix} \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$
Write the above system as an augmented matrix
$\begin{bmatrix} 1 & -3 & 0 &|& 0\\ 0 & -2 & 0 &|& 0\\ 1 & 3 & 0 &|& 0 \end{bmatrix}$

Row reduce using Gauss method
$\begin{bmatrix} 1 & 0 & 0 &|& 0\\ 0 & 1 & 0 &|& 0\\ 0 & 0 & 0 &|& 0 \end{bmatrix}$
$x_3$ is the free variable
$x_2 = 0$
$x_1 = 0$
Vector $\textbf x$ is given by $\textbf x = x_3 \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}$

Eigenvector corresponding to $\lambda = \lambda_1 = 2$
The eigenvector $\textbf x = \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ corresponding to $\lambda =2$ is the solution to the system

$(A - \lambda_1 I) \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$

$\begin{bmatrix} 0 & -3 & 0\\ 0 & -3 & 0\\ 1 & 3 & -1 \end{bmatrix} \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$
Write the above system as an augmented matrix
$\begin{bmatrix} 1 & -3 & 0 &|& 0\\ 0 & -2 & 0 &|& 0\\ 1 & 3 & 0 &|& 0 \end{bmatrix}$

Row reduce using Gauss method
$\begin{bmatrix} 1 & 0 & -1 &|& 0\\ 0 & 1 & 0 &|& 0\\ 0 & 0 & 0 &|& 0 \end{bmatrix}$
$x_3$ is the free variable
$x_2 = 0$
$x_1 = x_3$
Vector $\textbf x$ is given by $\textbf x = x_3 \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$

Conclusion: the eigenvalues and their corresponding eigenvectors are given by: $\lambda = -1, 1, 2$ in the corresponding order $\left\{ \begin{bmatrix} - 1 \\ - 1 \\ 2 \end{bmatrix} , \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} , \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} \right\}$
b)
To prove that the eigenvectors forms a basis for $\mathbb{R}^3$, it is enough to show that the three vectors are linearly independent. Let us use the test for linearity;
the augmented matrix made up of the eigenvectors and zeros for the last column on the right as follows:
$\begin{bmatrix} - 1 & 0 & 1 & | & 0 \\ - 1 & 0 & 0 & | & 0 \\ 2 & 1 & 1 & | & 0 \end{bmatrix}$     (I)
Row reduce using Gauss-Jordan the above augmented matrix.
$\begin{bmatrix} 1 & 0 & 0 & | & 0 \\ 0 & 1 & 0 & | & 0 \\ 0 & 0 & 1 & | & 0 \end{bmatrix}$
The system corresponding to the augmented matrix (I) above has one solution only given by $\begin{bmatrix} 0\\ 0\\ 0 \end{bmatrix}$ and therefore the eigenvectors are linearly independent, hence the 3 eigenvectors form a basis for $\mathbb{R}^3$.
c)
We now construct a matrix $P$ whose columns are the eigenvectors as follows
$P = \begin{bmatrix} - 1 & 0 & 1 \\ - 1 & 0 & 0 \\ 2 & 1 & 1 \end{bmatrix}$

Find $P^{-1}$ as follows:
Write the augmented matrix $A | I$ , where $I$ is the identity matrix
$\begin{bmatrix} - 1 & 0 & 1 & | & 1 & 0 & 0 \\ - 1 & 0 & 0 & | & 0 & 1 & 0 \\ 2 & 1 & 1 & | & 0 & 0 & 1 \end{bmatrix}$
Row reduce the above augmented matrix
$\begin{bmatrix} 1 & 0 & 0 & | & 0 & -1 & 0\\ 0 & 1 & 0 & | & -1 & 3 & 1\\ 0 & 0 & 1 & | & 1 & -1 & 0 \end{bmatrix}$
The above matrix has the form $I | P^{-1}$ where $P^{-1}$ is the inverse of matrix $P$ and is given by
$P^{-1} = \begin{bmatrix} 0 & -1 & 0\\ -1 & 3 & 1\\ 1 & -1 & 0 \end{bmatrix}$

Let $D$ be the diagonal matrix whose entries in the main diagonal are the eigenvalues and $P^{-1}$ be the inverse of matrix $P$. Matrix A may now be diagonalized as follows
$A = P D P^{-1} = \begin{bmatrix} \color{red}{- 1} & \color{blue}0 & \color{green}1 \\ \color{red}{- 1} & \color{blue}0 & \color{green} 0 \\ \color{red}2 & \color{blue} 1 & \color{green} 1 \end{bmatrix} \begin{bmatrix} \color{red}{- 1} & 0 & 0 \\ 0 & \color{blue} 1 & 0 \\ 0& 0 & \color{green}2 \end{bmatrix} \begin{bmatrix} 0 & -1 & 0\\ -1 & 3 & 1\\ 1 & -1 & 0\end{bmatrix}$
Use a calculator to check that the above diagonalization is correct.
In fact it is enough to check that $AP = PD$ which does not require the computation of $P^{-1}$.
Note that the order in which the eigenvectors are arranged in matrix $P$ and the eigenvalues in matrix $D$ is important. The eigenvector in column $k$ in matrix $P$ corresponds to the eigenvalue in row $k$ in matrix D. Note the colors used in matrix $P$ above.

Example 3 (Case of non diagonalizable matrix explained)
Find the eigenvalues and corresponding eigenvectors of matrix $A = \begin{bmatrix} 2 & 0 & 4\\ 2 & -1 & 3\\ -2 & 1 & -3 \end{bmatrix}$ and diagonalize it if possible.

Solution

Find the eigenvalues using the characteristic polynomial given by
$Det (A - \lambda I)$ , where $I$ is the identity matrix and $Det$ is the determinant.
Substitute $A$ and $I$
$Det (A - \lambda I) = Det \left(\begin{bmatrix} 2 & 0 & 4\\ 2 & -1 & 3\\ -2 & 1 & -3 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \right)$

$= Det \begin{bmatrix} 2-\lambda & 0 & 4\\ 2 & -1-\lambda & 3\\ -2 & 1 & -3-\lambda \end{bmatrix}$
Evaluate the determinant using the first row (it has 1 zero which makes calculations easy)
$= (2-\lambda)((-1-\lambda)(-3-\lambda)-3)+4(2 + 2 (-1-\lambda))$
Expand and simplify
$= -\lambda^3-2\lambda^2$
Find the eigenvalues by solving the characteristic equation
$Det (A - \lambda I) = 0$
Hence
$-\lambda^3-2\lambda^2 = 0$
gives the eigenvalues: $\lambda_1 = 0$   ,   $\lambda_2 = 0$   ,   $\lambda_3 = - 2$

Eigenvector corresponding to $\lambda = \lambda_1 = 0$
The eigenvector $\textbf x = \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ corresponding to $\lambda = 0$ is the solution to the system

$(A - \lambda_1 I) \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$

$\begin{bmatrix} 2 & 0 & 4\\ 2 & -1 & 3\\ -2 & 1 & -3 \end{bmatrix} \begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix} = 0$
Write the above system as an augmented matrix
$\begin{bmatrix} 2 & 0 & 4 &|& 0\\ 2 & -1 & 3 &|& 0\\ -2 & 1 & -3 &|& 0 \end{bmatrix}$

Row reduce using Gauss method
$\begin{bmatrix} 1 & 0 & 2 &|& 0\\ 0 & 1 & 1 &|& 0\\ 0 & 0 & 0 &|& 0 \end{bmatrix}$
$x_3$ is the free variable
$x_2 = -x_3$
$x_1 = -2x_3$
Vector $\textbf x$ is given by $\textbf x = x_3 \begin{bmatrix} -2 \\ -1 \\ 1 \end{bmatrix}$
The eigenvalue $\lambda = 0$ has algebraic multiplicity $2$ but has only one eigenvector and therefore we may conclude that the eigenvectors of the given matrix cannot form a basis for $\mathbb{R}^3$. The given matrix is not diagonalizable.

Example 4
Knowing that $1$ and $3$ are eigenvalues of matrix $A = \begin{bmatrix} 2 & 0 & 0 & 1\\ 0 & 0 & 4 & 0\\ 0 & 3 & 0 & 0\\ 1 & 0 & 0 & 2 \end{bmatrix}$, find all the eigenvalues and corresponding eigenvectors of matrix $A$ and diagonalize it.

Solution
Eigenvalues and eigenvectors:
$\lambda_1 = -2\sqrt 3$ , $\textbf x_1 = \begin{bmatrix} 0 \\ - \dfrac{2}{\sqrt 3}\\ 1\\ 0 \end{bmatrix}$

$\lambda_2 = 1$ , $\textbf x_2 = \begin{bmatrix} -1 \\ 0\\ 0\\ 1 \end{bmatrix}$

$\lambda_3 = 3$ , $\textbf x_3 = \begin{bmatrix} 1 \\ 0\\ 0\\ 1 \end{bmatrix}$

$\lambda_4 = 2\sqrt 3$ , $\textbf x_4 = \begin{bmatrix} 0 \\ \dfrac{2}{\sqrt 3}\\ 1\\ 0 \end{bmatrix}$

$P = \begin{bmatrix} 0 & -1 & 1 & 0\\ - \dfrac{2}{\sqrt 3} & 0 & 0 & \dfrac{2}{\sqrt 3}\\ 1 & 0 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix}$ $P^{-1} = \begin{bmatrix} 0&-\frac{\sqrt{3}}{4}&\frac{1}{2}&0\\ -\frac{1}{2} & 0 & 0 & \frac{1}{2}\\ \frac{1}{2} & 0 & 0 & \frac{1}{2}\\ 0 & \frac{\sqrt{3}}{4} & \frac{1}{2}& 0 \end{bmatrix}$

$A = PDP^{-1} = \begin{bmatrix} 0 & -1 & 1 & 0\\ - \dfrac{2}{\sqrt 3} & 0 & 0 & \dfrac{2}{\sqrt 3}\\ 1 & 0 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix} \begin{bmatrix} -2\sqrt 3 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 2\sqrt 3 \end{bmatrix} \begin{bmatrix} 0&-\frac{\sqrt{3}}{4}&\frac{1}{2}&0\\ -\frac{1}{2} & 0 & 0 & \frac{1}{2}\\ \frac{1}{2} & 0 & 0 & \frac{1}{2}\\ 0 & \frac{\sqrt{3}}{4} & \frac{1}{2}& 0 \end{bmatrix}$

Example 5
Calculate $A^{10}$ where $A$ is the matrix in example 4.

Solution
$A = P D P^{-1}$
$A^2 = (P D P^{-1})(P D P^{-1}) = P D^2 P^{-1}$
$A^3 = A^2 A = P D^2 P^{-1} (P D P^{-1}) = P D^3 P^{-1}$
. . .
etc ..
. . .
$A^n = P D^n P^{-1}$ for any positive integer $n$.
Hence
$A^{10} = P D^{10} P^{-1}$
Matrix $A$ was diagonalized in example 4 and is given by
$A = PDP^{-1} = \begin{bmatrix} 0 & -1 & 1 & 0\\ - \dfrac{2}{\sqrt 3} & 0 & 0 & \dfrac{2}{\sqrt 3}\\ 1 & 0 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix} \begin{bmatrix} -2\sqrt 3 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 2\sqrt 3 \end{bmatrix} \begin{bmatrix} 0&-\frac{\sqrt{3}}{4}&\frac{1}{2}&0\\ -\frac{1}{2} & 0 & 0 & \frac{1}{2}\\ \frac{1}{2} & 0 & 0 & \frac{1}{2}\\ 0 & \frac{\sqrt{3}}{4} & \frac{1}{2}& 0 \end{bmatrix}$

Use the formula $A^{10} = P D^{10} P^{-1}$
$A^{10} = \begin{bmatrix} 0 & -1 & 1 & 0\\ - \dfrac{2}{\sqrt 3} & 0 & 0 & \dfrac{2}{\sqrt 3}\\ 1 & 0 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix} \begin{bmatrix} -2\sqrt 3 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 2\sqrt 3 \end{bmatrix} ^{10} \begin{bmatrix} 0&-\frac{\sqrt{3}}{4}&\frac{1}{2}&0\\ -\frac{1}{2} & 0 & 0 & \frac{1}{2}\\ \frac{1}{2} & 0 & 0 & \frac{1}{2}\\ 0 & \frac{\sqrt{3}}{4} & \frac{1}{2}& 0 \end {bmatrix}$

Note that it is more efficient to calculate the power of a diagonal matrix.
$A^{10} = \begin{bmatrix} 0 & -1 & 1 & 0\\ - \dfrac{2}{\sqrt 3} & 0 & 0 & \dfrac{2}{\sqrt 3}\\ 1 & 0 & 0 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix} \begin{bmatrix} 248832 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 \\ 0 & 0 & 59049 & 0 \\ 0 & 0 & 0 & 248832 \end{bmatrix} \begin{bmatrix} 0&-\frac{\sqrt{3}}{4}&\frac{1}{2} & 0 \\ -\frac{1}{2} & 0 & 0 & \frac{1}{2}\\ \frac{1}{2} & 0 & 0 & \frac{1}{2}\\ 0 & \frac{\sqrt{3}}{4} & \frac{1}{2}& 0 \end {bmatrix}$

Multiply and simplify

$A^{10} = \begin{bmatrix} 29525 & 0 & 0 & 29524\\ 0 & 248832 & 0 & 0\\ 0 & 0 & 248832 & 0\\ 29524 & 0 & 0 & 29525 \end{bmatrix}$

## Questions (with solutions given below)

It is shown through these examples that invertibility and diagonalization of matrices are not related.
• Part 1
Show that matrix $A = \begin{bmatrix} 4 & - 1 \\ 1 & 2 \end{bmatrix}$ is invertible but not diagonalizable.
• Part 2
Show that matrix $A = \begin{bmatrix} -1 & -2 \\ 0 & 0 \end{bmatrix}$ is not invertible (singular) but diagonalizable.
• Part 3
Show that matrix $A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$ is neither invertible nor diagonalizable.
• Part 4
Show that matrix $A = \begin{bmatrix} -1 & 1 \\ 0 & 1 \end{bmatrix}$ is both invertible and diagonalizable.

### Solutions to the Above Questions

• Part 1
Given $A = \begin{bmatrix} 4 & - 1 \\ 1 & 2 \end{bmatrix}$
The determinant of matrix $A$ in not equal to zero and therefore $A$ is invertible.
Eigenvalues: $\lambda = 3$ of multiplicity $2$ and has one eigenvector only given by $\begin{bmatrix} 1\\ 1 \end{bmatrix}$
The eigenvector does not form a basis for $\mathbb{R}^2$ and therefore matrix $A$ is not diagonalizable.

• Part 2
Given $A = \begin{bmatrix} -1 & -2 \\ 0 & 0 \end{bmatrix}$
The determinant of matrix $A$ is equal to zero and therefore this matrix is not invertible.
Eigenvalue: $\lambda = -1$ , eigenvector $\begin{bmatrix} 1\\ 0 \end{bmatrix}$
Eigenvalue: $\lambda = 0$ , eigenvector $\begin{bmatrix} -2\\ 1 \end{bmatrix}$
$P = \begin{bmatrix} 1 & -2\\ 0 & 1 \end{bmatrix}$
Matrix $A$ is diagonalizable.
$A = PDP^{-1} = \begin{bmatrix} 1 & -2\\ 0 & 1 \end{bmatrix} \begin{bmatrix} -1 & 0\\ 0 & 0 \end{bmatrix} \begin{bmatrix} 1 & 2\\ 0 & 1 \end{bmatrix}$

• Part 3
$A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$
Matrix $A$ has a determinant equal to zero and therefore is not invertible.
Eigenvalue: $\lambda = 0$ of multiplicity $2$ and one eigenvector only given by $\begin{bmatrix} 1 \\ 0 \end{bmatrix}$
One eigenvector only cannot form a basis for $\mathbb{R}^2$ and therefore matrix $A$ is not diagonalizable.

• Part 4
Given $A = \begin{bmatrix} -1 & 1 \\ 0 & 1 \end{bmatrix}$
The determinant of matrix $A$ is not equal to zero and therefore matrix $A$ is invertible.
Eigenvalue: $\lambda = -1$ , eigenvector $\begin{bmatrix} 1\\ 0 \end{bmatrix}$
Eigenvalue: $\lambda = 1$ , eigenvector $\begin{bmatrix} 1\\ 2 \end{bmatrix}$
$P = \begin{bmatrix} 1 & 1\\ 0 & 2 \end{bmatrix}$
Matrix $A$ is diagonalizable
$A = PDP^{-1} = \begin{bmatrix} 1 & 1\\ 0 & 2 \end{bmatrix} \begin{bmatrix} -1 & 0\\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & -\frac{1}{2}\\ 0 & \frac{1}{2} \end{bmatrix}$