In the beginning sections of this chapter, we outlined procedures for solving systems of linear differential equations of the form \begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = A \begin{pmatrix} x \\ y \end{pmatrix} \end{equation*} by determining the eigenvalues of $A\text{,}$ but we have only justified the case where $A$ has distinct real eigenvalues. However, we have considered the following special cases for $A$ \begin{equation*} \begin{pmatrix} \lambda & 0 \\ 0 & \mu \end{pmatrix}, \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}, \begin{pmatrix} \lambda & 0 \\ 0 & \lambda \end{pmatrix}, \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}. \end{equation*} Although it may seem that we have limited ourselves by attacking only a very small part of the problem of finding solutions for $\mathbf x' = A \mathbf x\text{,}$ we are actually very close to providing a complete classification of all solutions. We will now show that we can transform any $2 \times 2$ system of first-order linear differential equations with constant coefficients into one of these special systems by using a change of coordinates.

A linear map or linear transformation on ${\mathbb R}^2$ is a function $T: {\mathbb R}^2 \to {\mathbb R}^2$ that is defined by a matrix. That is, \begin{equation*} T \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*} When there is no confusion, we will think of the linear map $T: {\mathbb R}^2 \to {\mathbb R}^2$ and the matrix \begin{equation*} \begin{pmatrix} a & b \\ c & d \end{pmatrix} \end{equation*} as interchangeable.

We will say that $T: {\mathbb R}^2 \to {\mathbb R}^2$ is an invertible linear map if we can find a second linear map $S$ such that $T \circ S = S \circ T = I\text{,}$ where $I$ is the identity transformation. In terms of matrices, this means that we can find a matrix $S$ such that \begin{equation*} TS = ST = I, \end{equation*} where \begin{equation*} I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}. \end{equation*} is the $2 \times 2$ identity matrix. We write $T^{-1}$ for the inverse matrix of $T\text{.}$ It is easy to check that the inverse of \begin{equation*} T = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \end{equation*} is \begin{equation*} T^{-1} = \frac{1}{\det T} \begin{pmatrix} d & - b \\ -c & a \end{pmatrix}. \end{equation*}

##### Proof

Suppose that we consider a linear system $${\mathbf y}' = (T^{-1} A T) {\mathbf y}\label{equation-linear06-change-of-coordinates}\tag{3.13}$$ where $T$ is an invertible matrix. If ${\mathbf y}(t)$ is a solution of (3.13), we claim that ${\mathbf x}(t) = T {\mathbf y}(t)$ solves the equation ${\mathbf x}' = A {\mathbf x}\text{.}$ Indeed, \begin{align*} {\mathbf x}'(t) & = (T {\mathbf y})'(t)\\ & = T {\mathbf y}'(t)\\ & = T( (T^{-1} A T) {\mathbf y}(t))\\ & = A (T {\mathbf y}(t))\\ & = A {\mathbf x}(t). \end{align*} We can think of this in two ways.

1. A linear map $T$ converts solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ to solutions of ${\mathbf x}' = A {\mathbf x}\text{.}$
2. The inverse of a linear map $T$ takes solutions of ${\mathbf x}' = A {\mathbf x}$ to solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}\text{.}$

We are now in a position to solve our problem of finding solutions of an arbitrary linear system \begin{equation*} \begin{pmatrix} x' \\ y' \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} = \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*}

Consider the system ${\mathbf x}' = A {\mathbf x}\text{,}$ where $A$ has two real, distinct eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors ${\mathbf v}_1$ and ${\mathbf v}_2\text{,}$ respectively. Let $T$ be the matrix with columns ${\mathbf v}_1$ and ${\mathbf v}_2\text{.}$ If ${\mathbf e}_1 = (1, 0)$ and ${\mathbf e}_2 = (0, 1)\text{,}$ then $T {\mathbf e}_i = {\mathbf v}_i$ for $i = 1, 2\text{.}$ Consequently, $T^{-1} {\mathbf v}_i = {\mathbf e}_i$ for $i = 1, 2\text{.}$ Thus, we have \begin{align*} (T^{-1} A T) {\mathbf e}_i & = T^{-1} A {\mathbf v}_i\\ & = T^{-1} (\lambda_i {\mathbf v}_i)\\ & = \lambda_i T^{-1} {\mathbf v}_i\\ & = \lambda_i {\mathbf e}_i \end{align*} for $i = 1, 2\text{.}$ Therefore, the matrix $T^{-1} A T$ is in canonical form, \begin{equation*} T^{-1} A T = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix}. \end{equation*} The eigenvalues of the matrix $T^{-1} A T$ are $\lambda_1$ and $\lambda_2$ with eigenvectors $(1, 0)$ and $(0, 1)\text{,}$ respectively. Thus, the general solution of \begin{equation*} {\mathbf y}' = (T^{-1}AT) {\mathbf y} \end{equation*} is \begin{equation*} {\mathbf y}(t) = \alpha e^{\lambda_1 t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{\lambda_2 t} \begin{pmatrix} 0\\ 1 \end{pmatrix}. \end{equation*} Hence, the general solution of \begin{equation*} {\mathbf x}' = A {\mathbf x} \end{equation*} is \begin{align*} T {\mathbf y}(t) & = T \left( \alpha e^{\lambda_1 t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{\lambda_2 t} \begin{pmatrix} 0 \\ 1 \end{pmatrix} \right)\\ & = \alpha e^{\lambda_1 t} T \begin{pmatrix} 1 \\ 0\end{pmatrix} + \beta e^{\lambda_2 t} T \begin{pmatrix} 0 \\ 1 \end{pmatrix}\\ & = \alpha e^{\lambda_2 t} \mathbf v_1 + \beta e^{\lambda_2 t} \mathbf v_2. \end{align*}

##### Example3.39

Suppose $d{\mathbf x}/dt = A {\mathbf x}\text{,}$ where \begin{equation*} A = \begin{pmatrix} 1 & 2 \\ 4 & 3 \end{pmatrix}. \end{equation*} The eigenvalues of $A$ are $\lambda_1 = 5$ and $\lambda_2 = -1$ and the associated eigenvectors are $(1, 2)$ and $(1, -1)\text{,}$ respectively. In this case, our matrix $T$ is \begin{equation*} \begin{pmatrix} 1 & 1 \\ 2 & -1 \end{pmatrix}. \end{equation*} If ${\mathbf e}_1 = (1, 0)$ and ${\mathbf e}_2 = (0, 1)\text{,}$ then $T {\mathbf e}_i = {\mathbf v}_i$ for $i = 1, 2\text{.}$ Consequently, $T^{-1} {\mathbf v}_i = {\mathbf e}_i$ for $i = 1, 2\text{,}$ where \begin{equation*} T^{-1} = \begin{pmatrix} 1/3 & 1/3 \\ 2/3 & -1/3 \end{pmatrix}. \end{equation*} Thus, \begin{equation*} T^{-1} A T = \begin{pmatrix} 1/3 & 1/3 \\ 2/3 & -1/3 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 4 & 3 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 2 & -1 \end{pmatrix} = \begin{pmatrix} 5 & 0 \\ 0 & -1 \end{pmatrix}. \end{equation*} The eigenvalues of the matrix \begin{equation*} \begin{pmatrix} 5 & 0 \\ 0 & -1 \end{pmatrix} \end{equation*} are $\lambda_1 = 5$ and $\lambda_2 = -1$ with eigenvectors $(1, 0)$ and $(0, 1)\text{,}$ respectively. Thus, the general solution of \begin{equation*} {\mathbf y}' = (T^{-1}AT) {\mathbf y} \end{equation*} is \begin{equation*} {\mathbf y}(t) = \alpha e^{5t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{-t} \begin{pmatrix} 0\\ 1 \end{pmatrix}. \end{equation*} Hence, the general solution of \begin{equation*} {\mathbf x}' = A {\mathbf x} \end{equation*} is \begin{align*} T {\mathbf y}(t) & = \begin{pmatrix} 1 & 1 \\ 2 & -1 \end{pmatrix} \left( \alpha e^{5t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{-t} \begin{pmatrix} 0 \\ 1 \end{pmatrix} \right)\\ & = \alpha e^{5t} \begin{pmatrix} 1 \\ 2 \end{pmatrix} + \beta e^{-t} \begin{pmatrix} 1 \\ -1 \end{pmatrix} \end{align*} The linear map $T$ converts the phase portrait of the system ${\mathbf y}' = (T^{-1}AT) {\mathbf y}$ (Figure 3.40) to the phase portrait of the system ${\mathbf x}' = A {\mathbf x}$ (Figure 3.41).

Suppose the matrix \begin{equation*} A = \begin{pmatrix} a \amp b \\ c \amp d \end{pmatrix} \end{equation*} in system ${\mathbf x}' = A {\mathbf x}$ has complex eigenvalues. In this case, the characteristic polynomial $p(\lambda) = \lambda^2 - (a + d)\lambda + (ad - bc)$ will have roots $\lambda = \alpha + i \beta$ and $\overline{\lambda} = \alpha - i \beta\text{,}$ where \begin{align*} \alpha \amp = \frac{a + d}{2}\\ \beta \amp = \frac{\sqrt{4bc - (a - d)^2}}{2}. \end{align*} The eigenvalues $\lambda$ and $\overline{\lambda}$ are complex conjugates. Now, suppose that the eigenvalue $\lambda = \alpha + i \beta$ has an eigenvector of the form \begin{equation*} \mathbf v = {\mathbf v}_ 1 + i {\mathbf v}_2, \end{equation*} where $\mathbf v_1$ and $\mathbf v_2$ are real vectors. Then $\overline{\mathbf v} = {\mathbf v}_ 1 - i {\mathbf v}_2$ is an eigenvector for $\overline{\lambda}\text{,}$ since \begin{equation*} A \overline{\mathbf v} = \overline{A \mathbf v} = \overline{\lambda \mathbf v} = \overline{\lambda} \overline{\mathbf v}. \end{equation*} Consequently, if $A$ is a real matrix with complex eigenvalues, one of the eigenvalues determines the other.

The system ${\mathbf y}' = (T^{-1} AT ) {\mathbf y}$ is in one of the canonical forms and has a phase portrait that is a spiral sink ($\alpha \lt 0$), a center ($\alpha = 0$), or a spiral source ($\alpha \gt 0$). After a change of coordinates, the phase portrait of ${\mathbf x}' = A {\mathbf x}$ is equivalent to a sink, center, or source.

##### Example3.44

Suppose that we wish to find the solutions of the second order equation \begin{equation*} 2x'' + 2x' + x = 0. \end{equation*} This particular equation might model a damped harmonic oscillator. If we rewrite this second-order equation as a first-order system, we have \begin{align*} x' & = y\\ y' & = - \frac{1}{2} x - y, \end{align*} or equivalently $\mathbf x' = A \mathbf x\text{,}$ where \begin{equation*} A = \begin{pmatrix} 0 & 1 \\ - 1/2 & - 1 \end{pmatrix}. \end{equation*} The eigenvalues are \begin{equation*} - \frac{1}{2} \pm i \frac{1}{2}, \end{equation*} with eigenvectors \begin{equation*} \begin{pmatrix} -1 \\ 1 \end{pmatrix} \pm i \begin{pmatrix} 1 \\ 0 \end{pmatrix}, \end{equation*} respectively. Therefore, we can take $T$ to be \begin{equation*} T = \begin{pmatrix} -1 & -1 \\ 1 & 0 \end{pmatrix}, \end{equation*} and \begin{equation*} T^{-1} A T = \begin{pmatrix} 0 & 1 \\ -1 & -1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ -1/2 & -1 \end{pmatrix} \begin{pmatrix} -1 & -1 \\ 1 & 0 \end{pmatrix} = \begin{pmatrix} -1/2 & 1/2 \\ -1/2 & -1/2 \end{pmatrix}, \end{equation*} which is in the canonical form \begin{equation*} \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}. \end{equation*} Since general solution to ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ is \begin{equation*} {\mathbf y}(t) = c_1 e^{-t/2} \begin{pmatrix} \cos(t/2) \\ -\sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} \sin(t/2) \\ \cos(t/2) \end{pmatrix} \end{equation*} the general solution of ${\mathbf x}' = A {\mathbf x}$ is \begin{align*} T {\mathbf y}(t) & = \begin{pmatrix} -1 & -1 \\ 1 & 0 \end{pmatrix} \left[ c_1 e^{-t/2} \begin{pmatrix} \cos(t/2) \\ -\sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} \sin(t/2) \\ \cos(t/2) \end{pmatrix} \right]\\ & = c_1 e^{-t/2} \begin{pmatrix} - \cos(t/2) + \sin(t/2) \\ \cos(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} - \cos(t/2) - \sin(t/2) \\ \sin(t/2) \end{pmatrix}. \end{align*} The phase portrait for this system is given in Figure 3.45.

Now suppose that $A$ has a single real eigenvalue $\lambda\text{.}$ Then the characteristic polynomial of $A$ is $p(\lambda) = \lambda^2 - (a + d)\lambda + (ad - bc)\text{,}$ then $A$ has an eigenvalue $\lambda = (a + d)/2\text{.}$

In this case, the system is uncoupled and is easily solved. That is, we can solve each equation in the system \begin{align*} x' \amp = \lambda x\\ y' \amp = \lambda y \end{align*} separately to obtain the general solution \begin{align*} x \amp = c_1 e^{\lambda t}\\ y' \amp = c_2 e^{\lambda t}. \end{align*}

##### Example3.48

Consider the system $\mathbf x' = A \mathbf x\text{,}$ where \begin{equation*} A = \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix}. \end{equation*} The characteristic polynomial of $A$ is $\lambda^2 - 6 \lambda + 9 = (\lambda - 3)^2\text{,}$ we have only a single eigenvalue $\lambda = 3$ with eigenvector $\mathbf v = (1, -2)\text{.}$ Any other eigenvector for $\lambda$ is a multiple of $\mathbf v\text{.}$ If we choose $\mathbf w = (1, 0)\text{,}$ then $\mathbf v$ and $\mathbf w$ are linearly independent. Furthermore, \begin{equation*} A \mathbf w = \begin{pmatrix} 5 \\ - 4 \end{pmatrix} = 2 \begin{pmatrix} 1 \\ -2 \end{pmatrix} + \lambda \begin{pmatrix} 1 \\ 0 \end{pmatrix} = 2 \begin{pmatrix} 1 \\ -2 \end{pmatrix} + 3 \begin{pmatrix} 1 \\ 0 \end{pmatrix}. \end{equation*} So we can let $\mathbf u = (1/2) \mathbf w = (1/2, 0)\text{.}$ Therefore, the matrix that we seek is \begin{equation*} T = \begin{pmatrix} 1 \amp 1/2 \\ -2 \amp 0 \end{pmatrix}, \end{equation*} and \begin{equation*} T^{-1} A T = \begin{pmatrix} -1/2 & 2 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix} \begin{pmatrix} 1 \amp 1/2 \\ -2 \amp 0\end{pmatrix} = \begin{pmatrix} 3 & 1 \\ 0 & 3 \end{pmatrix}. \end{equation*} From Section 3.3, we know that the general solution to the system \begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} 3 & 1 \\ 0 & 3 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} \end{equation*} is \begin{equation*} \mathbf y(t) = c_1 e^{3t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + c_2 e^{3t} \begin{pmatrix} t \\ 1 \end{pmatrix}. \end{equation*} Therefore, the general solution to \begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} \end{equation*} is \begin{align*} \mathbf x(t) \amp = T \mathbf y(t)\\ \amp = c_1 e^{3t} T \begin{pmatrix} 1 \\ 0 \end{pmatrix} + c_2 e^{3t} T \begin{pmatrix} t \\ 1 \end{pmatrix}\\ \amp = c_1 e^{3t} \begin{pmatrix} 1 \\ -2 \end{pmatrix} + c_2 e^{3t} \begin{pmatrix} 1/2 + t \\ -2t \end{pmatrix}. \end{align*} This solution agrees with the solution that we found in Example 3.37.

In practice, we find solutions to linear systems using the methods that we outlined in Sections 3.23.4. What we have demonstrated in this section is that those solutions are exactly the ones that we want.

• A linear map $T$ is invertible if and only if $\det T \neq 0\text{.}$
• A linear map $T$ converts solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ to solutions of ${\mathbf x}' = A {\mathbf x}\text{.}$
• The inverse of a linear map $T$ takes solutions of ${\mathbf x}' = A {\mathbf x}$ to solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}\text{.}$
• A change of coordinates converts the system ${\mathbf x}' = A {\mathbf x}$ to one of the following special cases, \begin{equation*} \begin{pmatrix} \lambda & 0 \\ 0 & \mu \end{pmatrix}, \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}, \begin{pmatrix} \lambda & 0 \\ 0 & \lambda \end{pmatrix}, \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}. \end{equation*}