
## Section3.6Changing Coordinates

In the beginning sections of this chapter, we outlined procedures for solving systems of linear differential equations of the form

\begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = A \begin{pmatrix} x \\ y \end{pmatrix} \end{equation*}

by determining the eigenvalues of $A\text{,}$ but we have only justified the case where $A$ has distinct real eigenvalues. However, we have considered the following special cases for $A$

\begin{equation*} \begin{pmatrix} \lambda & 0 \\ 0 & \mu \end{pmatrix}, \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}, \begin{pmatrix} \lambda & 0 \\ 0 & \lambda \end{pmatrix}, \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}. \end{equation*}

Although it may seem that we have limited ourselves by attacking only a very small part of the problem of finding solutions for $\mathbf x' = A \mathbf x\text{,}$ we are actually very close to providing a complete classification of all solutions. We will now show that we can transform any $2 \times 2$ system of first-order linear differential equations with constant coefficients into one of these special systems by using a change of coordinates.

### Subsection3.6.1Linear Maps

A linear map or linear transformation on ${\mathbb R}^2$ is a function $T: {\mathbb R}^2 \to {\mathbb R}^2$ that is defined by a matrix. That is,

\begin{equation*} T \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*}

When there is no confusion, we will think of the linear map $T: {\mathbb R}^2 \to {\mathbb R}^2$ and the matrix

\begin{equation*} \begin{pmatrix} a & b \\ c & d \end{pmatrix} \end{equation*}

as interchangeable.

We will say that $T: {\mathbb R}^2 \to {\mathbb R}^2$ is an invertible linear map if we can find a second linear map $S$ such that $T \circ S = S \circ T = I\text{,}$ where $I$ is the identity transformation. In terms of matrices, this means that we can find a matrix $S$ such that

\begin{equation*} TS = ST = I, \end{equation*}

where

\begin{equation*} I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}. \end{equation*}

is the $2 \times 2$ identity matrix. We write $T^{-1}$ for the inverse matrix of $T\text{.}$ It is easy to check that the inverse of

\begin{equation*} T = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \end{equation*}

is

\begin{equation*} T^{-1} = \frac{1}{\det T} \begin{pmatrix} d & - b \\ -c & a \end{pmatrix}. \end{equation*}

If $\det T = 0\text{,}$ then there are infinitely many nonzero vectors ${\mathbf x}$ such that $T {\mathbf x} = {\mathbf 0}\text{.}$ Suppose that $T^{-1}$ exists and ${\mathbf x} \neq {\mathbf 0}$ such that $T {\mathbf x} = {\mathbf 0}\text{.}$ Then

\begin{equation*} {\mathbf x} = T^{-1} T {\mathbf x} = T^{-1} {\mathbf 0} = {\mathbf 0}, \end{equation*}

which is a contradiction. On the other hand, we can certainly compute $T^{-1}\text{,}$ at least in the $2 \times 2$ case, if the determinant is nonzero.

### Subsection3.6.2Changing Coordinates

Suppose that we consider a linear system

$${\mathbf y}' = (T^{-1} A T) {\mathbf y}\label{equation-linear06-change-of-coordinates}\tag{3.6.1}$$

where $T$ is an invertible matrix. If ${\mathbf y}(t)$ is a solution of (3.6.1), we claim that ${\mathbf x}(t) = T {\mathbf y}(t)$ solves the equation ${\mathbf x}' = A {\mathbf x}\text{.}$ Indeed,

\begin{align*} {\mathbf x}'(t) & = (T {\mathbf y})'(t)\\ & = T {\mathbf y}'(t)\\ & = T( (T^{-1} A T) {\mathbf y}(t))\\ & = A (T {\mathbf y}(t))\\ & = A {\mathbf x}(t). \end{align*}

We can think of this in two ways.

1. A linear map $T$ converts solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ to solutions of ${\mathbf x}' = A {\mathbf x}\text{.}$
2. The inverse of a linear map $T$ takes solutions of ${\mathbf x}' = A {\mathbf x}$ to solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}\text{.}$

We are now in a position to solve our problem of finding solutions of an arbitrary linear system

\begin{equation*} \begin{pmatrix} x' \\ y' \end{pmatrix} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} = \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*}

### Subsection3.6.3Distinct Real Eigenvalues

Consider the system ${\mathbf x}' = A {\mathbf x}\text{,}$ where $A$ has two real, distinct eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors ${\mathbf v}_1$ and ${\mathbf v}_2\text{,}$ respectively. Let $T$ be the matrix with columns ${\mathbf v}_1$ and ${\mathbf v}_2\text{.}$ If ${\mathbf e}_1 = (1, 0)$ and ${\mathbf e}_2 = (0, 1)\text{,}$ then $T {\mathbf e}_i = {\mathbf v}_i$ for $i = 1, 2\text{.}$ Consequently, $T^{-1} {\mathbf v}_i = {\mathbf e}_i$ for $i = 1, 2\text{.}$ Thus, we have

\begin{align*} (T^{-1} A T) {\mathbf e}_i & = T^{-1} A {\mathbf v}_i\\ & = T^{-1} (\lambda_i {\mathbf v}_i)\\ & = \lambda_i T^{-1} {\mathbf v}_i\\ & = \lambda_i {\mathbf e}_i \end{align*}

for $i = 1, 2\text{.}$ Therefore, the matrix $T^{-1} A T$ is in canonical form,

\begin{equation*} T^{-1} A T = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix}. \end{equation*}

The eigenvalues of the matrix $T^{-1} A T$ are $\lambda_1$ and $\lambda_2$ with eigenvectors $(1, 0)$ and $(0, 1)\text{,}$ respectively. Thus, the general solution of

\begin{equation*} {\mathbf y}' = (T^{-1}AT) {\mathbf y} \end{equation*}

is

\begin{equation*} {\mathbf y}(t) = \alpha e^{\lambda_1 t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{\lambda_2 t} \begin{pmatrix} 0\\ 1 \end{pmatrix}. \end{equation*}

Hence, the general solution of

\begin{equation*} {\mathbf x}' = A {\mathbf x} \end{equation*}

is

\begin{align*} T {\mathbf y}(t) & = T \left( \alpha e^{\lambda_1 t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{\lambda_2 t} \begin{pmatrix} 0 \\ 1 \end{pmatrix} \right)\\ & = \alpha e^{\lambda_1 t} T \begin{pmatrix} 1 \\ 0\end{pmatrix} + \beta e^{\lambda_2 t} T \begin{pmatrix} 0 \\ 1 \end{pmatrix}\\ & = \alpha e^{\lambda_2 t} \mathbf v_1 + \beta e^{\lambda_2 t} \mathbf v_2. \end{align*}
###### Example3.6.2

Suppose $d{\mathbf x}/dt = A {\mathbf x}\text{,}$ where

\begin{equation*} A = \begin{pmatrix} 1 & 2 \\ 4 & 3 \end{pmatrix}. \end{equation*}

The eigenvalues of $A$ are $\lambda_1 = 5$ and $\lambda_2 = -1$ and the associated eigenvectors are $(1, 2)$ and $(1, -1)\text{,}$ respectively. In this case, our matrix $T$ is

\begin{equation*} \begin{pmatrix} 1 & 1 \\ 2 & -1 \end{pmatrix}. \end{equation*}

If ${\mathbf e}_1 = (1, 0)$ and ${\mathbf e}_2 = (0, 1)\text{,}$ then $T {\mathbf e}_i = {\mathbf v}_i$ for $i = 1, 2\text{.}$ Consequently, $T^{-1} {\mathbf v}_i = {\mathbf e}_i$ for $i = 1, 2\text{,}$ where

\begin{equation*} T^{-1} = \begin{pmatrix} 1/3 & 1/3 \\ 2/3 & -1/3 \end{pmatrix}. \end{equation*}

Thus,

\begin{equation*} T^{-1} A T = \begin{pmatrix} 1/3 & 1/3 \\ 2/3 & -1/3 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 4 & 3 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 2 & -1 \end{pmatrix} = \begin{pmatrix} 5 & 0 \\ 0 & -1 \end{pmatrix}. \end{equation*}

The eigenvalues of the matrix

\begin{equation*} \begin{pmatrix} 5 & 0 \\ 0 & -1 \end{pmatrix} \end{equation*}

are $\lambda_1 = 5$ and $\lambda_2 = -1$ with eigenvectors $(1, 0)$ and $(0, 1)\text{,}$ respectively. Thus, the general solution of

\begin{equation*} {\mathbf y}' = (T^{-1}AT) {\mathbf y} \end{equation*}

is

\begin{equation*} {\mathbf y}(t) = \alpha e^{5t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{-t} \begin{pmatrix} 0\\ 1 \end{pmatrix}. \end{equation*}

Hence, the general solution of

\begin{equation*} {\mathbf x}' = A {\mathbf x} \end{equation*}

is

\begin{align*} T {\mathbf y}(t) & = \begin{pmatrix} 1 & 1 \\ 2 & -1 \end{pmatrix} \left( \alpha e^{5t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{-t} \begin{pmatrix} 0 \\ 1 \end{pmatrix} \right)\\ & = \alpha e^{5t} \begin{pmatrix} 1 \\ 2 \end{pmatrix} + \beta e^{-t} \begin{pmatrix} 1 \\ -1 \end{pmatrix} \end{align*}

The linear map $T$ converts the phase portrait of the system ${\mathbf y}' = (T^{-1}AT) {\mathbf y}$ (Figure 3.6.3) to the phase portrait of the system ${\mathbf x}' = A {\mathbf x}$ (Figure 3.6.4).

### Subsection3.6.4Complex Eigenvalues

Suppose the matrix

\begin{equation*} A = \begin{pmatrix} a \amp b \\ c \amp d \end{pmatrix} \end{equation*}

in system ${\mathbf x}' = A {\mathbf x}$ has complex eigenvalues. In this case, the characteristic polynomial $p(\lambda) = \lambda^2 - (a + d)\lambda + (ad - bc)$ will have roots $\lambda = \alpha + i \beta$ and $\overline{\lambda} = \alpha - i \beta\text{,}$ where

\begin{align*} \alpha \amp = \frac{a + d}{2}\\ \beta \amp = \frac{\sqrt{4bc - (a - d)^2}}{2}. \end{align*}

The eigenvalues $\lambda$ and $\overline{\lambda}$ are complex conjugates. Now, suppose that the eigenvalue $\lambda = \alpha + i \beta$ has an eigenvector of the form

\begin{equation*} \mathbf v = {\mathbf v}_ 1 + i {\mathbf v}_2, \end{equation*}

where $\mathbf v_1$ and $\mathbf v_2$ are real vectors. Then $\overline{\mathbf v} = {\mathbf v}_ 1 - i {\mathbf v}_2$ is an eigenvector for $\overline{\lambda}\text{,}$ since

\begin{equation*} A \overline{\mathbf v} = \overline{A \mathbf v} = \overline{\lambda \mathbf v} = \overline{\lambda} \overline{\mathbf v}. \end{equation*}

Consequently, if $A$ is a real matrix with complex eigenvalues, one of the eigenvalues determines the other.

If ${\mathbf v}_1$ and ${\mathbf v}_2$ are not linearly independent, then ${\mathbf v}_1 = c {\mathbf v}_2$ for some $c \in \mathbb R\text{.}$ On one hand, we have

\begin{equation*} A ({\mathbf v}_ 1 + i {\mathbf v}_2) = A (c {\mathbf v}_2 + i {\mathbf v}_2) = (c + i) A {\bf v}_2. \end{equation*}

However,

\begin{align*} A ({\mathbf v}_ 1 + i {\mathbf v}_2) & = (\alpha + i \beta) ( {\mathbf v}_ 1 + i {\mathbf v}_2)\\ & = (\alpha + i \beta) ( c + i) {\mathbf v}_2\\ & = ( c + i) (\alpha + i \beta) {\mathbf v}_2 \end{align*}

In other words, $A {\mathbf v}_2 = (\alpha + i \beta) {\mathbf v}_2\text{.}$ However, this is a contradiction since the left-side of the equation says that we have real eigenvector while the right-side of the equation is complex. Thus, ${\mathbf v}_1$ and ${\mathbf v}_2$ are linearly independent.

Since ${\mathbf v}_1 + i {\mathbf v}_2$ is an eigenvector associated to the eigenvalue $\alpha + i \beta\text{,}$ we have

\begin{equation*} A ( {\mathbf v}_1 + i {\mathbf v}_2) = (\alpha + i \beta) ({\mathbf v}_1 + i {\mathbf v}_2). \end{equation*}

Equating the real and imaginary parts, we find that

\begin{align*} A {\mathbf v}_1 & = \alpha {\mathbf v}_1 - \beta {\mathbf v}_2\\ A {\mathbf v}_2 & = \beta {\mathbf v}_1 + \alpha {\mathbf v}_2. \end{align*}

If $T$ is the matrix with columns ${\mathbf v}_1$ and ${\mathbf v}_2\text{,}$ then

\begin{align*} T {\mathbf e}_1 & = {\mathbf v}_1\\ T {\mathbf e}_2 & = {\mathbf v}_2. \end{align*}

Thus, we have

\begin{equation*} (T^{-1} A T) {\mathbf e}_1 = T^{-1} (\alpha {\mathbf v}_1 - \beta {\mathbf v}_2) = \alpha {\mathbf e}_1 - \beta {\mathbf e}_2. \end{equation*}

Similarly,

\begin{equation*} (T^{-1} A T) {\mathbf e}_2 = \beta {\mathbf e}_1 + \alpha {\mathbf e}_2. \end{equation*}

Therefore, we can write the matrix $T^{-1}A T$ as

\begin{equation*} T^{-1} AT = \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}. \end{equation*}

The system ${\mathbf y}' = (T^{-1} AT ) {\mathbf y}$ is in one of the canonical forms and has a phase portrait that is a spiral sink ($\alpha \lt 0$), a center ($\alpha = 0$), or a spiral source ($\alpha \gt 0$). After a change of coordinates, the phase portrait of ${\mathbf x}' = A {\mathbf x}$ is equivalent to a sink, center, or source.

###### Example3.6.7

Suppose that we wish to find the solutions of the second order equation

\begin{equation*} 2x'' + 2x' + x = 0. \end{equation*}

This particular equation might model a damped harmonic oscillator. If we rewrite this second-order equation as a first-order system, we have

\begin{align*} x' & = y\\ y' & = - \frac{1}{2} x - y, \end{align*}

or equivalently $\mathbf x' = A \mathbf x\text{,}$ where

\begin{equation*} A = \begin{pmatrix} 0 & 1 \\ - 1/2 & - 1 \end{pmatrix}. \end{equation*}

The eigenvalues of $A$ are

\begin{equation*} - \frac{1}{2} \pm i \frac{1}{2}. \end{equation*}

The eigenvalue $\lambda = (1 + i)/2$ has an eigenvector

\begin{equation*} \mathbf v = \begin{pmatrix} 2 \\ -1 + i \end{pmatrix} = \begin{pmatrix} 2 \\ -1 \end{pmatrix} + i \begin{pmatrix} 0 \\ 1 \end{pmatrix}, \end{equation*}

respectively. Therefore, we can take $T$ to be the matrix

\begin{equation*} T = \begin{pmatrix} 2 \amp 0 \\ -1 \amp 1 \end{pmatrix}. \end{equation*}

Consequently,

\begin{equation*} T^{-1} A T = \begin{pmatrix} 1/2 & 0 \\ 1/2 & 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ -1/2 & -1 \end{pmatrix} \begin{pmatrix} 2 & 0 \\ -1 & 1 \end{pmatrix} = \begin{pmatrix} -1/2 & 1/2 \\ -1/2 & -1/2 \end{pmatrix}, \end{equation*}

which is in the canonical form

\begin{equation*} \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}. \end{equation*}

The general solution to ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ is

\begin{equation*} {\mathbf y}(t) = c_1 e^{-t/2} \begin{pmatrix} \cos(t/2) \\ -\sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} \sin(t/2) \\ \cos(t/2) \end{pmatrix}. \end{equation*}

The phase portrait of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ is given in Figure 3.6.8.

The general solution of ${\mathbf x}' = A {\mathbf x}$ is

\begin{align*} T {\mathbf y}(t) & = \begin{pmatrix} 2 & 0 \\ -1 & 1 \end{pmatrix} \left[ c_1 e^{-t/2} \begin{pmatrix} \cos(t/2) \\ -\sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} \sin(t/2) \\ \cos(t/2) \end{pmatrix} \right]\\ & = c_1 e^{-t/2} \begin{pmatrix} 2 & 0 \\ -1 & 1 \end{pmatrix} \begin{pmatrix} \cos(t/2) \\ -\sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} 2 & 0 \\ -1 & 1 \end{pmatrix} \begin{pmatrix} \sin(t/2) \\ \cos(t/2) \end{pmatrix}\\ & = c_1 e^{-t/2} \begin{pmatrix} 2 \cos(t/2) \\ - \cos(t/2) - \sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} 2 \sin(t/2) \\ - \sin(t/2) + \cos(t/2) \end{pmatrix}. \end{align*}

The phase portrait for this system is given in Figure 3.6.9.

###### Remark3.6.10

Of course, we have a much more efficient way of solving the system ${\mathbf x}' = A {\mathbf x}\text{,}$ where

\begin{equation*} A = \begin{pmatrix} 0 & 1 \\ - 1/2 & - 1 \end{pmatrix}. \end{equation*}

Since $A$ has eigenvalue $\lambda = (-1 + i)/2$ with an eigenvector $\mathbf v = (2, -1 + i)\text{,}$ we can apply Euler's formula and write the solution as

\begin{align*} \mathbf x(t) \amp = e^{(-1 + i)t/2} \mathbf v \amp\\ \amp = e^{-t/2} e^{it/2} \begin{pmatrix} 2 \\ -1 + i \end{pmatrix}\\ \amp = e^{-t/2} (\cos(t/2) + i \sin(t/2)) \begin{pmatrix} 2 \\ -1 + i \end{pmatrix}\\ \amp =e^{-t/2} \begin{pmatrix} 2 \cos(t/2) \\ - \cos(t/2) - \sin(t/2) \end{pmatrix} + i e^{-t/2} \begin{pmatrix} 2 \sin(t/2) \\ -\sin(t/2) + \cos(t/2) \end{pmatrix}. \end{align*}

Taking the real and the imaginary parts of the last expression, the general solution of ${\mathbf x}' = A {\mathbf x}$ is

\begin{equation*} \mathbf x(t) = c_1 e^{-t/2} \begin{pmatrix} 2 \cos(t/2) \\ - \cos(t/2) - \sin(t/2) \end{pmatrix} + c_2 e^{-t/2} \begin{pmatrix} 2 \sin(t/2) \\ - \sin(t/2) + \cos(t/2) \end{pmatrix}, \end{equation*}

which agrees with the solution that we found by transforming coordinates.

### Subsection3.6.5Repeated eigenvalues

Now suppose that $A$ has a single real eigenvalue $\lambda\text{.}$ Then the characteristic polynomial of $A$ is $p(\lambda) = \lambda^2 - (a + d)\lambda + (ad - bc)\text{,}$ then $A$ has an eigenvalue $\lambda = (a + d)/2\text{.}$

Suppose that ${\mathbf u}$ and ${\mathbf v}$ are linearly indeendent eigenvectors for $A\text{,}$ and let $T$ be the matrix whose first column is ${\mathbf u}$ and second column is ${\mathbf v}\text{.}$ That is, $T {\mathbf e}_1 = {\mathbf u}$ and $T{\mathbf e}_2 = {\mathbf v}\text{.}$ Since ${\mathbf u}$ and ${\mathbf v}$ are linearly independent, $\det(T) \neq 0$ and $T$ is invertible. So, it must be the case that

\begin{equation*} AT = (A {\mathbf u}, A {\mathbf v}) = (\lambda {\mathbf u}, \lambda {\mathbf v}) = \lambda ({\mathbf u}, {\mathbf v}) = \lambda IT, \end{equation*}

or

\begin{equation*} A = \begin{pmatrix} \lambda & 0 \\ 0 & \lambda \end{pmatrix}. \end{equation*}

In this case, the system is uncoupled and is easily solved. That is, we can solve each equation in the system

\begin{align*} x' \amp = \lambda x\\ y' \amp = \lambda y \end{align*}

separately to obtain the general solution

\begin{align*} x \amp = c_1 e^{\lambda t}\\ y' \amp = c_2 e^{\lambda t}. \end{align*}

If ${\mathbf w}$ is another vector in ${\mathbb R}^2$ such that ${\mathbf v}$ and ${\mathbf w}$ are linearly independent, then $A \mathbf w$ can be written as a linear combination of $\mathbf v$ and $\mathbf w\text{,}$

\begin{equation*} A {\mathbf w} = \alpha {\mathbf v} + \beta {\mathbf w}. \end{equation*}

We can assume that $\alpha \neq 0\text{;}$ otherwise, we would have a second linearly independent eigenvector. We claim that $\beta = \lambda\text{.}$ If this were not the case, then

\begin{align*} A \left( {\mathbf w} + \left( \frac{\alpha}{\beta - \lambda} \right) {\mathbf v} \right) \amp = A {\mathbf w} + \left( \frac{\alpha}{\beta - \lambda} \right) A {\mathbf v}\\ \amp = \alpha {\mathbf v} + \beta {\mathbf w} + \lambda \left( \frac{\alpha}{\beta - \lambda} \right) {\mathbf v}\\ \amp = \beta {\mathbf w} + \alpha \left(1 + \frac{\lambda}{\beta - \lambda} \right) {\mathbf v}\\ \amp = \beta {\mathbf w} + \alpha \left( \frac{\beta - \lambda + \lambda}{\beta - \lambda} \right) {\mathbf v}\\ \amp = \beta \left( {\mathbf w} + \left( \frac{\alpha}{\beta - \lambda} \right) {\mathbf v} \right) \end{align*}

and $\beta$ would be an eigenvalue distinct from $\lambda\text{.}$ Thus, $A {\mathbf w} = \alpha {\mathbf v} + \lambda {\mathbf w}\text{.}$ If we will let ${\mathbf u} = (1/ \alpha) {\mathbf w}\text{,}$ then

\begin{equation*} A {\mathbf u} = {\mathbf v} + \frac{\lambda}{\alpha} {\mathbf w} = {\mathbf v} + \lambda {\mathbf u}. \end{equation*}

We now define $T {\mathbf e}_1 = {\mathbf v}$ and $T{\mathbf e}_2 = {\mathbf u}\text{.}$ Since

\begin{align*} AT \amp = A\mathbf u + A \mathbf v = \mathbf v + \lambda \mathbf u + \lambda \mathbf v\\ T\begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix} \amp = T (\lambda \mathbf e_1) + T \mathbf e_1 + T (\lambda \mathbf e_2) = \mathbf v + \lambda \mathbf u + \lambda \mathbf v, \end{align*}

we have

\begin{equation*} T^{-1} A T = \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}. \end{equation*}

Therefore, ${\mathbf x}' = A {\mathbf x}$ is in canonical form after a change of coordinates.

###### Example3.6.13

Consider the system $\mathbf x' = A \mathbf x\text{,}$ where

\begin{equation*} A = \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix}. \end{equation*}

The characteristic polynomial of $A$ is $\lambda^2 - 6 \lambda + 9 = (\lambda - 3)^2\text{,}$ we have only a single eigenvalue $\lambda = 3$ with eigenvector $\mathbf v = (1, -2)\text{.}$ Any other eigenvector for $\lambda$ is a multiple of $\mathbf v\text{.}$ If we choose $\mathbf w = (1, 0)\text{,}$ then $\mathbf v$ and $\mathbf w$ are linearly independent. Furthermore,

\begin{equation*} A \mathbf w = \begin{pmatrix} 5 \\ - 4 \end{pmatrix} = 2 \begin{pmatrix} 1 \\ -2 \end{pmatrix} + \lambda \begin{pmatrix} 1 \\ 0 \end{pmatrix} = 2 \begin{pmatrix} 1 \\ -2 \end{pmatrix} + 3 \begin{pmatrix} 1 \\ 0 \end{pmatrix}. \end{equation*}

So we can let $\mathbf u = (1/2) \mathbf w = (1/2, 0)\text{.}$ Therefore, the matrix that we seek is

\begin{equation*} T = \begin{pmatrix} 1 \amp 1/2 \\ -2 \amp 0 \end{pmatrix}, \end{equation*}

and

\begin{equation*} T^{-1} A T = \begin{pmatrix} -1/2 & 2 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix} \begin{pmatrix} 1 \amp 1/2 \\ -2 \amp 0\end{pmatrix} = \begin{pmatrix} 3 & 1 \\ 0 & 3 \end{pmatrix}. \end{equation*}

From Section 3.3, we know that the general solution to the system

\begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} 3 & 1 \\ 0 & 3 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} \end{equation*}

is

\begin{equation*} \mathbf y(t) = c_1 e^{3t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + c_2 e^{3t} \begin{pmatrix} t \\ 1 \end{pmatrix}. \end{equation*}

Therefore, the general solution to

\begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} \end{equation*}

is

\begin{align*} \mathbf x(t) \amp = T \mathbf y(t)\\ \amp = c_1 e^{3t} T \begin{pmatrix} 1 \\ 0 \end{pmatrix} + c_2 e^{3t} T \begin{pmatrix} t \\ 1 \end{pmatrix}\\ \amp = c_1 e^{3t} \begin{pmatrix} 1 \\ -2 \end{pmatrix} + c_2 e^{3t} \begin{pmatrix} 1/2 + t \\ -2t \end{pmatrix}. \end{align*}

This solution agrees with the solution that we found in Example 3.5.5.

In practice, we find solutions to linear systems using the methods that we outlined in Sections 3.2–3.4. What we have demonstrated in this section is that those solutions are exactly the ones that we want.

### Subsection3.6.6Important Lessons

• A linear map $T$ is invertible if and only if $\det T \neq 0\text{.}$
• A linear map $T$ converts solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}$ to solutions of ${\mathbf x}' = A {\mathbf x}\text{.}$
• The inverse of a linear map $T$ takes solutions of ${\mathbf x}' = A {\mathbf x}$ to solutions of ${\mathbf y}' = (T^{-1} A T) {\mathbf y}\text{.}$
• A change of coordinates converts the system ${\mathbf x}' = A {\mathbf x}$ to one of the following special cases,
\begin{equation*} \begin{pmatrix} \lambda & 0 \\ 0 & \mu \end{pmatrix}, \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}, \begin{pmatrix} \lambda & 0 \\ 0 & \lambda \end{pmatrix}, \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}. \end{equation*}

### SubsectionExercises

###### 1

Consider the one-parameter family of linear systems given by

\begin{equation*} \begin{pmatrix} x' \\ y' \end{pmatrix} = \begin{pmatrix} a & \sqrt{2} + a/2 \\ \sqrt{2} - a/2 & 0 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*}
1. Sketch the path traced out by this family of linear systems in the trace-determinant plane as $a$ varies.
2. Discuss any bifurcations that occur along this path and compute the corresponding values of $a\text{.}$
###### 2

Consider the two-parameter family of linear systems

\begin{equation*} \begin{pmatrix} x' \\ y' \end{pmatrix} = \begin{pmatrix} a & b \\ b & a \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*}

Identify all of the regions in the $ab$-plane where this system possesses a saddle, a sink, a spiral sink, and so on.