#
Subsection1.6.1The Existence and Uniqueness Theorem¶ permalink

The following theorem tells us that solutions to first-order differential equations exist and are unique under certain reasonable conditions.

#####
Theorem1.40Existence and Uniqueness Theorem

Let \(x' = f(t, x)\) have the initial condition \(x(t_0) = x_0\text{.}\) If \(f\) and \(\partial f/ \partial x\) are continuous functions on the rectangle
\begin{equation*}
R = \left\{ (t, x) : 0 \leq |t - t_0| \leq a, 0 \leq |x - x_0| \leq b \right\},
\end{equation*}
there exists a unique solution \(u = u(t)\) for \(x' = f(t, x)\) and \(x(t_0) = x_0\) on some interval \(|t - t_0| \lt h\) contained in the interval \(|t - t_0| \lt a\text{.}\)

Let us examine some consequences of the existence and uniqueness of solutions.

#####
Example1.41

Consider the initial value problem \(y' = y^{1/3}\) with \(y(0) = 0\) and \(t \geq 0\text{.}\) Separating the variables,
\begin{equation*}
y^{-1/3} y' = dt.
\end{equation*}
Thus,
\begin{equation*}
\frac{3}{2} y^{2/3} = t + C
\end{equation*}
or
\begin{equation*}
y = \left( \frac{2}{3} ( t + C) \right)^{3/2}.
\end{equation*}
If \(C = 0\text{,}\) the initial condition is satisfied and
\begin{equation*}
y = \left( \frac{2}{3} t \right)^{3/2}
\end{equation*}
is a solution for \(t \geq 0\text{.}\) However, we can find two additional solutions for \(t \geq 0\text{:}\)
\begin{align*}
y & = - \left( \frac{2}{3} t \right)^{3/2},\\
y & \equiv 0.
\end{align*}
This is especially troubling if we are looking for equilibrium solutions. Although \(y' = y^{1/3}\) is an autonomous differential equation, there is no equilibrium solution at \(y = 0\text{.}\) The problem is that
\begin{equation*}
\frac{\partial}{\partial y} y^{1/3} = \frac{1}{3} y^{-2/3}
\end{equation*}
is not defined at \(y = 0\text{.}\)

#####
Example1.42

Suppose that \(y' = y^2\) with \(y(0) = 1\text{.}\) Since \(f(t,y) = y^2\) and \(\partial f/ \partial y = 2y\) are continuous everywhere, a unique solution exists near \(t = 0\text{.}\) Separating the variables,
\begin{equation*}
\frac{1}{y^2} \; dy = dt,
\end{equation*}
we see that
\begin{equation*}
y = - \frac{1}{t + C}
\end{equation*}
or
\begin{equation*}
y = \frac{1}{1-t}.
\end{equation*}
Therefore, a solution also exists on \((-\infty, 1)\) if \(y(0) = -1\text{.}\) In the case that \(y(0) = -1\text{,}\) the solution is
\begin{equation*}
y = - \frac{1}{t + 1},
\end{equation*}
and a solution exists on \((-1, \infty)\text{.}\) Solutions are only guaranteed to exist on an open interval containing the initial value and are very dependent on the initial condition.

It was Emile Picard (1856–1941) who developed the method of successive approximations to show the existence of solutions of ordinary differential equations. He proved that it is possible to construct a sequence of functions that converges to a solution of the differential equation. One of the first steps towards understanding *Picard iteration* is to realize that an initial value problem can be recast in terms of an integral equation.

#####
Theorem1.44

The function \(u = u(t)\) is a solution to the initial value problem
\begin{align*}
x' & = f(t, x)\\
x(t_0) & = x_0,
\end{align*}
if and only if \(u\) is a solution to the integral equation
\begin{equation*}
x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds.
\end{equation*}

Suppose that \(u = u(t)\) is a solution to
\begin{align*}
x' & = f(t, x)\\
x(t_0) & = x_0,
\end{align*}
on some interval \(I\) containing \(t_0\text{.}\) Since \(u\) is continuous on \(I\) and \(f\) is continuous on \(R\text{,}\) the function \(F(t) = f(t, u(t))\) is also continuous on \(I\text{.}\) Integrating both sides of \(u'(t) = f(t, u(t))\) and applying the Fundamental Theorem of Calculus, we obtain
\begin{equation*}
u(t) - u(t_0) = \int_{t_0}^t u'(s) \, ds = \int_{t_0}^t f(s, u(s)) \, ds
\end{equation*}
Since \(u(t_0) = x_0\text{,}\) the function \(u\) is a solution of the integral equation.

Conversely, assume that
\begin{equation*}
u(t) = x_0 + \int_{t_0}^t f(s, u(s)) \, ds.
\end{equation*}
If we differentiate both sides of this equation, we obtain \(u'(t) = f(t, u(t))\text{.}\) Since
\begin{equation*}
u(t_0) = x_0 + \int_{t_0}^{t_0} f(s, u(s)) \, ds = x_0,
\end{equation*}
the initial condition is fulfilled.

To show the existence of a solution to the initial value problem
\begin{align*}
x' & = f(t, x)\\
x(t_0) & = x_0,
\end{align*}
we will construct a sequence of functions, \(\{ u_n(t) \}\text{,}\) that will converge to a function \(u(t)\) that is a solution to the integral equation
\begin{equation*}
x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds.
\end{equation*}
We define the first function of the sequence using the initial condition,
\begin{equation*}
u_0(t) = x_0.
\end{equation*}
We derive the next function in our sequence using the right-hand side of the integral equation,
\begin{equation*}
u_1(t) = x_0 + \int_{t_0}^t f(s, u_0(s)) \, ds.
\end{equation*}
Subsequent terms in the sequence can be defined recursively,
\begin{equation*}
u_{n+1} = x_0 + \int_{t_0}^t f(s, u_n(s)) \, ds.
\end{equation*}
Our goal is to show that \(u_n(t) \rightarrow u(t)\) as \(n \rightarrow \infty\text{.}\) Furthermore, we need to show that \(u\) is the continuous, unique solution to our initial value problem. We will leave the proof of Picard's Theorem to a series of exercises, but let us see how this works by developing an example.

#####
Example1.45

Consider the exponential growth equation,
\begin{align*}
\frac{dx}{dt} & = kx\\
x(0) & = 1.
\end{align*}
We already know that the solution is \(x(t) = e^{kt}\text{.}\) We define the first few terms of our sequence \(\{ u_n(t) \}\) as follows:
\begin{align*}
u_0(t) & = 1,\\
u_1(t) & = 1 + \int_0^t ku_0(s) \, ds\\
& = 1 + \int_0^t k \, ds\\
& = 1 + kt,\\
u_2(t) & = 1 + \int_0^t ku_1(s) \, ds\\
& = 1 + \int_0^t k(1 + ks) \, ds\\
& = 1 + kt + \frac{(kt)^2}{2}.
\end{align*}
The next term in the sequence is
\begin{equation*}
u_3(t) = 1 + kt + \frac{(kt)^2}{2} + \frac{(kt)^3}{2\cdot 3},
\end{equation*}
and the \(n\)th term is
\begin{align*}
u_n(t) & = 1 + 1 + \int_0^t ku_{n-1}(s) \, ds\\
& = 1 + \int_0^t k\left(1 + ks \frac{(kt)^2}{2!} + \frac{(kt)^3}{3!} + \cdots +\frac{(kt)^{n-1}}{(n-1)!}\right) \, ds\\
& = 1 + kt + \frac{(kt)^2}{2!} + \frac{(kt)^3}{3!} + \cdots + \frac{(kt)^n}{n!}.
\end{align*}
However, this is just the \(n\)th partial sum for the power series for \(u(t) = e^{kt}\text{,}\) which is what we expected.

##### 1

Which of the following initial value problems are guaranteed to have a unique solution by the Existence and Uniqueness Theorem (Theorem 1.40)? In each case, justify your conclusion.

- \(y' = 4 + y^3\text{,}\) \(y(0) = 1\)
- \(y' = \sqrt{y}\text{,}\) \(y(1) = 0\)
- \(y' = \sqrt{y}\text{,}\) \(y(1) = 1\)
- \(x' = \dfrac{t}{x-2}\text{,}\) \(x(0) = 2\)
- \(x' = \dfrac{t}{x-2}\text{,}\) \(x(2) = 0\)
- \(y' = x \tan y\text{,}\) \(y(0) = 0\)
- \(y' = \dfrac{1}{t} y + 2t\text{,}\) \(y(0) = 1\)

##### 2

Find an explicit solution to the initial value problem
\begin{align*}
y' & = \frac{1}{(t - 1)(y + 1)}\\
y(0) & = 1.
\end{align*}
Use your solution to determine the interval of existence.

##### 3

Consider the initial value problem
\begin{align*}
y' & = 3y^{2/3}\\
y(0) & = 0.
\end{align*}

- Show that the constant function, \(y(t) \equiv 0\text{,}\) is a solution to the initial value problem.
- Show that
\begin{equation*}
y(t) =
\begin{cases}
0, & t \leq t_0 \\
(t - t_0)^3, & t \gt t_0
\end{cases}
\end{equation*}
is a solution for the initial value problem, where \(t_0\) is any real number. Hence, there exists an infinite number of solutions to the initial value problem. [
*Hint*: Make sure that the derivative of \(y(t)\) exists at \(t = t_0\text{.}\) ]
- Explain why this example does not contradict the Existence and Uniqueness Theorem.

##### 4

Let \(\phi_n(x) = x^n\) for \(0 \leq x \leq 1\) and show that
\begin{equation*}
\lim_{n \rightarrow \infty} \phi_n(x)
=
\begin{cases}
0, & 0 \leq x \lt 1 \\
1, & x = 1.
\end{cases}
\end{equation*}
This is an example of a sequence of continuous functions that does not converge to a continuous function, which helps explain the need for *uniform continuity* in the proof of the Existence and Uniqueness Theorem.

##### 5

Consider the initial value problem
\begin{align*}
y' & = 2ty + t\\
y(0) & = 1.
\end{align*}

- Use the fact that \(y' = 2ty + t\) is a first-order linear differential equation to find a solution to the initial value problem.
- Let \(\phi_0(t) = 1\) and use Picard iteration to find \(\phi_n(t)\text{.}\)
- Show that the sequence \(\{ \phi_n(t) \}\) converges to the exact solution that you found in part (a) as \(n \to \infty\text{.}\)

##### 6

*The following series of exercises, prove the Existence and Uniqueness Theorem for first-order differential equations.*

Use the Fundamental Theorem of Calculus to show that the function \(u = u(t)\) is a solution to the initial value problem
\begin{align*}
x' & = f(t, x)\\
x(t_0) & = x_0,
\end{align*}
if and only if \(u\) is a solution to the integral equation
\begin{equation*}
x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds.
\end{equation*}

##### 7

If \(\partial f/ \partial x\) is continuous on the rectangle
\begin{equation*}
R = \left\{ (t, x) : 0 \leq |t - t_0| \lt a, 0 \leq |x - x_0| \lt b \right\},
\end{equation*}
prove that there exists a \(K \gt 0\) such that
\begin{equation*}
|f(t, x_1) - f(t, x_2) | \leq K |x_1 - x_2|
\end{equation*}
for all \((t, x_1)\) and \((t, x_2)\) in \(R\text{.}\)

##### 8

Define the sequence \(\{ u_n \}\) by
\begin{align*}
u_0(t) & = x_0,\\
u_{n+1} & = x_0 + \int_{t_0}^t f(s, u_n(s)) \, ds, \qquad n = 1, 2, \ldots.
\end{align*}
Use the result of the previous exercise to show that
\begin{equation*}
|f(t, u_n(t)) - f(t, u_{n-1}(t) )| \leq K|u_n(t) - u_{n-1}(t) |.
\end{equation*}

##### 9

Show that there exists an \(M \gt 0\) such that
\begin{equation*}
|u_1(t) - x_0| \leq M | t - t_0|.
\end{equation*}

##### 10

Show that
\begin{equation*}
|u_2(t) - u_1(t)| \leq \frac{KM | t - t_0|^2}{2}.
\end{equation*}

##### 11

Use mathematical induction to show that
\begin{equation*}
|u_n(t) - u_{n -1}(t)| \leq \frac{K^{n-1}M |t - t_0|^n}{n!}.
\end{equation*}

##### 12

Since
\begin{equation*}
u_n(t) = u_1(t) + [u_2(t) - u_1(t)] + \cdots + [u_n(t) - u_{n-1}(t)],
\end{equation*}
we can view \(u_n(t)\) as a partial sum for the series
\begin{equation*}
u_0(t) + \sum_{n=1}^\infty [u_n(t) - u_{n-1}(t)].
\end{equation*}
If we can show that this series converges absolutely, then our sequence will converge to a function \(u(t)\text{.}\) Show that
\begin{equation*}
\sum_{n=1}^\infty |u_n(t) - u_{n-1}(t)| \leq \frac{M}{K} \sum_{n=1}^\infty \frac{(K |t - t_0|)^n}{n!} \leq \frac{M}{K} \left( e^{K|h|} - 1 \right),
\end{equation*}
where \(h\) is the maximum distance between \((t_0, x_0)\) and the boundary of the rectangle \(R\text{.}\) Since \(|u_n(t) - u_{n -1}(t)| \to 0\text{,}\) we know that \(u_n(t)\) converges to a continuous function \(u(t)\) that solves our equation.

##### 13

To show uniqueness, assume that \(u(t)\) and \(v(t)\) are both solutions to
\begin{equation*}
x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds.
\end{equation*}
Show that
\begin{equation*}
|u(t) - v(t)|\leq K \int_{t_0}^t |u(s) - v(s)| \, ds.
\end{equation*}

##### 14

- Define
\begin{equation*}
\phi(t) = \int_{t_0}^t |u(s) - v(s)| \, ds,
\end{equation*}
then \(\phi(t_0) = 0\) and \(\phi(t) \geq 0\) for \(t \geq t_0\text{.}\) Show that
\begin{equation*}
\phi'(t) = |u(t) - v(t)|.
\end{equation*}
- Since
\begin{equation*}
|u(t) - v(t)| - K \int_{t_0}^t |u(s) - v(s)| \, ds \leq 0,
\end{equation*}
we know that
\begin{equation*}
\phi'(t) - K \phi(t) \leq 0.
\end{equation*}
Use this fact to show that
\begin{equation*}
\frac{d}{dt} \left[ e^{-Kt} \phi(t) \right] \leq 0.
\end{equation*}
Conclude that
\begin{equation*}
\phi(t) = \int_{t_0}^t |u(s) - v(s)| \, ds = 0
\end{equation*}
for \(t \geq t_0\) or for all \(t \geq t_0\) and \(u(t) = v(t)\text{.}\)