MathIsimple
ODE-07
Theoretical Foundations

Existence and Uniqueness Theory

Understand the theoretical foundations of ODEs: when solutions exist, when they are unique, how far they can be extended, and how they depend on initial conditions and parameters.

Learning Objectives

  • State and apply the Picard-Lindelöf theorem
  • Understand the Lipschitz condition and its role
  • Distinguish existence from uniqueness
  • Apply Picard iteration to construct solutions
  • Use Gronwall's inequality in proofs
  • Understand continuation and maximal solutions
  • Analyze continuous dependence on data
  • Recognize when uniqueness fails

1. The Lipschitz Condition

The Lipschitz condition is the key hypothesis that ensures uniqueness of solutions. It controls how fast the right-hand side of the ODE can change with respect to the dependent variable.

Definition 7.1: Lipschitz Condition

A function f:D,R,nf: D \to \mathbb,{R},^n where D,R,×,R,nD \subset \mathbb,{R}, \times \mathbb,{R},^n satisfies aLipschitz condition in xx if there exists a constant L0L \geq 0 such that:

f(t,x1)f(t,x2)Lx1x2|f(t, x_1) - f(t, x_2)| \leq L|x_1 - x_2|

for all (t,x1),(t,x2)D(t, x_1), (t, x_2) \in D. The constant LL is called the Lipschitz constant.

If this holds only on compact subsets of DD, we say ff is locally Lipschitz.

Theorem 7.1: Sufficient Condition for Lipschitz

If f(t,x)f(t, x) is continuously differentiable with respect to xx on a convex domain DD, and fxL\left|\frac{\partial f}{\partial x}\right| \leq L on DD, then ff is Lipschitz with constant LL.

Proof of Theorem 7.1:

By the mean value theorem, for fixed tt:

f(t,x1)f(t,x2)=fx(t,ξ)x1x2Lx1x2|f(t, x_1) - f(t, x_2)| = \left|\frac{\partial f}{\partial x}(t, \xi)\right| |x_1 - x_2| \leq L|x_1 - x_2|

where ξ\xi is on the line segment between x1x_1 and x2x_2. ∎

Example 7.1: Checking Lipschitz Condition

Problem: Determine if f(x)=x2f(x) = x^2 is Lipschitz on [1,1][-1, 1].

Solution:

Since f(x)=2xf'(x) = 2x, we have f(x)2|f'(x)| \leq 2 on [1,1][-1, 1].

Therefore, ff is Lipschitz with L=2L = 2 on this interval.

Alternatively: x12x22=x1+x2x1x22x1x2|x_1^2 - x_2^2| = |x_1 + x_2||x_1 - x_2| \leq 2|x_1 - x_2| for x1,x2[1,1]x_1, x_2 \in [-1, 1].

Remark:

The function f(x)=,xf(x) = \sqrt,{|x|} is not Lipschitz at x=0x = 0:f(x)f(0)/x0=1/,x,|f(x) - f(0)|/|x - 0| = 1/\sqrt,{|x|}, \to \infty as x0x \to 0. This is why the IVP x=,x,,x(0)=0x' = \sqrt,{|x|},, x(0) = 0 has multiple solutions.

2. The Picard-Lindelöf Theorem

Theorem 7.2: Picard-Lindelöf (Existence and Uniqueness)

Consider the initial value problem:

dxdt=f(t,x),x(t0)=x0\frac{dx}{dt} = f(t, x), \quad x(t_0) = x_0

Let f:DoRnf: D o \mathbb{R}^n be continuous on D=(t,x):tt0leqa,xx0leqbD = {(t,x): |t-t_0| leq a, |x-x_0| leq b}and satisfy a Lipschitz condition in xx with constant LL. Let M=supDfM = \sup_D |f|.

Then there exists a unique solution x(t)x(t) on the interval tt0h|t - t_0| \leq h whereh=min(a,b/M)h = \min(a, b/M).

Proof of Theorem 7.2:

Step 1: Integral formulation. The IVP is equivalent to:

x(t)=x0+t0tf(s,x(s))dsx(t) = x_0 + \int_{t_0}^{t} f(s, x(s))\,ds

Step 2: Picard iteration. Define the sequence:

ϕ0(t)=x0,ϕn+1(t)=x0+t0tf(s,ϕn(s))ds\phi_0(t) = x_0, \quad \phi_{n+1}(t) = x_0 + \int_{t_0}^{t} f(s, \phi_n(s))\,ds

Step 3: Show iterates stay in domain.By induction, ϕn(t)x0Mtt0Mhb|\phi_n(t) - x_0| \leq M|t - t_0| \leq Mh \leq b.

Step 4: Convergence. Let en=suptϕnϕ,n1,e_n = \sup|_t |\phi_n - \phi_,{n-1},|. Then:

en+1Ltt0enLhen(Lh)nn!e1e_{n+1} \leq L|t - t_0| e_n \leq Lh \cdot e_n \leq \frac{(Lh)^n}{n!} e_1

Since (Lh)n/n!\sum (Lh)^n/n! converges, ϕn\phi_n converges uniformly to some ϕ\phi.

Step 5: Uniqueness. If ψ\psi is another solution, Gronwall's inequality shows ϕψ=0|\phi - \psi| = 0. ∎

Algorithm 7.1: Picard Iteration

Input: IVP x=f(t,x),x(t0)=x0x' = f(t, x), x(t_0) = x_0

Output: Approximate solution (or exact via limit)

1. Set ϕ0(t)=x0\phi_0(t) = x_0

2. For n=0,1,2,n = 0, 1, 2, \ldots:

Compute ϕ,n+1,(t)=x0+,t0,tf(s,ϕn(s))ds\phi_,{n+1},(t) = x_0 + \int_,{t_0},^t f(s, \phi_n(s))ds

3. Until convergence or desired accuracy

4. return ϕn(t)\phi_n(t)

Example 7.2: Picard Iteration

Problem: Apply Picard iteration to x=x,x(0)=1x' = x, x(0) = 1.

Solution:

ϕ0(t)=1\phi_0(t) = 1
ϕ1(t)=1+0t1ds=1+t\phi_1(t) = 1 + \int_0^t 1\,ds = 1 + t
ϕ2(t)=1+0t(1+s)ds=1+t+t22\phi_2(t) = 1 + \int_0^t (1+s)\,ds = 1 + t + \frac{t^2}{2}
ϕ3(t)=1+0t(1+s+s22)ds=1+t+t22+t36\phi_3(t) = 1 + \int_0^t \left(1 + s + \frac{s^2}{2}\right)ds = 1 + t + \frac{t^2}{2} + \frac{t^3}{6}

In general: ϕn(t)=,k=0,n,tkk!\phi_n(t) = \sum_,{k=0},^n \frac,{t^k}{k!}, which converges to ete^t.

3. Gronwall's Inequality

Theorem 7.3: Gronwall's Inequality

Let u,β:[a,b][0,)u, \beta: [a, b] \to [0, \infty) be continuous, and let α0\alpha \geq 0 be a constant. If:

u(t)α+atβ(s)u(s)dsfor t[a,b]u(t) \leq \alpha + \int_a^t \beta(s) u(s)\,ds \quad \text{for } t \in [a, b]

then:

u(t)αexp(atβ(s)ds)u(t) \leq \alpha \exp\left(\int_a^t \beta(s)\,ds\right)
Proof of Theorem 7.3:

Let v(t)=atβ(s)u(s)dsv(t) = \int_a^t \beta(s) u(s)ds. Then v=βuβ(α+v)v' = \beta u \leq \beta(\alpha + v).

This gives vβvαβv' - \beta v \leq \alpha\beta. Multiplying by e,atβe^,{-\int_a^t \beta}:

ddt(veatβ)αβeatβ\frac{d}{dt}\left(v \cdot e^{-\int_a^t \beta}\right) \leq \alpha\beta e^{-\int_a^t \beta}

Integrating from aa to tt and using v(a)=0v(a) = 0:

v(t)α(eatβ1)v(t) \leq \alpha\left(e^{\int_a^t \beta} - 1\right)

Therefore u(t)α+v(t)αe,atβu(t) \leq \alpha + v(t) \leq \alpha e^,{\int_a^t \beta}. ∎

Remark:

Gronwall's inequality is the key tool for proving: (1) uniqueness of solutions, (2) continuous dependence on initial conditions, and (3) continuous dependence on parameters.

4. Peano Existence Theorem

Theorem 7.4: Peano Existence Theorem

If f(t,x)f(t, x) is continuous on D=(t,x):tt0leqa,xx0leqbD = {(t,x): |t-t_0| leq a, |x-x_0| leq b}, then the IVP x=f(t,x),x(t0)=x0x' = f(t, x), x(t_0) = x_0 has at least one solution ontt0h=min(a,b/M)|t - t_0| \leq h = \min(a, b/M) where M=supDfM = \sup_D |f|.

Remark:

Peano's theorem guarantees existence but not uniqueness. The proof uses Euler's polygonal approximation and the Arzelà-Ascoli theorem. The Lipschitz condition is essential for uniqueness.

Example 7.3: Non-Unique Solutions

Problem: Show that x=3x,2/3,,x(0)=0x' = 3x^,{2/3},, x(0) = 0 has multiple solutions.

Solution:

Note that f(x)=3x,2/3f(x) = 3x^,{2/3} is continuous but not Lipschitz at x=0x = 0.

Solution 1: x(t)0x(t) \equiv 0 (verify: x=0=30,2/3x' = 0 = 3 \cdot 0^,{2/3} ✓)

Solution 2: Try x(t)=t3x(t) = t^3:

x(t)=3t2,3x2/3=3(t3)2/3=3t2x'(t) = 3t^2, \quad 3x^{2/3} = 3(t^3)^{2/3} = 3t^2 \quad \checkmark

Both solutions satisfy the IVP, demonstrating non-uniqueness.

In fact, for any c0c \geq 0, the function:

x(t)={0tc(tc)3t>cx(t) = \begin{cases} 0 & t \leq c \\ (t-c)^3 & t > c \end{cases}

is also a solution, showing infinitely many solutions exist.

5. Continuation of Solutions

Definition 7.2: Maximal Solution

A solution x(t)x(t) defined on an interval (α,ω)(\alpha, \omega) is called amaximal solution if it cannot be extended to any larger interval while remaining a solution of the ODE.

Theorem 7.5: Blow-Up Theorem

Let ff be locally Lipschitz on an open domain D,R,×,R,nD \subset \mathbb,{R}, \times \mathbb,{R},^n. If x(t)x(t) is a maximal solution on (α,ω)(\alpha, \omega) with ω<\omega < \infty, then for any compact set KDK \subset D, there exists tKt_K such that (t,x(t))K(t, x(t)) \notin K fort>tKt > t_K.

In other words: the solution "escapes to infinity" or approaches the boundary of DD.

Example 7.4: Finite-Time Blow-Up

Problem: Solve x=x2,x(0)=1x' = x^2, x(0) = 1 and find the maximal interval.

Solution:

Separating: ,dxx2,=dt    ,1x,=t+C\frac,{dx}{x^2}, = dt \implies -\frac,{1}{x}, = t + C

With x(0)=1x(0) = 1: C=1C = -1, so x=,11tx = \frac,{1}{1-t}

The solution exists on (,1)(-\infty, 1) and x(t)x(t) \to \infty as t1t \to 1^-.

This is finite-time blow-up: the solution becomes infinite in finite time.

6. Continuous Dependence

Theorem 7.6: Continuous Dependence on Initial Conditions

Let ff be continuous and Lipschitz in xx with constant LL. Let x(t)x(t) and y(t)y(t) be solutions of x=f(t,x)x' = f(t, x) with initial conditions x(t0)=x0x(t_0) = x_0 and y(t0)=y0y(t_0) = y_0 respectively. Then:

x(t)y(t)x0y0eLtt0|x(t) - y(t)| \leq |x_0 - y_0| e^{L|t - t_0|}
Proof of Theorem 7.6:

We have:

x(t)y(t)x0y0+t0tf(s,x(s))f(s,y(s))ds|x(t) - y(t)| \leq |x_0 - y_0| + \int_{t_0}^{t} |f(s, x(s)) - f(s, y(s))|\,ds
x0y0+Lt0tx(s)y(s)ds\leq |x_0 - y_0| + L\int_{t_0}^{t} |x(s) - y(s)|\,ds

By Gronwall's inequality with α=x0y0\alpha = |x_0 - y_0| and β=L\beta = L:

x(t)y(t)x0y0eLtt0|x(t) - y(t)| \leq |x_0 - y_0| e^{L|t - t_0|}

Remark:

This theorem shows that solutions depend continuously on initial conditions. The bound grows exponentially in time, which is sharp for some systems (but often pessimistic in practice).

Practice Quiz

Existence and Uniqueness Quiz
10
Questions
0
Correct
0%
Accuracy
1
The Picard-Lindelöf theorem guarantees:
Easy
Not attempted
2
A function f(t,x)f(t,x) is Lipschitz continuous in xx if:
Easy
Not attempted
3
The Peano existence theorem requires:
Easy
Not attempted
4
The IVP x=x2/3x' = x^{2/3}, x(0)=0x(0) = 0 has:
Medium
Not attempted
5
The Gronwall inequality states that if u(t)α+atβ(s)u(s)dsu(t) \leq \alpha + \int_a^t \beta(s)u(s)ds, then:
Medium
Not attempted
6
In Picard iteration, starting from ϕ0(t)=x0\phi_0(t) = x_0, the next iterate is:
Medium
Not attempted
7
A maximal solution is:
Medium
Not attempted
8
If ff is C1C^1 in a neighborhood of (t0,x0)(t_0, x_0), then ff is locally Lipschitz because:
Medium
Not attempted
9
The blow-up theorem states that if a maximal solution exists on (α,ω)(\alpha, \omega) with ω<\omega < \infty, then:
Hard
Not attempted
10
Continuous dependence on initial conditions means that:
Hard
Not attempted

Frequently Asked Questions

What happens when the Lipschitz condition fails?

Without the Lipschitz condition, uniqueness may fail (as in x=x,2/3x' = x^,{2/3}), but existence is still guaranteed by Peano's theorem if ff is continuous. Multiple solution curves can pass through the same point.

How do I know if a solution is maximal?

A solution is maximal if attempting to extend it leads to: (1) leaving the domain of ff, (2) blow-up (solution becomes unbounded), or (3) the solution becomes non-unique. The blow-up theorem characterizes what happens at the endpoints of a maximal solution.

Why is Gronwall's inequality so important?

Gronwall's inequality converts integral inequalities into pointwise bounds. It's used to prove: uniqueness (if two solutions agree at one point, they agree everywhere), continuous dependence (small changes in data produce small changes in solutions), and stability estimates.

Does Picard iteration always converge quickly?

Picard iteration converges, but the rate depends on the Lipschitz constant and interval length. For large LL or long time intervals, convergence may be slow. In practice, numerical methods are often used instead, but Picard iteration remains important theoretically.

What is the difference between local and global existence?

Local existence (guaranteed by Picard-Lindelöf) means a solution exists on some interval around t0t_0. Global existence means the solution exists for all tt. Global existence requires additional conditions—for example, linear growth of ff prevents finite-time blow-up.

How does this theory apply to systems?

The same theorems apply to systems x=f(t,x)x' = f(t, x) where x,R,nx \in \mathbb,{R},^n. The Lipschitz condition becomes f(t,x1)f(t,x2)Lx1x2|f(t, x_1) - f(t, x_2)| \leq L|x_1 - x_2| using vector norms. All results generalize directly.