Master the precise definition of function limits that forms the rigorous foundation of calculus.
The ε-δ definition was developed in the 19th century to put calculus on a rigorous foundation. Augustin-Louis Cauchy (1789-1857) first introduced limits as the foundation of calculus in 1821.
Karl Weierstrass (1815-1897) formalized this into the precise ε-δ language we use today, resolving centuries of philosophical debates about infinitesimals.
In Chapter 2, we studied limits of sequences using the ε-N definition. Now we extend this concept to functions. The key idea remains the same: we want to capture what it means for to get "arbitrarily close" to a value as gets "sufficiently close" to .
A punctured neighborhood of with radius is:
This is an open interval centered at , but with itself removed.
Let be defined on some punctured neighborhood . We say if and only if:
Think of ε as a tolerance. Someone says: "I want f(x) within ε of L." No matter how small ε is, you must meet this challenge.
You must produce a δ that works. The δ can depend on ε—smaller ε typically requires smaller δ.
x is within δ of x₀, but NOT equal to x₀. We examine points near x₀, not at x₀ itself.
f(x) must be within ε of L. If you can always achieve this for any ε, the limit is L.
If and , then .
Suppose . Let .
By the limit definitions, such that for : .
Let . Take any x with .
By triangle inequality:
Contradiction! Therefore . ∎
Claim:
Scratch Work:
Need
This holds when . So choose .
Formal Proof:
Given: ε > 0. Choose: δ = ε/3 > 0.
Verify: If 0 < |x - 2| < δ = ε/3, then:
Claim:
Scratch Work:
Need
Bound |x+3|: If |x - 3| < 1, then 2 < x < 4, so |x+3| < 7.
Then . Need .
Formal Proof:
Choose:
If 0 < |x - 3| < δ, then |x - 3| < 1 (so |x+3| < 7) and |x - 3| < ε/7.
Claim:
Simplification:
Note: . For x ≠ 1:
Analysis:
Need
If x > 0, then √x + 1 > 1, and |1 - √x| ≤ |1 - x|.
So the expression is bounded by |x - 1|/2.
Conclusion:
Choose δ = min(1/2, 2ε). Then the limit equals 1/2. ∎
Claim: does not exist.
Proof by Contradiction:
Suppose . Take ε = 1/2.
For any δ > 0, consider and .
For large n, both are in (0, δ). But sin(1/xₙ) = 0 and sin(1/yₙ) = 1.
Both |0 - L| < 1/2 and |1 - L| < 1/2 cannot hold simultaneously.
Contradiction! The limit does not exist. ∎
Heine's theorem bridges function limits (ε-δ) and sequence limits (ε-N). This allows us to use everything we learned about sequences to analyze functions.
Let f be defined on U'(x₀, δ₀). Then:
(⇒) Function limit ⟹ Sequential limit:
Assume . Let {xₙ} be any sequence with xₙ → x₀, xₙ ≠ x₀.
Given ε > 0, ∃δ > 0: 0 < |x - x₀| < δ ⟹ |f(x) - L| < ε.
Since xₙ → x₀, ∃N: n > N ⟹ |xₙ - x₀| < δ. Since xₙ ≠ x₀, we have 0 < |xₙ - x₀| < δ.
Therefore |f(xₙ) - L| < ε for n > N, proving f(xₙ) → L. ∎
(⇐) By contrapositive:
If the function limit ≠ L, then ∃ε₀ > 0 such that ∀δ > 0, ∃x with 0 < |x - x₀| < δ but |f(x) - L| ≥ ε₀.
Take δ = 1/n. Get xₙ with 0 < |xₙ - x₀| < 1/n but |f(xₙ) - L| ≥ ε₀.
Then xₙ → x₀ (since |xₙ - x₀| < 1/n → 0), xₙ ≠ x₀, but f(xₙ) ↛ L. ∎
If ∃ sequences {xₙ}, {yₙ} → x₀ (with terms ≠ x₀) such that , then does not exist.
Problem: Show does not exist using Heine.
Sequence 1: xₙ = 1/(nπ) → 0. Then sin(nπ) = 0, so limit = 0.
Sequence 2: yₙ = 1/(π/2 + 2nπ) → 0. Then sin(π/2 + 2nπ) = 1, so limit = 1.
Since 0 ≠ 1, by Corollary 3.1, the limit does not exist. ∎
Sometimes we want to consider the limit from only one direction—from the left or from the right. This is especially useful for piecewise functions or functions with jump discontinuities.
We say (the right-hand limit) if:
We approach x₀ from the right (x > x₀).
We say (the left-hand limit) if:
We approach x₀ from the left (x < x₀).
The two-sided limit exists iff both one-sided limits exist and are equal.
For , find limits at x = 1.
Left-hand limit: For x < 1, f(x) = x². So
Right-hand limit: For x ≥ 1, f(x) = 2x. So
Conclusion: Since 1 ≠ 2, does not exist.
For sgn(x) = {1 if x > 0, 0 if x = 0, -1 if x < 0}, find limits at x = 0.
Since 1 ≠ -1, does not exist.
For , find limits at x = 0.
For x > 0: |x|/x = x/x = 1. For x < 0: |x|/x = -x/x = -1.
The two-sided limit does not exist (this is the sign function!).
We can also ask: what happens to f(x) as x becomes very large (positive or negative)? This leads to the concept of limits at infinity.
Let f be defined on (a, +∞) for some a ∈ ℝ. We say if:
Similarly, if:
If or , the line y = L is a horizontal asymptote.
For with aₙ, bₘ ≠ 0:
If n < m
If n = m
If n > m
Find .
Method: Divide numerator and denominator by x² (highest power):
As x → +∞: 2/x → 0, 1/x² → 0, 2/x² → 0.
Find .
Answer: +∞. Exponentials always dominate polynomials.
Intuition: eˣ grows faster than any polynomial xⁿ as x → +∞.
We say if:
Similarly for : replace f(x) > M with f(x) < -M.
If or , the line x = x₀ is a vertical asymptote.
Find one-sided limits of f(x) = 1/x at x = 0.
The line x = 0 is a vertical asymptote of y = 1/x.
Find all asymptotes of .
Vertical: x = 1 (denominator = 0).
Oblique: Perform polynomial division: f(x) = x + 1 + 2/(x-1).
As x → ±∞, 2/(x-1) → 0, so y = x + 1 is an oblique asymptote.
Trigonometric
Exponential/Logarithmic
Power Functions
Inverse Trig
Find .
Find .
Method: Rewrite using the standard limit form:
As x → 0, -2x → 0, so (1 + (-2x))^{1/(-2x)} → e.
Find .
Using Taylor expansion: e^{2x} ≈ 1 + 2x + 2x² + ... near x = 0.
Answer: 2
Find .
Method 1 (Half-angle identity):
As x → 0, (x/2) → 0, so sin(x/2)/(x/2) → 1.
Find .
Solution: Write as a product:
As x → 0: tan x → 0, so sin(tan x)/(tan x) → 1.
Also, tan x / x = (sin x / x) · (1 / cos x) → 1 · 1 = 1.
Find .
Solution: Write as:
As x → 0: sin x → 0, so ln(1 + sin x)/(sin x) → 1.
And sin x / x → 1.
Mastering ε-δ proofs requires practice with various function types. Here we present systematic approaches for different situations.
For with :
Choose:
Linear functions are the simplest—δ is directly proportional to ε.
For polynomials near x₀:
For where :
Key insight: Control the denominator before controlling the whole expression.
Claim:
Step 1: Factor
Step 2: Bound the second factor
If |x - 2| < 1, then 1 < x < 3, so:
Step 3: Choose δ
Need , so .
Choose:
Step 4: Verify
If 0 < |x - 2| < δ, then:
Claim:
Step 1: Analyze
Step 2: Bound the denominator
If |x - 1| < 1/2, then 1/2 < x < 3/2, so |x| > 1/2.
Thus 1/|x| < 2.
Step 3: Complete
Need 2|x - 1| < ε, so |x - 1| < ε/2.
Choose: δ = min(1/2, ε/2)
When your proof requires two conditions on δ:
Choose . This ensures BOTH conditions hold.
Sometimes the limit doesn't exist. Understanding how to prove non-existence is just as important as proving existence.
means:
Note how the quantifiers are flipped: ∀ε becomes ∃ε₀, and ∃δ becomes ∀δ.
Find two sequences xₙ → x₀ and yₙ → x₀ such that lim f(xₙ) ≠ lim f(yₙ).
Often the easiest method!
Show .
Exhibit a specific ε₀ for which no δ works.
Claim: does not exist.
Proof using Heine's theorem:
Sequence 1: xₙ = 1/(2πn) → 0. Then cos(2πn) = 1 → 1.
Sequence 2: yₙ = 1/(πn) → 0. Then cos(πn) = (-1)ⁿ, which doesn't converge.
Since f(yₙ) doesn't converge, the function limit cannot exist. ∎
Claim: does not exist (floor function).
Proof using one-sided limits:
For 0 < x < 1: ⌊x⌋ = 0, so
For -1 < x < 0: ⌊x⌋ = -1, so
Since 0 ≠ -1, the two-sided limit doesn't exist. ∎
Claim: does not exist.
Proof: Consider xₙ = 1/(π/2 + 2πn).
Then sin(1/xₙ) = 1, and:
The function is unbounded near 0, so no finite limit exists.
But also consider yₙ = 1/(3π/2 + 2πn), giving values → -∞.
So even +∞ or -∞ is not a valid limit. The limit simply doesn't exist. ∎
Find (without L'Hôpital).
Using Taylor series: sin x = x - x³/6 + x⁵/120 - ...
Find .
Solution: Since -1 ≤ sin x ≤ 1 for all x:
As x → +∞: -1/x → 0 and 1/x → 0.
By Squeeze Theorem:
Find .
Solution: Write .
First find :
As x → 0⁺: ln x → -∞ and 1/x → +∞. This is -∞/∞ form.
Let t = 1/x, so x = 1/t and x → 0⁺ means t → +∞:
Therefore:
Find .
Solution: Rationalize:
Prove that for any α > 0.
Proof: Let y = ln x, so x = eʸ and x → +∞ means y → +∞.
Since eᵅʸ grows faster than any polynomial in y:
Conclusion: Logarithms grow slower than ANY positive power. ∎
Understanding how different functions grow as x → ∞ is essential for evaluating limits. Here's the complete hierarchy from slowest to fastest growth.
Key insight: When computing limits of ratios, the faster-growing function "wins."
Evaluate these limits at +∞:
Exponential beats any polynomial.
Any positive power beats any power of logarithm.
Factorial beats exponential.
Near x = 0, the hierarchy reverses for "smallness":
A function is an infinitesimal as x → x₀ if:
Let α(x) and β(x) be infinitesimals as x → x₀ with β(x) ≠ 0 near x₀:
Usage: Replace equivalent infinitesimals in limits for easier computation.
Find .
Solution:
Using equivalences: sin x ~ x, 1 - cos x ~ x²/2, cos x → 1:
Find .
Warning: Cannot simply replace sin x ~ x here (would give 0/x³)!
Correct approach: Use Taylor series:
Lesson: Equivalences only work for products/quotients, not for sums where cancellation occurs.
Claim: lim_{x → x₀} f(x) = L
Proof:
Let ε > 0 be given. [Start of proof]
Choose δ = ... [Your choice, may depend on ε]
Suppose 0 < |x - x₀| < δ. [Assume the hypothesis]
Then ... [Chain of inequalities]
Therefore |f(x) - L| < ε. [Reach the conclusion]
Since ε was arbitrary, lim_{x → x₀} f(x) = L. ∎
Claim:
Scratch Work (not part of formal proof):
Need .
If |x - 4| < 2, then x > 2, so √x > √2, thus √x + 2 > 2.
So . Need |x - 4|/2 < ε, i.e., |x - 4| < 2ε.
Formal Proof:
Proof: Let ε > 0 be given.
Choose δ = min(2, 2ε) > 0.
Suppose 0 < |x - 4| < δ.
Since δ ≤ 2, we have |x - 4| < 2, so x > 2 > 0, which gives √x + 2 > 2.
Then:
Since ε was arbitrary, . ∎
The concept of limits naturally leads to continuity—one of the most important properties in analysis. Here's a preview of how they connect.
A function f is continuous at x₀ if:
This requires three things: (1) f(x₀) exists, (2) the limit exists, (3) they're equal.
Define f(0) to make continuous at x = 0.
Solution: We need f(0) = lim_{x→0} (sin x)/x = 1.
Define:
This function is continuous everywhere.
For (floor function), find limits at x = 2.
Left-hand limit: For 1 < x < 2, ⌊x⌋ = 1.
Right-hand limit: For 2 ≤ x < 3, ⌊x⌋ = 2.
Note: f(2) = 2, so f is right-continuous at 2 but not left-continuous.
Find .
Solution: Write as exponential:
Using ln(1 + u) ≈ u - u²/2 + ... for small u = 1/x:
Therefore the limit is .
Now that you've mastered the definition of function limits, the next section covers:
The condition excludes x = x₀. The limit describes behavior as x approaches x₀, not the value at x₀. The function may not even be defined at x₀.
For sequences, N is discrete (natural number). For functions, δ measures continuous distance. Heine's theorem bridges them.
Work backwards: start with |f(x) - L| < ε, manipulate to get |x - x₀| < something. That 'something' becomes your δ.
Yes! The limit only concerns behavior approaching x₀. Example: lim(sin x)/x = 1 as x→0.
Two-sided limit exists iff both one-sided limits exist and are equal.
Intuition can be misleading. ε-δ provides a precise criterion for proving theorems and resolving debates.
For all {xₙ} → x₀ (xₙ ≠ x₀): f(xₙ) → L