Iterative Methods for Linear Systems
For large sparse systems, iterative methods are often more efficient than direct methods. Learn Jacobi, Gauss-Seidel, and SOR methods, and understand when they converge.
- Understand matrix splitting and iterative formulations
- Implement Jacobi, Gauss-Seidel, and SOR methods
- Analyze convergence using spectral radius
- Apply sufficient conditions for convergence
- Choose optimal relaxation parameters
- Understand block iterative methods
1. Jacobi Method
The Jacobi method solves by isolating each variable and iterating.
Definition 7.1: Matrix Splitting
Write where:
- = diagonal of
- = strict lower triangular part of
- = strict upper triangular part of
Definition 7.2: Jacobi Iteration
From :
Component-wise:
Algorithm Jacobi Method
Input: , tolerance
- For until convergence:
- For : compute using only
- If , stop
Remark:
Parallelizable: All components of can be computed independently since they only depend on .
2. Gauss-Seidel Method
Definition 7.3: Gauss-Seidel Iteration
Use updated values immediately as they become available:
Component-wise:
Example: Comparison
For system with :
| Jacobi | Gauss-Seidel | |
|---|---|---|
| 0 | (0, 0) | (0, 0) |
| 1 | (0.25, 0.25) | (0.25, 0.3125) |
| 2 | (0.3125, 0.3125) | (0.328, 0.332) |
Exact solution: . Gauss-Seidel converges faster.
Remark:
Gauss-Seidel typically converges faster than Jacobi (roughly twice as fast for many problems), but it cannot be parallelized as easily due to data dependencies.
3. Successive Over-Relaxation (SOR)
Definition 7.4: SOR Iteration
Introduce relaxation parameter :
In matrix form:
Theorem 7.1: SOR Convergence
SOR can only converge if .
Note:
Special cases:
- : Gauss-Seidel
- : Under-relaxation (more stable, slower)
- : Over-relaxation (can accelerate convergence)
Theorem 7.2: Optimal SOR for Tridiagonal SPD
For symmetric positive definite tridiagonal matrices with Jacobi iteration matrix :
This minimizes .
4. Convergence Analysis
Definition 7.5: Spectral Radius
The spectral radius of matrix is:
where are the eigenvalues of .
Theorem 7.3: Convergence Criterion
The iteration converges to the unique solution for any initial guess if and only if:
Definition 7.6: Convergence Rate
The asymptotic convergence rate is:
This measures decimal digits of accuracy gained per iteration.
Theorem 7.4: Sufficient Conditions
The following ensure convergence of both Jacobi and Gauss-Seidel:
- Strict diagonal dominance: for all
- Symmetric positive definite: and for all
Example: Convergence Check
For matrix :
- Row 1: ✓
- Row 2: ✓
- Row 3: ✓
Strictly diagonally dominant → Both methods converge.
5. Practical Considerations
Definition 7.7: Stopping Criteria
Common stopping criteria:
- Residual:
- Relative change:
- Maximum iterations:
When to Use Iterative Methods
- Large sparse matrices (thousands or millions of unknowns)
- When matrix structure can be exploited (e.g., matrix-free methods)
- When only approximate solutions are needed
- When storage is limited
Block Methods
Partition unknowns into blocks and solve block systems at each iteration. Often converges faster than point methods.
Remark:
Modern iterative methods like Conjugate Gradient (for SPD) andGMRES (for general matrices) are often preferred over classical stationary methods. They are covered in advanced courses.
Practice Quiz
Frequently Asked Questions
When should I use iterative vs. direct methods?
Direct methods: For small to medium dense systems, or when you need to solve with many right-hand sides.
Iterative methods: For large sparse systems where direct methods would be too slow or use too much memory. Also useful when only approximate solutions are needed.
How do I choose between Jacobi, Gauss-Seidel, and SOR?
Jacobi: Simple to implement and parallelize. Often slowest to converge.
Gauss-Seidel: Usually faster than Jacobi (about 2x). Use when parallelization isn't critical.
SOR: Fastest when optimal is known. Worth the effort for repeated solves of similar systems.
My iteration isn't converging. What should I check?
Check these in order:
- Is non-singular?
- Is diagonally dominant or SPD?
- For SOR, is ?
- Try scaling the system (row equilibration)
- Consider preconditioning or Krylov methods
What is preconditioning?
Preconditioning transforms into an equivalent system with better convergence properties. Instead of solving , solve where but is easy to compute. Good preconditioners dramatically speed up convergence.
How do I estimate the spectral radius without computing eigenvalues?
Run a few iterations and observe the ratio . This ratio approaches asymptotically. Alternatively, use the power method on the iteration matrix (covered in the next chapter).