7.2 KiB
# Tutorial 4: Root-Finding Methods
In this tutorial, we study classical root-finding algorithms for nonlinear equations. We will:
- Define the root-finding problem mathematically
- Derive several algorithms (bisection, fixed-point, Newton, secant)
- Discuss convergence conditions and error behavior
- Compare methods with worked examples using the
numethods
package.
1. Problem Setup and Notation
We seek to solve a nonlinear scalar equation: \[ f(x) = 0, \quad f: \mathbb{R} \to \mathbb{R}. \] with a continuously differentiable function.
Root, residual, and error
- A root \(r\) satisfies \(f(r)=0\).
- Absolute error: \((e_k = |x_k - x^\star|)\).
- Residual: \((r_k = |f(x_k)|)\). Note that small residual does not always imply small error.
Multiplicity
A root \(r\) has multiplicity \(m\) if \[ f(r) = f'(r) = \dots = f^{(m-1)}(r) = 0, \quad f^{(m)}(r) \neq 0. \] If \(x^\star\) satisfies \(f(x^\star)=0\) and \(f'(x^\star)\ne 0\), we say the root is simple (multiplicity 1).
If \(f'(x^\star)=\cdots=f^{(m-1)}(x^\star)=0\) and \(f^{(m)}(x^\star)\ne 0\), we say the root has multiplicity (m).
- Simple roots (\(m=1\)): most methods converge rapidly.
- Multiple roots (\(m>1\)): convergence often slows.
2. Bisection Method
Assumption (Intermediate Value Theorem): If f is continuous on ([a,b]) and (f(a),f(b) < 0), then there exists \(x^\star\) in (a,b) with \(f(x^\star)=0\).
- Assumes \(f\) is continuous on \([a,b]\) with \(f(a)f(b)<0\).
- Repeatedly bisect interval and select subinterval containing the root.
Iteration: \[ c_k = \frac{a_k+b_k}{2}, \quad f(c_k). \]
Error bound: interval length halves each step: \[ |c_k-r| \le \frac{b-a}{2^k}. \]
- Convergence: linear, guaranteed.
3. Fixed-Point Iteration
- Rewrite equation as \(x=g(x)\).
- Iterate: \[ x_{k+1} = g(x_k). \]
Convergence theorem (Banach fixed-point): If g is continuously differentiable near \((x^\star)\) and \[ |g'(x_\star)| < 1, \] then for initial guesses \(x_0\) sufficiently close to \(x^\star\), the iterates converge linearly to \(x^\star\) with asymptotic rate \(|g'(x^\star)|\).
Choice of g. Different rearrangements yield different g's with different convergence properties. A poor choice (with \((|g'|\ge 1))\) can diverge.
- Rate: linear with factor \(|g'(r)|\).
4. Newton’s Method
From Taylor expansion: \[ f(x) \approx f(x_k) + f'(x_k)(x-x_k). \] Set \(f(x)=0\) to solve for next iterate: \[ x_{k+1} = x_k - \frac{f(x_k)}{f'(x_k)}. \]
- Quadratic convergence for simple roots.
- For multiple roots: drops to linear.
- Requires derivative, sensitive to initial guess.
5. Secant Method
Avoids derivative by approximating slope with finite difference: \[ x_{k+1} = x_k - f(x_k) \frac{x_k - x_{k-1}}{f(x_k)-f(x_{k-1})}. \]
Convergence order: \(\approx 1.618\) (superlinear).
More efficient than Newton if derivative expensive.
6. Stopping Criteria
We stop iteration when:
- \(|f(x_k)| \le \varepsilon\) (residual small), or
- \(|x_{k+1}-x_k| \le \varepsilon(1+|x_{k+1}|)\) (relative step small).
from numethods import Bisection, FixedPoint, NewtonRoot, Secant
import math
# Example function: f(x) = x^2 - 2
= lambda x: x**2 - 2
f = lambda x: 2*x
df
# Bisection
= Bisection(f, 0, 2, tol=1e-8)
bisect = bisect.solve()
root_b print('Bisection root:', root_b)
# Newton
= NewtonRoot(f, df, 1.0, tol=1e-12)
newton = newton.solve()
root_n print('Newton root:', root_n)
# Secant
= Secant(f, 0, 2, tol=1e-12)
sec = sec.solve()
root_s print('Secant root:', root_s)
# Fixed point: g(x)=sqrt(2) ~ rewriting
= lambda x: (2/x) # not always convergent, but demonstrates
g try:
= FixedPoint(g, 1.0, tol=1e-8)
fp = fp.solve()
root_fp print('Fixed-point root:', root_fp)
except Exception as e:
print('Fixed-point failed:', e)
Bisection root: 1.4142135605216026
Newton root: 1.4142135623730951
Secant root: 1.414213562373095
Fixed-point failed: Fixed-point iteration did not converge
7. Comparison of Methods
Method | Requires derivative | Convergence rate | Guarantee? |
---|---|---|---|
Bisection | No | Linear | Yes (if sign change) |
Fixed-Point | No | Linear | Not always |
Newton | Yes | Quadratic | Locally (good guess) |
Secant | No | ~1.618 (superlinear) | Locally (good guess) |
8. Exercises
- Apply all four methods to \(f(x)=\cos x - x\).
- Try Newton’s method on \(f(x)=(x-1)^2\) and compare convergence rate with \(f(x)=x^2-2\).
- Modify Secant to stop if denominator too small.