2.1 Error analysis for iterative methods#
Assume we have an iterative method \(\{p_{n}\}\) that converges to the root \(p\) of some function. How can we assess the rate of convergence?
Suppose \(\{p_{n}\}\) converges to \(p\). If there are constants \(C>0\) and \(\alpha >1\) such that
for \(n\geq 1\), then we say \(\{p_{n}\}\) converges to \(p\) with order \(\alpha\).
Special cases:
If \(\alpha=1\) and \(C<1\), we say the convergence is linear, and the rate of convergence is \(C\). In this case, using induction, we can show
There are some methods for which Equation (4) holds, but Equation (3) does not hold for any \(C<1\). We still call these methods to be of linear convergence. An example is the bisection method.
If \(\alpha>1\), we say the convergence is superlinear. In particular, the case \(\alpha=2\) is called quadratic convergence.
Consider the sequences defined by
The first sequence converges to 0 linearly, and the second quadratically. Here are a few iterations of the sequences:
Observe how fast quadratic convergence is compared to linear convergence.