2.2: Newton's Method
This is the fastest method, but requires analytical computation of the derivative of \(f(x)\) . Also, the method may not always converge to the desired root.
We can derive Newton’s Method graphically, or by a Taylor series. We again want to construct a sequence \(x_{0}, x_{1}, x_{2}, \ldots\) that converges to the root \(x=r\) . Consider the \(x_{n+1}\) member of this sequence, and Taylor series expand \(f\left(x_{n+1}\right)\) about the point \(x_{n}\) . We have
\[f\left(x_{n+1}\right)=f\left(x_{n}\right)+\left(x_{n+1}-x_{n}\right) f^{\prime}\left(x_{n}\right)+\ldots . \nonumber \]
To determine \(x_{n+1}\) , we drop the higher-order terms in the Taylor series, and assume \(f\left(x_{n+1}\right)=0 .\) Solving for \(x_{n+1}\) , we have
\[x_{n+1}=x_{n}-\frac{f\left(x_{n}\right)}{f^{\prime}\left(x_{n}\right)} \nonumber \]
Starting Newton’s Method requires a guess for \(x_{0}\) , hopefully close to the root \(x=r .\)