Jump to content

Halley's method

From Wikipedia, the free encyclopedia

In numerical analysis, Halley's method is a root-finding algorithm used for functions of one real variable with a continuous second derivative. Edmond Halley was an English mathematician and astronomer who introduced the method now called by his name.

The algorithm is second in the class of Householder's methods, after Newton's method. Like the latter, it iteratively produces a sequence of approximations to the root; their rate of convergence to the root is cubic. Multivariate versions of this method exist.

Halley's method exactly finds the roots of a linear-over-linear Padé approximation to the function, in contrast to Newton's method or the Secant method which approximate the function linearly, or Muller's method which approximates the function quadratically.[1]

There is also Halley's irrational method, described below.

Method

[edit]

Halley's method is a numerical algorithm for solving the nonlinear equation f (x) = 0 . In this case, the function f has to be a function of one real variable. The method consists of a sequence of iterations:

beginning with an initial guess x0.[2]

If f is a three times continuously differentiable function and a is a zero of f but not of its derivative, then, in a neighborhood of a, the iterates xn satisfy:

This means that the iterates converge to the zero if the initial guess is sufficiently close, and that the convergence is cubic.[3]

The following alternative formulation shows the similarity between Halley's method and Newton's method. The ratio only needs to be computed once, and this form is particularly useful when the other ratio, can be reduced to a simpler form:

When the second derivative, is very close to zero, the Halley's method iteration is almost the same as the Newton's method iteration.

Motivation

[edit]

When deriving Newton's method, a proof starts with the approximation to compute Similarly for Halley's method, a proof starts with For Halley's rational method, this is rearranged to give where xn+1xn appears on both sides of the equation. Substituting in the Newton's method value for xn+1xn into the right-hand side of this last formula gives the formula for Halley's method,

Also see the motivation and proofs for the more general class of Householder's methods.

Cubic convergence

[edit]

Suppose a is a root of f but not of its derivative. And suppose that the third derivative of f exists and is continuous in a neighborhood of a and xn is in that neighborhood. Then Taylor's theorem implies:

and also

where ξ and η are numbers lying between a and xn. Multiply the first equation by and subtract from it the second equation times to give:

Canceling and re-organizing terms yields:

Put the second term on the left side and divide through by

to get:

Thus:

The limit of the coefficient on the right side as xna is:

If we take K to be a little larger than the absolute value of this, we can take absolute values of both sides of the formula and replace the absolute value of coefficient by its upper bound near a to get:

which is what was to be proved.

To summarize,

[4]

Relation to Newton's method

[edit]

Halley's rational method applied to the real-valued function f(x) is the same as applying Newton's method to find the zeros of the function where k is any non-zero constant. Applying Newton's method to g(x) will have cubic convergence (or better), whereas Newton's method applied directly to f(x) will usually have quadratic convergence.

For example, using Halley's method to find a zero of f(x) = yex, which is useful for efficiently and precisely estimating x = ln(y), is the same as using Newton's method to find a zero of g(x) = yex/2ex/2.

Example

[edit]

Use of Halley's method to compute square roots

[edit]

Just as Newton's method can be used to compute square roots, Halley's method can be specialized for the same purpose with cubic convergence. To find the positive square root of a given number S, consider the function

Substituting these into the general form of Halley's iteration,

gives

This is known as the rational form of Halley’s method for square roots. It requires only arithmetic operations—no square roots—and converges cubically to for any positive initial guess .

Halley's irrational method

[edit]

Halley actually developed two third-order root-finding methods. The above, using only a division, is referred to as Halley's rational method. A second, "irrational" method uses a square root as well.[5][6] It starts with

and solves for the value using the form of the quadratic formula for that has the radical in the denominator,

The sign for the radical is chosen so that is the root nearer to .

This iteration was "deservedly preferred" to the rational method by Halley[6] on the grounds that it tends to have about half of the error of the rational method, a benefit which multiplies as it is iterated.

This formulation reduces to Halley's rational method under the approximation that . Muller's method could be considered as modification of this method. So, this method can be used to find the complex roots.

Multiple variables

[edit]

Halley's method has been adapted to multiple variables,[7] where the goal is to find a root of for some function with positive integer n. In this case, Newton's method uses the constant and linear terms of the multivariate Taylor series, and solves the equation with matrix to give using matrix inversion. Following the motivation for the one-variable case, Halley's method gives where the subscript k indicates the k-th coordinate of the associated vector.

References

[edit]
  1. ^ Boyd, John P. (2013). "Finding the Zeros of a Univariate Equation: Proxy Rootfinders, Chebyshev Interpolation, and the Companion Matrix". SIAM Review. 55 (2): 375–396. doi:10.1137/110838297.
  2. ^ Scavo, T.R.; Thoo, J.B. (1995). "On the geometry of Halley's method". American Mathematical Monthly. 102 (5): 417–426. doi:10.2307/2975033. JSTOR 2975033.
  3. ^ Alefeld, G. (1981). "On the convergence of Halley's method". American Mathematical Monthly. 88 (7): 530–536. doi:10.2307/2321760. JSTOR 2321760.
  4. ^ Proinov, Petko D.; Ivanov, Stoil I. (2015). "On the convergence of Halley's method for simultaneous computation of polynomial zeros". J. Numer. Math. 23 (4): 379–394. doi:10.1515/jnma-2015-0026. S2CID 10356202.
  5. ^ Bateman, Harry (January 1938). "Halley's methods for solving equations". The American Mathematical Monthly. 45 (1): 11–17. doi:10.2307/2303467. JSTOR 2303467.
  6. ^ a b Halley, Edmond (May 1694). "Methodus nova accurata & facilis inveniendi radices æqnationum quarumcumque generaliter, sine praviæ reductione". Philosophical Transactions of the Royal Society (in Latin). 18 (210): 136–148. doi:10.1098/rstl.1694.0029. An English translation was published as Halley, Edmond (1809) [May 1694]. "A new, exact, and easy Method of finding the Roots of any Equations generally, and that without any previous Reduction". In C. Hutton; G. Shaw; R. Pearson (eds.). The Philosophical Transactions of the Royal Society of London, from their commencement, in 1665, to the year 1800. Vol. III from 1683 to 1694. pp. 640–649.
  7. ^ Cuyt, Annie A. M.; Rall, Louis B. (1985). "Computational implementation of the multivariate Halley method for solving nonlinear systems of equations". ACM Transactions on Mathematical Software. 11 (1): 20–36.
[edit]