Newton's method
From Wiki @ Karl Jones dot com
In the mathematical field of numerical analysis, Newton's method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real-valued function.
Description
The idea of the method is as follows:
One starts with an initial guess which is reasonably close to the true root
Then the function is approximated by its tangent line (which can be computed using the tools of calculus), and one computes the x-intercept of this tangent line (which is easily done with elementary algebra).
This x-intercept will typically be a better approximation to the function's root than the original guess, and the method can be iterated.
See also
- Aitken's delta-squared process
- Approximation
- Bisection method
- Euler method
- Fast inverse square root
- Fisher scoring
- Function (mathematics)
- Gradient descent
- Integer square root
- Iterative method
- Laguerre's method
- Leonid Kantorovich, who initiated the convergence analysis of Newton's method in Banach spaces.
- Methods of computing square roots
- Newton, Isaac
- Newton's method in optimization
- Numerical analysis
- Raphson, Joseph
- Real number
- Richardson extrapolation
- Root of a function
- Root-finding algorithm
- Secant method
- Steffensen's method
- Subgradient method
External links
- Newton's method @ Wikipedia