Numerical analysis is that branch of applied mathematics which studies the methods and algorithms to find (approximate) numerical solutions to various mathematical problems, using a finite sequence of arithmetic and logical operations. Most solutions of numerical problems build on the theory of linear algebra.
Table of contents |
2 Computers as Tools for Numerical Analysis 3 Problem Taxonomy 4 Finding Zeros 5 Partial Differential Equations 6 See Also |
The problems considered by numerical analysis include:
Areas of study
When approximating different solutions to numerical problems, three factors about such methods are to be considered:
While numerical analysis employs mathematical axioms, theorems and proofs in theory, it may use empirical results of computation runs to probe new methods and analyze problems. It has thus a unique character when compared to other mathematical sciences.
Computers are an essential tool in numerical analysis, but the field predates computers by many centuries, and actually computers were invented to a large extent in order to solve numerical problems, not the other way around. Taylor approximation is a product of the seventeenth and eighteenth centuries that is still very important. The logarithms of the sixteenth century are no longer vital to numerical analysis, but the associated and even prehistoric notion of interpolation continues to solve problems for us.
Floating point number representations are used extensively in modern computers: for many problems, their behavior can be unexpected, unless care is taken using numerical analysis to ensure that they will not misbehave.
A well-conditioned mathematical problem is, roughly speaking, one whose solution changes by only a small amount if the problem data are changed by a small amount. The analogous concept for the numerical algorithm for solving the problem is that of numerical stability: an algorithm for solving a well-conditioned problem is numerically stable if the result of the algorithm changes only a small amount if the data change a little.
An algorithm that solves a well-conditioned problem may or may not be numerically stable. An art of numerical analysis is to find a stable algorithm for solving a mathematical problem.
The study of the generation and propagation of round-off errors in the cause of a computation is an important part of numerical analysis. Subtraction of two nearly equal numbers is an ill-conditioned operation, producing
catastrophic loss of significance.
The effect of round-off error is partly quantified in the condition number of an operator.
One fundamental problem is the determination of zeros of a given function. Various algorithms have been developed. If the function is differentiable and the derivative is known, then Newton's method is a popular choice.
Numerical Analysis is also concerned with computing (in an approximate way) the solution of Partial Differential Equations. This is done by first discretizing the equation, bringing it into a finite dimensional subspace, then solving the linear system in this finite dimensional space. The first stage is done by the Finite element method, finite difference methods, or (particularly in engineering) the method of Finite Volumes. The theoretical justification of these methods often involves theorems from functional analysis.
The linear systems that come form discretized Partial Differential Equations can then be solved by a variant of Gauss-Jordan elimination, by some Iterative method such as Conjugate Gradients, or by Multigrid.
For very large problems, the partial differential equation can be split into smaller subproblems and solved in parallel, as in domain decomposition methods.Computers as Tools for Numerical Analysis
Problem Taxonomy
Finding Zeros
Partial Differential Equations