Order of accuracy


In numerical analysis, order of accuracy quantifies the rate of convergence of a numerical approximation of a differential equation to the exact solution.
Consider, the exact solution to a differential equation in an appropriate normed space. Consider a numerical approximation, where is a parameter characterizing the approximation, such as the step size in a finite difference scheme or the diameter of the cells in a finite element method.
The numerical solution is said to be th-order accurate if the error, is proportional to the step-size to the th power;
Where the constant is independent of h and usually depends on the solution.. Using the big O notation an th-order accurate numerical method is notated as
This definition is strictly depended on the norm used in the space; the choice of such norm is fundamental to estimate the rate of convergence and, in general, all numerical errors correctly.
The size of the error of a first-order accurate approximation is directly proportional to.
Partial differential equations which vary over both time and space are said to be accurate to order in time and to order in space.