Power series
In mathematics, a power series is an infinite series of the form
where an represents the coefficient of the nth term and c is a constant. an is independent of x and may be expressed as a function of n. Power series are useful in analysis since they arise as Taylor series of infinitely differentiable functions. In fact, Borel's theorem implies that every power series is the Taylor series of some smooth function.
In many situations c is equal to zero, for instance when considering a Maclaurin series. In such cases, the power series takes the simpler form
These power series arise primarily in analysis, but also occur in combinatorics as generating functions and in electrical engineering. The familiar decimal notation for real numbers can also be viewed as an example of a power series, with integer coefficients, but with the argument x fixed at. In number theory, the concept of p-adic numbers is also closely related to that of a power series.
Examples
Any polynomial can be easily expressed as a power series around any center c, although all but finitely many of the coefficients will be zero since a power series has infinitely many terms by definition. For instance, the polynomial can be written as a power series around the center asor around the center as
or indeed around any other center c. One can view power series as being like "polynomials of infinite degree," although power series are not polynomials.
The geometric series formula
which is valid for, is one of the most important examples of a power series, as are the exponential function formula
and the sine formula
valid for all real x.
These power series are also examples of Taylor series.
On the set of exponents
Negative powers are not permitted in a power series; for instance, is not considered a power series. Similarly, fractional powers such as are not permitted. The coefficients are not allowed to depend on, thus for instance:is not a power series.
Radius of convergence
A power series will converge for some values of the variable x and may diverge for others. All power series f in powers of will converge at x = c. If c is not the only convergent point, then there is always a number r with 0 < r ≤ ∞ such that the series converges whenever |x − c| < r and diverges whenever |x − c| > r. The number r is called the radius of convergence of the power series; in general it is given asor, equivalently,
. The relation
is also satisfied, if this limit exists.
The series converges absolutely for |x − c| < r and converges uniformly on every compact subset of. That is, the series is absolutely and compactly convergent on the interior of the disc of convergence.
For |x − c| = r, we cannot make any general statement on whether the series converges or diverges. However, for the case of real variables, Abel's theorem states that the sum of the series is continuous at x if the series converges at x. In the case of complex variables, we can only claim continuity along the line segment starting at c and ending at x.
Operations on power series
Addition and subtraction
When two functions f and g are decomposed into power series around the same center c, the power series of the sum or difference of the functions can be obtained by termwise addition and subtraction. That is, ifthen
It is not true that if two power series and have the same radius of convergence, then also has this radius of convergence. If and, then both series have the same radius of convergence of 1, but the series has a radius of convergence of 3.
Multiplication and division
With the same definitions for and, the power series of the product and quotient of the functions can be obtained as follows:The sequence is known as the convolution of the sequences and.
For division, if one defines the sequence by
then
and one can solve recursively for the terms by comparing coefficients.
Solving the corresponding equations yields the formulae based on determinants of certain matrices of the coefficients of and
Differentiation and integration
Once a function is given as a power series as above, it is differentiable on the interior of the domain of convergence. It can be differentiated and integrated quite easily, by treating every term separately:Both of these series have the same radius of convergence as the original one.
Analytic functions
A function f defined on some open subset U of R or C is called analytic if it is locally given by a convergent power series. This means that every a ∈ U has an open neighborhood V ⊆ U, such that there exists a power series with center a that converges to f for every x ∈ V.Every power series with a positive radius of convergence is analytic on the interior of its region of convergence. All holomorphic functions are complex-analytic. Sums and products of analytic functions are analytic, as are quotients as long as the denominator is non-zero.
If a function is analytic, then it is infinitely differentiable, but in the real case the converse is not generally true. For an analytic function, the coefficients an can be computed as
where denotes the nth derivative of f at c, and. This means that every analytic function is locally represented by its Taylor series.
The global form of an analytic function is completely determined by its local behavior in the following sense: if f and g are two analytic functions defined on the same connected open set U, and if there exists an element c∈U such that f = g for all n ≥ 0, then f = g for all x ∈ U.
If a power series with radius of convergence r is given, one can consider analytic continuations of the series, i.e. analytic functions f which are defined on larger sets than and agree with the given power series on this set. The number r is maximal in the following sense: there always exists a complex number x with |x − c| = r such that no analytic continuation of the series can be defined at x.
The power series expansion of the inverse function of an analytic function can be determined using the Lagrange inversion theorem.
Behavior near the boundary
The sum of a power series with a positive radius of convergence is an analytic function at every point in the interior of the disc of convergence. However, different behavior can occur at points on the boundary of that disc. For example:- Divergence while the sum extends to an analytic function: has radius of convergence equal to and diverges at every point of. Nevertheless, the sum in is, which is analytic at every point of the plane except for.
- Convergent at some points divergent at others.: has radius of convergence. It converges for, while it diverges for
- Absolute convergence at every point of the boundary: has radius of convergence, while it converges absolutely, and uniformly, at every point of due to Weierstrass M-test applied with the hyper-harmonic convergent series.
- Convergent on the closure of the disc of convergence but not continuous sum: Sierpiński gave an example of a power series with radius of convergence, convergent at all points with, but the sum is an unbounded function and, in particular, discontinuous. A sufficient condition for one-sided continuity at a boundary point is given by Abel's theorem.
Formal power series
Power series in several variables
An extension of the theory is necessary for the purposes of multivariable calculus. A power series is here defined to be an infinite series of the formwhere j = is a vector of natural numbers, the coefficients a are usually real or complex numbers, and the center c = and argument x = are usually real or complex vectors. The symbol is the product symbol, denoting multiplication. In the more convenient multi-index notation this can be written
where is the set of natural numbers, and so is the set of ordered n-tuples of natural numbers.
The theory of such series is trickier than for single-variable series, with more complicated regions of convergence. For instance, the power series is absolutely convergent in the set between two hyperbolas. On the other hand, in the interior of this region of convergence one may differentiate and integrate under the series sign, just as one may with ordinary power series.