Before you start this module, you must know how to find the Taylor polynomials of a given function. You should also be familiar with the geometric series, the notion of a power series, and in particular the concept of the radius of convergence of a power series.
Many functions can be written as a power series. The archetypical example is provided by the geometric series:
which is valid for -1<x<1.
If we write a function as a power series with center , we call the power series the Taylor series of the function with center . (When the center is , the Taylor series is also often called the McLaurin series of the function.)
You probably like polynomials. Think of power series as "generalized" polynomials. Since (almost) all functions you encounter have a Taylor series, all functions can be thought of as "generalized" polynomials!
For instance, you will see that power series are easy to differentiate and integrate. No more techniques of integration, if one is satisfied with writing an integral as a power series!
In finding integrals and solving differential equations, one often faces the problem that the solutions can't be "found", just because they do not have a name, i.e., they cannot be written down by combining the familiar function names and the familiar mathematical notation. The error function and the functions describing the motion of a "simple" pendulum are important examples. Power series open the door to explore even functions like these!
As you increase the degree of the Taylor polynomial of a function, the approximation of the function by its Taylor polynomial becomes more and more accurate.
It is thus natural to expect that the function will coincide with the limit of its Taylor polynomials! So you should expect the Taylor series of a function to be found by the same formula as the Taylor polynomials of a function: Given a function f(x) and a center , we expect
Finding the Taylor series of a function is nothing new! There are two problems, though.
1. It happens quite often that the right-hand side converges only for certain values of x. This is where the notion of the radius of convergence of a power series will become useful.
2. There are rare occasions, where the right-hand side is convergent, but does not equal the function f(x). I will not address this problem here; consult your book for a theorem usually called Taylor's Theorem.
Using the ratio test you can convince yourself that this power series converges everywhere; in fact it is equal to for all values of x.
Let's consider the natural logarithm , and find its Taylor series with center . Note that would be an unwise choice! Why?
Here are the derivatives of :
, so .
, so f'(1)=+1.
, so f''(1)=-1.
, so f'''(1)=+2.
, so .
Do you see a pattern evolve? I bet you see that the next term will be . It takes some practice to see how to translate this into a general formula:
We obtain that (The factorial is 1 less than the order of the derivative.)
The alternating sign can be accommodated by inserting a term of the form or , depending on whether the first term (for n=1) is negative or positive!
Thus we see that Check that this works for . (It doesn't work for n=0; since this term is 0 anyway, we will omit it).
This yields the Taylor series
In an earlier example (the example is almost identical!), we saw that this power series has a radius of convergence of 1. It turns out that the formula above is indeed valid for 0<x<2. (Recall that the center of the power series is 1.)