\(\def\ans#1{\bbox[border:1px solid green,6pt]{#1}}\)

Taylor and Maclaurin Series

Introduction

Here we apply what we've learned about infinite series, and particularly power series: the application is to represent functions as power series. This is useful for a variety of reasons; for instance, it allows us to write a transcendental function like \(e^x\) as what is essentially an infinite polynomial. As we have seen in many contexts by now, polynomials are very friendly, so it can greatly simplify things to rewrite a more complicated function as a power series.

Two Assumptions:

  1. The function \(f(x)\) that we're dealing with in a particular problem has a power series representation about some center \(x=a\). Basically, we aren't going to try any functions for which this won't work.
  2. This function \(f(x)\) has derivatives of every order (first derivative, second, etc.) and we can find them. This is crucial.

The Basic Idea

Ok, so how do we come up with a power series to represent a function? In other words, start with the following (just using the general form of a power series: \[f(x)=\sum_{n=0}^\infty c_n (x-a)^n\] How do we figure out what those coefficients \(c_n\) are? If we know what those are, we're done; that's the whole game.

In a moment, we'll go through and find a formula for them, and that's all you'll need to know to do the problems; you can memorize that and move on. But again, I like to show you where things like this come from, so we're going to derive that formula. Here's the basic idea, though: we pick a center point \(a\) (the center of the power series' interval of convergence), and we decide that AT THAT POINT, we want the function and the power series to agree. Simple enough, right? It turns out that that condition gives us the first coefficient. Then, we decide that at that point, we want the first derivative of the function and the first derivative of the power series to agree. This, as we'll see, gives us the second coefficient. We do the same with the second derivative to get the third coefficient, and so on.

It may help to try the following demo: (click to follow the link to Desmos)


Try entering \(\cos x\), \(\sin x\), and \(e^x\) into the function box. Then vary \(D\), the degree of the matching polynomial (if this \(D\) were infinite, it would be the power series). Notice that if \(D=1\), \(f(x)\) and the polynomial agree at the center point, and their slope agree. Then, if \(D=2\), they agree, their slopes agree, and their concavities agree. Every time you increase \(D\), the matching polynomial starts to line up more and more closely with \(f(x)\).


Finding the Coefficients

To find the coefficients, as I said, we'll start taking derivatives of both the function \(f(x)\) and the power series, and each time we do, we'll set them equal when \(x=a\). To differentiate the power series, all we need is the Power Rule (and technically the Chain Rule, but it's pretty simple). You should double-check the table below (note that to write the nth derivative of \(f\), we write \(f^{(n)}\), so we don't get stuck writing a bunch of apostrophes; for instance, the fifth derivative would be written \(f^{(5)}(x)\)).

The FunctionThe Power Series
\(f(x)\)\(\displaystyle\sum_{n=0}^\infty c_n (x-a)^n = c_0 + c_1(x-a) + c_2(x-a)^2 + c_3(x-a)^3 + c_4(x-a)^4 + \ldots\)
At \(x=a\)\(f(a)\)\(c_0\)
First Derivative\(f'(x)\)\(c_1 + 2c_2(x-a) + 3c_3(x-a)^2 + 4c_4(x-a)^3 + \ldots\)
At \(x=a\)\(f'(a)\)\(c_1\)
Second Derivative\(f''(x)\)\(2c_2 + 3 \cdot 2 c_3(x-a) + 4 \cdot 3 c_4(x-a)^2 + \ldots\)
At \(x=a\)\(f''(a)\)\(2c_2\)
Third Derivative\(f'''(x)\)\(3 \cdot 2 c_3 + 4 \cdot 3 \cdot 2 c_4(x-a) + \ldots\)
At \(x=a\)\(f'''(a)\)\(3 \cdot 2 c_3\)
\(\vdots\)\(\vdots\)\(\vdots\)
nth Derivative\(f^{(n)}(x)\)\(n! c_n + (n+1)!c_{n+1}(x-a) + \ldots\)
At \(x=a\)\(f^{(n)}(a)\)\(n! \ c_n\)

This table gives us the coefficients (which is all we were after, remember). The key takeaway is that \[c_n = \dfrac{f^{(n)}(a)}{n!}.\]

Summary

If a power series representation of a function \(f(x)\) exists about a point \(x=a\), then the Taylor Series for this function about this point is given by \[f(x) = \sum_{n=0}^\infty \dfrac{f^{(n)}(a)}{n!} \ (x-a)^n.\] If the center point is \(a=0\), this is called a Maclaurin Series: \[f(x) = \sum_{n=0}^\infty \dfrac{f^{(n)}(0)}{n!} \ x^n.\]

As you'll see, doing a problem in this section simply consists of finding a pattern for these coefficients, which we'll do by taking derivatives and plugging in that center point.

Examples

Find the Maclaurin Series for \(f(x)=e^x\).

Solution

Since we're finding the Maclaurin Series, \(a=0\), so \[f(x)=e^x = \sum_{n=0}^\infty \dfrac{f^{(n)}(0)}{n!} \ x^n.\] All that we have to do is find the pattern in the derivatives evaluated at 0. This is easy, since all the derivatives are equal:

\[\begin{align} f(x) = e^x \longrightarrow f(0) &= e^0 = 1\\ f'(x) = e^x \longrightarrow f'(0) &= e^0 = 1\\ f''(0) &= 1\\ f'''(0) &= 1\\ &\vdots\\ f^{(n)}(0) &= 1 \end{align}\] Therefore, \[\ans{e^x = \sum_{n=0}^\infty \dfrac{1}{n!} x^n}\]

Interval of Convergence

When it comes to a Taylor or Maclaurin Series, the interval of convergence is important because this is the interval on which the function and the power series are equal. In other words, this is the interval of \(x\) values where the function can be evaluated by evaluating the power series.

To find the interval of convergence, use the Ratio Test: \[\begin{align} p &= \lim_{n \to \infty} \left|\dfrac{\dfrac{x^{n+1}}{(n+1)!}}{\dfrac{x^n}{n!}}\right|\\ &= \lim_{n \to \infty} \left|\dfrac{x^{n+1}}{(n+1)!} \cdot \dfrac{n!}{x^n}\right|\\ &= \lim_{n \to \infty} \left|x \cdot \dfrac{1}{n+1}\right| \end{align}\] Making sure this limit is less than 1: \[\begin{align} \lim_{n \to \infty} \left|x \cdot \dfrac{1}{n+1}\right| &< 1\\ |x| \lim_{n \to \infty} \dfrac{1}{n+1} &< 1 \end{align}\]

Since that limit is 0, the left hand side is always less than 0, no matter what \(x\) is, so the interval of convergence is \[(-\infty,\infty).\]

Try it yourself:

(click on the problem to show/hide the answer)

Find the Maclaurin Series for \(f(x)=e^{-x}\).
\(e^{-x} = \displaystyle\sum_{n=0}^\infty \dfrac{(-1)^n}{n!} x^n\)

Find the Taylor Series for \(f(x)=\dfrac{1}{x^2}\) centered about \(x=-1\).

Solution

We start by taking derivatives and evaluating them at \(x=-1\).

\[\begin{align} f(x) = x^{-2} = \dfrac{1}{x^2} &\longrightarrow f(-1) = 1\\ f'(x) = -2x^{-3} = \dfrac{-2}{x^3} &\longrightarrow f'(-1) = 2\\ f''(x) = (3)(2)x^{-4} = \dfrac{(3)(2)}{x^4} &\longrightarrow f''(-1) = (3)(2)\\ f'''(x) = (-4)(3)(2)x^{-5} = \dfrac{(-4)(3)(2)}{x^5} &\longrightarrow f'''(-1) = (4)(3)(2)\\ &\vdots\\ f^{(n)}(-1) &= (n+1)! \end{align}\]

Plugging this into the Taylor Series formula: \[\begin{align} f(x) = \dfrac{1}{x^2} &= \sum_{n=0}^\infty \dfrac{f^{(n)}(-1)}{n!} \ (x+1)^n\\ &= \sum_{n=0}^\infty \dfrac{(n+1)!}{n!} \ (x+1)^n \end{align}\]

Simplify by noting that \(\dfrac{(n+1)!}{n!} = n+1\):

\[\ans{\dfrac{1}{x^2} = \sum_{n=0}^\infty (n+1) (x+1)^n}\]

Interval of Convergence

To find the interval of convergence, use the Ratio Test: \[\begin{align} p &= \lim_{n \to \infty} \left|\dfrac{(n+2)(x+1)^{n+1}}{(n+1)(x+1)^n}\right|\\ &= \lim_{n \to \infty} \left|(x+1) \cdot \dfrac{n+2}{n+1}\right| \end{align}\] Making sure this limit is less than 1: \[\begin{align} \lim_{n \to \infty} \left|(x+1) \cdot \dfrac{n+2}{n+1}\right| &< 1\\ |x+1| \lim_{n \to \infty} \dfrac{n+2}{n+1} &< 1 \end{align}\]

Since this limit is equal to 1: \[\begin{align} |x+1| \lim_{n \to \infty} \dfrac{n+2}{n+1} &< 1\\ |x+1| &< 1\\ -2 < x < 0 \end{align}\]

The series diverges at both endpoints, based on the Divergence Test, so the interval of convergence is \[(-2,0).\]

Find the Maclaurin Series for \(f(x)=\sin x\).

Solution

This one is a bit trickier, but we start the same way, by taking derivatives and plugging 0 into them to spot a pattern. The pattern is pretty clear, but the tricky part is writing an expression that matches this pattern. To do so, we'll drop off all the terms that equal 0 and write an expression for what remains.

\[\begin{align} f(x) = \sin x &\longrightarrow f(0) = 0\\ f'(x) = \cos x &\longrightarrow f'(0) = 1\\ f''(x) = -\sin x &\longrightarrow f''(0) = 0\\ f'''(x) = -\cos x &\longrightarrow f'''(0) = -1\\ f^{(4)}(x) = \sin x &\longrightarrow f^{(4)}(0) = 0\\ f^{(5)}(x) = \cos x &\longrightarrow f^{(5)}(0) = 1\\ \end{align}\]

Plugging this into the Maclaurin Series:<\p> \[\begin{align} \sin x &= \sum_{n=0}^\infty \dfrac{f^{(n)}(0)}{n!} \ x^n\\ &= \dfrac{0}{0!} \ x^0 + \dfrac{1}{1!} \ x^1 + \dfrac{0}{2!} \ x^2 + \dfrac{-1}{3!} \ x^3 + \dfrac{0}{4!} \ x^4 + \dfrac{1}{5!} \ x^5 + \ldots\\ &= \dfrac{1}{1!} \ x^1 + \dfrac{-1}{3!} \ x^3 + \dfrac{1}{5!} \ x^5 + \ldots\\ \end{align}\]

To find the pattern for this series, first note that it alternates, and the negative terms are the odd ones (if \(n\) starts at 0, the second term listed is when \(n=1\)), so we'll need \((-1)^{n}\) in the expression. Then notice that the power on \(x\) and the factorial in the denominator are the same for each term, and they are the odd numbers, which, as we've seen before, can be expressed by \(2n+1\). Therefore, the Maclaurin Series looks like

\[\ans{\sin x = \sum_{n=0}^\infty (-1)^n \cdot \dfrac{x^{2n+1}}{(2n+1)!}.}\]

Try it yourself:

(click on the problem to show/hide the answer)

Find the Maclaurin Series for \(f(x)=\cos x\).
\(\cos x = \displaystyle\sum_{n=0}^\infty (-1)^n \cdot \dfrac{x^{2n}}{(2n)!}\)

Most Common Maclaurin Series

We've now derived the three most common Maclaurin Series, and you will probably want to memorize the following:

\[\ans{\begin{align} e^x &= \sum_{n=0}^\infty \dfrac{1}{n!} x^n\\ \\ \sin x &= \sum_{n=0}^\infty (-1)^n \cdot \dfrac{x^{2n+1}}{(2n+1)!}\\ \\ \cos x &= \displaystyle\sum_{n=0}^\infty (-1)^n \cdot \dfrac{x^{2n}}{(2n)!} \end{align}}\]

Note

The applicability of Taylor/Maclaurin Series is very broad, but one easy application is to use them to verify derivatives that you already know. I won't work this out for you, but you should be able to write out the first few terms of each of the three series above and take the derivative, term by term, to show that the derivative of the exponential function is itself, the derivative of the sine function is the cosine function, and the derivative of the cosine function is the negative of the sine function. Take a moment and try it.