In mathematics, the Taylor series is a representation of a function as
an infinite sum of terms calculated from the values of its derivatives
at a single point. It is named after the English mathematician Brook
Taylor. If the series is centered at zero, the series is also called a
Maclaurin series, named after the Scottish mathematician Colin
Maclaurin. It is common practice to use a finite number of terms of the
series to approximate a function. The Taylor series may be regarded as
the limit of the Taylor polynomials. In the article we shall discuss
about Taylor polynomial series.(Source: Wikipedia)
Taylor polynomial series:
Find the Taylor series for sin x.
Sin x = `sum_(0)^oo` `((-1)^k)/((2k + 1)!)` x2k+1
The
remainder term is not expressible in any simple way but can be
estimated by using the Lagrange's form of the remainder. The
coefficients
`((-1)^k)/((2k + 1)!)`
are easily verified by calculating successive derivatives of f(x) = sin x and using the formulas
ak=`(f^(k)(0))/(k!)`
To check convergence of the series, apply Lagrange's form for Ra(x); For each x`in` R. there exists Z such that
Rn(x) = `(f^(n + 1)(z))/((n + 1)!)` xn+1
Now |fn+1(z)| equals either |cos z| or |sin z| So, in either case,|fn+1 (z)|`<=`1 ,and
|Rn(x) |`<=` | x |n+1 /(n+1)!
Since | x |n+1 /(n+1)!`->`0 as n `|->` `oo` for all x`in` R, we can see that the remainder term |Rn(x)|`|->` 0 as n `|->` `oo`
for all x`in` R. Thus the series representation is completely justified for all real x.
Observe that our estimate for |Rn (x)|,
|Rn (x)|`<=`|x|n+1/(n+1)!
gives also a sense of the rate of convergence of the series for fixed x, for example, for | x|`<=` 1, we find
|Rn (x)|`<=`1/(n+1)!
Thus,
if we want to calculate sin x on (-1, 1) to within .01, we need take
only the first five terms of the series (n = 4) to achieve that degree
of accuracy.
Had we used the integral form for Rn (x) we would have obtained a similar estimate.
Sample problem for Taylor polynomial series:
Pro: Evaluate the definite integral `int_0^1` Sin (x) dx
Sol: The
integrand has no anti derivative expressible in the terms of familiar
functions. Howebver, we know how to find its Taylor series. we know that
Sin t = t -`(t^3)/(3!)` + `(t^5)/(5!)` - `(t^7)/(7!)` + ----
Now if we substitute t = x, we have
Sin (x) = X - `(x^3)/(3!)` + `(x^5)/(5!)` - `(x^7)/(7!)` + ----
In spite of the fact that we cannot antidifferentiate the function, we can antidifferentiate the Taylor series:
`int_0^1` Sin (x) dx = `int_0^1` (X - `(x^3)/(3!)` + `(x^5)/(5!)` - `(x^7)/(7!)` + ----) dx
=(`(x^2)/(2)` -`(x^)/(4*3!)` + `(x^6)/(6*5!)`-`(x ^ 7)/(8*7!)`+ -----) |01
= (`(x^2)/(2)` -`(x^)/(4*3!)` + `(x^6)/(6*5!)`+`(x^7)/(8*7)` + ---)
Notice
that this is an alternating series so we know that it converges. if we
add up the first four terms, the pattern becomes ckear: the series
converges to 0.2871