0
$\begingroup$

Here's the question I was given.

"Find the first four terms of the Taylor series for

$$f(x) = log(1+x)$$

about the point x = 0 (i.e., that includes f and its first three derivatives). Evaluate the series for y1 = 0.1 and y2 = 0.01 and compute the errors of this approximation by comparing with the values f(y1) and f(y2), respectively. Derive a bound on the accuracy for the Taylor series and compare the bound with the actual errors."

I found the first four terms of the Taylor series, and evaluated it for y1 and y1. The series looks like $$0+x-\frac{x^2}{2}+\frac{x^3}{3}$$

f(y1) = 0.0953

and

f(y2) = 0.0099

I'm not sure how to compute the errors of this approximation by comparing with the values of f(y1) and f(y2). I also don't know how to derive a bound on the accuracy for the Taylor series and how to compare the bound with the actual errors. What does the question even mean by a bound?

Thanks in advance

$\endgroup$

1 Answer 1

1
$\begingroup$

There is a formula to find the maximum error for a Taylor called the Lagrange error bound. It basically says that the maximum difference between the function and the Taylor you created is less than or equal to

M/(n-1)!*(x-c)^(n+1) (Sorry for the bad formatting I’m on my phone) Where M is the absolute value of the largest value of the nth derivative, n is the degree of the Taylor polynomial, c is the center of the Taylor, and x is where you are evaluating it.

Here’s the link to where I got this https://magoosh.com/hs/ap-calculus/2017/ap-calculus-bc-review-lagrange-error-bound/

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .