Here's the question I was given.
"Find the first four terms of the Taylor series for
$$f(x) = log(1+x)$$
about the point x = 0 (i.e., that includes f and its first three derivatives). Evaluate the series for y1 = 0.1 and y2 = 0.01 and compute the errors of this approximation by comparing with the values f(y1) and f(y2), respectively. Derive a bound on the accuracy for the Taylor series and compare the bound with the actual errors."
I found the first four terms of the Taylor series, and evaluated it for y1 and y1. The series looks like $$0+x-\frac{x^2}{2}+\frac{x^3}{3}$$
f(y1) = 0.0953
and
f(y2) = 0.0099
I'm not sure how to compute the errors of this approximation by comparing with the values of f(y1) and f(y2). I also don't know how to derive a bound on the accuracy for the Taylor series and how to compare the bound with the actual errors. What does the question even mean by a bound?
Thanks in advance