G
Guest
Guest
Lately we've been studying infinite sequences and series and the Taylor Series. I'm working in a group with two other people, and the three of us are completely clueless. I'm trying to gradually lessen that, but still, this problem is quite perplexing. Any help that you can provide, would be greatly appreciated! here's the problem for our Calculus 2 class:
Now, teach your calculator to divide. We do this by teaching it to take 1/x for all x not =0. It is easy to teach it certain fractions: 1/10=0.1, 1/100=0.01 and in general 1/(10^n)=.000 .... .01 where you have n-1 zeros behind the decimal point. A) Assuming your calculator knows what 1/(10^n) is, find the Taylor Series to help your calculator approximate 1/x for any x not=0. Justify convergence. B)Set up an inequality that will help your calculator approximate 1/a to the nearest ten thousandth. Find the number of terms necessary for a=53 and a=122.
Now, teach your calculator to divide. We do this by teaching it to take 1/x for all x not =0. It is easy to teach it certain fractions: 1/10=0.1, 1/100=0.01 and in general 1/(10^n)=.000 .... .01 where you have n-1 zeros behind the decimal point. A) Assuming your calculator knows what 1/(10^n) is, find the Taylor Series to help your calculator approximate 1/x for any x not=0. Justify convergence. B)Set up an inequality that will help your calculator approximate 1/a to the nearest ten thousandth. Find the number of terms necessary for a=53 and a=122.