Convergence of sequence

mathmari

Junior Member
Joined
Apr 15, 2013
Messages
75
Hi! I am looking at an exercise in Analysis, which ask me to find the limit of the sequence \(\displaystyle x_{n+1}=x_{n}+\frac{1}{x_{n}^2} \) .

Firstly, to show that this sequence converges, I have to show that the sequence is monotonic, in this case \(\displaystyle x_{n} \) is increasing, then I have to show that the sequence is bounded above. How can I show this?

Having shown the above, let \(\displaystyle x \) be the limit of the sequence, then \(\displaystyle x=x+\frac{1}{x^2} \), that means \(\displaystyle \frac{1}{x^2}=0 \). Isn't this wrong?
 
Last edited:
First, you can see that \(\displaystyle x_{n+1}\) is \(\displaystyle x_n\) plus a positive number. That tells you it is increasing.
But you cannot show that the sequence is bounded- it isn't. That is what your "\(\displaystyle \frac{1}{n}= 0\)" shows. If it were bounded it would be convergent but you have shown that it cannot converge to any finite number. The answer to this exercise is that the sequence does NOT converge.
 
Top