Search This Blog

Sequence defined through an arithmetic mean recurrence

There is this nice problem about sequences which I've encountered several times while solving problems in real analysis.

Two constant real numbers $a,b$ are given and then we have this sequence defined as:  

$$a_0 = a$$

$$a_1 = b$$

$$a_{n+2} = \frac{a_{n+1} + a_n}{2},\ \ \ n \ge 0$$

Prove that the sequence converges and find the limit $L = \lim_{n \to \infty} a_n$  

I won't post the solution here but... it turns out the limit is this number 

$$L = \frac{1}{3} \cdot a + \frac{2}{3} \cdot b $$

Here is a nice illustration of this fact generated by a Python program. 











In this case (depicted on the picture) the limit is $$\frac{1}{3} \cdot 10 + \frac{2}{3} \cdot 100 = \frac{1}{3} (10 + 200) = \frac{210}{3} = 70$$


No comments:

Post a Comment