Methods of computing square roots
Posts  1 - 1  of  1
munchkin86
So we know the basic approximation in computing square roots is
√(a^2 +b) ≈ a+(b/2a)
Prove that
a ± (b/2a) > a^2 ± b

use this repeatedly to get
√(2)≈ 1 + 1/3 + 1/12 - 1/408

Hint: the first application given 3/2> sqrt(2). then 2= (3/2)^2 - (1/4) and so a second application gives 3/2 - 1/12 > sqrt (2). a third application gives the required result.)
Save
Cancel
Reply
 
x
OK