Calculating the Square Root of a Number using the Newton-Raphson Method [A How To Guide] by@suraj-regmi

January 18th 2020 7,241 reads

Data Scientist, The World Bank -- the views, content here represent my own and not of my employers.

Given a number of total students in a school and you want to know if all the students fit in an assembly ground, you need to know how many lines need to be formed at a minimum. Given the dimension of a door, you need to know how big plywood can be passed through the door. You can’t do these without one thing — square root. Whether it is finding the square root of a number or square root of a sum of squares, a function (or command) to find the square root of a number is needed.

**Easy?**

OK, easy. We have built-in functions (or commands/buttons) in our calculators, computers, programming languages, mobiles and everywhere to calculate the square root. Yes, that can be done in a straight forward way. But what if you are in a place where electricity is not available and your gadgets are dead?

**Help me …**

Yes, I am rescuing you from that situation if the day comes. I am bringing your old friend Newton (and Raphson too), whom you loved a lot during your school days. Probably, some of you hated too seeing his name everywhere — from math textbook to physics textbook, from classical mechanics to heat and thermodynamics, and from optics to cubics. I remember his name in the GK book too.

- Take a reasonable guess (approximate root) for the square root.
- Add the approximate root with the original number divided by the approximate root and divide by 2.
*x_i := (x_i + n / x_i) / 2* - Continue step 2 until the difference in the approximate root along the iterations is less than the desired value (or precision value).
- The approximate root is the square root we want.

**Whiteboard demonstration**

Let’s see it in the whiteboard now with *n = 4*.

*Fig: Newton method for calculating the square root of 4*

It is well known that there are two square roots and we ignore the negative square root here. The negative square root can be calculated easily by taking the first approximation near to the negative square root.

So, this is it — the recipe for the rescue — but I don’t stop here as my computer science, python, mathematics loving readers are expecting to see more. Let’s start with the Python code.

```
def mySqrt(x):
r = x
precision = 10 ** (-10)
while abs(x - r * r) > precision:
r = (r + x / r) / 2
return r
```

Why don’t you implement it in your system?

Newton’s method, also known as Newton-Raphson method is a root-finding algorithm that produces successively better approximations of the roots of a real-valued function. The approximations of the root go as:

*x_(n+1) = x_n - f(x_n) / f’(x_n)*

*x_0* is the rough approximation of the root done at the first and the successive approximations go as *x_1, x_2, ….*

*f(x_n)* is the function whose root is to be determined and *f’(x_n)* is the derivative of the function.

**Description**

For a close enough approximation *x_n*, the better approximation of the root typically can be found by drawing the tangent to the graph at *x_n* and finding where the tangent intersects the *x*-axis.

The equation of a line with slope *m* passing through *(x_1, y_1)* would be:*y - y_1 = m (x - x_1)*

Here, the tangent is the line, derivative, *f’(x_n)*, is the slope and *(x_n, f(x_n)) *is the point.

So, *y - f(x_n) = f’(x_n)(x - x_n)*

As the better approximation will be *x-intercept* of the tangent, put *y = 0* and solve for *x_n*.

*-f(x_n) = f’(x_n)(x - x_n)or, x - x_n = -f(x_n) / f’(x_n)*

This is Newton’s method for approximating the root of a function, *f(x).*

Let’s see now if we can come up with the algorithm provided above using the general formula.

**Newton’s method for square root**

If we have to find the square root of a number *n*, the function would be *f(x) = x² - N *and we would have to find the root of the function, *f(x).*

Here, the value *f(x_n)* at *x = x_n* is:*f(x_n) = x_n² - N*

And, the derivative at the point is:*f’(x_n) = 2 * x_n*

Now, the better approximation can be found using (1).

*x_(n+1) = x_n - (x_n² - N) / (2 * x_n)x_(n+1) = x_n - x_n² / (2 * x_n) + N/ (2 * x_n)x_(n+1) = x_n - x_n / 2+ N/ (2 * x_n)x_(n+1) = x_n / 2+ N/ (2 * x_n)*

This is how the algorithm for finding square root of a number comes.

Integer square root of a number is the floor of the square root. The algorithm can be modified a little to find integer square root of a number. The *while *condition here would be* **approximate * approximate*** **** > N**. The algorithm terminates when the approximate squared is less than or equal to N.

The iteration relation here is:*x_(n+1) = (x_n + N // x_n) // 2,*where // is integer division.

**Proof of correctness**

First, let’s prove the *correctness for the while condition*.

The iteration relation is:*x_(n+1) = (x_n + N // x_n) // 2*

It can be written as:*x_(n+1) = floor((x_n + N / x_n) / 2)*

For *a ≥ 0* and *b ≥ 0,* *a + b ≥ 2 * sqrt(a*b)*.

So, *x_(n+1) ≥ floor(2 * sqrt(x_n * N / x_n) / 2)or, x_(n+1) ≥ floor(sqrt(N))So, x_(n+1) ≥ intsqrt(N)*

Therefore, the value of the approximation never goes below the value of *intsqrt(N)*.

Now, let’s prove that *the approximation is monotonically decreasing*.

Let’s find the difference *x_(n+1) - x_n.*

*x_(n+1) - x_n = (x_n + N // x_n) // 2 - x_n*

As *N // x_n* is less than or equal to *x_n *(based on while condition), *(x_n + N // x_n) // 2 ≤ (2 * x_n) // 2*

So, *x_(n+1) - x_n ≤ (2 * x_n) // 2 - x_n*

As *x_n* is an integer, *(2 * x_n) // 2 = x_n.*i.e

So, x_(n+1)-x_n ≤ 0

Hence, the sequence *{x_n}* is monotonically decreasing.

The values *x_(n+1)* and *x_n* are equal only when *N // x_n* is equal to *x_n* and that is when we find the solution and the while loop gets stopped. In all other cases, the sequence *{x_n}* is strictly decreasing.

Hence, the correctness of algorithm to calculate the integer square root is proved.

Numerical analysis is the field of algorithms that use numerical approximation for the problems of mathematical analysis. Ranging from engineering and physical sciences to life sciences, social science, medicine, business, and arts, scientific computations are being applied everywhere. The rise in computing power has encouraged and enabled the use of realistic mathematical models in science and engineering and numerical analysis is required for the detailed implementation.

This is an example of numerical analysis, where we use Newton’s method to calculate the root of the function. There are two things that excite me about numerical analysis.

- Complex functions and equations can be solved which otherwise would be unsolvable.
- The algorithms can be programmed and automated making the solution easier to implement, reproduce and package.

Did you too find *numerical analysis *interesting? If so, comment here your favorite numerical analysis algorithm.