- Read
- Top Stories
- Write
Writer Brand Writing Prompts - Learn
- STARTUPS
- Tech Giants
- About
Company People Software by HackerNoon - Help
- Sponsor

The Fibonacci Sequence to the nᵗʰ number (Python & JavaScript) by@ethan.jarrell

March 9th 2018 9,065 reads

Programmers often use coding problems to sharpen their skills, test their knowledge, or prepare for technical interviews. Many of these problems are math based, and one of the most common types of math based technical challenges are ones that deal with the Fibonacci sequence.

Each new term in the Fibonacci sequence is generated by adding the previous two terms. So, for example, starting with 1 and 2, the first 10 numbers in the sequence would be:

1, 2, 3, 5, 8, 13, 21, 34, 55, 89…

One of my favorite challenges that deals with the Fibonacci sequence is one that asks for the index value of some arbitrarily high number in the sequence. As an example, let’s say that we would want to get the index value of the first Fibonacci number that contains 1000 digits.

Normally, an easy way to go about doing something like this would be to put all of the numbers in an array, and then cycle through them with a for loop. In the for loop, once we reach a number with 1000 digits, we would simply grab that index or the number. Our code could look something like this:

Now, this isn’t a particularly good idea for a couple of reasons. First, it requires two different functions. One function to generate the Fibonacci sequence. And a second function to cycle through all the numbers we’ve generated. Also, doing it this way could be fairly memory intensive. As a slightly better alternative, we could use a while loop, and generate the sequence in the while loop, but end the loop if we reach a number with a length of 1000. Also, if all we want is the number or it’s index, then we don’t actually have to create an array of any of the other numbers.

Here’s how we could use this type of solution in both Python and JavaScript:

We’ll start off by creating our variables, which will represent the numbers in the Fibonacci sequence:

a = 1

b = 1

c = a + b

Next, we’ll create two more variables, one for a `count`

, and another for `length`

. Since we don’t actually want to create a list or an array, I’ll use the `count`

to increment each value. That way, I’m storing one number, which represents the index value, rather than an array that I have to then cycle through to get the index value. The `length`

variable is what I’ll use to end the for loop. I’ll set the initial value to a length of 3. Then I’ll find the value of each new Fibonacci number, and if the current value of the number is greater than the `length`

variable, then I’ll set the value of the length `variable`

to the new number. My while loop condition will be that the `length`

variable will be less than or equal to `999`

. Here are my two variables:

count = 0

length = 'aaa'

And here’s the while loop:

while len(length) <= 999:

a = b

b = c

c = a + b

count = count + 1

str1 = str(c)

if len(str1) > len(length):

length = str1

At this point, the while loop will end once I find the first 1000 digit number. If I want the index of that number, then I can print the variable `count`

. Or if I want the `number`

itself, I can print `length`

, which is storing that `number`

as a `string`

.

print length

print count

[Finished in 0.09s]

Pretty cool right?

Well, now you’re screwed. Technically, we could use the same code that we used from our first example. Or, we could change it up, and do something similar to what I described not to do earlier. That code might look like the following:

let x = [];

let a = 1;

let b = 1;

let c = a + b;

while (x.length < 5000){

a = b;

b = c;

c = a + b;

x.push(c);

}

let val = [];

for (var i = 0; i < x.length; i++) {

let str1 = x[i].length;

if (x[i].length == 1000){

val.push([x[i], i])

}

}

console.log(val);

However, either way, you will quickly run into a problem. Anything longer than about 15 or 16 digits, and JavaScript won’t usually process it correctly. This is because JavaScript represents numbers using IEEE-754 double-precision (64 bit) format. As I understand it this gives you 53 bits precision, or fifteen to sixteen decimal digits. All numbers in JavaScript are floating point which means that integers are always represented as

sign × mantissa × 2exponent

The mantissa has 53 bits. You can use the exponent to get higher integers, but then they won’t be contiguous, any more. For example, you generally need to multiply the mantissa by two (exponent 1) in order to reach the 54th bit. However, if you multiply by two, you will only be able to represent every second integer:

> Math.pow(2, 53) // 54 bits

9007199254740992

> Math.pow(2, 53) + 1

9007199254740992

> Math.pow(2, 53) + 2

9007199254740994

> Math.pow(2, 53) + 3

9007199254740996

> Math.pow(2, 53) + 4

9007199254740996

Rounding effects during the addition make things unpredictable for odd increments (+1 versus +3). The actual representation is a bit more complicated [1], but this explanation should help you understand the basic problem.

If you come across a similar type problem, especially in technical interviews, you are probably better off attempting the problem in something other than JavaScript. And, to make you sound even more intelligent, you can solve it in Python, and then explain how JavaScript handles integers.

Join Hacker Noon

Create your free account to unlock your custom reading experience.