Fibonacci Recursive Algorithm

by Brian Kang

Fibonacci Recursive Algorithm


The Fibonacci sequence algorithm is one of my favorites as it is a great introduction to understanding and visualizing recursion.

Let’s talk about this sequence and how we can solve it both iteratively and recursively!


The Fibonacci sequence is defined as F(0) = 0, F(1) = 1, and F(n) = F(n-1) + F(n-2) for n>=2. So the sequence, starting with F(0), is 0, 1, 1, 2, 3, 5, 8 and so on…

Computing a single value of the sequence e.g. F(n) can be done in a variety of different ways, some faster than others. Today we will only be focused on solving it two different ways: recursively and iteratively.

What is Recursion

Recursion is the process in which a function calls itself until the condition is met. What this means is a function is recursive if somewhere within the function, it calls itself thereby re-starting from the beginning with perhaps a different set of inputs than before. There are three required components for effective recursion:

1. Base Case: When a function can determine (and return) an answer immediately, this is a base case.

2. Forward Progress: When a function cannot solve a problem but can narrow the range of possibilities. For recursion to be effective, you must make at least a little forward progress in every possible case.

3. Calling back into itself as if it were the original problem: Just like we talked about above, the key part of recursion is that a function invokes itself in order to narrow things down. This is simply called a recursive call.

What is Iterative

An iterative approach is a repetition process until the condition fails; specifically using loops such as while, for, etc. A certain set of statements are repeated a certain number of time as opposed to recursion which calls itself until a condition is met. We will first look into solving the Fibonacci sequence iteratively, which will be through the use of a simple loop.

This may be the most logical way to think of the solution once presented with the statement. As the definition of the sequence implies, in this implementation we are simply calculating the next number by adding the current number to the old number F(n) = F(n-1) + F(n-2).

As for time complexity, this approach is pretty solid at O(n) so we can be happy with our pretty optimal solution. Now let’s check out the recursive approach.

Recursive Solution

Through recursion, we will call the function itself and simplify a lot of what’s going on in the front scene.

Looks much simpler! But what is really going on behind the scenes?

We have set up the base case so that until num is less or equal to 1, we must hold off on returning anything. However, once num <=1 , we return 1 — thus allowing us to proceed with the remaining calls ‘on-hold’ one by one. To give an example, imagine you are standing in a line of people and there are an unknown number of people (n) in front of you. You can ask the person directly in front of you but they won’t have any idea as well. This would have to keep going on until the person in the very front happens to be asked. Since there is nobody else in front of them, they know for certain that the line length for him is 1 — this is the base case returning 1.

Once this is figured out, he can relay the information to the person behind him, who will take that number and add himself, before passing it to the person behind him. This relay of information symbolizes both the forward progress and recursive elements we talked about before. Finally, once the information reaches you in the back, you should have all of the people in front of you accounted for. All that’s left is to add yourself and you now have the length of the entire line!

By calling the function twice through fibonacci(num - 1) + fibonacci(num - 2), we are essentially recreating the relay of information while implementing the F(n) = F(n-1) + F(n-2) definition of the sequence, neat!

It’s not all roses

Unfortunately, our neat little 2 line solution is not without its drawback. The time complexity of this algorithm has increased considerably, and is now exponential O(2^n). Very unideal. Without going into optimizing this solution using memoization, we can see some of the major tradeoffs between solving iteratively versus using recursion.

But wait there’s more

Of course there’s more! That’s the beauty of it. Next chapter we will be going over dynamic programming and perhaps some matrix exponentiation and fast doubling when we dive deeper into time complexity!

Thanks for reading! If you have any questions about anything above, please don't hesitate to reach out!