Space complexity of recursive algorithms

I’m following along this online course and one of the topics they touched on was time and space complexity. They solve leetcode problems in a iterative and recursive way using Python and then calculate the complexities of the implementation.
They explained that when you code in a recursive way, it’s usually more space consuming given that when you use recursion, every function call gets added to the call stack. But by reading the book Learn Functional Programming with Elixir, in the chapter on recursion, it says this about Tail-Call Optimization:

Tail-call optimization is when the compiler reduces functions in memory without allocating more memory. It’s a common compiler feature in functional programming languages. To use it, we need to ensure that the last expression of our function is a function call. If the last expression is a new function call, then the current function’s return is the return of the new function call and it doesn’t need to keep the current function in memory

Does this mean that in most functional languages, at least when using tail-call optimization, the space complexity of an algorithm shouldn’t be that much worse than the iterative version of it in an imperative language ?

It depends on the values being recursed over and if they can be stored on the stack. Some values may be stored in the heap. So, even though there is no stack frame pointing to it, because of the frame being replaced, the value on the heap is still there until it gets garbage collected. Stuff that can be allocated on the stack are usually small values, like small strings and numbers.

1 Like

This seems orthogonal. The issue with recursion in languages like python is that regardless of where the values in the function are WRT the stack / vs heap, each recursive function call adds a stack frame. If it’s a GCed language (python) there is exactly the same issue WRT accumulating garbage. In fact arguably it’s worse, since there is a live stack frame pointing to the value, whereas in Elixir those frames are replaced and you can garbage collect the value.

The BEAM implements not only tail recursion but actually “last call elimination” where if the last thing a function call does is call a local function, the stack frame of the new function replaces the current stack frame. This allows mutual recursion where two functions recursively call each other.

Therefore:

Yes, in fact it can be identical to the iterative version.

The thing though is, even non tail recursive versions can have the same space constraints. Consider the simple function:

def multiply_by_2([]), do: []
def multiply_by_2([ n | rest]), do: [n * 2 | multiply_by_2(rest)]

This is a body recursive function, and therefore it will accumulate stack frames. The space complexity of this is O(n). The thing is, the space complexity of the iterative version is… also O(n). Now, because of immutable data, the Elixir version may generate more garbage, but honestly that depends entirely on the underlying implementation of lists / arrays in the language.

2 Likes