Description
Currently, the interpreter's call mechanism is coupled with the call stack of the host environment because of its recursive implementation. The very same issue exists in CPython, which led eventually to the development of Stackless Python.
The interpreter's main process may be viewed as a state machine. When entering a function, the state machine would change its state and then continue the main loop. Calling Rust functions during state change is perfectly possible, but continuing the main loop should not involve a recursion.
By avoiding recursion you gain complete control over stack consumption. For example, if a continuation never returns, except by breaking the process, the object/frame stack does not need to be consumed. This allows for infinite recursion, but enforces continuation passing style.
It is also possible to enable dynamic stack growth, for example by access to &mut Vec<Object>
instead of &mut [Object]
. It might involve an additional pointer indirection, but not necessarily, as (pointer, len)
is only replaced by (pointer, len, capacity)
. Note that in order to enable dynamic stack growth, the object stack's slices must be represented by indices and not by (pointer, len)
, otherwise you will be stuck with mutable borrow aliasing, which is not possible in order to save you from accessing dangling pointers not redirected to the reallocated buffer. A mutable stack may be splitted by split_at_mut
, but in this situation the problem is already present.
By limiting the amount of local variables, it becomes possible to store the mini object stacks inside of a stack frame. Placing a Vec<Object>
instead of [Object;N]
inside a stack frame allows again for an unlimited amount of local variables. Be aware that fixed size object stacks might interfere with the way you implement list/tuple unpacking. Unpacking into an object stack then becomes impossible as the list/tuple may be of unlimited length.
Another construction is to give each process an own stack or an index to an own stack. This construction presupposes growable stacks but enables general cooperative multitasking threads. It's closely linked to coroutines/yield and async/await. Imagine a coroutine that can be yielded even during subroutine calls which consume the call stack. But note that for a basic coroutine such a construction is not necessary, as the coroutine can allocate its own fixed storage for its local variables at its creation. Nonetheless an own stack is somewhat desirable because it permits yielding inside of an expression, i.e. during a call stack consumption.