At one point in time, there was no way to pass something to RETURN that represented a VOID. Because voids completely vanished. If you had a function that took an argument--and a void thing came after it--the evaluator would just keep retriggering until it found a value or hit the end.
This led to the only way of being able to return a void to be to have truly nothing after it. So RETURN became effectively arity-0 or arity-1. If you passed it no argument, it would consider itself VOID. It even had the nice property of being able to chain functions that themselves might be void.
Given that RETURN was doing this, some other functions followed suit. QUIT could take an argument or not. CONTINUE could as well.
But I Just Got Bit By a Variadic QUIT Bug
Without thinking about it, I tried to end some code prematurely by injecting a quit:
some stuff I wanted to run quit ; added this some stuff I wanted to avoid running
And that QUIT ran the stuff I didn't want to run anyway, because it was variadic.
My Kneejerk Reaction Was To Kill The Variadicness
The original case of RETURN has changed, because so-called "non-interstitial invisibility" is dead. You can only make expressions void in their totality--not when used as arguments. Doing otherwise caused more harm than good.
return void is a perfectly fine thing to write (or
return ~ if you prefer the quasiform of void, which you probably don't, but it might come in handy somewhere if you've defined VOID to mean something else)
I'd been thinking that argument-less RETURN would thus go back to returning the default unfriendly value (currently called NONE, a
~~ isotope, e.g. a parameter pack with absolutely no values in it). But maybe we shouldn't support argument-less RETURN at all.
But Variadics Can Be Neat
I guess RETURN could always take an argument, and we go back to CONTINUE/WITH and QUIT/WITH.
But those are uglier.
We might question the behavior of the system just randomly slurping up arguments from enusing lines? Especially when APPLY has such a convenient notation now, of some-func/ [...]
From a usability perspective, there's certainly a lot of potential for error in getting the arity wrong. Having it be more strict could catch bugs, and make it more likely that variadic arity is being used correctly.