Definitional Errors came into PARSE as an answer for how to make it possible to return any value... simply make a failed parse a definitional error. Not only does it mean that NULL can be a legitimate synthesized product that you don't conflate with failure, but it also means that parse failures can give back more information in the error about what went wrong (farthest parse point reached, etc.)...
Looking at another problem where a sentinel NULL had been used... there's Generators and Yielders:
>> g: generator [print "A" yield 1, print "B" yield 2, print "C"]
>> g
A
== 1
>> g
B
== 2
>> g
C
== ~null~ ; anti
>> g
== ~null~ ; anti
>> g
== ~null~ ; anti
But Generators and Yielders predate antiforms in general... they came at a time where there were just two forms of NULL: "heavy" and "light". So the thought was that if you wanted to actually yield null, it would be the heavy form.
>> g: generator ["Start", yield null, print "Done"]
>> g
Start
== ~[~null~]~ ; anti
>> g
Done
== ~null~ ; anti
But this isn't nearly as good as using a specific raised error. If NULL is truly out of band, then you could use try g
to get null when there are no more values.
If you try to YIELD or RETURN a raised error, then it would become an abrupt failure. So you'd be assured that the only time you'd ever get a definitional error back from a generator or yielder would be if it had been exhausted.
PACK returns for things like KEY and VALUE in objects?
As the enumeration routines are retooled to be based on generator functions, it struck me as very cool if generators returned packs, which could be decayed:
>> obj: make object! [a: 10 b: 20 c: 30]
>> g: each obj
>> g
; first in pack of 2 items
== a
>> meta g
== ~['b '20]~
>> [key {val}]: g
== 30
>> key
== c
>> g
** Error: Generator/Yielder exhausted
>> try g
== ~null~ ; anti
That's really cool, and seems like it would go along with decaying:
for 'key each obj [...]
for [key val] each obj [...]
This looks pretty nice, however blocks used in FOR-EACH scenarios like this haven't meant "unpack", it has meant keep getting items and assign them to each.
It seems like it would be unfortunate if you had to write the object unpacking as:
for [~[key val]~] each obj [...]
But that does seem sort of like what you'd have to do in order to make it generalized. It doesn't look that bad, and it does have a theory behind it...
>> obj: make object! [a: 10 b: 20 c: 30]
>> for [~[key val]~ :otherkey] each obj [ ; leading colon for optionality
print ["key:" key]
print ["val:" val]
if otherkey [
print ["otherkey:" otherkey]
]
]
key: a
val: 10
otherkey: b
key: c
val: 30
This assumes that plain blocks might be taken for some other intent in the FOR. If it were not:
for [[key val] :otherkey] each obj [...]
But I had been thinking that plain BLOCK! would likely be good for type checking (@hiiamboris has some experiments in this vein):
for [x [integer!] y [tag!]] each [1 <a> 2 <b>] [....]
Big Design Space, but Full-Band Returns Opens It Up
I'd let the Generator and Yielder experiments atrophy while working on other things. But I stopped today to try and get the basics working again, and it's interesting to see how isotopes bring a lot to the table...as they seem to have done with nearly everything.