I found a post from Carl titled "UNSET! is not first class".
It’s important to understand the unset! datatype; otherwise, we run the risk of assuming that it is first class (assignable, passable, returnable) when it’s really not intended for that kind of usage!
He gets to the idea that it’s not a “normal” value. While not taking the step to make it illegal to put in blocks, I think that was just a matter of not having thought through how to prevent it. You see the notion of a wish to quarantine something that is a “necessary evil”.
Tuning the model in Ren-C, the roles of the two cases of
NONE!/UNSET! were reshaped to into NULL, VOID!, and BLANK!. This has been very successful, putting the “hot potato” nature of null to good use by keeping it something you cannot assign…while allowing it to be conditionally false. The “neither true-nor-false” role then falls to VOID!, as a prickly value that is nonetheless a value and can be put in blocks if you insist.
Should NULL assigns via SET-WORD! unset variables?
When NULL was first being introduced, it wasn’t the failure result from ANY or FIND. Those still used blank, which was more convenient since that era’s null was neither true nor false (like UNSET! had been).
Instead, null was sneaking in as the outcome of failed conditionals…as well as trying to be a “more correct” answer to things like select [a 10 b 20] 'c. There, a null result distinguished from a literal blank in a block, such as select [a 10 b 20 c _] 'c.
But with null being such a hot potato, there were difficulties. So it was tried that foo: null would unset the
foo variable, vs. be an error. This made it a bit less awkward to work with if you wanted to write something like:
if not null? x: select block value [ ...do stuff with x... ] else [ ...x is unset, maybe do error handling here... ]
If SET-WORD!s caused errors, that would be more tricky–you’d have to test the result of SELECT for null, but the test would return LOGIC! and lose your result for assigning.
But, enfix to the rescue…once operations like ELSE and THEN came on the scene, they offered a new possibility…instead of needing IF and a test for a branch, the branch could react to the nullness before the assignment. Whether you needed error handling or default values, this pattern addressed most needs.
x: select block value else [<default-value>]
Then null changed to be conditionally false, with VOID! picking up its neither-true-nor-false duties. Routines like ANY and ALL and FIND began returning NULL on failure, which could be used in conditionals without any extra work. Then conversion of nulls to blanks was made as easy as the short, repurposed word TRY.
With this change, it was moved back to where null assignments to SET-WORD! were errors, though I’ve sometimes wondered if there would be an advantage to letting NULL implicitly unset variables. Should you be able to say pos: find block value and then if set? 'pos […] or do you really need to say pos: try find block value?
I think experience has spoken: the errors are good
Putting together Carl’s sentiment with my own experience, the signal of unsetness that is not a first-class-value (which is NULL, now) should not silently assign to variables. TRY is the great equalizer here, which very literately lets you markup operations which you are aware can fail. It has been a great success.
But we do have some edge cases here, for instance APPEND.
>> data: copy [a b c] >> append data case [1 = 2 ['d] 3 = 4 ['e]] [a b c]
Should APPEND choke on the NULL? Carl had written:
…if you find I’m not enthusiastic about extending mezz functions to accept unset! values, you now know why. If you really think such a change is needed, you’ll need to write a short explanation for why the exception is required. I’m pretty open minded, but just because we can do something does not mean we should do it.
There are rather good reasons to avoid NULL arguments; one of which is that NULL is used in frames to denote unspecialized arguments. Hence you can’t really tell apart an APPEND which has had its appended value specialized to NULL from one that hasn’t had the value specialized out at all.
But what if you used TRY, and APPEND without /ONLY followed the “BLANK!-in, NULL out” protocol?
>> data: copy [a b c] >> append data try case [1 = 2 ['d] 3 = 4 ['e]] // null >> data [a b c] >> append/only data _ [a b c _]
One problem is that mutating routines generally aren’t supposed to follow “BLANK!-in, NULL out”.
Another possibility would be to make a refinement to APPEND that suggested a known tolerance for nulls, distinct from /ONLY… e.g. /OPT
>> data: copy [a b c] >> append data case [1 = 2 ['d] 3 = 4 ['e]] ** Error: NULL input to APPEND illegal unless /OPT >> append/opt data case [1 = 2 ['d] 3 = 4 ['e]] [a b c]
Point of all this is that I’ve become pretty convinced that assignments via SET-WORD! should error on NULL But still struggling with if there is a real hard philosophical reason why APPEND should or should not error on NULL.