Conflation vs. Safety, RETURN and the Finer Points of ~null~ Isotopes

~null~ isotopes are a novel solution shaped to solve a specific problem. As a reminder of what the goal is...

The Goal is to Please @rgchris AND Please me

NULL is the signal of "soft failure". It's a unique result reserved for when a branch fails, or when a loop is halted by a BREAK, when PARSE fails...etc.

Its property of not being storable in blocks makes it critical to disambiguating this historical problem:

redbol>> third [a b #[none]]
== #[none]

redbol>> third [a b]
== #[none]

In a language that prides itself on letting you work with code structure, this is the tip of the iceberg of the problems that null solves, and you will find the distinction's utility across the board (obviously, in tools like COMPOSE). It facilitates rigorous analysis and rearrangements...without needing to drop to C or write convoluted code:

>> third [a b _]
== _

>> third [a b]
; null

Hence NULL has taken the place of blank ("none!") in many places, one of which is the unique result of failed conditionals.

But unlike the elements in a block, a branch that evaluates isn't required to be non-NULL. Which leads us to the long running question of what to bend NULL branches to so they don't conflate with the branch-not-taken result.

Chris has (rightly) expressed concern

At times I've said that it's not that big a deal that branches can't evaluate to NULL and get distorted. "You didn't have a NULL before, so why get so worked up about control constructs not returning it?"

But the now-pervasive nature of NULL means it can't be avoided. So:

"How do you express branching code which wants to do some work but also produce NULL as an evaluative product?"

Conflation was not a problem, e.g. in Rebol2:

rebol2>> exampler: func [x] [
     print "returning sample or none if not found"
     case [
         x = <string> [print "sample string" {hello}]
         x = <integer> [print "sample integer" 3]
         x = <none> [print "sample none" none]

rebol2>> exampler <string>
returning sample or none if not found
sample string
== "hello"

rebol2>> exampler <blatz>
returning sample or none if not found
== #[none]

rebol2>> exampler <none>
returning sample or none if not found
sample none
== #[none]

However NULL is now the basic currency of "soft failure". As such it would not be uncommon to be in the situation where a branching decision process would want to intentionally return NULL as part of the work it does.

Without something like the isotope decay mechanism, unpleasant convolutions would be needed, for instance surrounding anything that wanted to tunnel a NULL with a CATCH and THROW'ing it:

x: catch [
    throw switch 1 + 2 [
        1 [print "one" 1]
        2 [print "two", <two>]
        3 [print "three", throw null]

Definitely not good. But regarding the pleasing-me-part, remember I am trying to avoid this situation:

>> case [
     true [
          print "case branch"
          if 1 > 2 [print "failed inner"]
   ] else [
     print "else branch"

case branch
else branch  ; ugh

I don't want the CASE branch to evaluate to NULL just because the failed IF inside the branch was NULL. That would mean the ELSE tied to the CASE runs even though the code for the branch ran.

Enter Isotopes

One thing null isotopes have in common with NULL (like all BAD-WORD! isotopes) is that they can't be put in blocks. But they have been automatically "decaying" into regular NULL when stored into variables.

>> ~null~
== ~null~  ; isotope

>> x: ~null~
== ~null~  ; isotope  <-- note the overall expression is still an isotope

>> x
; null

The twist is that they are different enough from true NULL such that a THEN or an ELSE can consider them a situation where the branch did not run:

>> if false [<ignored>]
; null

>> if true [null]
== ~null~  ; isotope

>> if true [null] else [print "This won't run"]
== ~null~  ; isotope

The reason functions like ELSE can "see" the isotope is that they don't take an ordinary parameter on their left. They take a ^META argument. These can see the distinction between a ~null~ isotope and a "true" NULL.

I'd largely say this has been working well...certainly better than its conceptual predecessors. It makes piping NULL out of branches trivially easy, when the fear of conflation is not a problem.

>> x: switch 1 + 2 [
     1 [print "one" 1]
     2 [print "two", <two>]
     3 [print "three", null]
== ~null~

>> x
; null

The automatic decay in variable storage prevents you from needing an explicit operation to turn ~null~ isotopes into pure nulls:

>> x: decay switch 1 + 2 [
     1 [print "one" 1]
     2 [print "two", <two>]
     3 [print "three", null]
; null

But @rgchris Would Likely Want any NORMAL arg decay

At the very moment I am writing this, ~null~ isotopes are like all other BAD-WORD! isotopes and not accepted as normal parameters.

They could decay to pure NULL for all normal args. But let me explain a bummer of what we lose in that bargain.

It takes away a safety idea I had with functions like MATCH.

>> match [<opt> integer!] 3
== 3

>> match [<opt> integer!] "notaninteger"
; null

>> match [<opt> integer!] null
== ~null~  ; isotope

The idea was that it could say "yes, this matched" but if ~null~ isotopes were tested, they'd give an error:

>> if (match [<opt> integer!] null) [print "Yes it matched!"]
** Error: IF does not accept ~null~ isotopes without a ^META condition

Were MATCH to have passed through a plain NULL it would have succeeded in the match but not run the branch. So it's nice to get the warning on the isotope.

Or it could just return a ~matched~ isotope. But this loses the following nice isotopic property:

>> x: match [<opt> integer!] null else [fail "NO MATCH"]
== ~null~  ; isotope

>> x
; null

In fact I instituted other decaying variants for ~blank~ and ~false~

>> y: match [blank!] _ else [fail "NO MATCH!"]
== ~blank~  ; isotope

>> y
== _

>> z: match [logic!] 1 = 2 else [fail "NO MATCH!"]
== ~false~  ; isotope

>> z
== #[false]

So What To Do About MATCH and its bretheren?

Seems the options are:

  1. Don't worry about it. If you write match [logic!] 1 = 2 you get back #[false] and should you write an expression like if (match [logic!] 1 = 2) [print "Match!] you get what you deserve.

  2. Use a different isotope. Let's say that match [<opt>] null is simply ~matched~ (isotope). It wouldn't have the decaying property, but would have the invalidness property.

  3. Have a MATCH/FALSEY variant. Let plain match on a falsey thing trigger an error and if you write if match/falsey ... then you clearly do know what you're doing so it becomes like case (1).

  4. Make all conditional arguments take ^META arguments for their conditions. This would put the responsibility for checking for isotopes on them, and they'd uniquely disallow them before UNMETA'ing them and then testing for truth/falsehood.

Option (4) is too taxing...impacting not just the interface to IF but the implementation of CASE and any conditional construct.

I think I like (3) because it punts the ball down the road a bit.

But this might still not suit Chris.

Should Non-Meta Arguments Decay Null isotopes?

The "auto-decay" of ~null~ isotopes means no variable can ever hold a NULL isotope. And there's also a rule that no normal parameter can ever be passed an isotope, only ^META parameters.

In the beginning, it seemed useful if normal arguments would automatically decay null isotopes:

>> foo: func [x] [if null? x [print "Yup, it's null"]]

>> foo if true [null]
Yup it's null

>> metafoo: func [^x] [
    case [
        null? x [print "regular null"]
        x = '~null~ [print "null isotope"]
        true [print "something else"]

>> metafoo if false [null]
regular null

>> metafoo if true [null]
null isotope

There is a manual DECAY operator which could be used, but would not meet that wish:

>> ~null~
== ~null~  ; isotope

>> decay ~null~
; null

>> ~blank~
== ~blank~  ; isotope

>> decay ~blank~
== _

>> ~false~
== ~false~  ; isotope

>> decay ~false~
== #[false]

Should DECAY Conflation Be A Customization?

Another avenue of satisfaction could be to say that you simply customize your environment with some definitions to make auto-decaying constructs:

switch: chain [:switch | :decay]
case: chain [:case | :decay]

>> case [true [null]]
; null

>> case [false [10]]
; null

I don't like it, but if someone isn't going to use ELSE (or is willing to accept this very easily unintentional conflation if they do) it could be an option.

I really do believe the ability to tell from outside the construct if a branch has been taken is an interesting property, which even those who think they won't use ELSE or THEN can leverage, especially when building constructs atop each other. But this isn't something that can be appreciated without usage, or trying to write something like UPARSE generically in usermode.

Should function RETURN decay by default?

Continuing along these lines, this has to do with the pattern of:

foo: func [x] [
    return switch x [
         1 [print "one", #one]
         2 [print "two", null]
         3 [print "three", <three>]

>> foo 1 + 2
== ???   ; should this be ~null~ isotope or just NULL

Also, should it matter whether there's a RETURN there or not? Is this something the type spec should distinguish?

Right now there's a refinement called /ISOTOPE on RETURN which asks it not to decay.

If all non-^META parameters decay by default, then it seems isotopic decay is the right default for RETURN even though it takes a ^META parameter and returns non-decaying isotopes.

As Always, A Lot To Think About

Want to get this posted because it's preventing me from making new drafts (Discourse won't let you have multiple top-level post drafts in-flight for some reason).

Will keep mulling it all over.

"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away."


Great refresher and summary of the issues.

Or as we learned from many a multiple choice test, the correct answer is: None of the above.

Option 5. Adjust (DID ...) as isotope-tolerant (NOT NULL? ...)

If you find yourself in a situation where isotopes are giving you a problem, switch to did match (or decay match, if you're trying to get the value and not test it as a condition)

I wouldn't reach for this by default. You're fine most of the time... if your MATCH doesn't contain [<opt> logic! blank!]. It's only these quirky edge cases where it's better to let the isotopes give you a localized and clear error than wind up on a wild goose chase for why the program is acting strangely.

Beginners might be cautious and write things like did match or did parse all over the place, if they're scared of missing a case. But there's no need to write if did match integer! value, because you're testing for a value that can't possibly be's an integer! or null. Experts would use it sparingly, in cases like if match typeset value, in case the typeset gets expanded into things including falsey values, when you're not fully cognizant of that happening.

This broadens the service of DID across the board, to do what it was originally intended to do: transform functions that return non-LOGIC! values and NULL as soft failure to give logic results. It can handle historical edge cases with an elegant touch, without burdening code that knows itself well enough to not hit those cases. I'm quite pleased with it! :man_dancing:

This Means Auto-Decay Will Be Limited (sorry, Chris)

There's still some room for compromise. But the compromise won't be that ~null~ isotopes are interchangeable with NULL to plain function parameters.

Decaying variables seems acceptable, and hopefully it's learnable that DID is testing something transient...and does not obey substitution rules:

if did x: (match [<opt> integer!] null) [
    print "This will print (the ~null~ isotope denotes non-soft-failure)"

if did :x [
    print "This won't print, because X decayed to NULL in assignment"

I think that's learnable, if people can realize that DID is supposed to be paired with an evaluation and not a variable fetch. (*"did x what? did x be a variable?")

1 Like