True, False, On, Off, Yes, No...?

Isotopes do make it measurably distinct...

And to competently play the hand I've dealt myself, NULL's complement in conditional logic (e.g. returned by comparisons or functions like EVEN?)--must be an antiform.

Having struggled with names, I think the best choice is probably OKAY.

>> 10 > 20
== ~null~  ; anti

>> 10 < 20
== ~okay~  ; anti

There are tradeoffs to anything you'd put here. It's an antiform whose sole reason for existing is to not be null, and not be legal as a List element.

You could call it ~not-null~. :face_with_diagonal_mouth: Claude.ai was suggesting ~value~ (or ~valu~ if I wanted 4 letters--which I prefer--but not enough to be illiterate to get it). It goes with that vein of "I am just a generic value that a variable can hold, and there's nothing else interesting about me except that I'm not null". It's not meritless, but among its problems are that putting it into a word (like value) competes with a very important variable name...as well as that in the vernacular, an ANY-VALUE? is anything you can put into a variable, and ~null~ antiforms are in that set.

Trying to use a non-word for it has a host of problems. ~()~ and ~[]~ etc. are taken for actual purposes (splicing, parameter packs). I don't want SIGIL! antiforms, and if ~#~ were permitted that would probably require permitting ~#a~ and everything else. And if you use a non-word, you'd still have to name it to talk about it in discussion...or to put it in a WORD! so you didn't have to put the tildes on at the callsites where you used it. So you don't really buy yourself anything using a symbol.

On balance, OKAY is just... okay. It's like "yeah, all right, here's your non-null thing". There are places where it makes sense, like in the tests. There it is expected that the canon branch trigger to drop out, or the test fails.

; Note that BLOCK!s in the test dialect isolate the code into a module, so
; local definitions (or redefinitions of library words) are isolated into
; that block of tests! 😎
[
    (var: 2, okay)  ; just saying `var: 2` the 2 drops out, test fails
    (even? var + 2)
    (odd? var + 1)
    (for-each x [2 4 6] [assert [even? x]], okay)
    ~expect-arg~ !! (even? <banana>)
]
; Also: I like having the QUASI-WORD! choice there for dialecting error IDs.
; But I believe it should be illegal to EVAL/UNMETA those to get antiforms
; unless they're in the system-vetted list of word antiforms.

OKAY works pretty well there.

It's a little strange that "used" refinements would be just "okay"

>> foo: func [:refine] [probe refine]  ; in the soon-to-be notation

>> foo
== ~null~  ; anti

>> foo:refine
== ~okay~  ; anti

But I don't know if that's any stranger than anything else. So far we've been getting # as the "arbitrary truthy thing". I can't say how that's better--but I can articulate how it is worse (e.g. having various dialected meanings, appending zero bytes to binaries, matching # in block parses, COMPOSE-ing into slots without complaint, etc. etc.)

Anyway, I think I'm comfortable with it. An added advantage is that there's an abbreviated form (ok, ok?)... so you can say frame.refinement: ok or a test like (var: 2, ok), which is close to as brief as # while being more meaningful and getting all those sweet, sweet antiform benefits. :+1:

I'm a little less comfortable with the idea of people casually embracing ~okay~ and ~null~ for their boolean-like variables just to get the benefit of being able to test with IF directly. But if the system can make that choice for argument-less refinements, its pretty hypocritical to say you should never do it. Prescriptivism has limits.