Partial Specialization Syntax in a NULL-refinement world

SPECIALIZE is used to make a version of a function that has some of its arguments fixed. The syntax I first came up with (which we have been using up to this day) takes a block that it would bind into a "FRAME!" for the function. So:

>> a10: specialize 'append [value: 4 + 6]

>> a10 [a b c]
== [a b c 10]

Not only do you get to set the parameters, you also have evaluation on your side... like when you make an object. Here it does an addition for demonstration purposes, but it could be arbitrary code with branching and switching/etc. The question is which parameters are assigned values and which are not.

Since frames are like OBJECT!s that contain keys that correspond to arguments, the idea was that anything in this frame that wound up NULL would be considered "unspecialized". So function calls would gather such arguments at the callsite as normal (e.g. the SERIES parameter of append, taken here as [a b c] normally). But since the VALUE parameter was set to 10 and hence not-NULL, it would not be gathered...nor typechecked. (The type checking occurs only at specialization time, helping performance.)

But...what about when arguments are intentionally NULL?

Er, yeah. There was no good answer for this, so you'd have to do tricks. For example: first SPECIALIZE a parameter to a non-null value, and then ADAPT with code that turns around and nulls it out each time the function gets called:

an: adapt (specialize 'append [value: _]) [value: null]

That's inefficient and not too pretty, but it didn't come up often. With a few exceptions like REPLACE to a NULL being a way to delete things, you usually didn't want to pass null parameters. So solving it was a back-burner issue.

But refinements offered "partial" specializations. This was the idea of saying that you had a routine like APPEND which might have a /PART parameter, and you wanted to say that there was a /PART but not say what it is. The intent was a variant of APPEND which took 3 arguments instead of 2, and just didn't require you to say /PART.

The notation for that had originally been:

>> ap: specialize 'append [part: true]

>> ap [a b c] [d e f] 2
== [a b c d e]

This worked because refinements used to have arguments. The argument to /PART (named LENGTH in R3-Alpha) was still NULL, even though PART was true. So partial specialization was encoded by virtue of having a value for the refinement but not the argument.

Even then, there were problems with order. If you partially specialized both part: true and dup: true, which would be gathered first at the callsite? This was cleverly solved by offering an alternative way to do it, via specializing a PATH!:

 >> apd: specialize 'append/part/dup []

 >> apd [a b c] [d e f g] 2 3
 == [a b c d e d e d e]   ; /PART 2, /DUP 3

  >> adp: specialize 'append/dup/part []

  >> adp [a b c] [d e f g] 2 3
  == [a b c d e f d e f]  ; /DUP 2, /PART 3

Now that refinements are their own arguments, we're in a situation where this method for partial specialization still works, but the other doesn't. You can't say part: true anymore because the PART argument itself holds the number or series position that the partial is calculated relative to.

(I will point out that killing off the other partial specialization method is a good example of how the new refinement model purifies things, by eliminating a fuzzy guessing model where it had to count how many refinements you had specialized inside the block and whether you'd also specialized the arguments, and error if you tried more than one partial specialization at a time.)

How to remove a refinement from the interface but NULL it?

Let's imagine now that we want to make a version of append that does not offer a /PART refinement, and yet still hasn't pinned it down to a value.

We could try a riff on the adapt of a specialization. Pick some random /PART that type checks to specialize with, and then null it out when the call happens:

 a-no-part: adapt (specialize 'append [part: 1020]) [part: null]

That will always act like APPEND with no /PART, and in the HELP it does not list /PART as an available refinement.

This is a pain and not efficient. You have to know the types the refinement takes, pick some arbitrary random value compatible with that (which could change if you change the legal types on the specialized function, and it doesn't seem that should break the intention of a specialization removal). Then you're wasting runtime nulling it out on every call with a useless additional layer of ADAPT-ation.

Worst case scenario, we need a counter to AUGMENT whose job is to remove parameters. (ABRIDGE? DIMINISH?) But it seems like that is SPECIALIZE's domain, since it removes other parameters when it says what value they have...why can't we find a syntax to do it when it says it's "fixed at no value"?

As a random example, we might come up with a wacky variation on the path notation:

 a-no-part: specialize 'append/[part] []

So maybe if it sees a word in a BLOCK! in the path, that means "remove from the interface and set to null/unused".

We could be less creative e.g.:

a-no-part: specialize/remove 'append [] [part]

What about normal parameters (and reordering?)

Whatever mechanism we come up with here needs to work with ordinary arguments, too. And something I've wondered is if pathing with ordinary arguments might tie into some kind of reordering scheme:

>> append/series 10 [a b c]  ; imagine pushes /SERIES to be last parameter
== [a b c] 10

If we accept this, then strange ideas like specialize 'append/[value] [...] meaning "remove value parameter but set it to NULL" folds into the notion that you can just name any argument like this. We can also imagine the idea of creating variations of a function where a previously mandatory argument becomes optional.

>> apmaybe: unrequire 'append [value]

>> apmaybe [a b c]
== [a b c]

>> apmaybe/value [a b c] 10
== [a b c 10]

My point in bringing this up is that the one-parameter-per-name concept makes refinements and ordinary arguments have more in common. So we can imagine the mechanism to remove a refinement from the interface (while still leaving it null) working on ordinary arguments that want to be fixed at null as well.

This is by no means unsolvable and we have a lot of the mechanics for doing it. It's just a question of what you write at source level, and can we do something nice that finesses it instead of looking like a mess.

I think you meant == [a b c 10] in that last code example ...

Yes, fixed... and that helps... though...
...of course
...if you can fix or dream up designs here that solve the underlying problems described, that would be useful. :slight_smile:

I have this feeling something will click and the whole of how parameters/frames/nulls/reordering works will clear up. The pieces seem to be in place. It's a lot closer than it could possibly have been before NULL or before refinements-are-their-arguments.