Understanding FRAME! "Lensing"

Good news: an old issue is (seemingly) mostly addressed!

Among the various implications of this design improvement, you can AUGMENT a function with new fields that share the name of either locals or specialized values. The only names you cannot use in extending a function are those that are public parameters on the interface!

>> /ap10: specialize append/ [value: 10]
>> ap10 [a b c]
== [a b c 10]

>> /wow: adapt (augment ap10/ [:value [integer!]]) [insert series value]
>> wow:value [a b c] 20
== [20 a b c 10]

So what's going on here is that underneath the hood, the single FRAME! for this function call has two slots with the label value. But they're never in effect and visible at the same time. This is great news for composability of functions.

I'm going to try to explain here a little bit of how this works.

Every Function Has a "ParamList" FRAME!

Some time ago I penned the prophetic post: "Seeing all ACTION!s as Variadic FRAME! Makers". This set the stage for what ultimately became an implementation mechanism where the interface to all actions are defined by a FRAME!.

So if you write something like:

/foo: func [return: [integer!] x [tag! text!] y [integer!] <local> z] [
    print ["internal foo view:" mold binding of $x]
    return 5
]

Inside of FOO there is a FRAME! that lays out a map of the parameters and locals. This is called the "ParamList". Internally, it looks something like this:

#[frame! [
    return: #[parameter! [integer!]]
    x: ~#[parameter! [tag! text!]]~
    y: ~#[parameter! [integer!]]~
    z: ~
]]

This isn't an "execution" frame for the function. X and Y don't hold legitimate values for a function invocation...they are holding antiform parameters. RETURN is a special slot known to FUNC which it will fill in with a Except for Z which is a local, so it holds the value that it will have when a frame is made. (more on that in a second)

So now let's try making an ordinary frame for the function:

>> f: make frame! foo/
== #[frame! [
    x: ~
    y: ~
]]

Okay, that's neat. It doesn't seem to have the RETURN or Z fields because we aren't supposed to be setting those. They are there--the memory is part of the frame, and part of what will actually be backing the variables when you EVAL the frame function. But they are hidden.

I put code inside the function to print out its internal view of that same frame. Let's try running and see what it says:

>> f.x: "Hello"

>> f.y: 1020

>> do f
internal foo view: #[frame! [
    return: ~#[frame! [^atom :run]]~
    x: "Hello"
    y: 1020
    z: ~
]]

Hey, look at that. When we see the frame from inside the function, it has access to RETURN and Z. How does it know to hide the fields on the outside, but give access to them on the inside?

The answer is that each FRAME! Cell instance can optionally hold a "Lens". A Lens is itself is a ParamList. The Lens informs which of the fields are supposed to be visible.

Now, Let's SPECIALIZE It...

Let's make a new function SPFOO which fixes the value of Y.

/spfoo: specialize foo/ [y: 304]

And now let's look at what its internal "fake" exemplar FRAME! looks like:

#[frame! [
    return: #[parameter! [integer!]]
    x: ~#[parameter! [tag! text!]]~
    y: 304
    z: ~
]]

Something you'll notice is that the type information for Y is now lost, and the slot where the type information would have been has been replaced by the specialized value. That's a nice little efficiency trick. (We can still check the type, because FOO's ParamList has it.)

Now if we make a frame for SPFOO, the only thing it will let us set is X:

>> f: make frame! spfoo/
== #[frame! [
    x: ~
]]

What if We Were to ADAPT the Specialization?

So this raises an interesting question about the "inside" and "outside" view of things.

At an interface level, I would argue that it should not usually be possible to tell the difference between SPFOO and any other function that takes a single parameter X.

So what happens if we ADAPT the SPFOO function and get access to the frame on the inside?

/adspfoo: adapt spfoo/ [
    print ["inside adaptation:" mold binding of $x]
]

>> adspfoo "What happens?"
inside adaptation: make frame! [
    x: "What happens?"
]
internal foo view: make frame! [
    return: ~#[frame! [^atom :run]]~
    x: "What happens?"
    y: 304
    z: ~
]

Ta-da. ADAPT only saw a function with an X parameter, and none of the other details are exposed to it. Its view of the frame only sees X. But it's all the same frame... memory is being reused, just the access to it is controlled.

Pretty slick, huh? Anyway, I'm sure there are bugs but the groundwork is there. Please experiment and let me know if anything seems to be counterintuitive.

(Note that when you're inside the ADAPT, you don't have access to RETURN. It's not part of the interface, and we want you to be able to ADAPT functions that don't have RETURN. If you need greater control, use ENCLOSE.)

2 Likes

The following might not ultimately turn out to be a great idea. But it's an idea I'm giving a shot to.

I'm trying to make an option available for easily pushing parameters down through the stack, which is "frame tunneling".

So that's to say you can capture the view of a function at a level where certain variables are visible, and pass that frame down to a lower level that is expecting it.

For instance, let's make a function that you can optionally pass a frame of an augmented function to:

/lower: func [x :augmented [frame!]] [
    print "running lower"
    compose [x (if augmented [spread reduce [augmented.y augmented.z]])]
]

And then, let's make a higher level wrapper that adds more arguments:

/higher: adapt (augment lower/ [y z]) [
    print "running higher"
    augmented: binding of $y
]

So what you get is:

>> lower 10
running lower
== [10]

>> higher 10 20 30
running higher
running lower
== [10 20 30]

I wanted to mention this issue, because now that we have this rather effective "functions are black boxes to the outside" mechanic, it's throwing a wrench into some hacks that were able to get away with seeing things they shouldn't have been able to see. So more formal methods of exchanging information between higher layers and lower layers of purposefully collaborating functions are needed.

1 Like