A Lot To UNPACK: (Replacing the SET of REDUCE BLOCK! Idiom)

We've fretted a lot about the result of REDUCE when an expression produces something that can't be put in a block. At first this was just NULL. But now it's all isotopic forms.

One reason to fret is the historical idiom of setting multiple values in a block. This was done with a SET of a REDUCE. Something along the lines of:

>> x: 3 + 4
== 7

>> set [a b] reduce [10 + 20 x]
== [30 7]

>> a
== 30

>> b
== 7

It's your choice to reduce the values or not. If you don't involve REDUCE then the mechanics would all work out. But once you get NULLs and isotopes, the reduced block can't store the values to convey them to the SET...

But What If A Single Operation Did Both...?

Let's imagine we have instead something called PACK that by default reduces to meta values... and that SET-BLOCK! is willing to "unpack" one at a time in UNMETA'd variables.

>> x: 3 + 4
== 7

>> [a b]: pack [10 + 20 x]
== 30

>> a
== 30

>> b
== 7

We can prototype the behavior by making PACK quote a SET-WORD! or SET-BLOCK! on its left, and combine that with the unpacking. PACK manages the evaluation one expression at a time, instead of using REDUCE. So as it goes it can set the variables to NULL or isotopes. And by following the multi-return convention of returning the first value, you avoid ever needing to synthesize a block aggregating all the results together.

>> [a b]: pack case [
       1 = 1 [
           print "This is pretty slick..."
           [select [a 10] 'b, 1 + 2]
       ]
    ] else [
        print "This won't run because the ELSE gets a BLOCK!"
        print "Which is what you want, because the ELSE should be"
        print "what runs if no CASE was matched and have the option"
        print "of providing the block to UNPACK"
    ]
 This is pretty slick...
 == ~null~  ; isotope

>> a
== ~null~  ; isotope

>> b
== 3

@[...] can Even Avoid A REDUCE

If you already have a block in reduced or literal form, how would you tell the PACK about that? It could be a refinement like PACK/ONLY. BUT...what if that were signaled with the @ block type?

>> [a b]: pack @[1 +]
== 1

>> a
== 1

>> b
== +

A real aspect of power in this approach is the ability to mix and match. For instance you could have some branches in a CASE which have already reduced data and others that don't, and they could all participate with the same PACK operation.

[op1 op2]: pack case [
    ... [
        print "This branch uses values as-is"
        @[+ -]
    ]
    ... [
       print "This branch needs evaluation"
       [operators.1, pick [- /] op-num]
   ]
]

Cool Dialecting Twists

The basic premise of multiple returns is that if you don't know about extra values, you don't worry about them... so by default packs with extra values need to just ignore them.

>> [a b]: pack [1 2 3]
== 1

>> a
== 1

>> b
== 2

But for some applications, it might be nice to be able to check that an unpacking is exact:

>> [a b <end>]: pack [1 2 3]
** Error: Too many values for vars in PACK (expected <end>)

Borrowing from multi-return: I think the idea of "circling" values to say which is the one you want the overall expression to evaluate to is a neat idea.

>> [a @b]: pack [1 2]
== 2

And For Show And Tell... A Prototype!

How hard is it to write such a thing, you ask? In Ren-C it's super easy, barely an inconvenience:

(Note: Prototype updated circa 2023...kept around as a test of usermode behavior. But now this is handled by SET-BLOCK! and PACK! isotopes.)

pack: enfixed func [
    {Prepare a BLOCK! of values for storing each in a SET-BLOCK!}
    return: [<opt> <void> any-value!]
    'vars [set-block! set-group!]
    block "Reduced if normal [block], but values used as-is if @[block]"
        [block! the-block!]
][
    if set-group? vars [vars: eval vars]

    ; Want to reduce the block ahead of time, because we don't want partial
    ; writes to the results (if one is written, all should be)
    ;
    ; (Hence need to do validation on the ... for unpacking and COMPOSE the
    ; vars list too, but this is a first step.)
    ;
    block: if the-block? block [
        map-each item block [quote item]  ; should REDUCE do this for @[...] ?
    ]
    else [
        reduce/predicate block :meta
    ]

    let result': void'
    for-each val' block [
        if result' = void' [
            result': either blank? vars.1 [void'] [val']
        ]
        if vars.1 = <end> [
            fail "Too many values for vars in PACK (expected <end>)"
        ]
        if tail? vars [
            continue  ; ignore all other values (but must reduce all)
        ]
        switch/type vars.1 [
            blank! []  ; no assignment
            word! tuple! [set vars.1 unmeta val']
            meta-word! meta-tuple! [set vars.1 val']
        ]
        vars: my next
    ]
    if vars.1 = <end> [
        if not last? vars [
            fail "<end> must appear only at the tail of PACK variable list"
        ]
    ] else [
        ; We do not error on too few values (such as `[a b c]: [1 2]`) but
        ; instead unset the remaining variables (e.g. `c` above).  There could
        ; be a refinement to choose whether to error on this case.
        ;
        for-each var vars [  ; if not enough values for variables, unset
            if not blank? var [unset var]
        ]
    ]
    return unmeta any [result' void']
]

If the ^ and UNMETA seem confusing, the only thing you need to think about is that the META protocol helps you out when you're trying to deal with a situation of storing a value that can be anything...and you need to differentiate a state. I'm making the result "meta" so that I can use plain unset to signal that it hasn't been assigned yet. I could make a separate boolean variable instead, but then I'd have another variable and I'd have to GET/ANY the result...

I'm sure people will start getting the hang of it! :slight_smile:

3 Likes