YIELDER and GENERATOR (and thinking about Coroutines)

The stackless model so far has been built on a generic and comprehensible building block called a YIELDER. I thought I'd walk through it a little.

To understand YIELDER, first look at GENERATOR

I think a generator is pretty easy to understand. It is like a function, but instead of RETURN it has something called YIELD. Each time YIELD is called the generator gives back the result but is left in a suspended state, and the next call to the generator will pick up right after the YIELD:

counter: generator [
     let n: 0
     cycle [
          n: n + 1
          yield n
     ]
]

>> counter
== 1

>> counter
== 2

Generators are building blocks, meant to be used with functions

Generators don't take parameters. So if you want to parameterize them, you should combine them with a function. Imagine you wanted to be able to specify a bump amount for your counter:

make-counter: func [bump] [
     return generator [
         let n: 0
         cycle [yield n: n + 1]
     ]
]

>> counter: make-counter 5

>> counter
== 1

>> counter
== 6

>> counter
== 11

But functions aren't limited to being just "generator makers"...

For instance: functions can be generator wrappers, that actually delegate to the generator...or perhaps even destroy it and make new ones. Consider making a resettable counter, as @giuliolunati has in his GENERATE usermode generator:

 counter: func [/reset <static> n (0) gen (null)] [
     if reset [n: 0, return]
     return reeval gen: default [
         generator [
             cycle [yield n: n + 1]
         ]
     ]
 ]

 >> counter
 == 1

 >> counter
 == 2

 >> counter/reset

 >> counter
 == 1

 >> counter
 == 2

This gives a lot of flexibility in the design of generator interfaces. Considering the above example alone: what if you are in a situation where you think the counter/reset should have returned 1 instead of being a separate step that had no return result? Or maybe you think it should have returned what the last generator value was.

By making generators a "simplistic" building block, you're in control of these interface choices.

The YIELDER hybridizes with functions for efficiency

I said that generators don't have parameters or a function spec, but that is because they are a specialization of a version that does have a spec... called a YIELDER.

weird-print: yielder [x] [
    cycle [
        print ["Odd print:" x]
        yield none
        print ["Even print:" x]
        yield none
    ]
]

>> weird-print "Hello"
Odd print: Hello

>> weird-print "Weird"
Even print: Weird

>> weird-print "World"
Odd print: World

This isn't anything you couldn't have achieved with a function that wrapped a generator, that held that generator statically and then sub-dispatched to it. It's just cleaner and more efficient. (Since GENERATOR is implemented as yielder [] [...generator body...] it's kind of like the DOES analogue to FUNC.)

But this kind of gives you a sense of the parts box you have for building relatively efficient generator-type things.

1 Like

Comparing with Python2 Coroutines

Generators are related to coroutines...and I thought I knew what they were :face_with_raised_eyebrow:. But looking at examples of Python coroutines made me wonder exactly what the difference was, and what they could do that yielders + wrapping functions could not.

So I thought I'd try translating a coroutine example. A top hit for Python coroutines is apparently https://www.geeksforgeeks.org/coroutine-in-python/. I now take it this is out of date, and Python3 looks to have a much more JavaScript/C# async/await type feel to it.

But just for the heck of it I went ahead and thought I'd convert what was there to see how much of it YIELDER could handle. Look over their example, and then what I put together...without changing names (just adding a couple of comments):

producer: func [sentence [text!] next_coroutine [action!]] [ 
    ; 
    ; Producer which just split strings and 
    ; feed it to pattern_filter coroutine 
    ;
    let tokens: split sentence space 
    for-each token tokens [ 
        next_coroutine token  ; Python says `next_coroutine.send(token)`
    ] 
    next_coroutine null  ; Python says `next_coroutine.close()`
]
  
pattern_filter: func [next_coroutine [action!] /pattern [text!]] [
    pattern: default ["ing"]
    print ["Searching for" mold pattern]

    ; Search for pattern in received token  
    ; and if pattern got matched, send it to 
    ; print_token() coroutine for printing 
    ;
    return yielder [token [<opt> text!]] [
        while [token] [  ; Python does a blocking `token = (yield)`
            if find token pattern [
                next_coroutine token
            ]
            yield void 
        ]
        next_coroutine null  ; Python throws when `token = (yield)` exhausted
        print "Done with filtering!!"  
    ]
]

print_token: func [] [
    ; 
    ; Act as a sink, simply print the 
    ; received tokens 
    ;
    print ["I'm sink, i'll print tokens"]

    return yielder [token [<opt> text!]] [
        while [token] [  ; Python does a blocking `token = (yield)`
            print [token]
            yield void
        ]
        print "Done with printing!"
    ] 
]

let pt: print_token
let pf: pattern_filter :pt
  
let sentence: "Bob is running behind a fast moving car"
producer sentence :pf

The output matches the example.

They're similar...the key difference is that there's a bizarre overloading of the (yield) syntax in these routines which is about suspending to block and take in a value, instead of returning one. It provides a blocking call that gives a kind of inversion of control. It's a subtle difference but it allows there to be a kind of "listener" model where a single "I'm done" publication by the producer manages to cause a chain reaction of done-ness.

This does look like it's leading toward async/await, which as I mentioned is what the Python3 stuff looks like. As I shuffle around how this kind of producer/consumer pattern might be managed more legibly, I feel a resonance with the writeup of "What Color is Your Function". That author's conclusion seems like mine and maybe others which is that Go's channels and "goroutines" are probably a better answer than trying to emulate these other more bizarre coroutine examples.

1 Like

I took a day to tinker and see what I could manage to mock up Goroutines, standing on the shoulders of the stackless work.

Based on that, here's a reimagining of this sample in the style of Go...that works!

producer: func [sentence [text!]] [ 
    let out: make-chan
    let tokens: split sentence space 

    go (func [] [
        for-next t tokens [  ; FOR-EACH is not stackless (yet), so...
            send-chan out t/1  ; ...FOR-EACH can't unwind when this blocks
        ]
        close-chan out
    ])

    return out
]
  
pattern_filter: func [in "channel" /pattern [text!]] [
    pattern: default ["ing"]
    print ["Searching for" mold pattern]

    let out: make-chan

    go (func [<local> token] [
        while [token: receive-chan in] [
            if find token pattern [
                send-chan out token
            ]
        ]
        close-chan out
        print "Done with filtering!!"
    ])

    return out
]

print_token: func [in] [
    print ["I'm sink, i'll print tokens"]

    let done: make-chan
    go (func [<local> token] [
        while [token: receive-chan in] [
            print [token]
        ]
        close-chan done

        print "Done with printing!"
    ])

    return done
]

  
let sentence: "Bob is running behind a fast moving car"

unfiltered: producer sentence
filtered: pattern_filter unfiltered
done: print_token filtered

receive-chan done

This replaces the idea of coroutines speaking to each other directly with the idea that they synchronize and communicate through "channels".

  • The producer isn't itself a "coroutine" but an ordinary function that instantiates a goroutine, and returns a channel object that represents its stream of produced tokens.
  • The filter also isn't a coroutine... again an ordinary function, but one that takes a channel as an input parameter...returns a filtered channel as an output. To get the filtered channel, it instantiates a goroutine whose job it is to take items from in and decide which ones to send to out.
  • The printer once again is an ordinary function that creates a goroutine to dump out a channel that it receives.

You get the desired output:

Searching for "ing"
I'm sink, i'll print tokens
running
moving
Done with filtering!!
Done with printing!

This seems a much smarter path than trying to deal with a contortion of generators like "special yielding syntax"! Does it look clear?

Note: I didn't have to write it as functions that run goroutines and return channels... in fact, most examples you see in Go aren't styled this way. Instead the main code makes the channel and then calls go on a goroutine that is parameterized with the channels. But I thought it made responsibility clearer who was doing what, e.g. to have the producer both make the channel and close it--vs being given a pre-made channel to fill.

1 Like