Debunking the Arity-1 MAKE

Rebol2, R3-Alpha, and Red embrace the idea of an arity-1 MAKE, that you use for any type.

When a single parameter seems insufficient to create a datatype, it just stuffed the two things into a BLOCK!. That led to things like this FUNC definition in the bootstrap for R3-Alpha...which is a weird two-step process. It wants to copy the spec/body in most cases, but the mezzanine optimizes and does not write code in a style that would need to copy the spec/body...so there's a "funco" helper:

funco: make function! [[
    {Non-copying function constructor (optimized for boot).}
    spec [block!] {Help string (opt) followed by arg words}
    body [block!] {The body block of the function}
][
    make function! reduce [spec body]
]]

func: funco [
    {Defines a user function with given spec and body.}
    spec [block!] {Help string (opt) followed by arg words}
    body [block!] {The body block of the function}
][
    make function! copy/deep reduce [spec body]  ; (now it deep copies)
]

Is This Just "Make-Work"? (pun intended)

Imagine a different approach in which FUNC itself is just a native that takes two parameters and makes a function out of it. There would be several benefits:

  • You don't need to run a REDUCE to make a block that you're ultimately just going to separate into two blocks. That's wasted space and computation. The FUNC passes through the two blocks separately.

  • You avoid looking into a table of MAKE dispatchers, breaking down a block and re-type checking it to make sure it's a block of exactly two other blocks. The FUNC type checking takes care of that on the independent proeprties.

  • If there's a low-level property such as not copying the blocks passed in, that could be handled by a refinement (/NOCOPY) which makes the distinction clearer in the cases that use it.

Faster, clearer. No one really uses the "polymorphism" of make some-type some-definition without really knowing what type or definition is, right? That seems essentially meaningless.

So What Is The Theoretical Value of Arity-1 MAKE?

I'd assumed that the actual theoretical value was something along the lines of having a serialization form of every type as a BLOCK! (or other simple type)

Perhaps the reasoning went:

  • if you can create any type from a single argument like a BLOCK!
  • ...then that means you can MOLD any instance of a value into that same representation.

So what I did in the early days of Ren-C is I tried to enforce a correspondence between make function! [...] and #[function! [...]]. The code that was dispatched by MAKE was the same code that constructing a molded function would call.

That was a nice thought, but due to binding, it doesn't actually work. Basic disproof:

>> obj: make object! [y: 10]

>> f: func [x] bind [x + y] obj

>> y: 100

>> molded: mold f
== "#[action! [[x] [x + y]]]"

>> g: load molded

>> f 1
== 11

>> g 1
== 101

The only real way to preserve the loaded connections of things is in some kind of binary serialization format (like "redbin" is pursuing)...or to always be storing the session's memory as a persistent VM state (like Smalltalk's Squeak and such).

...any counter-arguments in defense of the usefulness of Arity-1 Make?

A post was split to a new topic: MAKE Should Be Using Dialected Constructors