New Build Executables, New Build Strategy

I mention that it's time to make a new batch of build executables. I think it can no longer be avoided that we need a simplified build process, and having new and recent executables provides a big bag of tricks for writing a new one. But first, let's just review a little history.

The story of the build system goes like this:


R3-Alpha's build was driven by a slightly templated GNU makefile. You can see the text of the makefile template in %make-make.r, which you would run with what OS_ID system you wanted to build for. It pulled together a list of files from %file-base.r and made you a makefile.

It's a simple approach. It means you have to have GNU make on your platform, which meant it wasn't a workable way to build on Windows. The README said:

This first release is intended for non-Windows systems like Linux, Mac, BSD,
Android, etc. However, it will build for Windows, but I've not included the
typical MSVC project files. I'll try to get that added, but it's not that
tough to create one yourself.

One of the steps run by the makefile was called "prep". This ran Rebol script code that generated a bunch of files to support the build. Responsibilities of this step were:

  • Pack up all the mezzanine Rebol text sources--with comments removed--into a compressed blob of data in a .c file. The C array of bytes in that file was effectively an embedded resource, used to bootstrap the system.

  • Look through the C sources of the interpreter and find function definitions. Then automatically generate an aggregate .h header file with those definitions. This keeps you from having to update function prototypes in more than one place. One downside of having everything automatically glommed into one big header is that the architectural lines between files aren't as clear as they would be if the headers were more granular...but overall I'd say it works out as an advantage.

  • Take some Rebol-structured memoizations and transform them into C code. Plenty of projects have similar steps to convert tabular data in whatever format into .h and .c. Rebol is a better medium for representing freeform intent than what most people use.

The "prep step" has evolved over time, but all of the "make" processes run it in the same way. So what we're talking about changing here is really just about how the invocation of the compiler and linker is done.

Atronix R3

Atronix builds against a lot of dependencies, and thus uses CMake. CMake is a beast with a kind of archaic syntax. But if you have a complex cross-platform C or C++ don't really have much in the way of better options. It has widely supported plugins for making sure your system pulls together the prerequisites it needs.

They used CMakeLists.txt files...written and maintained by hand. No generation process, and they did not use the makefile.

Earl's %build.reb Demo

The concept of using Rebol to build itself came up, by using CALL to invoke the compiler. earl made a quick and dirty example of this, called %build.reb. It showed a promising and brief idea.

A disadvantage it had was that it could only do full builds. GNU make and CMake senses which files change. Doing full builds on every change is not viable for day to day operation.

Another disadvantage is that this couldn't be used to bootstrap onto a new platform. You can't have the only way an executable can be made be if you have one already.

But this was a promising idea...and being able to build without needing GNU make, especially on Windows, was an idea that stuck.


Shixin didn't care for maintaining the handwritten CMakeLists.txt in parallel to changes with the %make-make.r process. So he wanted a unified build.

But rather than thinking of the build as abstracting a makefile as R3-Alpha was doing, he thought of it as abstracting a CMake file...which is much more complex. He wanted something that was "meta" enough that it could build to a CMakeLists.txt, a GNU makefile, a Visual Studio .app file, a command-line MSVC nmake file... or just drive the process through CALL as %build.reb had done.

The result of this ambitious process was the generalized engine of %rebmake.r. But that was a generic framework which then needed to be customized specifically to build a rise to the client script using the library %make.r (the names are arguably backwards... that make.r would be the framework, and rebmake.r would be the script for building Rebol).

Additionally, rebmake tried to accommodate two important new features:

  • Per-OS or build type configuration files (%configs/) to help with common builds

  • Having make specifications in the folders of extensions, so that their build recipes could be broken out from the main file. Here's for instance today's Crypt module's extensions/crypt/%make-spec.r

To most observers (including myself), Rebmake is way too big. It makes sense if you think the goal is to generate CMake files or Visual Studio files. But what you wind up with in this line of thinking is kind the direct opposite of Rebol's goals: instead of making a thing that is significantly less complicated than those systems, you have something that inherits all their complexity and then layers on more.

Giulio's makefile.reb experiment

We pretty much all agree Rebmake has to go, but build systems are just naturally difficult...and not a lot of fun.

@giuliolunati recently took a stab at seeing what kind of a minimal direction might be taken, and generated a %makefile.reb that steered back closer to the ideas of R3-Alpha. But there's a complicated set of requirements in this problem, so it's easy to see it stalled.

His file is a good reminder of the kind of scale we're looking for the build solution to be on. To think smaller and simpler, and cut things back.

Pulling Together A New Design

So taking all this experience into account, I'm ready to take a stab at starting anew. I want to back off from thinking of this as building a generic make system you'd use for other C projects...I think that's a bad idea.

Here's my rough strategy:

  • Make the main build system we use day-to-day driven by Ren-C using CALL. That means it needs to check object file time stamps vs. the source files, and be incremental.

  • Forget about makefiles or Visual Studio projects or otherwise. For bootstrap, just have a way that the list of command lines which would be run by CALL can be output to a .bat or .sh file. A full build is good enough...then once you're bootstrapped, use the executable from then on.

  • Use a circa 2021 Ren-C to do the builds, unlocking more superpower in the make script design. Hopefully @rgchris and I can coordinate that this is "R3C"...with a maintenance contract and quality control. That's something that the haphazard build-picking process hasn't done before in the history of Rebol, and would in itself be refreshing.

  • Make Building as a C library the Main Mission. Strangely enough, the web build has been made available as a library with no included "main()" before offering a similar static library for C. That's kind of silly. If a C library were enabled, we could break out the console into its own repository...or put together things like demo projects on GitHub that offered desktop GUIs. This was the original concept and it just never happened. If not now, when?

  • Revive Granular Extensions, in Particular Wasm Side-Modules ("the DLLs of Wasm"). There was a nice concept in Rebmake that extensions could be not built (-), included statically in the executable (+), or made as a DLL/shared-library (*). The DLL option did not really get tested and hasn't worked for a while. It needs to...and with the primacy of the web build, we have to get that working for loading extensions as little Wasm fragments. We also need to make it easier for extensions to live in their own git projects, vs. in the interpreter master git.

  • Try building using the TCC extension in a web browser. This isn't necessarily as far fetched as it sounds. Ren-C running in wasm in the browser could pull a .ZIP of the sources, let you click some customizations, and using the person's own CPU power in the browser build an x86 executable that you could then download. It's really just a recycling of the conference demo.

As usual, this is all daunting stuff, but with some focus it may be realistically doable. There's a lot of experience here to draw least to know what NOT to do...!


Encouraged by my increased familiarity with GitHub Actions by doing the TCC Bootstrap, I took it further and have done our first Windows-Server Based CI Builds !

This contrasts to how we did Windows builds on Travis, which was with cross-compilation using MinGW. We built Windows executables on Linux. This wasn't so bad for just the basic core of the interpreter. But once you stretched outside of that to wanting to link to strange libraries (ZeroMQ, FFI) you would run into problems because those libraries expected you to be running Windows if building for Windows.

To build on Windows Server without a GUI, I had to brush up the command-line-based building with Microsoft NMAKE...which is working well enough that I just deleted the support for Visual Studio altogether.

And good riddance! VSCode handles all the necessary C/C++ browsing features without needing a giant XML project file. You can jump to declaration, get autocompletion, and debugging...all without needing some gargantuan XML File.

This is a good step back toward the direction we want to go:

The TCC Bootstrap is helping put a sharper focus on what we actually want out of Rebmake: to build without needing any CMake or GNU Make or otherwise. Now to just make it good...


Instead of one giant thing building for every OS possible it would make more sense of having one smaller thing building for a specific target.

So be it double maintenance, it has to be done by someone who cares, and it would be doable because of the examples of existing build methods around.

And that would hold also for new targets.

1 Like

Good thing you get rid of this. I had to work around this problem because it was misleading when I tried to compile for Windows an a Windows OS.

1 Like