{ Rethinking Braces }... as an array type?

I've historically been pretty attached to braces for strings. They sure can be nice.

But I am increasingly thinking braces might be better applied as a new array type, called FENCE!:

[block] (group) {fence}

>> fence: first [{This [would] be @legal}]
== {This [would] be @legal}

>> length of fence
== 4

>> second fence
== [would]

(I like that FENCE is five letters...matching GROUP and BLOCK, and I like that it starts with a distinct character. The name is perfect, and I don't mind it conflicting with the three-backtick "code fence" notation name from other languages!)

So it would act like a BLOCK! or a GROUP! when it was inert. But the real benefit would be the idea that if this braced form got evaluated, it would effectively do a MAKE OBJECT! (or "something along those lines")

>> obj: {x: 10 y: 20, z: 30}
== make object! [  ; whatever this representation is, it's not {x: 10...}
    x: 10
    y: 20
    z: 30
]

This kills two birds with one stone: A neat new dialecting part that would also give a better source notation for objects!

Having its evaluator behavior be "make an object" pushes this from "frivolous third form of array" to being clearly useful on day 1. But I think the array form would soon turn out to not be frivolous.

Carl Himself Wants To Move Away From Braced Strings

In Carl's "ASON" pitch, he moves away from Rebol's choice to make braces an asymmetric string delimiter:

  • "Braces {} are used to denote objects. They are lexical and may be used directly without evaluation (the make constructor is not necessary)."

  • "Braces {} are not used for multi-line strings. A single+double quote format is used for multi-line strings."

I must admit braced strings can make a lot of situations in the text programming world look better than they typically would.

But it comes at a cost for taking the asymmetric delimiter, and is a real weakness against JavaScript and JSON. When rethought as this fun new dialecting part, it actually offers a new edge and plays to Rebol's strengths.

What might the new {...} type do in PARSE? As a branch type? In your own dialects?

My {...} Proposal Is Arrays, Not Object Literals

It might seem like having a source representation of objects that maps directly to the loaded/ in-memory representation would be better. But in practice, you can't really get the loaded form to ever look completely like the source...there's so many issues with nested cyclical structures or things that just don't mold out.

It doesn't work in JavaScript either. Note that you're not supposed to be loading JSON directly in any case into JavaScript...you're always supposed to go through parsers and serializers. So that should be weighed here when looking at the suggestion of a structural type that happens to evaluate to give you an in-memory representation.

Map Representation Via : ?

There was another remark in the Altscript on the role of colon:

For JSON compatiblity:

  • Keys (word definitions) can be written with quotes ("field":)
  • A lone colon (:) will automatically associate to the word/string immediately before it.
  • Commas as element separators are allowed as long as they are not directly followed by a non-digit character (to avoid confusion with comma-based decimal values.)

The note about the colon seems like it might be good for maps.

mapping: {
    1 : "One"
    "Two" : 2
}

This could help avoid the need for SET-INTEGER! or similar.

4 Likes

Yes, "make object!" must die. I have wasted much time in the past trying to work out how to use blocks instead of objects just to avoid the mess that objects make on mold of a structure.

Yes please!

Damn, that's true, but surely there's a solution somewhere. Nesting should be straight forward, but cycling I guess needs to represent a context reference and we don't have identifiers for contexts.

123 : {bizarre-thought : "is something like this the new type of" "context" cycling: @123}

But if that was feasible and allowed, would there be any significant difference between { and [ ?

3 Likes

Well my point is about ambiguity:

 >> first [{a: 10 b: 10 + 20}]
 == {a: 10 b: 10 + 20}  ; a FENCE! of length 6

>> {a: 10 b: 10 + 20}
== ???a 10 b 20???  ; an object or map or dictionary or something.

There would be some kind of semi-serialization operator:

>> obj: {a: 10 b: 10 + 10}
== ???a 10 b 20???

>> to fence! obj
== {
    a: 10
    b: 20
}

Which could navigate you from the object from "what would create me". But this has limits.

and maybe that's what MOLD is. (Not that I ever liked the name "mold"... SERIALIZE ?)

I'm just saying that the console needs to keep you grounded when you're talking about something that's in memory and structured vs. the source array notation. They're different and it's a losing battle to not keep people aware of that difference.

3 Likes

17 posts were merged into an existing topic: Alternate String Forms if {...} Becomes An Array Type

In Carl's "ASON" pitch, he proposes lexical braces to mean objects:

I don't think this would satisfy JavaScript programmers...nor Rebol programmers for that matter...as a replacement for MAKE OBJECT!. (Note there is no source code for ASON, so we don't know how many scenarios he has looked at or hasn't.)

The problem is that if you take braces to mean objects literally, it raises the question of when expressions would be evaluated... or if expressions are allowed at all.

You can't evaluate expressions at LOAD time because the variables and contexts are not set up. So what would you get from this?

center: {x: 10, y: 20}

points: [
     {x: center.x - 10, y: center.y - 10}
     {x: center.x + 10, y: center.y + 10}
]

print ["Point 1's X is" points.1.x]

See the problem? When could these expressions be evaluated? They couldn't be evaluated at load time (it doesn't know what CENTER is yet, the line assigning it hasn't happened). And it can't be when POINTS is assigned, because the BLOCK! is inert.

It seems you simply can't use expressions when braces are used to lexically represent an object instance.

How Does Red's "Lexical MAP!" #(...) Deal With This?

It doesn't.

red>> points: [#(x: 1 + 1 y: 2 + 2)]
== [#(
    x: 1
    +: 2
    y: 2
)]

Since it couldn't evaluate the expressions it just did something weird, as if you'd written:

 points: reduce [make map! [
     x: 1
     +: 1
     y: 2
     +: 2  ; overwrites earlier mapping of + -> 1
]]

:roll_eyes:

So if you want expressions, you need to go through the normal process of MAKE MAP!.

How Does JSON Deal With This

It doesn't.

JSON is narrowly defined to not allow expressions...because it is for exchanging data.

You are discouraged from actually using a plain JavaScript eval() to process a JSON message precisely because it can evaluate...which could lead to bugs or vulnerabilities.

That doesn't change the fact that {...} within the JavaScript language evaluate when the line containing them is run. In this usage they are called object initializers.

How Would The FENCE! Proposal Handle This?

The proposal differs from "{} as lexical object" because it says that FENCE! is just another category of array...which when evaluated gives you an OBJECT!.

Arrays would be a case where this would surprise a JavaScript programmer, because they expect an array initialization to run evaluations of the object initializers inside that array. That's because arrays get evaluated without a REDUCE step.

But this applies for any expression, not just objects:

js>> var test = [1 + 2, 30 + 40]
js>> test
<- Array [ 3, 70 ]

js>> var test = [{x: 1 + 2, y: 30 + 40}]
js>> test
<- Array [ {...} ]
   > 0: Object {x: 3, y: 70}
     length: 1

We understand that the first case in Rebol would need some sort of evaluation request like REDUCE to run the expressions. With FENCE! the way I propose it, there would have to be a similar request for evaluation.

This calls for operators that parallel JSON.deserialize(), that would enforce that there weren't expressions. But also operators that would be more COMPOSE-like...looking for all the OBJECT!s in nested arrays and generating new BLOCK!s of OBJECT!s from them.

What About Accessing FENCE! In Object-Like Ways?

There's historical precedent of being able to pick from blocks like they were objects. Fences could try that as well.

>> points: [{x: 10, y: 20}]
== [{x: 10, y: 20}]

>> type of points.1
== #[datatype! fence!]

>> points.1.y
== 20

But this leads to complex questions. What if they are expressions? (The accessor could error if it doesn't see a comma?) How would NULL be represented? (Going directly to the next SET-WORD?) FENCE! is a generic array so you can APPEND arbitrary material to it...what happens if you add another X?

I don't know how beneficial it is to go along with this illusion, but the subject matter already has a post for discussing it:

BLOCK! and OBJECT! Parity in Pathing/Picking

One nuance here is that if braces are used principally for objects, then maybe @rgchris's wish for BLOCK!s to think in terms of /SKIP counts (alternating keys and values?) could coexist with a SET-WORD!-seeking rule for FENCE!.

Summary...?

The lexical brace concept (which I have never lent my support to) would not replace MAKE OBJECT! because it cannot be used for expression evaluation. It would be creating the objects it represents at LOAD-time, not when expression contexts were known. It would thus serve a very niche purpose in making data exchange somewhat-more JavaScript compatible.

The FENCE! proposal would be able to replace MAKE OBJECT!, but would be essentially a synonym for that functionality today. You would still need to be in an evaluative context to get the OBJECT!...and until then it would remain as an unevaluated FENCE!. But at least you can use it for more than just data exchange. And on the plus side, this opens up fences for interesting dialecting purposes.

2 Likes

I feel like I've "disproved" the value of braces for lexical objects. And so the somewhat glum finding is that you aren't really getting all that much bang for your buck out of a new array brace type that you couldn't do before with MAKE OBJECT!.

In fact, the new brace type couldn't be used a lot of places. You wouldn't want to derive objects using it, because it would greedily create objects:

base-object: {a: 10, b: 20}
derived-object: extend base-object {c: 30}

This would wind up creating a useless intermediate OBJECT! with a key of C and a value of 30, because the parameter would be evaluated before EXTEND would receive it. You'd be better off passing [c: 30] so the structure could be analyzed and integrated into a single new created object.

These sorts of findings rain on the parade of the idea of braces for objects.

4 Likes

If anyone could make this work in Rebol, it would be Carl. But it just might not feel like Rebol to most.

I believe that's the why, in his response to ASON. He really believes this could be a better way to do a multifaceted approach with exchanged data.

I think Altscript would set the rules for evaluation. And you would through rules set Altscript.

I know someone else crazy enough to make this work.

Hostilefork...

Yeah im just a little crazed that Rebol didn't do this out of the block.. em, box from the beginning to help with code compatibility from the get go.

So maybe let Rebol-Dom inspire you. Not the code, Lordy its all just prototype, but the idea of how the "{}" are being used.

How the DOM or node-element is the arrayed pack_age and the array-obj! var is the set builder notation.

It works very well with "{}" as an object, array, a replacement for Rebol blocks, and the brace is still used for interpolated printing.

It for me makes dialecting very easy. I can set the data to look like almost anything and still process it. Thank you "{}"s.

Once the protos is done i hope to get this to run in Ren-C and Redbol. Who knows, in 5 to 10 years from now maybe even ASON.

Well, actually it's json who got it wrong.

1 Like

Starts at 20:33 into the video, Rebol reference about a minute later.

A post was split to a new topic: JSON Envy: Serialization Dialect in Rebol?

I've been a little bit back on the FENCE! regarding the braces as a new type. The original proposal was this:

An alternative concept would be that it could be an inert new array type.

It would mean that we could unify PRINT and ECHO. We'd just say that how PRINT interpreted fences was different:

>> x: 1000

>> print ["Block semantics:" x + 20]
Block semantics: 1020

>> print {Fence semantics: @(x + 20)}
Fence semantics: 1020

(This is bolstered, albeit perhaps a bit deceptively, by the wide-ranging allowance of tokens in Ren-C... as well as sea of words not creating variables for everything just because a WORD! is used... along with the word symbol GC behavior vs. growing the word IDs unboundedly.)

So we'd not be all the way to getting an object from {...} directly. You'd still need some kind of operator for that, and picking a short word is probably a good idea... e.g. taking over MAKE:

>> {x: 10, y: 10 + 10}
== {x: 10, y: 10 + 10}

>> make {x: 10, y: 10 + 10}
== #[object! x: 10 y: 20]  ; whatever the rendering is

But we'd need something like this anyway. I mentioned that considering braces a "literal" notation would be DOA...and that there has to be some evaluative push your objects were in a block as an array--for instance:

 >> make [{x: 1 + 2, y: 3 + 4} {x: 5 + 6, y: 7 + 8}]
 == [#[object!...] #[object!...]]

This wouldn't be a problem if it was inert.

A weirder idea could be that FENCE! evaluates to a FENCE! isotope, and that fence isotopes decay to objects. This would permit anything that wanted to examine the fence structure to do so without having to quote it...and decide to not let it decay.

  • GROUP!s evaluate

  • BLOCK!s "block" evaluation

  • FENCE!s are "on the fence"... if evaluated, they enter an isotopic FENCE! state...so the array material is still there. But they will "decay" into an OBJECT! if nothing that claims interest in fences intercepts them first to interpret their source as structure.

That's probably too weird.

2 Likes

It's good to finally see unbound, inert, dialected code in block!, and now sequential ,(molded,bracketed) form without the heavy use of objects and less runaway binding being expressed in Rebdol.

I've noticed that when the Dialect is code, is the User-mode, is data, and somewhat unbound, especially bracketed, relative expressions are easier to visualize, create, and understand.

The best part is you really can achieve the ...Rebol in the Deep..., even with all the language shortcomings and run that same code in Rebol 2,3, Red, Ren-C, Rye, ASON, etc. without much change.

Very good job Hostilfork. Your finally entering into a beast mode. Get this right and Rebol itself, its purpose, and its place will change in PL community.

The summation of the Rebol-Dialect Object Model was the vision for me in Rebol as a pathway forward. That may have come to a halt because i no longer can log in and update it with past code that's very similar in action to what you've been expressing since January. Oh well.

Maybe the execution is all just tricks and following ducks. But now i get to see Rebol, Red, and Ren-C scrambling forward to give an alternative to Dialect, is data, is code, is User-mode, is (un)binding all wrapped up as Rebol AI... Analytical Interpretation.

A post was merged into an existing topic: Alternate String Forms if {...} Becomes An Array Type