It might be a question of familiarity rather than objective usability. I'm writing this comment in Latin letters rather than Cyrillic or Hebrew because I find Latin letters much more usable than Cyrillic or Hebrew. But that's because I've been surrounded by Latin letters since I was born, and have only occasionally encountered Cyrillic or Hebrew.
I think it's obvious that Cyrillic isn't any less usable than the Latin alphabet in any objective sense. In fact, I'm using English orthography, which has all kinds of unnecessary usability problems which aren't present in any Cyrillic orthography that I know of. But familiarity is a much stronger factor; even today I can barely sound out words in Russian or Ukrainian, while English text printed in Latin letters is clearer to me than speech.
On theoretical grounds, I suspect that the APL syntax Gabi is calling RL-NOP is less usable for left-to-right readers than at least LR-NOP and maybe even conventional Please Brutally Execute My Dear Aunt Sally operator precedence. But familiarity is such a strong force that this hypothesis is very difficult to test.
The theoretical grounds are that, when reading left to right, a reader must maintain a stack of pending operators and values in their mind, unless they are saved by parentheses. (The Iverson quote disagrees with this, but I think Iverson was wrong.) Maintaining mental stacks is difficult and error-prone; this is the reason for the Tim Peters proverb, "Flat is better than nested."
I suspect that operator precedence might be superior for two reasons:
1. It more often avoids parentheses, which are extra symbols to recognize and correctly pair up in your mind.
2. The meaning of high-precedence subexpressions like `x×b` are almost context-independent—although an exponentiation operator or something like a C struct field selector could still follow `b` and change its meaning, following multiplications, divisions, additions, subtractions, or comparisons will not, and preceding additions, subtractions, or comparisons also will not. I conjecture that this facilitates subconscious pattern recognition.
But the familiarity factor enormously outweighs these theoretical considerations for me.
> " I suspect that the APL syntax ... is less usable for left-to-right readers"
On the contrary, I find it much more usable for left-to-right readers, because it allows a "top-down" reading of the expressions, instead of a "bottom-up" reading.
When trying to understand an unfamiliar program, for debugging or maintenance, you normally do not want to waste time by reading completely all expressions, which provide irrelevant computation details.
You typically search where some variables are modified and how and why. For this it is frequently enough to look only at the last operations that have been performed before storing a modified value into a variable.
With the Iverson notation, the last operations are always conveniently grouped at the left side of a text line. Thus you read from left to right only as much as necessary to find what you need, then you can skip the rest of the line.
With the school notation, the required information is not grouped at one end of the line, so reading becomes slower.
The opposite of the Iverson notation, which was used in some stack-oriented languages, also groups the information, but in a way that is less usable for left-to-right users.
From natural languages, left-to-right readers expect that a sentence starts with its topic (at the left side), i.e. the most important part, e.g. the last assignment, like in the Iverson notation, instead of ending with its topic, like in the opposite notation.
> "a reader must maintain a stack of pending operators and values in their mind"
I believe that few readers, if any, do this.
The normal case when reading is that you do not want to reproduce in your mind what the computer does, but only to find the information flows between program variables. For this, it is enough to read partial expressions, as explained above.
In the very rare case when you wanted to make a mental calculation identical to that of the computer, you would normally read the expression from right to left.
When writing, the Iverson notation is usually more convenient than the school notation, while writing normally, from left to right. The reason is that for most computations the natural way to find the expression that must be computed is to go backwards, from the desired result towards the available data.
I think it's obvious that Cyrillic isn't any less usable than the Latin alphabet in any objective sense. In fact, I'm using English orthography, which has all kinds of unnecessary usability problems which aren't present in any Cyrillic orthography that I know of. But familiarity is a much stronger factor; even today I can barely sound out words in Russian or Ukrainian, while English text printed in Latin letters is clearer to me than speech.
On theoretical grounds, I suspect that the APL syntax Gabi is calling RL-NOP is less usable for left-to-right readers than at least LR-NOP and maybe even conventional Please Brutally Execute My Dear Aunt Sally operator precedence. But familiarity is such a strong force that this hypothesis is very difficult to test.
The theoretical grounds are that, when reading left to right, a reader must maintain a stack of pending operators and values in their mind, unless they are saved by parentheses. (The Iverson quote disagrees with this, but I think Iverson was wrong.) Maintaining mental stacks is difficult and error-prone; this is the reason for the Tim Peters proverb, "Flat is better than nested."
I suspect that operator precedence might be superior for two reasons:
1. It more often avoids parentheses, which are extra symbols to recognize and correctly pair up in your mind.
2. The meaning of high-precedence subexpressions like `x×b` are almost context-independent—although an exponentiation operator or something like a C struct field selector could still follow `b` and change its meaning, following multiplications, divisions, additions, subtractions, or comparisons will not, and preceding additions, subtractions, or comparisons also will not. I conjecture that this facilitates subconscious pattern recognition.
But the familiarity factor enormously outweighs these theoretical considerations for me.