On 01/12/2024 09:36, Janis Papanagnou wrote:
On 30.11.2024 12:59, Bart wrote:
[About Algol68]
But I also considered it too high level and hard to understand.
This I find astonishing, given that it is (IMO; and different from C)
a so cleanly defined language.
Algol68 was famous for its impenetrable specification. Its Revised Report was the programming language equivalent of James Joyce's 'Ulysses'.
I needed a clean simple syntax and 100% obvious and explicit semantics.
Even the
syntax had features I didn't like, like keyword stropping
Stropping was a way to solve the limited characters available in the
system character sets. Practically, as an implementer, you could use
any mechanism you like. (On the mainframe I had used symbols preceded
by a dot, the Genie compiler uses uppercase, for example. None is a
problem for the implementer.)
Yes, but they made writing, reading and maintaining source code impossible. You'd spend most of your time switching case, or babysitting semicolons (see below).
I can live without embedded spaces within identifiers - most languages do. That was the primary reason for the stropping.
If I really need to use a reserved word as an identifier now (which only happens if porting from another language), I can use a backtick:
int `Int, `INT, `Int
This also enables case-sensitivity (my syntax was case-insensitive). I don't think case-stropping for example can manage that.
and fiddly rules about semicolon placement.
Huh? - The semicolon placement as delimiters is quite clear and (as so
many things in Algol 68) also clearly defined (IMO). - So what do you
have in mind here?
It just makes life harder. It special-cases the last statement of any block, which must be semicolon free, as it's strictly a separator. So:
* Adding a new statement to the end of a block, you must apply ; to the
current last statement
* Deleting the last line, you must delete the ; on the previous.
* Move any of the lines about, and you may again need to update the semicolons if the last was included
* Temporarily comment out lines including the last, you must also temporarily remove ; from the line before the comments
* Copy the whole block elsewhere, you might need to add ;
* Temporarily comment out a whole block (or start off with an empty block that will be populated later) you need to use SKIP, another annoyance.
Usually you're not aware of this until the compiler tells you and you have to go back in and fix it.
Allow semicolons to be a /terminator/, and all that goes away. It's a no brainer. But then I don't like having to write semicolons at all, and generally I don't.
The whole thing with stropping and semicolons is just a colossal time-waster.
>
As for better languages than C, there were very few at that level.
(But you know you can use Algol 68 on a system development level; we
can read that it had been done at those day. - All that's "missing",
and that's a good design decision, were pointers.)
The only actual implementation I've come across is A68G. That's an interpreter.
It runs my Fibonacci benchmark in 16 seconds. My main /dynamic/ language interpreter runs it in 1.3 seconds. My C interpreter, which I consider hopelessly slow, takes 6 seconds. My unoptimised C is 0.24 seconds.
Optimised C is 0.12 seconds, 130 times faster than A68G.
It's quite unsuited to systems programming, and not just because of its execution speed. However, I'd quite like to see A68G implemented in A68G!
Algol68 was a fascinating and refreshing language back then. It looked great when typeset in a book. But its practicalities were annoying, and now it is quite dated.
There would have been problems just getting it into the machine (since
on CP/M, every machine used its own disk format). And by the accounts I
read later on in old Byte magazine articles, C compilers were hopelessly
slow running on floppy disks. (Perhaps Turbo C excepted.)
(I don't get what argument you are trying to make. - That you wanted
some terse language, maybe, as you already said above?)
That there were practical problems in physically getting the program into the machine. And when it did run, it would have taken minutes to build anything rather than seconds.
(See:
https://archive.org/details/byte-magazine-1983-08/page/n111/mode/2upA chart of compile-times is on page 122. The same issue compares 8086 C compilers, and introduces the C language.)
A lot of my HLL programs were short and intended to test some hardware. My resident in-memory compiler translated them more or less instantly. A formal build using disk-based programs, source files, object files, and linkers would have been too time-consuming (little has changed!).
I was generally regarded as a whizz-kid; that would have been difficult to keep up if my boss saw me twiddling my thumbs everytime he looked in.