frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
289•theblazehen•2d ago•97 comments

Software Engineering Is Back

https://blog.alaindichiappari.dev/p/software-engineering-is-back
21•alainrk•1h ago•11 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
35•AlexeyBrin•1h ago•5 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
15•onurkanbkrc•1h ago•1 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
717•klaussilveira•16h ago•218 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
978•xnx•21h ago•562 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
94•jesperordrup•6h ago•35 comments

France's homegrown open source online office suite

https://github.com/suitenumerique
4•nar001•35m ago•2 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
138•matheusalmeida•2d ago•36 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
74•videotopia•4d ago•11 comments

Ga68, a GNU Algol 68 Compiler

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
16•matt_d•3d ago•4 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
46•helloplanets•4d ago•46 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
242•isitcontent•16h ago•27 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
242•dmpetrov•16h ago•128 comments

Cross-Region MSK Replication: K2K vs. MirrorMaker2

https://medium.com/lensesio/cross-region-msk-replication-a-comprehensive-performance-comparison-o...
4•andmarios•4d ago•1 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
344•vecti•18h ago•153 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
510•todsacerdoti•1d ago•248 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
393•ostacke•22h ago•101 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
309•eljojo•19h ago•192 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
361•aktau•22h ago•187 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
437•lstoll•22h ago•286 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
33•1vuio0pswjnm7•2h ago•31 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
73•kmm•5d ago•11 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
26•bikenaga•3d ago•13 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
98•quibono•4d ago•22 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
278•i5heu•19h ago•227 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
43•gmays•11h ago•15 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1088•cdrnsf•1d ago•469 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
312•surprisetalk•3d ago•45 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
36•romes•4d ago•3 comments
Open in hackernews

GCC SC approves inclusion of Algol 68 Front End

https://gcc.gnu.org/pipermail/gcc/2025-November/247020.html
229•edelsohn•2mo ago

Comments

zik•2mo ago
As a fan of Algol 68, I'm pretty excited for this.

For people who aren't familiar with the language, pretty much all modern languages are descended from Algol 60 or Algol 68. C descends from Algol 60, so pretty much every popular modern language derives from Algol in some way [1].

[1] https://ballingt.com/assets/prog_lang_poster.png

dhosek•2mo ago
Finally.
nine_k•2mo ago
If PL/I was like a C++ of the time, Algol-68 was probably comparable to a Scala of the time. A number of mind-boggling ideas (for the time), complexity, an array of kitchen sinks.
int_19h•2mo ago
It certainly has quite a reputation, but I suspect it has more to do with dense formalism that was quite unlike everything else. The language itself is actually surprisingly nice for its time, very orthogonal and composable.
j2kun•2mo ago
> I'm pretty excited for this

Aside from historical interest, why are you excited for it?

zik•2mo ago
I've actually been toying with writing an Algol 68 compiler myself for a while.

While I doubt I'll do any major development in it, I'll definitely have a play with it, just to revisit old memories and remind myself of its many innovations.

ofalkaed•2mo ago
Personally, I think the whole C tangent was a misstep and would love to see Algo 68 turn into Algo 26 or 27. I sort of like C and C++ and many other languages which came, but they have issues. I think Algo 68 could develop into something better than C++, it has some of the pieces already in place.

Admittedly, every language I really enjoy and get along with is one of those languages that produced little compared to the likes of C (APL, Tcl/Tk, Forth), and as a hobbyist I have no real stake in the game.

inkyoto•2mo ago
Whilst I think that C has its place, my personal choice of Algol 26 or 27 would be CLU – a highly influential, yet little known and underrated Algol inspired language. CLU is also very approachable and pretty compact.
uecker•2mo ago
I wonder about what you think is wrong with C? C is essentially a much simplified subset of ALGOL68. So what is missing in C?
pjmlp•2mo ago
Proper strings and arrays for starters, instead of being pointers that the programmer is responsible for doing length housekeeping.
uecker•2mo ago
Arrays are not pointers and if you do not let them decay to one, they do preserve the length information.
pjmlp•2mo ago
They surely behave like one as soon as they leave local scope.

Kind of hard when passing them around as funcion parameters, and the static trick doesn't really work in a portable way.

Lets seen how far WG14 gets with cybersecurity laws with this kind of answers being analysed by SecDevOps and Infosec experts.

dfawcus•2mo ago
Then don't allow it to decay:

    void arr_fn(char (*arr)[15]) {
        enum { len = sizeof *arr }; printf("len of array: %d\n", len);
        printf("Got: %.*s\n", len, *arr);
    }
    void sptr_fn(char ptr[static 15]) { printf("Got: %s\n", ptr); }
    int main(void) {
        char array[15] = "Hello, World!";

        arr_fn(&array); sptr_fn(array); return 0;
    }
Using gcc (and similarly clang) removing the '15' from 'array', and allowing it to allocate it as 14 chars will result in warnings for both function calls.

One can hide that ptr to array behind a typedef to make it more readable:

    typedef char (Arr_15)[15];
    void arr_fn2(Arr_15 *arr) {
What do you mean by 'the static trick'? Is that what I have in sptr_fn()?
pjmlp•2mo ago
That is the static trick.

The issues as it stands today are:

- It is still a warning instead of an error, and we all know how many projects have endless lists of warnings

- Only GCC and clang issue such warning, if we want to improve C, security must be imposed to all implementations

https://c.godbolt.org/z/fEKzT4WfM

dfawcus•2mo ago
OK - assuming you're referring to 'char ptr[static 15]' as the 'static trick', then yeah - other compilers do not complain.

However the other form 'char (*arr)[15]' has always been available, and is complained of in other compilers.

I believe I remember using it in DOS based C-89 compilers back in the early 90s, possibly also in K&R (via lint) in the 80s.

NB: icc, msvc, mvc complain about the misuse of the traditional version if one adjusts your godbolt example.

Yes one has to build with warnings forcing errors, which takes a bit of work to achieve if the code has previously been built without that.

uecker•2mo ago
There isn't really much difference between "ignoring warnings" in C and careless use of "unsafe" or "unwrap" in Rust. Once you entered the realm of sloppiness, the programming language will not safe you.

The point is to what extend the tools for safe programming are available. C certainly has gaps, but not having proper arrays is not one of them.

ordu•2mo ago

    int arr[4];
    foo(arr);
We can look at this code like it passes an array by reference, but how to pass `arr` by value?
uecker•2mo ago
You can pass it by value when putting it into a struct. You can also pass a pointer to the array instead of letting it decay.

void foo(int (*arr)[4]);

int arr[4]; foo(&arr);

ofalkaed•2mo ago
I think what C is missing is everything that people fall back onto clever use of pointers and macros to implement. Not that I think C should have all those things, Zig does a decent job of showing alternatives.
uecker•2mo ago
Yeah, but I meant specifically from ALGOL68.
ofalkaed•2mo ago
I don't think C is missing anything from Algol 68, but, FLEX and slices would be nice, although Algol's slices are fairly limited but even its limited slices are better than what C offers. Algol 68 operators are amazing but I don't see them playing well with C.
Y_Y•2mo ago
Is like to order a complementary question to the sibling one. What are you going to add to (/remove from?) Algol 68 to get Algol 26?
ofalkaed•2mo ago
That task would be beyond my skills, as I said, I am just a hobbyist. I think it would be interesting to see what would result from going back to one of those early foundational languages and developing a modern language from it. With a language like Algol we don't have the decades of evolution (baggage) which are a big part of languages like C and C++ and trickle into the languages they inspired even if they are trying to remove that baggage. So, what would we get if we went back to the start and built a modern language off of Algol? What would that look like?
GhosT078•2mo ago
Consider exploring Ada 2022 as a capable successor to Algol. Its well supported in GCC and scales well from very small to very large projects. Some information is at https://learn.adacore.com/ and https://alire.ada.dev/
vintagedave•2mo ago
Wouldn't that be some form of Pascal?
Taniwha•2mo ago
I would argue C comes from Algol68 (structs, unions, pointers, a full type system etc, no call by name) rather than Algol60
inkyoto•2mo ago
That is indeed correct. Kernighan in his original book on C cited Algol 68 as a major influence.
adrian_b•2mo ago
C had 3 major sources, B (derived from BCPL, which had been derived from CPL, which had been derived from ALGOL 60), IBM PL/I and ALGOL 68.

Structs come from PL/I, not from ALGOL 68, together with the postfix operators "." and "->". The term "pointer" also comes from PL/I, the corresponding term in ALGOL 68 was "reference". The prefix operator "*" is a mistake peculiar to C, acknowledged later by the C language designers, it should have been a postfix operator, like in Euler and Pascal.

Examples of things that come from ALGOL 68 are unions (unfortunately C unions lack most useful features of the ALGOL 68 unions. which are implicitly tagged unions) and the combined operation-assignment operators, e.g. "+=" or "*=".

The Bourne shell scripting language, inherited by ksh, bash, zsh etc., also has many features taken from ALGOL 68.

The explicit "malloc" and "free" also come from PL/I. ALGOL 68 is normally implemented with a garbage collector.

themafia•2mo ago
> it should have been a postfix operator, like in Euler and Pascal.

I never liked Pascal style Pointer^. As the postfix starts to get visually cumbersome with more than one layer of Indirection^^. Especially when combined with other postfix Operators^^.AndMethods. Or even just Operator^ := Assignment.

I also think it's the natural inverse of the "address-of" prefix operator. So we have "take the address of this value" and "look through the address to retreive the value."

adrian_b•2mo ago
The "natural inverse" relationship between "address-of" and indirect addressing is only partial.

You can apply the "*" operator as many times you want, but applying "address-of" twice is meaningless.

Moreover, in complex expressions it is common to mix the indirection operator with array indexing and with structure member selection, and all these 3 postfix operators can appear an unlimited number of times in an expression.

Writing such addressing expressions in C is extremely cumbersome, because they require a great number of parentheses levels and it is still difficult to see which is the order in which they are applied.

With a postfix indirection operator no parentheses are needed and all addressing operators are executed in the order in which they are written.

So it is beyond reasonable doubt that a prefix "*" is a mistake.

The only reason why they have chosen "*" as prefix in C, which they later regretted, was because it seemed easier to define the expressions "*++p" and "*p++" to have the desired order of evaluation.

There is no other use case where a prefix "*" simplifies anything and for the postfix and prefix increment and decrement it would have been possible to find other ways to avoid parentheses and even if they were used with parentheses that would still have been simpler than when you have to mix "*" with array indexing and with structure member selection. Moreover, the use of "++" and "--" with pointers was only a workaround for a dumb compiler, which could not determine by itself whether it should access an array using indices or pointers. Normally there should be no need to expose such an implementation detail in a high-level language, the compiler should choose the addressing modes that are optimal for the target CPU, not the programmer. On some CPUs, including the Intel/AMD CPUs, accessing arrays by incrementing pointers, like in the old C programs, is usually worse than accessing the arrays through indices (because on such CPUs the loop counter can be reused as an index register, regardless of the order in which the array is accessed, including for accessing multiple arrays, avoiding the use of extra registers and reducing the number of executed instructions).

With a postfix "*", the operator "->" would have been superfluous. It has been added to C only to avoid some of the most frequent cases when a prefix "*" leads to ugly syntax.

themafia•2mo ago
> You can apply the "*" operator as many times you want, but applying "address-of" twice is meaningless.

This is due to the nature of lvalue and rvalue expressions. You can only get an object where * is meaningful twice if you've applied & meaningfully twice before.

    int a = 42;
    int *b = &a;
    int **c = &b;
I've applied & twice. I merely had to negotiate with the language instead of the parser to do so.

> and all these 3 postfix operators can appear an unlimited number of times in an expression.

In those cases the operator is immediately followed by a non-operator token. I cannot meaningfully write a[][1], or b..field.

> The only reason why they have chosen "*" as prefix in C, which they later regretted, was because it seemed easier to define the expressions "++p" and "p++" to have the desired order of evaluation.

It not only seems easier it is easier. What you sacrifice is complications is defining function pointers. One is far more common than the other. I think they got it right.

> With a postfix "*", the operator "->" would have been superfluous.

Precisely the reason I dislike the Pascal**.Style. Go offers a better mechanism anyways. Just use "." and let the language work out what that means based on types.

I'm offering a subjective point of view. I don't like the way that looks or reads or mentally parses. I'm much happier to occasionally struggle with function pointers.

LeFantome•2mo ago
I do not think that is what they meant.

**c is valid but &&b makes no sense.

comex•2mo ago
Some languages do define &&b, like Rust, where its effect is similar to the parent post's C example: it creates a temporary stack allocation initialized with &b, and then takes the address of that.

You could argue this is inconsistent or confusing. It is certainly useful though.

Incidentally, C99 lets you do something similar with compound literal syntax; this is a valid expression:

    &(int *){&b}
inkyoto•2mo ago
> The only reason why they have chosen "" as prefix in C, which they later regretted, was because it seemed easier to define the expressions "++p" and "*p++" to have the desired order of evaluation.

There has been no shortage of speculation, much of it needlessly elaborate. The reality, however, appears far simpler – the prefix pointer notation had already been present in B and its predecessor, BCPL[0]. It was not invented anew, merely borrowed – or, more accurately, inherited.

The common lore often attributes this syntactic feature to the influence of the PDP-11 ISA. That claim, whilst not entirely baseless, is at best a partial truth. The PDP-11 did support pre-increment and post-increment indirect address manipulation – but notably lacked their symmetrical complements: pre-increment and post-decrement addressing modes[1]. In other words, it exhibited asymmetry – a gap that undermines the argument for direct PDP-11 ISA inheritance, i.e.

  MOV (Rn)+, Rm

  MOV @(Rn)+, Rm

  MOV -(Rn), Rm

  MOV @-(Rn), Rm
existed but not

  MOV +(Rn), Rm

  MOV @+(Rn), Rm

  MOV (Rn)-, Rm

  MOV @(Rn)-, Rm
[0] https://www.thinkage.ca/gcos/expl/b/manu/manu.html#Section6_...

[1] PDP-11 ISA allocates 3 bits for the addressing mode (register / Rn, indirect register (Rn), auto post-increment indirect / (Rn)+ , auto post-increment deferred / @(Rn)+, auto pre-decrement indirect / -(Rn), auto pre-increment deferred / @-(Rn), index / idx(Rn) and index deferred / @idx(Rn) ), and whether it was actually «let's choose these eight modes» or «we also wanted pre-increment and post-decrement but ran out of bits» is a matter of historical debate.

adrian_b•2mo ago
The prefix "*" and the increment/decrement operators have been indeed introduced in the B language (in 1969, before the launch of PDP-11 in 1970, but earlier computers had some autoincrement/autodecrement facilities, though not as complete as in the B language), where "*" has been made prefix for the reason that I have already explained.

The prefix "*" WAS NOT inherited from BCPL, it was purely a B invention due to Ken Thompson.

In BCPL, "*" was actually a postfix operator that was used for array indexing. It was not the operator for indirection.

In CPL, the predecessor of BCPL, there was no indirection operator, because indirection through a pointer was implicit, based on the type of the variable. Instead of an indirection operator, there were different kinds of assignment operators, to enable the assignment of a value to the pointer, instead of assigning to the variable pointed by the pointer, which was the default meaning.

BCPL has made many changes in the syntax of CPL, whose main reason was the necessity of adapting the language to the impoverished character set available on American computers, which lacked many of the characters that had been available in Europe before IBM and a few other US vendors have succeeded to replace the local vendors, also imposing thus the EBCDIC and later the ASCII character sets.

Several of the changes done between BCPL and B had the same kind of reason, i.e. they were needed to transition the language from an older character set to the then new ASCII character set. For instance the use of braces as block delimiters was prompted by their addition into ASCII, as they were not available in the previous character set.

The link that you have provided to a manual of the B language is not useful for historical discussions, as the manual is for a modernized version of B, which contains some features back-ported from C.

There is a manual of the B language dated 1972-01-07, which predates the C language, and which can be found on the Web. Even that version might have already included some changes from the original B language of 1969.

inkyoto•2mo ago
* was the usual infix multiplication operator in BCPL, and it was not used for pointer arithmetic.

The BCPL manual[0] explains the «monadic !» operator (section 2.11.3) as:

  2.11.3 MONADIC !

  The value or a monadic ! expression is the value of the storage cell whose address is the operand of the !. Thus @!E = !@E = E, (providing E is an expression of the class described in 2.11.2).

  Examples.

  !X := Y Stores the value of Y into the storage cell whose address is the value of X.

  P := !P Stores the value of the cell whose address is the value of P, as the new value of P.
The array indexing used the «V ! idx» syntax (section 2.13, «Vector application»).

So, the ! was a prefix operator for pointers, and it was an infix operator for array indexing.

In Richard's account of BCPL's evolution, he noted that on early hardware the exlamation mark was not easily available, and, therefore, he used a composite *( (i.e. a diagraph):

  «The star in *( was chosen because it was available … and it seemed appropriate for subscription since it was used as the indirection operator in the FAP assembly language on CTSS. Later, when the exclamation mark became available, *( was replaced by !( and exclamation mark became both a dyadic and monadic indirection operator».
So, in all likelihood, !X := Y became *(X := Y, eventually becoming *X = Y (in B and C) whilst retaining the exact and original semantics of the !.

[0] https://rabbit.eng.miami.edu/info/bcpl_reference_manual.pdf

adrian_b•2mo ago
The BCPL manual linked by you is not useful, as it describes a recent version of the language, which is irrelevant for the evolution of the B and C languages. A manual of BCPL from July 1967, predating B, can be found on the Web.

The use of the character "!" in BCPL is much later than the development of the B language from BCPL, in 1969.

The asterisk had 3 uses in BCPL, as the multiplication operator, as a marker for the opening bracket in array indexing, to compensate for the lack of different kinds of brackets for function evaluation and for array indexing, and as the escape character in character strings. For the last use the asterisk has been replaced by the backslash in C.

There was indeed a prefix indirection operator in BCPL, but it did not use any special character, because the available character set did not have any unused characters.

The BCPL parser was separate from the lexer, and it was possible for the end users to modify the lexer, in order to assign any locally available characters to the syntactic tokens.

So if a user had appropriate characters, they could have been assigned to indirection and address-of, but otherwise they were just written RV and LV, for right-hand-side value and left-hand-side value.

It is not known whether Ken Thompson had modified the BCPL lexer for his PDP computer, to use some special characters for operators like RV and LV.

In any case, he could not have used asterisk for indirection, because that would have conflicted with its other uses.

The use of asterisk for indirection in B became possible only after Ken Thompson has made many other changes and simplifications in comparison with BCPL, removing any parsing conflicts.

You are right that BCPL already had prefix operators for indirection and address-of, which was different from how this had been handled in CPL, but Martin Richards did not seem to have any reason for this choice and in BCPL this was a less obvious mistake, because it did not have structures.

On the other hand, Ken Thompson did want to have "*" as prefix, after introducing his increment and decrement operators, in order to need no parentheses for pre- and post-incrementation or decrementation of pointers, in the context where postfix operators were defined as having higher precedence than prefix.

Also in his case this was not yet an obvious mistake, because he had no structures and the programs written in B at that time did not use any complex data structures that would need correspondingly complex addressing expressions.

Only years later it became apparent that this was a bad choice, while the earlier choice of N. Wirth in Euler (January 1966; the first high-level language that handled pointers explicitly, with indirection and address-of operators) had been the right one. The high-level languages that had "references" before 1966 (the term "pointer" has been introduced in IBM PL/I, in July 1966), e.g. CPL and FORTRAN IV, handled them only implicitly.

Decades later, complex data structures became common while the manual optimization of incrementing/decrementing explicitly pointers for addressing arrays became a way of writing inefficient programs, which prevent the compiler from optimizing correctly the array accessing for the target CPU.

So the choice of Ken Thompson can be justified in its context from 1969, but in hindsight it has definitely been a very bad choice.

inkyoto•2mo ago
I take no issue with the acknowledgment of being on the losing side of a technical argument – provided evidence compels.

However, to be entirely candid, I have submitted two references and a direct quotation throughout the discourse in support of the position – each of which has been summarily dismissed with an appeal to some ostensibly «older, truer origin», presented without citation, without substantiation, and, most tellingly, without the rigour such a claim demands.

It is important to recall that during the formative years of programming language development, there were no formal standards, no governing design committees. Each compiled copy of a language – often passed around on a tape and locally altered, sometimes severely – became its own dialect, occasionally diverging to the point of incompatibility with its progenitor.

Therefore, may I ask that you provide specific and credible sources – ones that not only support your historical assertion, but also clarify the particular lineage, or flavour, of the language in question? Intellectual honesty demands no less – and rhetorical flourish is no substitute for evidence.

adrian_b•2mo ago
What you say is right, and it would have been less lazy for me to provide links to the documents that I have quoted.

On the other hand, I have provided all the information that is needed for anyone to find those documents through a Web search, in a few seconds.

I have the quoted documents, but it is not helpful to know from where they were downloaded a long time ago, because, unfortunately, the Internet URLs are not stable. So for links, I just have to search them again, like anyone else.

These documents can be found in many places.

For instance, searching "b language manual 1972" finds as the first link:

https://www.nokia.com/bell-labs/about/dennis-m-ritchie/kbman...

Searching "martin richards bcpl 1967" finds as the first link:

https://www.nokia.com/bell-labs/about/dennis-m-ritchie/bcpl....

Additional searching for CPL and BCPL language documents finds

https://archives.bodleian.ox.ac.uk/repositories/2/archival_o...

where there are a lot of early documents about the languages BCPL and CPL.

Searching for "Wirth Euler language 1966" finds the 2-part paper

https://dl.acm.org/doi/10.1145/365153.365162

https://dl.acm.org/doi/10.1145/365170.365202

There exists an earlier internal report about Euler from April 1965 at Stanford, before the publication of the language in CACM, where both indirection and address-of were prefix, like later in BCPL. However, before the publication in January 1966, indirection has been changed to be a postfix operator, choice that has been retained in the later languages of Wirth.

http://i.stanford.edu/pub/cstr/reports/cs/tr/65/20/CS-TR-65-...

The early IBM PL/I manuals are available at

http://bitsavers.org/pdf/ibm/360/pli/

Searching for "algol 68 reports" will find a lot of documents.

And so on, everything can be searched and found immediately.

zozbot234•2mo ago
A postfix "*" would be completely redundant since you can just use p[0] . Instead of *p++ you'd have (p++)[0] - still quite workable.
fastaguy88•2mo ago
You're kidding, right? (p++)[0] returns the contents of (p) before the ++. Its hard to imagine a more confusing juxtaposition.
psychoslave•2mo ago
A dash instead of a dot would be so much more congruent with the way Latin script generally render compounded terms. And a reference/pointer (or even pin for short) is really nothing that much different compared to any other function/operator/method.

some·object-pin-pin-pin-transform is not harder to parse nor to interpret as human than (***some_object)->transform().

inkyoto•2mo ago
C's «static» and «auto» also come from PL/I. Even though «auto» has never been used in C, it has found its place in C++.

C also had a reserved keyword, «entry», which had never been used before eventually being relinquished from its keyword status when the standardisation of C began.

pjmlp•2mo ago
C23 also has reused auto as C++, although type inference is more limited.
Taniwha•2mo ago
C originally had =+ and =- (upto and including Unix V6) - they were ambiguous (a=-b means a= -b? or a = a-b?) and replaced by +=/-=

The original structs were pretty bad too - field names had their own address space and could sort of be used with any pointer which sort of allowed you to make tacky unions) we didn't get a real type system until the late 80s

adrian_b•2mo ago
ALGOL 68 had "=" for equality and ":=" for assignment, like ALGOL 60.

Therefore the operation with assignment operators were like "+:=".

The initial syntax of C was indeed weird and it was caused by the way how their original parser in their first C compiler happened to be written and rewritten, the later form of the assignment operators was closer to their source from ALGOL 68.

ErikCorry•2mo ago
Yeah if you ever wondered why the fields in a lot of Posix APIs have names with prefixes like tm_sec and tm_usec it's because of this misfeature of early C.
somat•2mo ago
Yes, massively influential, but was it ever used or popular?, I always think of it as sort of the poster child for the danger of "design by committee".

Sure it's ideas spawned many of today's languages, But wasn't that because at the time nobody could afford to actually implement the spec. So we ended up with a ton of "algols buts" (like algol but can actually be implemented and runs on real hardware).

dboreham•2mo ago
Used extensively on Burroughs mainframes.
Taniwha•2mo ago
Burroughs used an Algol60 derivative (not '68)
pjmlp•2mo ago
ESPOL initially, which evolved into NEWP.
Taniwha•2mo ago
ESPOL was (is?) simply a version of the standard Algol compiler that let you do 'system' sorts of things.

The Burroughs large systems architecture didn't really protect you from yourself, system security/integrity depended on only letting code from vetted compilers run (only a compiler could make a code file, and only a privileged person could make a program a compiler) - so the Algol 60 compiler made code that was safe, Espol could make code that wasn't, could do things a normal user couldn't - you kept the espol compiler somewhere safe away from the students ....

(there was a well known hole in this whole thing involving mag tapes)

pjmlp•2mo ago
As mentioned it evolved into NEWP, and you can get all the manuals from Unisys, as they keep selling it.

Given its architecture, it is sold for batch processing systems where security is paramount.

Yes, ESPOL and NEWP, being one of the first systems languages with UNSAFE code blocks, a binary that is compiled having unsafe is tainted and requires administrator configuration before being allowed to execute by the system.

One cannot just compile such code and execute it right away.

somat•2mo ago
Wow, The Burroughs large system had special instructions explicitly for efficient algol use. You could almost say it was algol hardware. but algol 60 not 68.

https://en.wikipedia.org/wiki/Burroughs_Large_Systems

There is a large system emulator that runs in a browser, I did not get any algol written but I did have way to much fun going through the boot sequence.

https://www.phkimpel.us/B5500/webUI/B5500Console.html

pjmlp•2mo ago
Yes, for example UK navy had a system developed in Algol 68 subset.

https://academic.oup.com/comjnl/article-abstract/22/2/114/42...

Onavo•2mo ago
They can just fork off the Golang frontend and it would be the same, maybe patch the runtime a bit.
MangoToupe•2mo ago
Does gcc even support go?
ameliaquining•2mo ago
Yes, though language support runs behind the main Go compiler. https://go.dev/doc/install/gccgo
wahern•2mo ago
Until a few years ago, gccgo was well maintained and trailed the main Go compiler by 1 or 2 releases, depending on how the release schedules aligned. Having a second compiler was considered an important feature. Currently, the latest supported Go version is 1.18, but without Generics support. I don't know if it's a coincidence, but porting Generics to gccgo may have been a hurdle that broke the cadence.
ratmice•2mo ago
Seems doubtful, given that generics and the gccgo compiler were both spearheaded by Ian Lance Taylor, it seems more likely to me that him leaving google would be a more likely suspect, but I don't track go.
pjmlp•2mo ago
This has been stagnant long before he left.
syockit•2mo ago
The best thing about gccgo is that it is not burdened with the weirdness of golang's calling convention, so the FFI overhead is basically the same as calling an extern function from C/C++. Take a look at [0] and see how bad golang's cgo calling latency compare to C. gccgo is not listed there but from my own testing it's the same as C/C++.

[0]: https://github.com/dyu/ffi-overhead

wahern•2mo ago
Isn't that horribly out of date? More recent benchmarks elsewhere performed after some Go improvements show Go's C FFI having drastically lower overheard, by at least an order of magnitude, IIUC.
MangoToupe•2mo ago
> The best thing about gccgo is that it is not burdened with the weirdness of golang's calling convention

Interesting. I saw go breaking from the c abi as the primary reason to use it; otherwise you might as well use java or rust.

pjmlp•2mo ago
Being an old dog, as I mention elsewhere, I see a pattern with gcj.

GCC has some rules to add, and keep frontends on the main compiler, instead of additional branches, e.g. GNU Pascal never got added.

So if there is no value with maintenance effort, the GCC steering will eventually discuss this.

MangoToupe•2mo ago
Where might one look to find examples of such code? I've never found algol outside of wikipedia
geocar•2mo ago
https://rosettacode.org/wiki/Category:ALGOL_68

https://github.com/search?q=algol68&type=repositories

Without knowing what your interests/motivations and backgrounds are, it is hard to make good recommendations, but if you didn't know about rosettacode or github I figured I should start with that

MangoToupe•2mo ago
What I'm taking away from this is that there's absolutely zero code of interest that is Algol 68
geocar•2mo ago
Interests vary!

Just because you can’t find something interesting doesn’t mean it isn’t interesting.

That lesson once learned pays dividends

pjmlp•2mo ago
Old papers and computer manuals from the 1960's.

Many have been digitalized throughout the years across Bitsavers, ACM/SIGPLAN, IEEE, or university departments.

Also heavily influenced languages like ESPOL, NEWP, PL/I and its variants.

jemarch•2mo ago
You can find some modern Algol 68 code, using the modern stropping which is the default in GCC, at https://git.sr.ht/~jemarch/godcc

Godcc is a command-line interface for Compiler Explorer written in Algol 68.

NooneAtAll3•2mo ago
any algol tutorial recommendations? just to feel what's it all about
jemarch•2mo ago
I would recommend the Informal Introduction to Algol 68, available in PDF at https://algol68-lang.org/resources
lanstin•2mo ago
Wow that is cool. Pass by name. I always wanted to try it.
Y_Y•2mo ago
Just pass a string and `eval` it.
Taniwha•2mo ago
Algol60 had call by name, Algol68 doesn't really, it does have "proceduring" which creates a function to call when you pass an expression to a parameter that's a function pointer that has no parameters, you can use that to sort of do something like call by name but the expense is more obvious
0xpgm•2mo ago
In my mind this highlights something I've been thinking about, the differences between FOSS influenced by corporate needs vs FOSS driven by the hacker community.

FOSS driven by hackers is about increasing and maintaining support (old and new hardware, languages etc..) while FOSS influenced by corporate needs is about standardizing around 'blessed' platforms like is happening in Linux distributions with adoption of Rust (architectures unsupported by Rust lose support).

gldrk•2mo ago
The big difference is that Algol 68 is set in stone. This is what allows a single dedicated person to write the initial code and for it to keep working essentially forever with only minor changes. The Rust frontend will inevitably become obsolete without active development.

Algol 68 isn’t any more useful than obsolete Rust, however.

jemarch•2mo ago
The core Algol 68 language is indeed set in stone.

But we are carefully adding many GNU extensions to the language, as was explicitly allowed by the Revised Report:

  [RR page 52]
  "[...] a superlanguage of ALGOL 68 might be defined by additions to
   the syntax, semantics or standard-prelude, so as to improve
   efficiency or to permit the solution of problems not readily
   amenable to ALGOL 68."
The resulting language, which we call GNU Algol 68, is a strict super-language of Algol 68.

You can find the extensions currently implemented by GCC listed at https://algol68-lang.org/

dfawcus•2mo ago
I had a small programming task a while ago, and decided to try doing it algol68 (using the algol68 genie interpreter) simply because I'd had some exposure to the language many years ago at Uni.

It was an AWK like task, but I decided up front it was too much trouble to do in AWK, as I needed to build a graph of data structures from the input data.

In part the program had an AWK like pattern matching and processing section, which wasn't too awkward. I found having to use REF's more trouble that dealing with pointers, in part due to the forms of auto dereferencing the language uses; but that was expected.

The real problem though was that I ended up needing something like a map / hash-table, and I concluded it was too much trouble to write from scratch.

So in the end I switched the program to be written in Go.

That then suggests a few things to me:

    - it should have an extension library (prelude) offering some form of hash table.

    - it would be useful to add syntax for explicit pointers (PTR keyword) which are not automatically dereferenced when used.

    - maybe have with either something like the Go (or Zig) style syntax for selecting a member of a pointed to struct (a.b) and maybe Zig like explicit defer (ptr.\*).
That latter pointer suggestions because I found the "field OF struct" form too verbose, and especially confusing when juggling REFs which may or may not get auto dereferenced.
keepamovin•2mo ago
It's funny, I have a different view. Corporates often need LT maintenance and support for weird old systems. The majority of global programming community often chases shiny new trends in their personal tinkering.

However I think there's the retro-computing, and other hobby niches that align with your hacker view. And certainly there's a bunch of corp enthusiasm for standardizing shiny things.

uecker•2mo ago
I think you both are partially right. In fact, the friction I see are where the industry relies on the open-source community for maintenance but then pushes through certain changes they think they need, even if this alienates part of the community.
fithisux•2mo ago
You nailed it. I am in the process in my spare time to maintain old Win32 apps, that corporates and always-the-latest-and-greatest crowd has abandoned.

Most people don't care about our history, only what is shiny.

It is sad!

JoshTriplett•2mo ago
> while FOSS influenced by corporate needs is about standardizing around 'blessed' platforms like is happening in Linux distributions with adoption of Rust

Rust's target tier support policies aren't based on "corporate needs". They're based, primarily, on having people willing to do the work to support the target on an ongoing basis, and provide the logistics needed to make sure it works.

The main difference, I would say, is that many projects essentially provide the equivalent of Rust's "tier 3" ("the code is there, it might even work") without documenting it as such.

uecker•2mo ago
The issue is that certain specific parts of the industry currently pour in a lot of money into the Rust ecosystem, but selectively only where they need it.
bhaak•2mo ago
How is that different than scratching one’s own itch?
pxc•2mo ago
Personal itches are more varied and strange than corporate itches. What companies are willing to pour time (money) into is constrained by market forces. The constraints on the efforts of independent hackers are different.

Both sets of constraints produce patterns and gaps. UX and documentation are commonly cited gaps for volunteer programming efforts, for example.

But I think it's true that corporate funding has its own gaps and other distinctive tendencies.

uecker•2mo ago
It is not, but the open-source community should be aware of this and not completely realign reorganize around the itches of specific stakeholders, at least the parts of the community who are not paid by those.
FrankenApps•2mo ago
The Rust Community is working on gcc-rs for this very reason.
SkiFire13•2mo ago
gcc-rs is far from being usable. If you want to use Rust with gcc-only targets you're probably better off with rustc_codegen_gcc instead.
seg_lol•2mo ago
One could also compile to wasm, and then convert that wasm to C.
mkornaukhov•2mo ago
It sounds convenient %)
physicsguy•2mo ago
I don’t know that that is fair.

A number of years ago I worked on a POWER9 GPU cluster. This was quite painful - Python had started moving to use wheels and so most projects had started to build these automatically in CI pipelines but pretty much none of these even supported ARM let alone POWER9 architecture. So you were on your own for pretty much anything that wasn’t Numpy. The reason for this of course is just that there was little demand and as a result even fewer people willing to support it.

gnufx•2mo ago
At least it's been fine for four years of research software on a POWER9 cluster I support (with nodes like the Summit system's).
SAI_Peregrinus•2mo ago
Not just little demand, also expensive and uncommon hardware. If the maintainers don't have the hardware to test on they can't guarantee support for that hardware. Not having hardware available often happens because there's little demand for it, but the difficulty of maintaining software for rare hardware further reduces the demand for that hardware.
Levitating•2mo ago
You don't think the movement to rust is driven by hackers?
samus•2mo ago
Rust is by no means allowed in the core yet, only as drivers. So far, there are only a few drivers. Currently, only the Nova driver, Google's Binder IPC and the (out of tree) Apple drivers are of practical relevance.
pjmlp•2mo ago
I find this great, finally an easy way to play with ALGOL 68, beyond the few systems that made use of it, like the UK Navy project at the time.

Ironically, Algol 68 and Modula-2 are getting more contributions than Go, on GCC frontends, which seems stuck in version 1.18, in a situation similar to gcj.

Either way, today is for Algol's celebration.

LeFantome•2mo ago
This makes me worry for the GCC implementation of Rust. People do not seem to use or upkeep the GCC versions of languages who primary Open Source implementations are elsewhere.
pjmlp•2mo ago
There is the advantage that GCC will be only way for Rust to be available in some targets where LLVM isn't an option.

Regarding Go, gccgo was a way to have a better compiler backend for those that care about optimizations that reference Go compiler isn't capable of, due to the difference in history, philosophy, whatever.

Apparently that effort isn't seen as worthwile by the community.

InfamousRece•2mo ago
Will it compile Knuth’s test? https://en.wikipedia.org/wiki/Man_or_boy_test
chuckadams•2mo ago
That test is short enough to just paste it in here:

    begin
      real procedure A(k, x1, x2, x3, x4, x5);
      value k; integer k;
      real x1, x2, x3, x4, x5;
      begin
        real procedure B;
        begin k := k - 1;
              B := A := A(k, B, x1, x2, x3, x4)
        end;
        if k ≤ 0 then A := x4 + x5 else B
      end;
      outreal(1, A(10, 1, -1, -1, 1, 0))
    end
The whole "return by assigning to the function name" is one of my least favorite features of Pascal, which I suppose got it from Algol 60. Where I'm confused though is, what is the initial value of B in the call to A(k, B, x1, x2, x3, x4)? I'm guessing the pass-by-name semantics are coming into play, but I still can't figure out how to untie this knot.
svat•2mo ago
Yeah that's one of the things the test was designed to catch: at that point, B is a reference, to the B that is being defined. Here's a C++ translation from https://oeis.org/A132343 that uses identity functions to make the types consistent:

    #include <functional>
    #include <iostream>
    using cf = std::function<int()>;
    int A(int k, cf x1, cf x2, cf x3, cf x4, cf x5)
    {
        int Aval;
        cf B = [&]()
        {
            int Bval;
            --k;
            Bval = Aval = A(k, B, x1, x2, x3, x4);
            return Bval;
        };
        if (k <= 0) Aval = x4() + x5(); else B();
        return Aval;
    }
    cf I(int n) { return [=](){ return n; }; }
    int main()
    {
        for (int n=0; n<10; ++n)
            std::cout << A(n, I(1), I(-1), I(-1), I(1), I(0)) << ", ";
        std::cout << std::endl;
    }
So in the expression `A(k, B, x1, x2, x3, x4)`, the `B` there is not called, it simply refers to the local variable `B` (inside the function `A`), that was captured by the lambda (by reference): the same B variable that is currently being assigned.
chuckadams•2mo ago
Thanks, that's a bit easier to trace: I think what broke my brain initially is that the x1-x5 parameters were declared as real, when they're apparently nullary functions returning a real. Brings to mind CAFs in Haskell. And all that that in 1960 when most things had less CPU power than the chip in my credit card.
fanf2•2mo ago
No, because Knuth’s test was for Algol 60 and Algol 68 is a very different programming language.
adsl731898322•2mo ago
This is great news for GCC! I love how this decision supports older languages like Algol 68, keeping them alive in the FOSS world. It shows the hacker community's dedication to preserving diverse tools.
LeFantome•2mo ago
It is awesome.

That said, it really stands out to me that the two latest GCC languages are Cobol and Algol68 while LLVM gets Swift and Zig.

And Rust and Julia come from LLVM as well of course.

LeFantome•2mo ago
Does GNU Algol 68 use a garbage collector?
dribblecup•2mo ago
ALGO 68 (dc) was the go to language for Burrough's [6-8]x00 variants.

These were fairly popular for awhile and supported advanced features like multiprocessing. The demand for exercising the full range of capabilities was kind of niche but an "amateur", like myself, could make a few bucks if you knew ALGOL.

I used to have the grey manual for the Burrough's variant - I'll have to poke around to see if it's in the attic somewhere.

gnufx•2mo ago
Not relevant to GCC, but one use for an old A68 compiler was apparently to be adapted for the old NA Software Fortran 90 compiler, I was told by a former colleague. I'd have expected Ada to be a closer fit, and I don't know how well the decision worked out.
firesteelrain•2mo ago
GCC Gnat frontend is used for modern Ada development these days. Not sure if that’s what you mean