Now, I’m teaching undergraduate courses of my own and, while I do not have the flexibility to change the languages used in my current offerings, if I ever start teaching a systems programming course I will absolutely require the students to use Pony.
I’m not sure if Pony is still being used, but the language was making some headway, at least on the PLT side of things. I know the inclusion of some of their reference capabilities work (and practical implementation of prior research in the area) would be a benefit to greenfield programming language design. I think they missed going the process calculus route, instead choosing actors, but overall I liked the direction.
- In the future, you'll carry in your pocket a computer more powerful than the sum of all computers currently present at the university
- The unchecked flat memory model of C will cause numerous security issues with sometimes grave consequences in the "real world"
- Follow the design of Standard ML (SML) and adapt it to systems programming (yeah, it appeared in 1983, but surely papers have been published before that)
- Do not even think about using unsigned types for sizes and get rid of implicit numeric conversions: if (v.size() - 1 < 0) fails on empty vector in today's C++
- Deterministic resource management is still important and is _the_ feature that C++ gets praised for.
- Lack of standard ABI will cause a lot of headaches and lost time.
- I would tell him about LLVM IR, .NET assemblies, metadata and encourage him to first standardize an intermediate format which the compiler could read and write. That'd ensure seamless interoperability between compilers and even other languages.
- Related to the above point: the header/source split will become a burden.
Also, why go with constexpr as a replacement (which is not as expressive unless I have badly misunderstood how they work) for pre-processor macros. There have been type-safe and sound implementations of macros, along with explicitly staged computations, since the early 2000’s, why would that not be more preferable?
I think the article is a fun thought exercise, but i think it attempts to stick too closely to what C++ has become in our timeline and ignored better alternatives that if explained and implemented at the outset would result in a language that retained the performance and abstraction characteristics of C++ as it is today but would place it on sound foundation for further evolution as the language adapts to changes in the industry at large.
All you could pass as a parameter to a function were pointers to structs. In fact, with one exception, all parameters to functions were basically a machine word. Either a pointer or a full size int. Exception were doubles (and all floating point args were passed as doubles).
Hmm..maybe two exceptions? Not sure about long.
The treatment of structs as full values that could be assigned and passed to or returned from functions was only introduced in ANSI C, 1989.
And of course the correct recommendation to Bjarne would be: just look at what Brad is doing and copy that.
> During 1973-1980, the language grew a bit: the type structure gained unsigned, long, union, and enumeration types, and structures became nearly first-class objects (lacking only a notation for literals).
And
> By 1982 it was clear that C needed formal standardization. The best approximation to a standard, the first edition of K&R, no longer described the language in actual use; in particular, it mentioned neither the void or enum types. While it foreshadowed the newer approach to structures, only after it was published did the language support assigning them, passing them to and from functions, and associating the names of members firmly with the structure or union containing them. Although compilers distributed by AT&T incorporated these changes, and most of the purveyors of compilers not based on pcc quickly picked up them up, there remained no complete, authoritative description of the language.
So passing structs entered the language before C89, and possibly was available in some compilers by 1979. I was very active in C during this period and was a member of X3J11 (I happen to be the first person ever to vote to standardize C, due to alphabetical order), but unfortunately I'm not able to pin down the timing from my own memory.
P.S. Page of 121 of K&R C, first edition, says "The essential rules are that the only operations that you can perform on a structure are take its address with c, and access one of its members. This implies that structures may not be assigned to or copied as a unit, and that they cannot be passed to or returned from functions. (These restrictions will be removed in forthcoming versions.)"
So they were already envisioning passing structs to functions in 1978.
The plot is based on the premise that H.G. Wells actually invents a time machine, and it's used by Jack the Ripper to travel to 1979 San Francisco.
The second best advice is probably, do just c with classes. Allow defining your own allocator to make objects of those classes. It's fine if objects built with one allocator can only refer to objects built by the same one.
Don't do templates, just do the minimum needed for a container type to know what type it contains, for compile time type checking. If you want to build a function that works on all numbers regardless if they are floats or complex or whatever, don't, or make it work on those classes and interfaces you just invented. A Float is a Number, as is an Integer. Put all that cleverness you'd waste on templates into making the compiler somewhat OK at turning that into machine types.
Very specifically don't make the most prominent use of operator overloading a hack to repurpose the binary left shift operator to mean `write to stream`. People will see that and do the worst things imaginable, and feel good about themselves for being so clever.
kelseyfrog•1w ago
The individual merits of language features hold relatively little value compared to the sausage making machine that is the C++ language evolution process.