We take this for granted now, but at the time it was revolutionary. In part, we've done things like mandating Unicode and IEEE 754, but nowadays most of our languages also encourage portability. We think very little of moving an application from Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess), but back when Cobol was being created, you normally threw your programs away (“reprogramming”) when you went to a new machine.
I haven't used Cobol in anger in 50 years (40 years since I even taught it), but for that emphasis on portability, I am very grateful.
you need special custom numerical types to come even close in, say, java or C++ or any other language.
I guess you mean:
>digest -> digits
>loosing -> losing
Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo Pascal had that as an option, or maybe I am thinking of something else, sorry, it's many years ago.
1100 in “regular” binary is 12 in decimal.
0001 0010 in BCD is 12 in decimal.
ie: bcd is an encoding.
High precision numbers are more akin to the decimal data type in SQL or maybe bignum in some popular languages. It is different from (say) float in that you are not losing information in the least significant digits.
You could represent high precision numbers in BCD or regular binary… or little endian binary… or trinary, I suppose.
(There are a few other threads with a smaller number of comments.)
"COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP."
Lisp isn't as widely used as, say, Python, but it's still something a lot of people touch every single day.
I feel that the article should have made this a lot more clear - as so many people code along the APL -> Matlab / R (via S) -> NumPy family tree.
My main beef, however, is that the last sentence in the section seems to suggest that the birth of Haskell killed SML on the vine because suddenly everybody only wanted pure, lazy FP. That's just wrong. The reality is that these two branches of Functional Programming (strict/impure and lazy/pure) have continued to evolve together to the present day.
This is when I started professionally and we were asked to replace "slow, old Perl scripts" As a new entrant, I didn't ask many questions, but I also didn't see any of the replacements as improvements in any way. I think the # of devs left to take over messy Perl projects was shrinking.
As you might imagine, this job involved a lot of text processing. People still point to that as the arrow in Perl's quiver, but it seems especially quaint today since any language I'd reach for would blow it out of the water in terms of flexibility and ease of use.
Would I be wrong in saying that SQL has what feels to me to be a very cobaly syntax. By which I mean, I know it is not directly related to cobal, But someone definitely looked at cobal's clunky attempt at natural language and said "that, I want that for my query language"
I put the blame solely on the management of Borland. They had the world leading language, and went off onto C++ and search of "Enterprise" instead of just riding the wave.
When Anders gave the world C#, I knew it was game over for Pascal, and also Windows native code. We'd all have to get used to waiting for compiles again.
For some reason I remember an odd feature of PL/1: Areas and offsets. If I am remembering correctly, you could allocate structures in an area and reference them by offset within that area. That stuck in my mind for some reason, but I never found a reason to use it. It struck me as a neat way to persist pointer-based data structures. And I don't remember seeing the idea in other languages.
Maybe the reason it stayed with me is that I worked on Object Design's ObjectStore. We had a much more elegant and powerful way of persisting pointer-based structures, but an area/offset idea could have given users some of the capabilities we provided right in the language.
>> An area is a region in which space for based variables
can be allocated. Areas can be cleared of their allocations
in a single operation, thus allowing for wholesale freeing.
Moreover, areas can be moved from one place to another by
means of assignment to area variables, or through input-output
operations.
>> Based variables are useful in creating linked data struc
tures, and also have applications in record inputoutput. A
based variable does not have any storage of its own; instead,
the declaration acts as a template and describes a generation
of storage.
http://www.iron-spring.com/abrahams.pdf p. 19, 74*shrug_emoji*
When I was in grad school in the late 70s, there was a major competition to design a DoD-mandated language, to be used in all DoD projects. Safety and efficiency were major concerns, and the sponsors wanted to avoid the proliferation of languages that existed at the time.
Four (I think) languages were defined by different teams, DoD evaluated them, and a winner was chosen. It was a big thing in the PL community for a while. And then it wasn't. My impression was that it lost to C. Ada provided much better safety (memory overruns were probably impossible or close to it). It would be interesting to read a history of why Ada never took off the way that C did.
The things it got wrong were mostly in it having a rigorous mathematical definition (syntax and semantics) that was almost unreadable by humans ... and the use of 2 sets of character sets (this was in the days of cards) rather than using reserved words
I've heard of enough Cobol and Fortran jobs existing, and Lisp continues to exist in some form or other, but Algol really does seem dead. I remember someone telling me about an Algol codebase that was decommissioned in 2005 and that seemed like a very late death for an Algol codebase.
(In contrast, Lisp retains some unique ideas that have not been adopted by other languages, so it survives by a slim margin.)
Rochus•3d ago
dlachausse•2d ago
Pascal, particularly the Delphi/Object Pascal flavor, is also still in widespread use today.
Rochus•2d ago
fuzztester•6h ago
ranger_danger•9h ago
Rochus•9h ago
pessimizer•8h ago
edit: for ancient Greek to become a dead language, will we be required to burn all of the books that were written in it, or can we just settle for not writing any new ones?
eviks•2h ago
Same with a programming language - is no one is wiring code in it, it's dead
duskwuff•8h ago
Rochus•8h ago
iLoveOncall•8h ago
No.
You have to put this relative to projects started in other languages, at which points new projects started in COBOL is even less than a rounding error, it probably wouldn't result in anything other than 0 with a float.
Rochus•8h ago
iLoveOncall•7h ago
Rochus•7h ago
duskwuff•7h ago
As an aside, the article you linked to is pretty obvious AI slop, even aside from the image ("blockchin infarsucture" and all). Some of the details, like claims that MIT is offering COBOL programming classes or that banks are using COBOL to automatically process blockchain loan agreements, appear to be entirely fabricated.