If I were to venture to guess: I'd say "most developers" seek work-life balance and aren't interested in reading long-form articles on how to do X or Y when they are off the clock.
I am happy that you are here but there is no need for hyperbole.
Towards the end, I worked on a project to port Wang VS OS and its COBOL to AIX. I was tasked finding issues with COBOL Programs we had on the VS. It was a good environment, but Wang went CPT 11 before it was ready :) It was rather close to being complete.
I imagine it's the one place where LLMs would absolutely shine. COBOL jobs are usually very verbose, lots of boilerplate, but what they do is mostly very straightforward batch processing. It's ripe for automation with LLMs.
The flip side is that banks are usually very conservative about technology (for good reason).
For example, if I prompt ChatGPT: "Write me a BF program that produces the alphabet, but inverts the position of J & K" it will deterministically fail. I've never even seen one that produces the alphabet the normal way. I can run a GP algorithm over an example of the altered alphabet string and use simple MSE to get it to evolve a BF program that actually emits the expected output.
The BPE tokenizer seems like a big part of the problem when considering the byte-per-instruction model, but fundamentally I don't think there is a happy path even if we didn't need to tokenize the corpus. The expressiveness of the language is virtually non-existent. Namespaces, type names, member names, attributes, etc., are a huge part of what allows for a LLM to lock on to the desired outcome. Getting even one byte wrong is catastrophic for the program's meaning. You can get a lot of bytes wrong in C/C++/C#/Java/Go/etc. (e.g. member names) and still have the function do exactly the same thing.
BUT: Please, oh please, write up a blog entry! I bet that would be fun to read.
Mostly to figure out the best way to replace the old systems with something newer, so not really as a "COBOL dev", though.
I heard a story about replacing COBOL with JavaScript ... and my skin still crawls thinking about it.
It sounds kinda crazy but with good change control, documentation, good relationship with the ETL team - it was pretty maintainable.
The very high salaries you hear about sometimes are always for VERY specific mainframes that are extremely old with a lot of quirks, and are usually being paid to consultants.
Seeing the horrible performance from Indian offshore firms with modern languages I cannot imagine the mess they make with legacy languages like Cobol. Or is it the other way around?
Ditto with market control, it's not some permanent crown you achieve. Companies have to keep performing to keep their market share.
E.g., if you opened an account at a major bank, and your transactions started failing, would you keep banking there?
A lot of people who land in that situation do continue banking there since they are either tied into that bank through loans/debt, or lack the time/energy to move elsewhere.
The argument that a market leader can screw up because it "owns" the market is not correct. Where's Xerox / IBM / Intel now?
In my country, no, COBOL jobs aren't well paid. They are below average.
If changes are made to these systems it’s often due to changes in regulation or driven by changes in the business(new financial products being offered etc.
Off-topic: I’ve seen quite a few mainframe related posts on HN fly by over the years. I’ve been meaning to create an account and participate but I’ve only gotten around to it just now.
And welcome!
There are some free resources available that will allow you to get training but I haven’t tried them myself. IBM Z Xplore is worth a look as an example: https://www.ibm.com/products/z/resources/zxplore
I hope you find a way in, more mainframe developers and sysadmins(often called systemsprogrammers in the mainframe niche) are always needed.
Edit*: Spelling and grammar
They have it hooked in to VS Code now. It’s weirdly modern. And you get to play on a real z machine.
Recommendable summer/holiday tinkering project. It’s amazing how much and yet how little has changed in computing and transaction processing.
https://www.coursera.org/professional-certificates/ibm-z-mai...
Also the MOSHIX mainframe YouTube channel has a lot of info, and helped me setup HERCULES emulator for my own little mainframe experience.
You can also look at the IBM Redbooks site[2]. Search for terms like Z/OS, MVS, CICS, DB/2, etc. and you'll find a lot of IBM books, whitepapers (well, they call them redpapers, but whatever) and so on.
Computing total exposures and possible loss distributions are things which can be more computationally heavy. It includes grouping together similar policies which is multiplicative in complexity.
The system contained records of all their policies including all the premium factors (e.g. make, model, year of car, parking location, physical address, etc). These premium factors were then fed into a rating engine which would use actuarial tables with custom actuarial defined algorithms to determine premiums.
In insurance companies, working out the correct premium is core to everything. Insurance companies can have lots of different products and their competitive edge comes from how well they structure their offerings and determine the correct premiums based on risk factors. One does not simply rewrite such a thing.
Couple of things I thought were a bit interesting about the place:
- Their single server running Universe Basic (with a hot spare I believe) had 4TB of RAM. - While I was used to the devs being the stars of the show at the consulting house I worked at, at insurance companies it’s the actuaries.
The better your risk models, the more easily you can offer competitive premiums without over-exposure.
(I have worked in insuretech. Although my work was on transactional services and our white labelling capabilities rather than the math heavy stuff)
Life Insurance is mostly a savings product, and the insurance part protects you if you live too long.
Property and casualty insurance protects you from losses, including someone's life, but also houses, cars, etc.
The domains are quite different, but they both have specific "insurance business" computing that's related to actuarial science or analysis, i.e. the statistics needed to calculate reserves, policies, prices etc.
I doubt COBOL is used for any actuarial analysis. I think SAS is still strong, but I suspect R is used now. Maybe Python is used in the more static calculations that are handed over to developers, but the actuaries are typically coding whatever they need when they create their models.
The rest is just case management, automating business rules, bookkeeping, payments, and for life insurance also systems for trading securities and funds, and possibly in-house tools for asset management.
There isn't really a strong case for COBOL. The only reason COBOL still is used is that the insurance companies where early adopters and saw computing as a way to reduce the administrative overhead. The investments were made at a time when trusting some hippies running UNIX wasn't really on the table, and even less so trusting some nerds and their rickety PCs. They built up a workforce with COBOL devs that also gained quite a lot of business knowledge.
The digitalisation created another problem - a lot of the older employees were hired to do simple administrative tasks. Even big corporations aren't totally psychopathic so it actually has taken a long time to shift out the employees, and retrain the remainder for the jobs that got more demanding. Even the employees that didn't really have that much high-value domain-specific knowledge to begin with. So the case for more automation was actually not as strong as it could have been.
Even still, although especially life insurance is a totally digital product (damage claims is not), they primarily see IT as a cost centre at heart although they probably claim they do not.
This has shaped their systems and they have tended to replace their old systems when they're forced by external factors, as the upsides - better digital sales, more automated decisions, better trading experience for their customers,etc are not as easy to achieve as the more tangible administrative automation cost savings they started out with.
Actually, this also applies to banks. You could totally run an insurance company or bank without any mainframes or a single line of COBOL. But the organisations still have COBOL developers and maybe more important an upper management that come from that tradition.
Currently working on migration of 30yo ERP without tests in Progress to Kotlin+PostgreSQL.
AI agents don’t care which code to read or convert into tests. They just need an automated feedback loop and some human oversight.
Then again, a human won't know all requirements either; over time, requirements are literally encoded.
Is there a delta? Debug and add a unit test to capture the bug. Then fix and move to the next delta.
And, nobody knew how the whole worked. Everyone has their niche of interaction with the system. They would be able to shave off an insane percentage off expenses (in the form of employees whose job exists for no real reason), but the switching costs would also be immense.
I sometimes wonder what came of their company. The system was so far beyond the complexity that anyone could grasp, they had no inhouse devs, they'd need people with the competency to judge which competency they need.
They wouldn't hang around here though.
https://www.reddit.com/r/cobol/
If you're every really bored, search around the HN archives to find out how I accidentally founded that community as a result of a joke. :-)
But keep in mind there was no internet, no podcasts, no youtube. If you needed to learn something you learned it at work, on work time. If something new was introduced, IBM came in and conducted training, at work. For something big you might get sent to an IBM training center for a week. There was no need for (and no real way to do) any learning on your own outside of work.
Lots of batch jobs running at night. Their alert system is an actual human who calls my mom when jobs fail in the middle of the night.
It's high paying for the city they live in, but not high paying for software development. They will both have full retirement and healthcare for life, assuming the government can fulfill it. They are both fully remote since COVID too.
She's also worked for state lottery, teacher's retirement system and DOT.
edit: she says they have a SQL database, but mostly store in IBM IMS
They really are. I had a parttime coworker who moonlighted some mainframe job and he often had another laptop on his desk connected to a z/OS terminal. He would show me some of the jobs and code occasionally too, really fascinating stuff, and he was quite good at it and could navigate quickly.
Color in terminal emulators was one of the main perks of Linux over other Unixes for me at first!
This hasn't been virtualized?
(paraphrased: https://groups.google.com/g/golang-nuts/c/hJHCAaiL0so/m/kG3B...)
Pike genuinely doesn't seem to care for syntax highlighting, though understands that his is a minority opinion.
Fun fact: the first SMP UNIX implementation ran on top of EXEC 8, the kernel of OS 2200.
"Any configuration supplied by Sperry, including multiprocessor ones, can run the UNIX system."
https://www.nokia.com/bell-labs/about/dennis-m-ritchie/retro...
Edit: https://web.archive.org/web/20150611114648/https://www.bell-...
One of the modules I saw in action was written before the moon landing, written by a lady programmer.
Batch jobs, clunky and very verbose programs, nothing interesting. I... hated it.
I was part of team that was writing web applications that needed to call z/OS transactions. The IBM solution was to use their Transaction Gateway product, which cost a ton, and was slow as shit. We developed a framework that used annotations on the Java Side to map COBOL Records to Java Objects and invoke transactions over a TCP socket. Learning how to pack decimals and convert encodings was pretty cool. We ended up with a framework that was at least a zillion times faster than the IBM solution. Left that job though as the company was is distress and was losing customers (health plans). They eventually folded.
On the other hand, these guys generally write pretty neat, lean code that is quick, reliable, and directly responsive to the business. The really fun thing is watching the users fly through the keyboard-only screens, sometimes with muscle memory that is faster than the terminal emulator can update - they're literally working ahead of the screens.
ASCII tables, text only, with F key shortcuts. Hard to learn but blazing fast once you did.
Nothing modern approaches it.
Few points I can easily remember:
1. Navigating the code, e.g easily see all the callers, navigate up/down the call tree requires static code analysis. Super handy while reading someone's else code, which is like 90% on large projects.
2. Quick refactorings. Often times I see people discuss in lengths what would/could be instead of just go and try it out quickly, seeing all the pros and cons. Many times I proven myself wrong by trying it out and seeing pitfalls I didn't see earlier.
3. Warnings: so many real bugs could've been prevented if developers had seen (or cared about) to IDE showing a warning. Many PR review suggestions are detectable by a proper IDE without wasting reviewer's time.
4. Hotkeys (what the parent comment was talking about) -- speeds up all of that, especially refactorings, freeing dev's brain for thinking of architecture and other problems.
I can go on an on. Sometimes it feels like 50%+ of AI usage for coding is to free up fingers, not knowing that they were already mostly free by using static analysis features/hotkeys.
From a technical-cultural perspective it was mostly sulkiness, and a complete and utter lack of embracing the paradigms of distributed computing. Also, like most internal clouds, there were plenty of issues as it was. Initially they just tried to replace mainframe application components 1:1 onto VMs in whatever way and whenever anything was <100% reliable they complained that our cloud was not able to do it. I had to explain in a very harsh way, under a lot of pressure (I believe not hitting the deadline of switching off the mainframes meant renewal for a year at 40 Mil.. or thereabouts) the realities of "cloud".
The developers I spoke with in that time though, were very much the opposite of the move fast breaking things crowd. Intelligent, but also narrow minded I would say.
He’s trying to learn Go now and modernize himself to see if he can get out. I’m trying to help as much as I can. Hopefully, he’ll land a job somewhere else this year.
The recent hacking of BMP shows the risk this creates (poorly paid employee with debts sold his password to hackers).
A legitimate question, but so far not many answers, and they're mostly from people who know people who know COBOL devs. This is to be expected.
Demographically, COBOL devs skew older, and there aren't a lot of graybeards left on HN. This place used to be full of them, and they always had interesting and unusual insights and techniques to share. Those days are long gone.
IMO, Graybeards have largely left HN for a few reasons:
- They're tired of being shouted down by the Reddit-quality ageism that lingers through this forum.
- They're mature enough to no longer be interested in chasing every little tech fad as if their lives depended on it, and that's 90% of what HN has become.
- As most older people do, have other things in their lives that are more interesting than work. Family. Children. Hobbies. Traveling. Service. The world is full of things more rewarding than being terminally online, or being reminded of your day job.
I applaud your curiosity, but you're standing in a church asking, "Where are all the atheists?" COBOL devs aren't here. And where they are is likely not online.
We need more like this, please.
If such an online community exists, where did these graybeards go to?
This was around 1999, and I was building a system for configuring and ordering custom PC's at a large distribution company. One of the feature requirements was that we display inventory over the various options. (ie: There are 373 20G disks in stock, but only 12 30G disks in stock). The idea was that this would let a customer ordering 200 machines know that they should pick the 20G disk if the wanted it now.
The way inventory at this company was done was via two systems. There was a normal SQL database that had access to a daily snapshot of inventory taken from a mainframe that always had the up to date data. With the mainframe taking a while to process queries, we used the less current SQL database for the majority of the UI, but took the time to query the mainframe once a customer was in the shopping cart. Customers might see a change during the flow, but it would at least let them see the most current data prior to committing to a purchase.
The mainframe query itself was implemented by someone else as a COBOL job that produced the required inventory numbers. From my point of view, it was just a limited sort of query conducted over a specialized JDBC driver. (Not necessarily the weirdest aspect of that design.... for a variety of reasons, we did the UI in ASP/VBScript, the back end using Microsoft's JVM invoked via COM Automation, and the SQL database link was via a dubious use of a JDBC/ODBC bridge to connect to SQL Server. It all worked, but not the most solid architecture.)
==
My only other mainframe experience was working as an intern for a utility company a few years prior (1991-1992). They used CDC Cyber mainframes to run the power grid, using something like 4MM lines of FORTRAN code. The dispatchers themselves interfaced to the system using consoles with 4 19" color displays running at 1280x1024. Heady stuff for 1991. (The real time weather radar screen was cool too, in an age before the internet made it common place.)
There’s not much maintenance work. There are very few bugs, as the core applications have been running for decades, most come up with interactions to external services.
Any major development projects are only in service of lower overall COBOL development time, like transitioning some business logic to database updates.
And there is a decommission plan for the mainframe, so plenty of work helping that team.
We work on Phoenix, government of Canada payroll system. If you google it up, you'll see some interesting coverage. However, the underlying peoplesoft ERP itself is rock solid at every other client I've served over last 25 years. Peoplesoft uses Cobol and sqr, as well as proprietary languages stored in database, application engine and peoplecode.
Key payroll processes are in Cobol. This is because of its tight integration with database and ability to manually control database cursors. We are very database oriented when it comes to performance. Our developers need to know the programming language, but also deep understanding of client business processes, and sql optimization. They also work closely with our dbas to ensure good performance. So our developers are technically proficient in Cobol and couple of other languages, but also very very strong in sql optimization, and understand clients payroll rules and can speak intelligently with compensation advisers and payroll processors.
I personally found that to be true for most Cobol programmers - whereas typical hacker news Dev seems very technology oriented and frequently moving, typical Cobol programmer is very business process aware and integrated with corporate line of business. They don't move as much for several reasons, but that deep awareness of client is one of them.
Edit: I shoild mention, while peoplesoft can and does work on mainframe, most of my clients are on windows, Linux, or AIX. COBOL is not quite as mainframe specific as it sometimes seems :-). See e.g.microfocus Cobol for a modern multi platform compiler.
I'm being a little unkind to my Dad. He moved to management fairly early on and didn't really keep up with things.
He taught me a hell of a lot though, and did really know his shit at one point. It worries me how much his skills and understanding have declined over the years.
For some reason I think we’re all drawn to the idea of working with an older language. I wonder why!
- Item Database (SKU, UPC, attributes)
- Stock Status (Sales, Onhands, process sales transactions from POS systems)
- Replenishment (Create vendor and DC orders to stores)
- Reporting (Report sales and other performance data by stores, departments, classes, etc..)
- Pricing (make and maintain current and future price changes, send to store when needed).
- Many other applications (15+)
They have been saying they are going to get rid of these applications for over 20 years now. I am still waiting...
There are three flavors of COBOL that I deal with: PeopleSoft delivered, Vendor delivered, and University modified. Most of the work I do in COBOL breaks down to reading the code to determine why a given behavior is observed. Only once (in University modified code) have I needed to make an actual edit. The rest of the times I either modify the flow of information into the COBOL code or I modify the resultant after the code has run.
Most of our processes are EOD centric, we run a lot of batch jobs (mainly TSO, very little IMS). Integrations are mostly file based but we do both call and expose APIs (“regular REST APIs”) as well as consuming from and producing to Kafka among other things. We integrate with both mainframe and distributed systems on prem as well as “internal” systems hosted on cloud.
We use Git for source control but have a CI/CD solution that is built in house. Quite a lot of other infrastructure is also built in house.
I am mid 30s and am on the younger side looking at the mainframe area as a whole at my employer, however in my team we have several developers is in their 20/30s.
My background is mainly back- and frontend development on the Microsoft tech stack but I have worked, and do work, with whatever is at hand/required. But mostly stuff like .NET and SQL Server on the backend, and Angular/Vue/React on the front end before this.
People joke about old coders brought out of their retirement to maintain a dusty COBOL/RPG program, while the reality is that the tooling is simple enough that a young developer could learn them in a month, and master in less than a year.
Plus, the expertise is not lost after a few years, given the platform focus on incremental improvement, and backwards compatibility.
Many megacorps still run AS/400 and it’s uptimes and performance is legendary.
Edit: Forgot to mention that I was mentored by folks more than twice my age that time.
zTPF runs the airlines, the model for real online transaction systems... Not SQL relational database shopping carts.
Botton line, everything is Assembler. It's all just bits.
a) the systems are very tightly coupled, like uber-monolith architecture, and it's hard to QA them without end-to-end testing through everything. Good luck getting anyone to refactor them because a1) they're going away and a2) they're so huge. Which leads into . . .
b) there's 40 years of undocumented business logic in there, or business logic that only exists in code comments. And it still makes billions of dollars a year. So good luck replicating 40 years of tribal knowledge in the new systems.
c) It was written by cowboy coders 40 years ago when the company was a startup, so no one can learn to work on it without first getting hired here. The joke is one of the original architects went and created his own dialect of COBOL.
I had less trouble with assembly than COBOL at the time I'm afraid.
They're both weird beasts, but once I learned the C calling conventions for assembly, I was able to make a lot more sense of the assembly.
COBOL is a world unto itself. I didn't hate it, but I didn't think I saw a career in it either.
I'm just glad I opted not to take RPG :-)
I work in COBOL for government systems.
We upgraded our old mainframes to IBM Z16's, and we just got some new ones in recently for our backup server.
Part of the job is taking whatever they decide in legislation and translating that into code that processes whatever they decided to make law, from fees, suspensions, special legislation for certain areas, etc.
Our programming environment primarily uses TSO/ISPF and changeman for version control. We have access to IBM's IDz (previously RDz) as an IDE for development, and I would personally prefer to use that but haven't been able to get it installed on my work computer due to licensing issues. Part of security protocol is that we cannot use anything open-source, so VSCode with Zowe is, unfortunately, out of the picture.
We maintain a lot of old programs and modules, but we are also actively developing new programs as we expand our IT department - and yes, that is new COBOL programs. We have a Linux side as well which mainly deals with the web-side, but they still interact with the mainframe to call on modules - but all they are really doing is sending data to CICS to get data back. They do not know anything about the COBOL itself or how to program in it.
andrelaszlo•16h ago
nmcfarl•16h ago
nobodyandproud•13h ago
iammrpayments•15h ago
dasil003•15h ago