If I were to venture to guess: I'd say "most developers" seek work-life balance and aren't interested in reading long-form articles on how to do X or Y when they are off the clock.
I am happy that you are here but there is no need for hyperbole.
Towards the end, I worked on a project to port Wang VS OS and its COBOL to AIX. I was tasked finding issues with COBOL Programs we had on the VS. It was a good environment, but Wang went CPT 11 before it was ready :) It was rather close to being complete.
I imagine it's the one place where LLMs would absolutely shine. COBOL jobs are usually very verbose, lots of boilerplate, but what they do is mostly very straightforward batch processing. It's ripe for automation with LLMs.
The flip side is that banks are usually very conservative about technology (for good reason).
For example, if I prompt ChatGPT: "Write me a BF program that produces the alphabet, but inverts the position of J & K" it will deterministically fail. I've never even seen one that produces the alphabet the normal way. I can run a GP algorithm over an example of the altered alphabet string and use simple MSE to get it to evolve a BF program that actually emits the expected output.
The BPE tokenizer seems like a big part of the problem when considering the byte-per-instruction model, but fundamentally I don't think there is a happy path even if we didn't need to tokenize the corpus. The expressiveness of the language is virtually non-existent. Namespaces, type names, member names, attributes, etc., are a huge part of what allows for a LLM to lock on to the desired outcome. Getting even one byte wrong is catastrophic for the program's meaning. You can get a lot of bytes wrong in C/C++/C#/Java/Go/etc. (e.g. member names) and still have the function do exactly the same thing.
Mostly to figure out the best way to replace the old systems with something newer, so not really as a "COBOL dev", though.
I heard a story about replacing COBOL with JavaScript ... and my skin still crawls thinking about it.
It sounds kinda crazy but with good change control, documentation, good relationship with the ETL team - it was pretty maintainable.
The very high salaries you hear about sometimes are always for VERY specific mainframes that are extremely old with a lot of quirks, and are usually being paid to consultants.
Seeing the horrible performance from Indian offshore firms with modern languages I cannot imagine the mess they make with legacy languages like Cobol. Or is it the other way around?
Ditto with market control, it's not some permanent crown you achieve. Companies have to keep performing to keep their market share.
E.g., if you opened an account at a major bank, and your transactions started failing, would you keep banking there?
A lot of people who land in that situation do continue banking there since they are either tied into that bank through loans/debt, or lack the time/energy to move elsewhere.
The argument that a market leader can screw up because it "owns" the market is not correct. Where's Xerox / IBM / Intel now?
In my country, no, COBOL jobs aren't well paid. They are below average.
If changes are made to these systems it’s often due to changes in regulation or driven by changes in the business(new financial products being offered etc.
Off-topic: I’ve seen quite a few mainframe related posts on HN fly by over the years. I’ve been meaning to create an account and participate but I’ve only gotten around to it just now.
And welcome!
There are some free resources available that will allow you to get training but I haven’t tried them myself. IBM Z Xplore is worth a look as an example: https://www.ibm.com/products/z/resources/zxplore
I hope you find a way in, more mainframe developers and sysadmins(often called systemsprogrammers in the mainframe niche) are always needed.
Edit*: Spelling and grammar
https://www.coursera.org/professional-certificates/ibm-z-mai...
Also the MOSHIX mainframe YouTube channel has a lot of info, and helped me setup HERCULES emulator for my own little mainframe experience.
You can also look at the IBM Redbooks site[2]. Search for terms like Z/OS, MVS, CICS, DB/2, etc. and you'll find a lot of IBM books, whitepapers (well, they call them redpapers, but whatever) and so on.
Computing total exposures and possible loss distributions are things which can be more computationally heavy. It includes grouping together similar policies which is multiplicative in complexity.
The system contained records of all their policies including all the premium factors (e.g. make, model, year of car, parking location, physical address, etc). These premium factors were then fed into a rating engine which would use actuarial tables with custom actuarial defined algorithms to determine premiums.
In insurance companies, working out the correct premium is core to everything. Insurance companies can have lots of different products and their competitive edge comes from how well they structure their offerings and determine the correct premiums based on risk factors. One does not simply rewrite such a thing.
Couple of things I thought were a bit interesting about the place:
- Their single server running Universe Basic (with a hot spare I believe) had 4TB of RAM. - While I was used to the devs being the stars of the show at the consulting house I worked at, at insurance companies it’s the actuaries.
Currently working on migration of 30yo ERP without tests in Progress to Kotlin+PostgreSQL.
AI agents don’t care which code to read or convert into tests. They just need an automated feedback loop and some human oversight.
Then again, a human won't know all requirements either; over time, requirements are literally encoded.
Is there a delta? Debug and add a unit test to capture the bug. Then fix and move to the next delta.
And, nobody knew how the whole worked. Everyone has their niche of interaction with the system. They would be able to shave off an insane percentage off expenses (in the form of employees whose job exists for no real reason), but the switching costs would also be immense.
I sometimes wonder what came of their company. The system was so far beyond the complexity that anyone could grasp, they had no inhouse devs, they'd need people with the competency to judge which competency they need.
They wouldn't hang around here though.
https://www.reddit.com/r/cobol/
If you're every really bored, search around the HN archives to find out how I accidentally founded that community as a result of a joke. :-)
But keep in mind there was no internet, no podcasts, no youtube. If you needed to learn something you learned it at work, on work time. If something new was introduced, IBM came in and conducted training, at work. For something big you might get sent to an IBM training center for a week. There was no need for (and no real way to do) any learning on your own outside of work.
Lots of batch jobs running at night. Their alert system is an actual human who calls my mom when jobs fail in the middle of the night.
It's high paying for the city they live in, but not high paying for software development. They will both have full retirement and healthcare for life, assuming the government can fulfill it. They are both fully remote since COVID too.
She's also worked for state lottery, teacher's retirement system and DOT.
edit: she says they have a SQL database, but mostly store in IBM IMS
They really are. I had a parttime coworker who moonlighted some mainframe job and he often had another laptop on his desk connected to a z/OS terminal. He would show me some of the jobs and code occasionally too, really fascinating stuff, and he was quite good at it and could navigate quickly.
Color in terminal emulators was one of the main perks of Linux over other Unixes for me at first!
This hasn't been virtualized?
(paraphrased: https://groups.google.com/g/golang-nuts/c/hJHCAaiL0so/m/kG3B...)
Pike genuinely doesn't seem to care for syntax highlighting, though understands that his is a minority opinion.
Fun fact: the first SMP UNIX implementation ran on top of EXEC 8, the kernel of OS 2200.
"Any configuration supplied by Sperry, including multiprocessor ones, can run the UNIX system."
https://www.nokia.com/bell-labs/about/dennis-m-ritchie/retro...
Edit: https://web.archive.org/web/20150611114648/https://www.bell-...
One of the modules I saw in action was written before the moon landing, written by a lady programmer.
Batch jobs, clunky and very verbose programs, nothing interesting. I... hated it.
I was part of team that was writing web applications that needed to call z/OS transactions. The IBM solution was to use their Transaction Gateway product, which cost a ton, and was slow as shit. We developed a framework that used annotations on the Java Side to map COBOL Records to Java Objects and invoke transactions over a TCP socket. Learning how to pack decimals and convert encodings was pretty cool. We ended up with a framework that was at least a zillion times faster than the IBM solution. Left that job though as the company was is distress and was losing customers (health plans). They eventually folded.
On the other hand, these guys generally write pretty neat, lean code that is quick, reliable, and directly responsive to the business. The really fun thing is watching the users fly through the keyboard-only screens, sometimes with muscle memory that is faster than the terminal emulator can update - they're literally working ahead of the screens.
ASCII tables, text only, with F key shortcuts. Hard to learn but blazing fast once you did.
Nothing modern approaches it.
From a technical-cultural perspective it was mostly sulkiness, and a complete and utter lack of embracing the paradigms of distributed computing. Also, like most internal clouds, there were plenty of issues as it was. Initially they just tried to replace mainframe application components 1:1 onto VMs in whatever way and whenever anything was <100% reliable they complained that our cloud was not able to do it. I had to explain in a very harsh way, under a lot of pressure (I believe not hitting the deadline of switching off the mainframes meant renewal for a year at 40 Mil.. or thereabouts) the realities of "cloud".
The developers I spoke with in that time though, were very much the opposite of the move fast breaking things crowd. Intelligent, but also narrow minded I would say.
He’s trying to learn Go now and modernize himself to see if he can get out. I’m trying to help as much as I can. Hopefully, he’ll land a job somewhere else this year.
The recent hacking of BMP shows the risk this creates (poorly paid employee with debts sold his password to hackers).
A legitimate question, but so far not many answers, and they're mostly from people who know people who know COBOL devs. This is to be expected.
Demographically, COBOL devs skew older, and there aren't a lot of graybeards left on HN. This place used to be full of them, and they always had interesting and unusual insights and techniques to share. Those days are long gone.
IMO, Graybeards have largely left HN for a few reasons:
- They're tired of being shouted down by the Reddit-quality ageism that lingers through this forum.
- They're mature enough to no longer be interested in chasing every little tech fad as if their lives depended on it, and that's 90% of what HN has become.
- As most older people do, have other things in their lives that are more interesting than work. Family. Children. Hobbies. Traveling. Service. The world is full of things more rewarding than being terminally online, or being reminded of your day job.
I applaud your curiosity, but you're standing in a church asking, "Where are all the atheists?" COBOL devs aren't here. And where they are is likely not online.
We need more like this, please.
This was around 1999, and I was building a system for configuring and ordering custom PC's at a large distribution company. One of the feature requirements was that we display inventory over the various options. (ie: There are 373 20G disks in stock, but only 12 30G disks in stock). The idea was that this would let a customer ordering 200 machines know that they should pick the 20G disk if the wanted it now.
The way inventory at this company was done was via two systems. There was a normal SQL database that had access to a daily snapshot of inventory taken from a mainframe that always had the up to date data. With the mainframe taking a while to process queries, we used the less current SQL database for the majority of the UI, but took the time to query the mainframe once a customer was in the shopping cart. Customers might see a change during the flow, but it would at least let them see the most current data prior to committing to a purchase.
The mainframe query itself was implemented by someone else as a COBOL job that produced the required inventory numbers. From my point of view, it was just a limited sort of query conducted over a specialized JDBC driver. (Not necessarily the weirdest aspect of that design.... for a variety of reasons, we did the UI in ASP/VBScript, the back end using Microsoft's JVM invoked via COM Automation, and the SQL database link was via a dubious use of a JDBC/ODBC bridge to connect to SQL Server. It all worked, but not the most solid architecture.)
==
My only other mainframe experience was working as an intern for a utility company a few years prior (1991-1992). They used CDC Cyber mainframes to run the power grid, using something like 4MM lines of FORTRAN code. The dispatchers themselves interfaced to the system using consoles with 4 19" color displays running at 1280x1024. Heady stuff for 1991. (The real time weather radar screen was cool too, in an age before the internet made it common place.)
There’s not much maintenance work. There are very few bugs, as the core applications have been running for decades, most come up with interactions to external services.
Any major development projects are only in service of lower overall COBOL development time, like transitioning some business logic to database updates.
And there is a decommission plan for the mainframe, so plenty of work helping that team.
We work on Phoenix, government of Canada payroll system. If you google it up, you'll see some interesting coverage. However, the underlying peoplesoft ERP itself is rock solid at every other client I've served over last 25 years. Peoplesoft uses Cobol and sqr, as well as proprietary languages stored in database, application engine and peoplecode.
Key payroll processes are in Cobol. This is because of its tight integration with database and ability to manually control database cursors. We are very database oriented when it comes to performance. Our developers need to know the programming language, but also deep understanding of client business processes, and sql optimization. They also work closely with our dbas to ensure good performance. So our developers are technically proficient in Cobol and couple of other languages, but also very very strong in sql optimization, and understand clients payroll rules and can speak intelligently with compensation advisers and payroll processors.
I personally found that to be true for most Cobol programmers - whereas typical hacker news Dev seems very technology oriented and frequently moving, typical Cobol programmer is very business process aware and integrated with corporate line of business. They don't move as much for several reasons, but that deep awareness of client is one of them.
Edit: I shoild mention, while peoplesoft can and does work on mainframe, most of my clients are on windows, Linux, or AIX. COBOL is not quite as mainframe specific as it sometimes seems :-). See e.g.microfocus Cobol for a modern multi platform compiler.
I'm being a little unkind to my Dad. He moved to management fairly early on and didn't really keep up with things.
He taught me a hell of a lot though, and did really know his shit at one point. It worries me how much his skills and understanding have declined over the years.
For some reason I think we’re all drawn to the idea of working with an older language. I wonder why!
- Item Database (SKU, UPC, attributes)
- Stock Status (Sales, Onhands, process sales transactions from POS systems)
- Replenishment (Create vendor and DC orders to stores)
- Reporting (Report sales and other performance data by stores, departments, classes, etc..)
- Pricing (make and maintain current and future price changes, send to store when needed).
- Many other applications (15+)
They have been saying they are going to get rid of these applications for over 20 years now. I am still waiting...
There are three flavors of COBOL that I deal with: PeopleSoft delivered, Vendor delivered, and University modified. Most of the work I do in COBOL breaks down to reading the code to determine why a given behavior is observed. Only once (in University modified code) have I needed to make an actual edit. The rest of the times I either modify the flow of information into the COBOL code or I modify the resultant after the code has run.
Most of our processes are EOD centric, we run a lot of batch jobs (mainly TSO, very little IMS). Integrations are mostly file based but we do both call and expose APIs (“regular REST APIs”) as well as consuming from and producing to Kafka among other things. We integrate with both mainframe and distributed systems on prem as well as “internal” systems hosted on cloud.
We use Git for source control but have a CI/CD solution that is built in house. Quite a lot of other infrastructure is also built in house.
I am mid 30s and am on the younger side looking at the mainframe area as a whole at my employer, however in my team we have several developers is in their 20/30s.
My background is mainly back- and frontend development on the Microsoft tech stack but I have worked, and do work, with whatever is at hand/required. But mostly stuff like .NET and SQL Server on the backend, and Angular/Vue/React on the front end before this.
People joke about old coders brought out of their retirement to maintain a dusty COBOL/RPG program, while the reality is that the tooling is simple enough that a young developer could learn them in a month, and master in less than a year.
Plus, the expertise is not lost after a few years, given the platform focus on incremental improvement, and backwards compatibility.
Many megacorps still run AS/400 and it’s uptimes and performance is legendary.
Edit: Forgot to mention that I was mentored by folks more than twice my age that time.
zTPF runs the airlines, the model for real online transaction systems... Not SQL relational database shopping carts.
Botton line, everything is Assembler. It's all just bits.
a) the systems are very tightly coupled, like uber-monolith architecture, and it's hard to QA them without end-to-end testing through everything. Good luck getting anyone to refactor them because a1) they're going away and a2) they're so huge. Which leads into . . .
b) there's 40 years of undocumented business logic in there, or business logic that only exists in code comments. And it still makes billions of dollars a year. So good luck replicating 40 years of tribal knowledge in the new systems.
c) It was written by cowboy coders 40 years ago when the company was a startup, so no one can learn to work on it without first getting hired here. The joke is one of the original architects went and created his own dialect of COBOL.
andrelaszlo•5h ago
nmcfarl•4h ago
nobodyandproud•2h ago
iammrpayments•4h ago
dasil003•4h ago