Must be engineers who write requirements and unrealistic deadlines that lead to such issues.
"How To Get The Most Out Of Vibe Coding | Startup School " => https://www.youtube.com/watch?v=BJjsfNO5JTo&t=494s
I like his advice on downloading relevant documentation and putting it in the code base. That makes sense to me for targeted use cases.
20 years old me had rm -fr root access to way too many systems.
I don't think it's much different today.
If anything, I think the youngsters will learn faster from their mistakes because they already have good mentor for the easy stuff and will get grinded on the hard stuff sooner.
Some observations:
Initial velocity was high. It felt like I was getting things done. This was exciting and satisfying.
The code I wrote was structurally unsound as it ended up having no overall plan. This became more and more evident as the prototype was completed. Yes, it was a prototype, but usually I use prototypes to think about structure. (Choosing the right way to structure a problem is a key part of solving the problem)
My retention of new knowledge was terrible. Ditto for developing any deep understanding of the tools and libraries I was using. I had skipped the entire process by which I usually learn and just had the LLMs tell me solutions. This provided a poor basis for turning the prototype into production code because I had to catch up on all the thinking that should have been part of producing the prototype. And I had to sit down and read the documentation.
LLMs are only as good as your prompts. You actually need to know quite a bit in order to formulate good prompts. Being a fairly experienced programmer (and also having a some knowledge of LLMs and a former career in web search) I had significant advantages novices do not have.
Now imagine a novice who lacks the experience to see the shortcomings of their code (hey, it works doesn't it!?) and the ability to introspect and think about why they failed.
Half of the time the LLM would lead me astray and pursue paths that resulted in poor solutions when better solutions would have been more obvious if I'd just read documentation. It is easy to become focused on exhausting paths that the LLM sent you down in the first place and poke it to give you something that works.
I have nearly 40 years of programming experience. It scared me how stupid relying too much on a LLM made me. Yes, it got me started quickly and it feels faster. However the inflection point that comes some time into the learning process, where things start to click, didn't materialize. Mostly I was just poking the LLM in various ways to make it spit out solutions to tiny parts of the problem at a time.
I still use LLMs. Every day. But I use it more as a fancy search engine. Sometimes I use it to generate ideas - to try detect blind spots (solutions I don't know about, alternative approaches etc). Having been burnt by LLMs hallucinating I consistently ask LLMs to list their sources and then go and look at those.
LLMs are *NOT* mentors. A mentor isn't someone who does the thinking and the work for you. A mentor is also not an alternative to reference material or the means by which you find information. You're expected to do that yourself. A mentor is not someone who eliminates the need to grind through things. You have to grind through things to learn. There is no alternative if you are going to learn.
A mentor is someone who uses their experience and insight to help your personal growth.
Maybe? Back when I had to troubleshoot coaxial network terminations uphill both ways in the snow, we had to learn how things actually work (e.g., what's in a tcp header, how to read an RFC) and that made debugging things a little more straightforward.
I'm pretty insulated from young developers these days, but my limited experience is that they might know the application, presentation and maybe session layers (if using the old OSI model) but everything below that is black magic.
Sometimes people make mistakes, sometimes people are incompetent, sometimes managers suck, sometimes it's a multi-layered issue.
It is not even about being technical. Have a person put data on a spreadsheet and you can get into so many errors if the procedure of doing that is bad.
Look, I've been an executive and a management consultant for a long time now (started as a sysadmin and developer), and it's quite often the case that everything was late (decisions, budgets, approvals, every other dependency) but for some reason it's ok that the planned 4 months of development is now compressed in to 2.5 months.
I have been involved to some degree or another in probably close to 300 moderately complex to highly complex tech projects over the last 30 years. That's an extremely conservative number.
And the example I describe above is probably true for 85% of them.
Whole courses are built around forensically dissecting every error in major systems. Entire advanced fields are written in blood.
You probably don't hear about it often because the analysis is too dense and technical to go viral.
At the same time, there's a serious cultural problem: technical expertise is often missing from positions of authority, and even the experts we do have are often too narrow to bridge the levels of complexity modern systems demand.
Answer: we do not know either, but this is the standard response so that companies (or governments or whoever this concerns) are absolved of any responsibility. In general, it is not possible to know for a specific case until a proper investigation is carries out (which is rarely the case). But most of the times, experience says that it is company policies/procedures that either create circumstances that make such errors more probable, or simply allow these errors to happen too easily due to lack of enough guardrails. And usually it is due to push for "shipping products/features fast" or similar with little concern to other regards.
It could be a different case here, but seeing it is about oracle and having in mind that oracle is really bad at taking accountability about anything going wrong, I doubt it (very recently they were denying a hack on their systems and the leak of data of companies that use them until the very last moment).
I would regularly write massive update/insert statements on production DBs to fix issues.
So, yeah, id imagine this was the engineers fault.
Diamond might actually be better: low surface energy means a low coefficient of friction, so it would be much easier to clean. It would still suck the heat right out of your cheeks, though.
Realistically, porcelain or other ceramics are probably the ideal material.
https://www.guggenheim.org/exhibition/maurizio-cattelan-amer...
I'd say poor process management. Why is an engineer even deleting critical storage (I take that to mean that they are deleting something of a file-system). Perhaps they where dropping a database, but you wouldn't do it like that in a critical environment. You'd disable the database access first, and then after some time, weeks, you'd drop the database, after doing a final backup.
It could also be disconnecting a SAN, deleting a storage pool, something like that, but your process should say: Check of read activity, off-line the storage, anything non-destructive, and the only later, once you've verified that everything runs without this resource, do you delete.
At previous jobs I've worked with healthcare system. You have processes, you follow those proesses to the letter and you never delete anything as your first step. Deleting is almost always done by going into read-only mode and ageing out the data.
The fact that recovery time is four days tells me that no one followed a single procedure. Because there should be a written step by step plan, including recovery and risk assessment and when the change manager sees: "Recovery time five days" they will ask questions and stop you.
This is probably the only answer less sexy than technical debt.
When I worked at a megacorp as a dev, I had near zero say in such purchase decisions. I had to work with what I was given. Thankfully I work for a much smaller shop now. Better pay and much better decision autonomy.
As a sysadmin/dba I get to handle the nasty side of Oracle: the bugs, the patches that fail, the redundant tools, the wordy documentation that always feels like a never-ending advert.
Our small company was building enterprises things for large clients. And one of them wanted Oracle on Windows server. They also wanted a failover setup. How hard could that be?
Now I hate Oracle. I hate Oracle consultants. I despise the ignorance.
My favourite bit was Oracle uninstall procedure. I had like 4 pages printed just for this case.
Oracle bad. Postgres good.
That is only partially true. Oracle has a wide portfolio of a bunch of products and the "wine & dine the execs" is the sales cycle for software like ERP Oracle E-Business Suite and Oracle Health (Cerner). E.g. it's the hospital CFO & CIO that are the true "customers" of Oracle Health and not the frontline doctors and nurses that use it.
However, for the Oracle RDBMS database ... it was often the developers that chose it over competitors such as IBM DB2, Sybase, Informix, MS SQL Server.
In the late 1980s and early 1990s, a lot of us devs did a "bake off" with various databases and Oracle often won on technical merit. E.g. Oracle had true row-level locking but MS SQL Server before v6.5 was page-level locking. And the open source alternatives of MySQL and PostgreSQL at that early timeframe were not mature enough to compete with advanced Oracle features such as Parallel Query Execution and Recovery from a Standby database.
E.g. the C Language programmer Shel Kaphan at Amazon chose Oracle in 1994: https://www.linkedin.com/posts/jpcharles_in-1994-amazons-fir...
(that anecdote cited this deep link: https://www.youtube.com/watch?v=u3qIWN-ZIPk&t=1h11m56s)
It took Amazon 25 years to finally migrate off of all Oracle databases: https://www.google.com/search?q=oracle+shuts+off+last+oracle...
So young devs today who aren't aware of history will wonder why Amazon ever got locked into Oracle in the first place?!? It was because in 1994, Oracle db was a very reasonable technical choice for devs to make.
So it's likely that, for a specific task, the oracle solution is crap, but oracle has crap for everything so the oracle sales droid can sell a "one throat to choke, one check to write" policy to a company that likely has technology problems but produces a "not technology" product.
Amazon's moving into this space and their crap is even stickier than oracle's...
(That was not an outage, though — as far as I understand the system was working, it just didn’t actually work...)
So I'm not surprised any software not purpose built for for that market would fail. And we know Oracle is not investing in R&D for Millennium since all efforts are out into their forthcoming AI based EHR - whatever that actually means....
Also - cerner software in general allowed hospitals to freely completely fuck up their own architecture. Sometimes ireperably.
If anyone has details about how this happened Id love to hear.
buyucu•9h ago
jackvalentine•8h ago
chickenzzzzu•8h ago
Theory: A society collectivey buys Oracle when they no longer view armed revolution as acceptable.
Scoundreller•7h ago
selivanovp•8h ago
And after a few years you find yourself in a situation when you already paid for Oracle so much, integrated it so deeply, so switching to any alternative is a massive pain and in most cases it’s safer and easier to keep paying Oracle.
goodpoint•7h ago
netdevphoenix•6h ago
1718627440•4h ago
netdevphoenix•47m ago
gonzo41•8h ago
8fingerlouie•8h ago
COBOL is in the same category. When invented, it was the absolute easiest programming language to learn and use, so of course it gained popularity.
It then turned out to be rather good at what it did, along with running on hardware that "never fails", so most places didn't even think about replacing it until sometime in the 90s.
Also keep in mind that the reason companies are migrating away from COBOL is not due to the language as much as due to young people not taking an interest in COBOL and Mainframes, making it hard to recruit new talent.
Even then, a migration away from a typical mainframe is a huge task. In most cases you're talking 50-60 years of "legacy" (still running, still working, still updated) code, and replacing a system that has evolved for half a century is not an easy undertaking, at least not if you plan on getting it 100% right the first time.
pastage•8h ago
Having only fleeting professional experience with COBOL during a summer my view of it is that it is a mix of dataanalyst job and programming. Where the programming is horrible and the report making is ok though archaic. As long as you modify processes already available it is not so bad, but the developer experience was horrible.
With all that said I actually liked ideas in COBOL but it is an extremely niche language that does not serve you at all in the real world.
decimalenough•8h ago
But yeah, if you're looking to code up a progressive web app or next blockbuster MMORPG, I wouldn't recommend COBOL.
sofixa•7h ago
It's all a matter of age and maturity. Nobody starting an airline, or financial company, or "hardcore data processing" today even bothers considering a mainframe or COBOL.
8fingerlouie•7h ago
oblio•5h ago
I wouldn't portray Cobol to be some sort of magic "hardcore" pixie dust for anything.
pastage•5h ago
netdevphoenix•6h ago
It is more like young people not wanting to:
- throw their careers out of the window by pigeonholing themselves into zombieware tech
- experience high levels of stress trying to debug code older than their parents, writing code that can't be unit tested and pushing said code to production
consp•5h ago
Isn't that just "move fast" these days?
abduhl•4h ago
baridbelmedar•8h ago
And let's be honest, a lot of folks in IT aren't exactly top performers and don't seem to care all that much. It's really the developers you find on forums like this who are genuinely passionate. You're not likely to bump into the people actually buying those big Oracle or IBM systems around here though :)
vlovich123•7h ago
jeroenhd•6h ago
Sure, Postgres beats OracleDB, but Postgres doesn't integrate as well with Oracle Fusion and you need to migrate the code yourself. They're like SAP: they're big enough that you can make a career just out of configuring their software packages and make good money while doing so.
It's expensive and certainly not the best, but it's reasonably stable and has a huge company backing it. Oracle won't disappear any time soon and they're not as likely as Google or Microsoft to shut down their services within a few years notice.
In some countries, Oracle is also very good at doing what Google and Microsoft are doing to students. The Brazilian programmers I've spoken to specifically learned OracleDB when they were taught relational databases. They learned to program in Java, and I'm sure Oracle also sponsored other parts of the professional tooling they got to use for free. Microsoft, on the other hand, didn't seem to generous towards their educational facility (no free MS tooling for their schools like they offer over here). If all you know is Photoshop/Windows/Maya/OracleDB/iOS, you're going to look for jobs where you can use Photoshop/Windows/Maya/OracleDB/iOS, and employers looking for cheap juniors will need to offer Photoshop/Windows/Maya/OracleDB/iOS to make the best use of them.
bn-l•6h ago
Oracle have a relatively big presence here and there’s a comfortable “mates” system that runs the Australia (soft bribery).
chickenzzzzu•6h ago
jeroenhd•6h ago
I was taught C# in uni for very similar reasons except the entire uni ran on Windows and the Microsoft platform, which made doing assignments on Linux rather inconvenient. With the status of dotnet core, I'd say Java finally has a good competitor when it comes to teaching OOP languages.
knifie_spoonie•6h ago
newsclues•6h ago
anonzzzies•6h ago
parthdesai•6h ago
Are we sure? I'm by no means a DBA, but DBA at our company (who is freaking smart btw) said if money wasn't an issue, he would actually go with OracleDB.
ie21•6h ago
I work with engineers and technical managers with 25+ years of experience, building and maintaining serious business software 'you could run a country with' - people here build react web apps or do scientific research, or work for a SaaS provider - completely different view than building highly complex, regulated, mission critical software that supposed to run for decades and be supported at this level.
collingreen•5h ago
A more useful line of conversation might be discussing the vastly different requirements and environments (both physical and bureaucratic) that span our industry. Right now I'm a one man dev team slinging multiple products as fast as I can, trying to find revenue as the runway burns up. It would be silly to think everyone is in my same position with my same tradeoffs and I don't expect that to be particularly controversial.
If you have some good insight about when Oracle products are particularly well suited to the task I think many folks would love to read and discuss it. If you just want to act like you're the only one taking your job seriously then I suggest you just save your keystrokes and everybody's time.
jeroenhd•6h ago
parthdesai•3h ago
Again, I haven't worked with OracleDB at all, and my postgres knowledge is limited, but assuming without having experience with both systems isn't fair to either DBs IMO.
anonzzzies•6h ago
metadata•5h ago
Postgres won't even let your force an execution plan and ignores hints (yes, there is an extension for that) so your optimized query can at some point just 10x in execution time and increase load in production when the plan changes.
In Oracle, I am told you can prioritize certain queries at certain time of day - it's crazy what it can do. Yes, it's slow and expensive. If you have money to throw at the problem, it's fast and it solves your problem, whatever the scale. Their Exadata cluster, for example, is wicked fast storage layer pre-filtering the data it sends to the database.
Of course, I despise their business practices - especially the abuse of customers via audits. As a database, it absolutely has its place regardless of lobbying, corruption, and whatever else they are doing.
akoboldfrying•3h ago
Finally, an actual technical argument. I agree that PostgreSQL's absolute insistence on trusting the query optimiser to Do The Right Thing is weird and annoying (despite being sound general strategy). It even seems to contradict its own general spirit of being on the whole extremely customisable -- you can make your own data types, operators, indexing data structures, complete scripting language plugins... but not, ya know, a way to 100% guarantee that this here query will use this here execution plan.
Spooky23•4h ago
My employer probably spends more money on databases for our learning management system than we do for one of our main customer facing apps with thousands of concurrent users. It’s literally a tally of training courses.
buyucu•5h ago
ManBeardPc•4h ago
Luckily more and more customers switched to Postgres and I no longer have to deal with it.