Do you think that maybe it’s possible the OP has a problem with the program and that crying racism whenever someone brings it up might actually be hurting your argument?
I'm radically pro-immigrant. I want the smartest people from around the world to come work here. I want to unshackle them from their corporate sponsors. the current system is unfair to immigrants (who are bound like serfs to their workplace) and to citizens (who lose jobs because corporations prefer serfs.)
But, you know, company that has invested billions in AI selling the idea that AI will be replacing labour is not surprising.
AI has provided alot of unique value, but despite the countless headlines stoking fear of mass job loss, there still remains little substance to these claims of being able to automate anything but the most meanial of jobs. Until we can directly point the finger to AI as the cause of job loss numbers rising, and not other unrelated economic factors, this all just smells of fear mongering with a profit incentive.
1. The existing codebase worse
2. The existing employees work more
3. The salaries stay flat
I’d argue that 1 is irrelevant provided the system continued to extract profit at the same or greater margin
Amazon lives and dies by not caring about #2 so that’s constant
#3 is desirable from Jassy and the boards perspective
Seems like exactly what I’d expect from Amazon
AI is for coding velocity like electricity is for better room lighting.
We haven't seen the nature of work after AI yet, we're still in a nascent phase. Consider every single white collar role, process, worfklow in your organization up for extreme disruption during this transition period, and it will take at least a decade to even begin to sort out.
Maybe startup development will significantly accelerate with AI churning out all the boilerplate to get your app started.
But enterprise development, where the app is already there and you’re building new features on top of a labyrinthian foundation, is a different beast. The hard part is sitting through planning meetings or untangling weird system dependencies, not churning out net new code. My two cents anyway.
Though AI will probably just proactively add features and open PRs and people can choose
Which I expect will be the gist of management consulting reports for the next decade.
If human decision-makers become the bottleneck... eventually that will be reengineered.
I'm fascinated to imagine what change control will need to look like in a majority-AI scenario. Expect there will be a lot more focus on TDD.
I don’t think LLMs are particularly smart, or capable of, or will definitely replace humans at anything, or if they’ll lead to better work. But I can already tell that their inherent lack of an ego DO accelerate things at enterprises, for the simple reason that the self-imposed roadblocks above stop happening
We are also explicitly NOT allowed to make any code changes that aren’t part of a story that our product owner has approved and prioritized.
The result is that we scrape together some stories to work on every sprint, but if we finish it early, we quickly run into red tape and circular conversations with other “decision makers” who need to tell us what we’re allowed to do before we actually do anything.
It’s fairly maddening. The whole org is hamstrung by a few workaholic individuals who control decision making for several teams and are chronically unavailable as a result.
I’ve seen this sort of thing happen at other big enterprises too but my current situation is perhaps an extreme example of dysfunction. Point being, when an org gets tangled up like this, LLMs aren’t gonna save it :)
I’ve already witnessed a certain big tech that started to move much faster by removing TPMs & EMs across the board, even without LLMs to “replace” them. With LLMs, you need even fewer layers. Then eventually fewer middle-of-business decision makers. In your example, it’s entirely possible that the function of making those components could be entirely subsumed by a single AI bot. That’s starting to happen a lot in the devops space already.
All that said, I doubt your business would benefit from moving faster anyway - most businesses don’t actually need to move faster. I highly recommend the “Bullshit Jobs” book, on this matter. Businesses will just need fewer and fewer people
The upside is that both of these things are the kind of tasks that are probably good to give to AI. I've always got little UI bugs that bother me every time I use our application but don't actually break anything and thus won't impact revenue and never get done.
I had a frontend engineer, who, when I could just find a way to give him time to do whatever he wanted, would just constantly make little improvements that would incrementally speed up pageload.
Both of those cases feel like places where AI probably gets the job done.
So, to clarify – developers want to make improvements to the codebase, and you want to give that work to AI? Have you never been in the shoes of making an improvement or a suggestion for a project that you want to work on, seeing it given to somebody else, and then being assigned just more slog that you don't want to do?
I mean, I'm no PM, but that certainly seems like a way to kill team morale, if nothing else.
> I had a frontend engineer, who, when I could just find a way to give him time to do whatever he wanted, would just constantly make little improvements that would incrementally speed up pageload.
Blows my mind to think that those are the things you want to give to AI. I'd quit.
The ability to untangle old bad code and make bigger broader plans for a codebase is precisely where you need human developers the most.
I wasn't clear, but to get that AI to admit that it has made an error and getting it to actually correct its error is like trying to put a round peg into a square hole. It will take the blame and continue as if nothing needs to change, no matter what prompts you send it.
Amazon has a document writing culture, all of those documents will be written by AI. People have built careers on writing documents. Same with operations, its all about audit logs. Internally, there are MCPs that have already automated TPMs/PMs/Oncall/maintenance coding. Some orgs in AWS are 90% foreign, there is fear about losing visa status and going back, the automation is just beginning. Sonnet 4 felt like the first time MCPs could actually be used to automate work.
A region expansion scoping project in AWS that required detailed design and inspection of tens of code bases was done in a day, it would usually require two or three weeks of design work.
The automation is real, and the higher are ups are directly monitoring token usage in their org, and pushing senior engineers to increase Q/token usage metrics among low level engineers. Most orgs have a no backfill policy for engineers leaving, they are supplimenting staffing needs with indian contractors, the expectation being that fewer engineers will be needed in a years time.
pryelluw•4h ago
Isn’t this their general approach since forever?
happytoexplain•3h ago
dale_huevo•3h ago
nitwit005•3h ago
Somehow they want to act like they are making a shift, rather than say they were ahead of the trend.
chneu•3h ago
The wording changes, the intention doesn't.
If they could pay you nothing they would.
goatlover•3h ago
ethbr1•2h ago
But I expect the increasing income stratification of the 10s+ is a harbinger that we're running out of high-paying jobs for the number of people who are qualified for them.
And the window is closing for countries to agree to something like a structural tax on AI with benefits going to society to address the ills.
Absent that: further stratification, more employee-less businesses, and not a great future
nitwit005•2h ago
droopyEyelids•3h ago
827a•3h ago
Really bad look and poor leadership from Jassy. There's a good way to frame adoption of AI, but this is not it.
usefulcat•2h ago
For 6/17, the S&P 500 was down 0.84%, QQQ (Nasdaq stocks) was down 0.98% and AMZN was down 0.59%.
AMZN slightly outperformed the market today.
827a•10m ago