Can someone explain this? Are they suggesting that (eventually) one engineer can produce 1 million lines of Rust code in a month? Or replace 1 million lines of C code?
Using new "powerful code processing infrastructure"... but would it understand the semantics? Are those semantics clearly documented?
It really just boils down to AI writing a million lines of code.
I actually think it can probably already do something like where there's a lot of boilerplate code.
How is it wild? On social media I kept seeing things like people falsely expecting the end goal would require manually reading through a million lines of code. It seemed more like people making up reasons to be mad or trying to dunk on the author.
Which is absolutely batshit. There's no way that can be reviewed properly, even if it's putting all of the review work on all of the other teams.
This is "lets put our postgres database on blockchain because I think blockchain is cool" level of crap you see in peak bubble.
That’s not to trivialize what a compiler does, but it’s effectively going from a complex form to its building blocks while maintaining semantics.
Changing high level languages introduces fundamentally different semantics. Both can decompose to the same general building blocks, but you can’t necessarily compose them the same way.
At the simplest example, a compiler backend (the part you’re describing) can’t reason about data access rules. That is the domain of the language’s compiler frontend and a fundamental difference between C++ and Rust that can’t just be directly derived.
AI code generation is not deterministic and has no guarantee of behavior, thus requires review unless incorrect code is acceptable.
You don't have to use AI code generation to be what is generating the code or you could require some kind of proof of equivalence to verify the code that was generated.
They do still review code, but the first wave of layoffs in 2022 mainly hit principal engineers and above because some bean counters said "oh, these are the engineers that are costing us the most per head", so it's kind of the inmates running the asylum now.
And I'll say that their biggest sin was always that their code from the late 90s on was about 20% too clever for their own good. Kind of goes to that classic quip about how how it takes twice your brain power to debug code as it takes to write it, so if you were already maxing out just writing it, then you're not smart enough to debug it. That's half of why features seemed to get a 1.0 release, then get replaced with something rather than iteratively improved (the other half being FAANG style internal incentive structures).
Were all seeing the effects of them clearing house of their weaponized autism that was barely keeping the wheels on the wagon. They do review, but they don't have the ability to do it properly at scale anymore. Which makes rewriting everything even more batshit.
As a third-party developer in the late 2000s I remember my boss giving me a CDROM binder (binders?) of every single OS release that Microsoft had ever put out. I assume he’d been given it my his developer-relations rep at Microsoft. My team and I used it to ensure our code worked on every MSDOS/Win* platform we cared to target.
I expect that, internally, the Windows team have crazy amounts of resources to implement the most comprehensive regression testing suite ever created. To that extent, at least, you’d be able to tell if the Rust version did what the old code did even if you didn’t read the code itself.
That hasn't been nearly the same goal for decades now.
For instance, Crysis literally won't run on win10 or later anymore.
On top of that, security bugs aren't the kind of thing you can automate away during a rewrite that no one has the bandwidth to actually review.
That thing we don't have yet?
In a time when some of Satya Nadella's chickens are coming home to roost and Windows being the most obvious example and most of their AI things quickly approaching too, it's good to laugh at their stupidity as a consolation prize.
In the past Microsoft fucked up some many times but they had the absolute dominance of the market and a huge pool of talent and knowledgeable people capable of making them try again and win. Times have changed, many have retired or been layoff to give way for the next round of "cheap young" talent in the form of contract workers.
Now they have the Cloud, I'm not so sure the Windows division can turn this turd around this time. Xbox has tangentially been the canary.
The rest of Microsoft might go the same way. I guess now I know what it felt like looking at IBM in 1989.
A couple of quotes from the article above:
"WebView2-based Microsoft Teams consistently uses 1-2GB of RAM while doing nothing. Microsoft likely doesn’t know how to make these web apps use fewer resources, so it’s instead moving Teams calling to a separate process to reduce crashes."
"But Teams is not the only web app causing trouble when RAM prices are about to soar, as we also have WhatsApp. When WhatsApp debuted on Windows, it was an Electron app. However, Meta later upgraded it to WinUI/XAML (also known as native code on Windows), and WhatsApp eventually became one of the best apps [... using] less than 200MB of RAM and had smoother animations and faster load times."
It seems that most developers these days focus on web-exclusive technologies and try to force desktop and system level programs into this paradigm.
C, C++, and C# programmers seem to be as rare as hens teeth today?
Are colleges and universities not teaching these languages anymore? Is this a symptom of 'cloud-first' strategies where its easier to 'just use JavaScript' for everything, perhaps developer laziness/reluctance to learn another 'lower level' language?
I really don't understand the appeal of web-centric languages like JavaScript and TypeScript in the desktop and systems realm when they lack a standard library (which genuinely scares me: supply chain attacks...), likely contributing to the RAM consumption issue as developers just keep piling packages on for one specific function missing in another imported library, and aren't natively compilable to small binaries that aren't dependent on a runtime or bulky embedded interpreter.
Yes, C# technically falls afoul of this (in .NET), but C# at least has a standard library that is comprehensive and is supported by an enterprise (Microsoft, for all its faults), not random developers on the internet.
Microsoft allowing key components of Windows 11 to be rewritten in web-wrappers is only going to drive people further into Linux, as the RAM affordability crisis continues.
> My goal is to eliminate every line of C and C++ from Microsoft by 2030. Our strategy is to combine AI and Algorithms to rewrite Microsoft’s largest codebases
From his follow-up:
> It appears my post generated far more attention than I intended… with a lot of speculative reading between the lines.. Just to clarify… Windows is NOT being rewritten in Rust with AI.
So either he doesn't think that Windows is written in C/C++, it's not "from Microsoft", or he doesn't know what "reading between the lines" means, because those literally are the words he said. Sure, he also said "and algorithms", but I'd argue that inferring that to be a significant difference would require a lot more reading between the lines.
I guess he could also quibble that "eliminating every line of C and C++ from Microsoft" was supposed to mean new lines of code being written rather than existing ones, but that's both not the way most people would read it (if I said I wanted to eliminate all water from the planet, most people wouldn't think I meant I was eliminating rain but leaving the oceans alone) and a bit dubious from a technical perspective (since leaving the existing Windows codebase intact would make it pretty hard not to at least occasionally need to write a new line of code in the existing language).
marcodiego•2h ago
fragmede•1h ago
dralley•1h ago
j-o-m•1h ago
Further, this is not a random speculative post, it is an announcement for a job opening on the posters team.
dralley•1h ago
j-o-m•57m ago
I’ll own up to not considering that when I wrote my comment, still think discussing Microsoft’s seemingly head first dive into massive AI generation of code is entertaining, even if it is not really as important (or important at all) as it would be if this was a post from the CEO.
Spooky23•1h ago
hawaiianbrah•7m ago
1gn15•1h ago