Edit: maybe Cursor forced this and Microsoft is taking its choice to open license VS code on the chin. Will be interesting to see the strategy with Visual Studio going forward.
Makes sense that MS would partner with Anthropic since their tool-use for productivity (Claude Code) seems superior. I personally rarely code with ChatGPT, almost strictly Claude.
Even the bench marks don’t seem all that helpful.
I often would use Claude to do a “make it pretty” pass after implementation with GPT-5. I find Claude’s spatial and visual understanding when dealing with frontend to be better.
I am sure others will have the exact opposite experience.
On the other hand, I wish ChatGPT had GitHub integration in Projects, not just in Codex.
I can confirm Sonnet is good for vibe coding but makes an absolute mess of large and complex codebases, while GPT5 tends to be pretty respectful.
Overall, I think Google has a better breadth of knowledge encoded, but Anthropic gets work done better.
Surely Microsoft's expertise these days is in cross-selling passable but not best-in-class products to enterprises who already pay for Microsoft products.
It says something about how they view the AI coding market, or perhaps the level of the gap between Anthropic and OpenAI here, that they've gone the other way.
I had the same experience recently with: - Ticketmaster - Docusign - Vercel
Probably a handful more I forgot.
I believe the main reason is because it prevents fraud.
But I see a deeper motive that phone numbers are more friction to change and therefore our “real” numbers become hard-to-change identity codes that can easily be used to pull tons of info about you.
You give them that number and they immediately can look up your name, addresses, age, and tons of other mined info that was connected to you. Probably credit score, household income, etc.
Phone numbers have tons of “metadata” you provide without really knowing it. Like how the Exif data in a photo may reveal a lot about your location and device.
Plenty of free VOIP services exist, including SMS reception.
Even when the free service providers are manually blocklisted, one-time validations can be defeated with private numbers on real networks / providers for under a dollar per validation, and repeated ongoing validations can be performed with rented private numbers on real networks / providers for under ten dollars per month.
The rent-an-SMS services that enable this are accessible through a web interface that allows connections from tor, vpns, etc - there is no guarantee that the telecom provider's location records of the IMEI tied to that phone number is anywhere close to the end user's real geographic location, so this isn't even helpful for law enforcement purposes where they can subpoena telecom provider records.
This "phone number required" practice exists for one primary reason: for businesses to track non-fraudulent users, data mine their non-fraudulent users, and monetize the correlated personal information of non-fraudulent users without true informed consent (almost nobody reads ToS's, but many would object to these invasive practices if given a dialogue box that let them accept or decline the privacy infringements but still allowed the user to use the business' service either way).
Sometimes, they are also used for a secondary reason: to allow the business to cheap out on developer costs by cutting corners on proper, secure MFA validation. No need to implement modern, secure passkeys or RFC-compliant TOTP MFA, FIDO2, U2F when you can just put your users in harm's way by pretending that SMS is a secure channel, rather than easily compromised by even common criminals with SS7 attacks, which are not relegated to nation-state actors like they once were.
verdverm•1h ago
They lost me when they expired my money and then tried to take more without asking