If it were preventing a mass murder I might feel differently...
But this is protecting the money supply (and indirectly the governments control).
Not a reason to violate privacy IMO, especially when at the time this was done these people were only suspected of fraud, not convicted.
Well you can't really wait until the conviction to collect evidence in a criminal trial.
There are several stages that law enforcement must go through to get a warrant like this. The police didn't literally phone up Microsoft and ask for the keys to someone's laptop on a hunch. They had to have already confiscated the laptop, which means they had to have collected enough early evidence to prove suspicion and get a judge to sign off and so on.
At least they are honest about it, but a good reason to switch over to linux. Particularly if you travel.
If microsoft is giving these keys out to the US government, they are almost certainly giving them to all other governments that request them.
I use BitLocker on my Windows box without uploading the keys. I don't even have it connected to a Microsoft account. This isn't a requirement.
> If they have a key in their possession [...]
So they do have a choice.
The only explanation that makes sense to me is that there's an element of irrationality to it. Apple has a well known cult, but Microsoft might have one that's more subtle? Or maybe it's a reverse thing where they hate Linux for some equally irrational reasons? That one is harder to understand because Linux is just a kernel, not a corporation with a specific identity or spokesperson (except maybe Torvalds, but afaik he's well-regarded by everyone)
This is why the FBI can compel Microsoft to provide the keys. It's possible, perhaps even likely, that the suspect didn't even know they had an encrypted laptop. Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
But you can still help prevent abuses of mass surveillance without probable cause by making such surveillance as expensive and difficult as possible for the state
As of today at 00:00 UTC, no.
But there's an increasingly possible future
where authoritarian governments will brand users
who practice 'non-prescribed use' as enemies of the state.
And when we have a government who's leader
openly gifts deep, direct access to federal power
to unethical tech leaders who've funded elections (ex:Thiel),
that branding would be a powerful perk to have access to
(even if indirectly).Eg in England you're already an enemy of the state when you protest against Israel's actions in Gaza. In America if you don't like civilians being executed by ICE.
This is really a bad time to throw "enemy of the state" around as if this only applies to the worst people.
Current developments are the ideal time to show that these powers can be abused.
Criticizing the current administration? That sounds like something an enemy of the state would do!
Prepare yourself for the 3am FBI raid, evildoer! You're an enemy of the state, after all, that means you deserve it! /s
In the opposite scenario - without key escrow - we would be reading sob stories of "Microsoft raped my data and I'm going to sue them for irreparably ruining my business" after someone's drive fails and they realize the data is encrypted and they never saved a backup key (and of course have no backups either).
>Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works.
Because in 2026 "journalism" has devolved into ragebait farming for clicks. Getting the details correct is secondary, and sometimes optional.
Once the feature exists, it's much easier to use it by accident. A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded. And it's a silent failure: the security properties of the system have changed without any visible indication that it happened.
Stats on this very likely scenario?
Also, around 1999-2000, Sun blamed cosmic rays for bit flips for random crashes with their UltraSPARC II CPU modules.
From the wikipedia article on "Soft error", if anyone wants to extrapolate.
So roughly you could expect this happen roughly once every two hundred million years.
Assuming there are about 2 billion Windows computers in use, that’s about 10 computers a year that experience this bit flip.
That's wildly more than I would have naively expected to experience a specific bit-flip. Wow!
Was in DEFCON19.
Can't find an explanatory video though :(
More on the topic: Single-event upset[1]
This is absurd, because it's basically a generic argument about any sort of feature that vaguely reduces privacy. Sorry guys, we can't have automated backups in windows (even opt in!), because if the feature exists, a random bitflip can cause everything to be uploaded to microsoft against the user's will.
Before you downvote, please entertain this one question: have you been able to predict that mandatory identification of online users under the guise of protecting children would literally be implemented in leading western countries in such a quick fashion? If you were, then upvote my comment instead because you know that will happen. If you couldn't even imagine this say in 2023 - then upvote my comment instead because neither you can imagine mandatory key extraction.
We have mandatory identification for all kinds of things that are illegal to purchase or engage in under a certain age. Nobody wants to prosecute 12 year old kids for lying when the clicked the "I am at least 13 years old" checkbox when registering an account. The only alternative is to do what we do with R-rated movies, alcohol, tobacco, firearms, risky physical activities (i.e. bungee jumping liability waiver) etc... we put the onus of verifying identification on the suppliers.
I've always imagined this was inevitable.
When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.
We can't trust every private company that now has to verify age to not store that information with whatever questionable security.
If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not, then we need regulation to prevent the storage of ID information.
We should still be able to verify age while remaining psuedo-anonymous.
This is a part of Settings that you will never see at a passing glance, so it's easy to forget that you may have it on.
I'd also like to gently push back against the cynicism expressed about having a feature like this. There are more people who benefit from a feature like this than not. They're more likely thinking "I forgot my password and I want to get the pictures of my family back" than fully internalizing the principles and practices of self custody - one of which is that if you lose your keys, you lose everything.
There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account. Either of these types of accounts may be associated with an iCloud account. macOS doesn’t work like Windows where your Microsoft account is your login credential for the local machine.
FileVault key escrow is something you can enable when enabling FileVault, usually during initial machine setup. You must be logged into iCloud (which happens in a previous step of the Setup Assistant) and have iCloud Keychain enabled. The key that wraps the FileVault volume encryption key will be stored in your iCloud Keychain, which is end-to-end encrypted with a key that Apple does not have access to.
If you are locked out of your FileVault-encrypted laptop (e.g. your local user account has been deleted or its password has been changed, and therefore you cannot provide the key to decrypt the volume encryption key), you can instead provide your iCloud credentials, which will use the wrapping key stored in escrow to decrypt the volume encryption key. This will get you access to the drive so you can copy data off or restore your local account credentials.
And just in case it wasn't clear enough, I'd add: a local user account is the default. The only way you'd end up with an LDAP account is if you're in an organization that deliberately set your computer up for networked login; it's not a standard configuration, nor is it a component of iCloud.
> that just so happens to take a screenshot of your screen every few seconds
Recall is off by default. You have to go turn it on if you want it.
Microsoft also happens to own LinkedIn which conveniently "forgets" all of my privacy settings every time I decide to review them (about once a year) and discover that they had been toggled back to the privacy-invasive value without my knowledge. This has happened several times over the years.
>Microsoft is an evil corporation, so we must take all bad stories about them at face value. You're not some corpo bootlicker, now, are you? Now, in unrelated news, I heard Pfizer, another evil corporation with a dodgy history[1] is insisting their vaccines are safe...
There is the old password for candy bar study: https://blog.tmb.co.uk/passwords-for-chocolate
Do users care? I would posit that the bulk of them do not, because they just dont see how it applies to them, till they run into some type of problem.
Users absolutely 100% will lose their password and recovery key and not understand that even if the bytes are on a desk physically next to you, they are gone. Gone baby gone.
In university, I helped a friend set up encryption on a drive w/ his work after a pen drive with work on it was stolen. He insisted he would not lose the password. We went through the discussion of "this is real encryption. If you lose the password, you may as well have wiped the files. It is not in any way recoverable. I need you to understand this."
6 weeks is all it took him.
Bitlocker on by default (even if Microsoft does have the keys and complies with warrants) is still a hell if a lot better than the old default of no encryption. At least some rando can't steal your laptop, pop out the HDD, and take whatever data they want.
There is always a choice.
The real issue is that you can't be sure that the keys aren't uploaded even if you opt out.
At this point, the only thing that can restore trust in Microsoft is open sourcing Windows.
The fully security conscious option is to not link a Microsoft account at all.
I just did a Windows 11 install on a workstation (Windows mandatory for some software) and it was really easy to set up without a Microsoft account.
And newer builds of Windows 11 are removing these methods, to force use of a Microsoft account. [0]
[0] https://www.windowslatest.com/2025/10/07/microsoft-confirms-...
By "really easy" do you mean you had a checkbox? Or "really easy" in that there's a secret sequence of key presses at one point during setup? Or was it the domain join method?
Googling around, I'm not sure any of the methods could be described as "really easy" since it takes a lot of knowledge to do it.
False. If you only put the keys on the Microsoft account, and Microsoft closes your account for whatever reason, you are done.
But this is irrelevant to the argument made above, right?
The 5th Amendment gives you the right to refuse speech that might implicate you in a crime. It doesn’t protect Microsoft from being compelled to provide information that may implicate one of its customers in a crime.
Only fix is apparently waiting until enough for to cram through an Amendment/set a precedent to fix it.
One of the reasons giving for (usually) now requiring a warrant to open your phone they grab from you is because of the amount of third-party data you can access through it, although IIRC they framed is a regular 4th Amend issue by saying if you had a security camera inside your house the police would be bypassing the warrant requirement by seeing directly into your abode.
But, the 5th amendment is also why its important to not rely on biometrics. Generally (there are some gray areas) in the US you cannot be compelled to give up your password, but biometrics are viewed as physical evidence and not protected by the 5th.
I think the reasonable default here would be to not upload to MS severs without explicit consent about what that means in practise. I suspect if you actually asked the average person if they're okay with MS having access to all of the data on their device (including browser history, emails, photos) they'd probably say no if they could.
Maybe I'm wrong though... I admit I have a bad theory of mind when it comes to this stuff because I struggle to understand why people don't value privacy more.
They can fight the warrant, if you don't at least object to it then "giving the keys away" is not an incorrect characterization.
Except the steps to to that are disable bitlocker, create a local user account (assuming you initially signed in with a Microsoft account because Ms now forces it on you for home editions of windows), delete your existing keys from OneDrive, then re-encrypt using your local account and make sure not to sign into your Microsoft account or link it to Windows again.
A much more sensible default would be to give the user a choice right from the beginning much like how Apple does it. When you go through set up assistant on mac, it doesn't assume you are an idiot and literally asks you up front "Do you want to store your recovery key in iCloud or not?"
That's not so easy. Microsoft tries really hard to get you to use a Microsoft account. For example, logging into MS Teams will automatically link your local account with the Microsoft account, thus starting the automatic upload of all kinds of stuff unrelated to MS Teams.
In the past I also had Edge importing Firefox data (including stored passwords) without me agreeing to do so, and then uploading those into the Cloud.
Nowadays you just need to assume that all data on Windows computers is available to Microsoft; even if you temporarily find a way to keep your data out of their hands, an update will certainly change that.
It's 2026. The abuses of corporations are well documented. Anyone who still chooses Windows of their own volition is quite literally asking for it and they deserve everything that happens to them.
What do you suggest? I’ll try it in a VM or live usb.
Bon appetit!
If you want something that "just works," Linux Mint[1] is a great starting point. That gets you into Linux without any headache. Then, later when bored, you can branch out into the thousands[2] of Linux distributions that fill every possible niche
Regardless of which distro you choose, your "desktop experience" will be mostly based on what desktop environment you pick, and you are free to switch between them regardless of distro. Ubuntu for example provides various installers that come with different DEs installed by default (they call them "flavours": https://ubuntu.com/desktop/flavors), but you can also just switch them after installation. I say "mostly" because some distros will also customise the DE a bit, so you might find some differences.
"Nicest desktop experience" is also too generic to really give a proper suggestion. There are DE's which aim to be modern and slick (e.g. GNOME, KDE Plasma, Cinnamon), lightweight (LXQt), or somewhere in between (Xfce). For power users there's a multitude of tiling window managers (where you control windows with a keyboard). Popular choices there are i3/sway or, lately, Niri. All of these are just examples, there are plenty more DEs / WMs to pick from.
Overall my suggestion would be to start with something straightforward (Mint would probably be my first choice here), try all the most popular DEs and pick the one you like, then eventually (months or years later) switch to a more advanced distro once you know more what your goals are and how you want to use the system. For example I'm in the middle of migrating to NixOS because I want a fully declarative system which gives the freedom to experiment without breaking your system because you can switch between different temporary environments or just rollback to previous generations. But I definitely wouldn't have been ready for that at the outset as it's way more complex than a more traditional distro.
I'd already long since migrated away from Windows but if I'd been harbouring any lingering doubts, that was enough to remove them.
Will they shoot me in head?
What if I truly forgot the password to my encrypted drive? Will they also shoot me in the head?
What about your wife's head? Your kids' heads?
This news piece from a non-tech organization will help educate non-tech people.
Don't think Apple wouldn't do the same.
If you don't want other people to have access to your keys, don't give your keys to other people.
Of course Apple offers a similar feature. I know lots of people here are going to argue you should never share the key with a third party, but if Apple and Microsoft didn't offer key escrow they would be inundated with requests from ordinary users to unlock computers they have lost the key for. The average user does not understand the security model and is rarely going to store a recovery key at all, let alone safely.
> https://support.apple.com/en-om/guide/mac-help/mh35881/mac
Apple will escrow the key to allow decryption of the drive with your iCloud account if you want, much like Microsoft will optionally escrow your BitLocker drive encryption key with the equivalent Microsoft account feature. If I recall correctly it's the default option for FileVault on a new Mac too.
Besides, Apple's lawyers aren't stupid enough to forget to carve out a law-enforcement demand exception.
Nope. For this threat model, E2E is a complete joke when both E's are controlled by the third party. Apple could be compelled by the government to insert code in the client to upload your decrypted data to another endpoint they control, and you'd never know.
As a US company, it's certainly true that given a court order Apple would have to provide these keys to law enforcement. That's why getting the architecture right is so important. Also check out iCloud Advanced Data Protection for similar protections over the rest of your iCloud data.
[0] https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
Except for that time they didn't.
As of macOS Tahoe, the FileVault key you (optionally) escrow with Apple is stored in the iCloud Keychain, which is cryptographically secured by HSM-backed, rate-limited protections.
You can (and should) watch https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for all the details about how iCloud is protected.
Back in the day hackernews had some fire and resistance.
Too many tech workers decided to rollover for the government and that's why we are in this mess now.
This isn't an argument about law, it's about designing secure systems. And lazy engineers build lazy key escrow the government can exploit.
Most of the comments are fire and resistance, but they commonly take ragebait and run with the assumptions built-in to clickbait headlines.
> Too many tech workers decided to rollover for the government and that's why we are in this mess now.
I take it you've never worked at a company when law enforcement comes knocking for data?
The internet tough guy fantasy where you boldly refuse to provide the data doesn't last very long when you realize that it just means you're going to be crushed by the law and they're getting the data anyway.
Microsoft (and every other corporation) wants your data. They don't want to be a responsible custodian of your data, they want to sell it and use it for advertising and maintaining good relationships with governments around the world.
> I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.
FTA (3rd paragraph): don't default upload the keys to MSFT.
>If you design it so you don't have access to the data, what can they do?
You don't have access to your own data? If not, they can compel you to reveal testimony on who/what is the next step to accessing the data, and they chase that.
The solution to that is to not have the data in the first place. You can't avoid the warrants for data if you collect it, so the next best thing is to not collect it in the first place.
Trying to resist building ethically questionable software usually means quitting or being fired from a job.
In the 90s and 00s people overwhelmingly built stuff in tech because they cared about what they were building. The money wasn't bad, but no one started coding for the money. And that mindset was so obvious when you looked at the products and cultures of companies like Google and Microsoft.
Today however people largely come into this industry and stay in it for the money. And increasingly tech products are reflecting the attitudes of those people.
When you get high up in an org, choosing Microsoft is the equivalent of the old "nobody ever got fired for buying IBM". You are off-loading responsibility. If you ever get high up at a fortune 500 company, good luck trying to get off of behemoths like Microsoft.
False. You can design truly end-to-end encrypted secure system and then the state comes at you and says that this is not allowed, period. [1]
[1] https://medium.com/@tahirbalarabe2/the-encryption-dilemma-wh...
It's time - it's never been easier, and there's nothing you'll miss about Windows.
https://cointelegraph.com/news/fbi-cant-be-blamed-for-wiping...
Perhaps next time, an agent will copy the data, wipe the drive, and say they couldn't decrypt it. 10 years ago agents were charged for diverting a suspect's Bitcoin, I feel like the current leadership will demand a cut.
Will this make people care? Probably not, but you never know.
If you open the BitLocker control panel applet your drive(s) will be labelled as "Bitlocker waiting for activation".
Yes, the American government retrieves these keys "legally". But so what? The American courts won't protect foreigners, even if they are heads of state or dictators. The American government routinely frees criminals (the ones that donate to Republicans) and persecutes lawful citizens (the ones that cause trouble to Republicans). The "rule of law" in the U.S. is a farce.
And this is not just about the U.S. Under the "five eyes" agreement, the governments of Canada, UK, Autralia and New Zealand could also grab your secrets.
Never trust the United States. We live in dangerous times. Ignore it at your own risk.
Bitlocker isn't serious security. What is the easiest solution for non-technical users? Does FDE duplicate Bitlocker's funcationality?
bigyabai•1h ago
advisedwang•1h ago
As far as I can see this particular case is a straightforward search warrant. A court absolutely has the power to compel Microsoft to hand over the keys.
The bigger question is why Microsoft has the recovery feature at all. But honestly I believe Microsoft cares so little about privacy and security that they would do it just to end the "help customers who lose their key" support tickets, with no shady government deal required. I'd want to see something more than speculation to convince me otherwise.