Users: save files "on their PC" (they think)
Microsoft: Rolls out AI photo-scanning feature to unknowing users intending to learn something.
Users: WTF? And there are rules on turning it on and off?
Microsoft: We have nothing more to share at this time.
Favorite quote from the article:
> [Microsoft's publicist chose not to answer this question.]
Not in a million years. See you in court. As often, just because a press statement says something, it's not necessarily true and maybe only used to defuse public perception.
Astonishing. They clearly feel their users have no choice but to accept this onerous and ridiculous requirement. As if users wouldn't understand that they'd have to go way out of their way to write the code which enforces this outcome. All for a feature which provides me dubious benefit. I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?
Privacy legislation is clearly lacking. This type of action should bring the hammer down swiftly and soundly upon these gross and inappropriate corporate decision makers. Microsoft has needed that hammer blow for quite some time now. This should make that obvious. I guess I'll hold my breath while I see how Congress responds.
Presumably it can be used for filtering as well - find me all pictures of me with my dad, etc.
Depending on the algorithm and parameters, you can easily get a scary amount of false positives, especially using algorithms that shrink images during hashing, which is a lot of them.
The biggest worry would always be that the tools would be stultifying and shit but executives would use them to drive layoffs on an epic scale anyway.
And hey now here we are: the tools are stultifying and shit, the projects have largely failed, and the only way to fix the losses is: layoffs.
* Does not apply to the military and the politicians. They have trust issues with the technology.
Presumably, it's somewhat expensive to run face recognition on all of your photos. When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again.
My wife has a phone with a button on the side that opens the microphone to ask questions to Google. I guess 90% of the audios they get are "How the /&%/&#"% do I close this )(&(/&(%)?????!?!??"
This is probably the case. But Redmond being Redmond, they put their foot in their mouth by saying "you can only turn off this setting 3 times a year" (emphasis mine).
But that's not necessarily true for everyone. And it doesn't need to be this way, either.
For starters I think it'd help if we understood why they do this. I'm sure there's a cost to the compute MS spends on AI'ing all your photos, turning it off under privacy rules means you need to throw away that compute. And turning it back on creates an additional cost for MS, that they've already spent for nothing. Limiting that makes sense.
What doesn't make sense is that I'd expect virtually nobody to turn it on and off over and over again, beyond 3 times, to the point that cost increases by more than a rounding error... like what type of user would do that, and why would that type of user not be exceedingly rare?
And even in that case, it'd make more sense to do it the other way around: you can turn on the feature 3 times per year, and off anytime. i.e. if you abuse it, you lose out on the feature, not your privacy.
So I think it is an issue that could and should be quickly solved.
In fact, if you follow the linked page, you'll find a screenshot showing it was originally worded differently, "You can only change this setting 3 times a year" dating all the way back to 2023. So at some point someone made a conscious decision to change the wording to restrict the number of times you can turn it _off_
The issue is that is a feature that 100% should in any sane world be opt in - not opt out.
Microsoft privacy settings are a case of - “It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying 'Beware of the Leopard.”
If the user leaves it off for a year, then delete the encrypted index from the server...
Even my 10(?) year old iPhone X can do facial recognition and memory extraction on device while charging.
My Sony A7-III can detect faces in real time, and discriminate it from 5 registered faces to do focus prioritization the moment I half-press the shutter.
That thing will take mere minutes on Azure when batched and fed through GPUs.
If my hunch is right, the option will have a "disable AI use for x months" slider and will turn itself on without letting you know. So you can't opt out of it completely, ever.
We all know why.
That means that all Microsoft has to do to get your consent to scan photos is turn the setting on every quarter.
Very likely true, but we shouldn't have to presume. If that's their motivation, they should state it clearly up front and make it opt-out by default. They can put a (?) callout on the UI for design decisions that have external constraints.
Just stop using Microsoft shit. It's a lot easier than untangling yourself from Google.
But Microsoft is pretty easy to avoid after their decade of floundering.
They are exactly where I left them 20 years ago.
It's very sad that I can't stop using them again for doing this.
Just as linking to original documents, court filings etc. should be a norm in news reporting, it should also be a norm to summarize PR responses (helpful, dismissive, evasive or whatever) and link to a summary of the PR text, rather than treating it as valid body copy.
The privacy violations they are racking up are very reminiscent of prior behavior we've seen from Facebook and Google.
They'd probably do it happily even without a warrant.
I'd bet Microsoft is doing this more because of threats from USG than because of advertising revenue.
https://www.pcmag.com/news/the-10-most-disturbing-snowden-re...
I'm old enough to remember when companies were tripping over themselves after 9/11 trying to give the government anything they could to help them keep an eye on Americans. They eventually learned to monetize this, and now we have the surveillance economy.
They are a hard nosed company focused with precision on dominance for themselves.
Did they ever open source anything that really make you think "wow"? The best I could see was them "embracing" Linux, but embrace, extend, extinguish was always a core part of their strategy.
- EU governments keep auditing us, so we gotta stay on our toes, do things by the book, and be auditable - it's bad press when we get caught doing garbage like that. And bad press is bad for business
In my org, doing anything with customer's data that isn't directly bringing them value is theoretically not possible. You can't deliver anything that isn't approved by privacy.
Don't forget that this is a very big company. It's composed of people who actually care and want to do the right thing, people who don't really care and just want to ship and would rather not be impeded by compliance processes, and people who are actually trying to bypass these processes because they'd sell your soul for a couple bucks if they could. For the little people like me, the official stance is that we should care about privacy very much.
Our respect for privacy was one of the main reasons I'm still there. There has been a good period of time where the actual sentiment was "we're the good guys", especially when comparing to google and Facebook. A solid portion of that was that our revenue was driven by subscriptions rather than ads. I guess the appeal to take customer's money and exploit their data is too big. The kind of shit that will get me to leave.
How long has MS been putting ads in the start menu?
Everything I see from that company is farcical. Massive security lapses, lazy AI features with huge privacy flaws, steamrolling OS updates that add no value whatsoever, and heavily relying on their old playbook of just buying anything that looks like it could disrupt them.
P.S. The skype acquisition was $8.5B in 2011 (That's $12.24B in today's money.)
AHHAHAHAHAHAHAHAHA.
Ha.
Nice one.
The sad thing is that they've made it this way, as opposed to Windows being inherently deficient; it used to be a great blend of GUI convenience with ready access to advanced functionality for those who wanted it, whereas MacOS used to hide technical things from a user a bit too much and Linux desktop environments felt primitive. Nowadays MS seems to think of its users as if they were employees or livestock rather than customers.
I wonder if this is also a thing for their EU users. I can think of a few laws this violates.
Look, scanning with AI is available!
Wow, scanning with AI is now free for everyone!
What? Scanning with AI is now opt-out?
Why would opting-out be made time-limited?
WTF, what's so special about 3x a year? Is it because it's the magic number?
Ah, the setting's gone again, I guess I can relax. I guess the market wanted this great feature, or else they wouldn't have gradually forced it on us. Anyway, you're a weird techie for noticing it. What do you have to hide?
...I guess Microsoft believes that they're making up for it in AI and B2B/Cloud service sales? Or that customers are just so locked-in that there's genuinely no alternative? I don't believe that the latter is true, and it's hard to come back from a badly tarnished brand. Won't be long before the average consumer hates Microsoft as much as they hate HP (printers).
Microsoft hate was something else in the '90s and 2000s. Yet people stayed with it as if they had no choice while OS/2, AmigaOS, NextStep, BeOS and all those UNIXes died.
That's your problem right there.
> Microsoft only lets you opt out of AI photo scanning
Their _UI_ says they let you opt out. I wouldn't bet on that actually being the case. At the very least - a copy of your photos goes to the US government, and they do whatever they want with it.
> [Microsoft's publicist chose not to answer this question.]
_wire_•5h ago
Heaven forfend!
dmitrygr•2h ago