… although I really extend that to why are you wearing an internet connected camera that is obviously going to be monitored by Meta.
Of course, anyone who opened a newspaper in the last 10 years or so would know better, but I can definitively see some people not giving a fuck about it.
Maybe a company with those standards should not get our business. Oops, no wait, maybe they mean the Friedman Doctrine standards? In that case they are entitled to do any and every thing to make a profit. No matter what the harm.
[edit: add last two sentences]
There was an example in the article where a user’s glasses kept recording the user’s wife after he took them off. That’s bad but on the user, not Facebook.
Seems similar to a situation where someone takes nudes of someone without their consent and then sends them off to a lab to be printed. The lab isn’t doing anything illegal or unethical printing them when they ask the user “are these legal” and the user replies “yes.” Unless you want to stop photo printers from ever printing nudes, I think the responsibility is on the user, not the firm.
So that when I say that they really do have a zero tolerance policy for anyone using their internal systems to violate user privacy, it's not because I'm eager to defend them. It's just true (at least, it was when I was there). There are internal systems dedicated to making sure you have access to what you need to do your job, and absolutely nothing else. All content you interact with through internal tools is monitored and logged. If you get caught trying to use whatever access your job gives you for anything other than doing your job, security immediately escorts you out of the building. This is drilled into new hires early and often.
You are the frog being boiled.
There is no expectation of privacy in public.
In the US at least, any private homeowner/renter can deny entry to their property, barring legal warrants and exceptional circumstances. A business can have a policy, and is generally legally protected as long as the policy is 1) equally applied, and 2) does not violate ADA... A court would have to weigh in if glasses are allowed or not for ADA... but I suspect there's already a case where a movie theater banned such glasses and they would probably(?) win, since such individuals could be expected to have non-recording glasses.
> Meta said this was for the purpose of improving the customer experience, and was a common practice among other companies.
Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
edit 2: OK, I see what you mean. But I'm wondering if it should be possible to consent to this via T&C. Basically the same issue as with many online services, turned up to 11, sure. And it involves OTHER people, who have not consented.
Stuff like this used to be outrage fuel even when it was more of a social experiment, e.g. the documentary "We live in public" or the "Big brother" TV show. By now, I'm sure there have been millions of influencers doing similar things, but it's very much not considered normal?
Streaming to an unknown number of employees might be considered different from streaming to the public, sure.
But the core question here is whether there's informed consent, and, IMO also, if it should be possible to consent to this when the other party is a company like Meta and the pretext is not deliberately seeking attention (like influencers and streamers do).
edit, clarified social media comparison
> Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
It's total fantasy. I've worked in big tech. Casually uploading and providing company/contractor access to non-redacted intimate photos or pictures of the insides of people's homes vaguely "for the purpose of improving the customer experience" would not pass even a surface-level privacy or data-protection review anywhere I've ever worked. Do Meta even read what they are saying?
Or they might start scanning for "problematic" behavior, a bit like the Apple CSAM fingerprinting initiative.
So not one part of me would ever buy Meta glasses (or the Snap glasses before that). You simply don't have sufficient control over the recordings and big tech companies can't be trusted, as we've witnessed from outsourced workers sharing explicit images. And I bet that's just the tip of the iceberg.
I honestly don't understand why anyone would get these and trust Meta to manage the risks.
Things like audio scanning your living space using those Alexa smart speakers with ultrasonics to get an image of not only everything in your space, but where you are in that space as well.
That technological use case only came out within the last five or so years, maybe closer to eight. Either way I could see that coming before it became a thing just because ultrasound imaging of your unborn child is a thing ultrasound imaging of the sea floor is a thing so why wouldn’t ultrasound imaging of your living space be a thing by a company who wants to know what you buy.
I never ever ever had Alexa I only ever had a Google home because I got it for free with GPM but I almost never used it because I hated the idea of it always listening.
I already regret Wi-Fi because they figured out now how to look through walls with that.
Probably this is people asking the glasses something about what they see and the glasses uploading video for classification to generate an answer.
People think it is "just AI" so are not very concerned about privacy.
Which is why I'd never touch a person tech device from Meta.
Their entire DNA is written to exploit their users for profit. In my judgement, they literally cannot and will never consider those issues as anything other than something to obscure to keep people unaware of the depth of the exploitation.
So it doesn’t surprise me that Meta didnt renew/cancelled a contract that is a net negative for them. Arguing over the reason seems fruitless as no reason is needed per the terms of the contract (I assume since breach of contract wasn’t brought up by the sub).
Meta isn’t lying, you should assume other companies are doing it too, Tesla did it with their cameras, and assume others like any company has access to your camera, I would even assume CCTV cameras too. It’s why for anything sensitive, try to use open source stacks, you might lose some of the features, but it’s a needed compromise.
gorbachev•1h ago
everdrive•1h ago
jofzar•1h ago
arowthway•1h ago
HumblyTossed•1h ago
db48x•1h ago
azan_•1h ago
bredren•1h ago
Just because you don’t notice it doesn’t mean it doesn’t happen.
However, this is still a different thing than smart glasses which can further be segmented into who designed the smart glasses.
NBJack•23m ago
Smart glasses, however, are always aimed at whatever the wearer is looking at. They may or may not be recording (note the reports of people hiding the LED indicators), and at a fair distance could easily be mistaken for a normal pair.
The general populace is much more likely to notice the former recording rather than the latter.
arowthway•29m ago
yreg•10m ago
freehorse•1h ago
amelius•1h ago
It's the camera of their smartphone.
Not sure if it's ON though.
voidUpdate•1h ago
randallsquared•30m ago
wolvoleo•26m ago
voidUpdate•25m ago
powvans•1h ago
There's also nothing stopping us from stigmatizing the use of smartphones in public. Even a slight discouragement of it would be progress. It doesn't have to be all or nothing.
divan•35m ago
Calls to stop speaking or interacting with people who use smart glasses sounds like the dumbest thing I've read on HN ever.
jmholla•21m ago
elevation•29m ago
Great! Now do people with smart TVs and people with smart phones
Aaronstotle•28m ago
getnormality•1h ago
Not that I am remotely interested in defending Meta, or optimistic that they would proactively address privacy issues. But I don't feel that sympathetic to the outsourcing company here either.
I don't know what happened behind the scenes. I'm just going off what is said and not said in the article. If I were whistleblowing about something like this, I would take pains to describe what measures I took internally before going public. I didn't see any of that here.
EDIT: Look, to be clear, I think it's bad that naive or uninformed people are buying video recorders from Meta and unintentionally having their private lives intruded on by a company that, based on its history, clearly can't be trusted to be a helpful, transparent partner to customers on privacy. I think it's good that the media is giving people a reminder of this. I think it's good that the sources said something, even though the consequences they suffered seem inevitable. But to me, there is nothing essentially new to be learned here, and I don't know what can or should be done to improve the situation. I think for now, the best thing for people to do is not buy Meta hardware if they have any desire for privacy. Maybe there are laws that could help, but what should be in the laws exactly? It's not obvious to me what would work. I suspect that some of the reason people buy these products is for data capture, and that will sometimes lead to sensitive stuff being recorded. What should the rules be around this and who should decide? Personally I don't know.
OutOfHere•1h ago
getnormality•1h ago
OutOfHere•1h ago
fibonacci_man•1h ago
getnormality•56m ago
3form•49m ago
ImPostingOnHN•1h ago
The secondary issue is that it's generally frowned upon to make your employees view nudity in the workplace. Are there extenuating circumstances here? No, we have no evidence there are any extenuating circumstances here.
giraffe_lady•1h ago
noir_lord•1h ago
elphinstone•1h ago
Why reflexively defend a massive tech corporation caught repeatedly violating the law?
ImPostingOnHN•1h ago
Congratulations, you have a bright future in politics and/or tech CEOing.
ignoramous•52m ago
SlinkyOnStairs•43m ago
OpenAI had them classify CSAM, so Sama fired them as a client back in 2022. https://time.com/6247678/openai-chatgpt-kenya-workers/
We're 4 years on, 3 years since that report broke. Not a single thing has improved about how tech companies operate.
cyanydeez•23m ago
prepend•10m ago
It’s a terrible job, I wouldn’t want to do it, but someone needs to. Perhaps one day, AI will be accurate enough to not need it, but even then you need someone to process complaints and waivers (like someone’s home photos being inaccurately flagged).
stackghost•39m ago
Name a more iconic duo.
Frieren•32m ago
I do not care which country the outsourcing company is in. When criminals go global, protection whistleblowers should go global too.