> Tools for Humanity bragged about many large partnerships that should make any privacy advocates shiver in dread: the Match Group dating apps conglomerate (Tinder, OkCupid, Hinge, Plenty of Fish), Stripe, and Visa are some of them.
Visa and Stripe being involved in this should indeed make everyone shiver and push back against this. Altman is not a good person and this company has already had its unethical practices exposed[0].
[0] https://icj-kenya.org/news/worldcoin-case-postponed-amid-con...
(Obviously Worldcoin is shady as shit, I'm not defending it.)
Easy Remote Job Opportunity! Pay is $1/hr. Perfect for retirees, disabled, and even kids! Requirements: have an eye ball. Duties: whatever you want, except when this device beeps, look into the camera.
Is Amazon's Mechanical Turk or whatever paying people to solve captchas still a thing?
Or the tried-and-true method of trusting only friends, friends of friends, recommendations from friends, etc.
Now, if we designed technology for humans, we'd realize that most humans have local networks of trust. E.g. I talk to my friend in person, she tells me her discord handle, now we've established trust. In addition, trust is something that's gradually built, not given all at once in a EULA[2].
[1]: https://www.ribbonfarm.com/2010/07/26/a-big-little-idea-call...
[2]: https://ruben.verborgh.org/blog/2024/10/15/trust-takes-time/
If somebody is trying to get billions, they're doing it because they're the type of person that wants to be a part of politicis, and a power player. That kind of person was never going to just give up the opportunity to make more money and gather more power to do more things that they see as neccessary and as good.
For that matter, do we expect that the impoverished people the gp commenter refers to would resist, say, government-led efforts to compel their biometrics from them? [0]
[0] e.g. https://www.csmonitor.com/World/Asia-South-Central/2022/0425...
Down that road lies a paternalistic flavor of charity, a spirit of “protecting them from themselves.” And that seems to evoke the idea that poor is the same as ignorant. That there’s only one correct value to assign to your biometric data, and anyone who values theirs differently must do so because they’re ignorant, rather than just having different values from you.
We can advocate for political freedom, material security, and just societies—and probably get better results—if we don’t model people as helpless or uninformed or without agency just because they’re in a socially vulnerable position.
Sam Altman has a far greater capacity for agency than an impoverished Filipino signing away his biometric data for the price of a Domino’s pizza.
> > Is it certain that impoverished people would weigh those potential consequences more heavily than being paid today?
The answer is: No, it is not certain. Why? Because they are poor.
Poor people have less agency. That’s just a fact. And they are being preyed upon by Altman. Making this about whether poor Filipinos are making an informed agreement with an AI bro is tone-deaf.
That is not a good measure for the willingness of a decision
By all means let’s engineer a world where people are never faced with crappy choices. But people are living in the present, not the glorious future: Taking away the choice in this case doesn’t seem to fix the situation, and deprives people of a benefit they’d accept if you let them choose.
By your logic we shouldn’t fight child labor or drug trafficking.
All that feeds people too.
That’s not what anyone who objects to this thinks and you know it. Anyone who objects to people selling away something dear because they are poor want (1) those people to not be poor and (2) those other people to not prey on them. People are outraged when people are forced to drink dirty water—they are not outraged at desperate people for drinking dirty rainwater.
It’s a false choice.
> By all means let’s engineer a world where people are never faced with crappy choices. But people are living in the present, not the glorious future: Taking away the choice in this case doesn’t seem to fix the situation, and deprives people of a benefit they’d accept if you let them choose.
I’m Sam Altman and I approve this message.
I suspect that my face has been recorded and linked to my profile at several stores. Palantir or similar have probably scrapped all of the internet looking to link a face to an identity.
Real ID just because fully required for domestic air travel.
100% this. The fact that the governments and corporations have enough information at their fingertips to identify people from chance photos is, IMO, not good.
However that genie is out of the bottle and there is no way to get it back in. Cameras are ubiquitous and one can get a decent quality fingerprint from store camera footage. Any time I do eye exam the doctor takes an eye scan and uploads it somewhere. Passport biometrics are becoming required and most countries will match it with a face scan on border crossing. And this is just a tip of the iceberg.
I would like to be wrong, but IMO the only solution to the government being able to track anyone they like (or, rather, do not like) is via legislation, not technology. And with various 3-letter agencies being routinely allowed special access "because security" this path is unlikely to be viable either. My 2c.
My teeth were 3D scanned at very high resolution by my dentist the other day. He is leading edge and is now doing it for all patients (was previously only patients with replacement needs). I assume the information is going to some US provider somewhere.
Iris scanning and lots of other biometrics like capillaries can be done from a distance (e.g. iris scan at airport security in NZ).
That would require a big "source" for this claim.
Feel free to downvote me, please provide a source because it's false so far.
Recent month-ago story of police department looking to trade its mugshot face database for access to facial recognition software: https://news.ycombinator.com/item?id=43853297
Government directly looking to share biometrics.
As you say, the future can become heaven or hell.
They might have known the consequences, but money is money. I feel like for 99% of people there is a certain sum of money for which they will give into pretty much any kind of data collection. Even I'd give into it if they gave me enough. The bar is just higher, but there does exist a certain $X for which I would give in as well.
I guess I am in some sense compensated by the data brokers who psychometrically profile my internet use and resell their conclusions—but “free ad-supported internet content” isn’t exactly fungible cash…
I'd finally be able to afford a house, never have to work for a toxic company again in my life, and could afford various preventative medical care without relying on insurance.
Basically, "life-changing" money.
It's worse with 23andme, too, because the blast radius is all of your relatives that didn't take the test at all.
https://wydaily.com/latest/regional-national/2025/05/08/23an...
I hope given the recent events of having some hired thugs rifle through government databases (including the OPM, which supposedly has very sensitive data from security clearance applications), that maybe letting people collect and store data on you should be avoided at all cost.
The older I get, the more I understand the stereotype of the eccentric former techie who no longer wants anything to do with modern technology or society
I like to go with the simpler "I hold lots of sensitive data for people who trust me: my family, my friends, my employer. One would have to be a sociopath to disclose other people's secrets without their consent."
Even then, if the government is weak than the ‘more power over you’ is simply false. Maybe the magnitude of the power is more for a government, but companies apply their power with much more frequency.
I see your history teachers did a poor job. My condolences.
> the ostensible motive of the government is to serve its people
Your conclusion doesn't seem to match your usage of "ostensible". Yes, /democratic/ governments claim to serve its people, but do not necessarily do so. You should always be suspicious and critical of your government in an effort to ensure that the stated goal and actions are aligned. You should always be treating your government as adversarial. In fact, if you read a lot of the writings of people influential to the founding of the US you'll find that they were explicitly trying to design a system where they say its biggest adversary was itself.But also, there are plenty of governments that do not even pretend to serve its people. They are completely self-serving and transparent about that fact. You never know when one is going to turn into the other but often going from ostensibly benevolent to explicitly malevolent is relatively mild, but gaining back freedom usually requires a lot of bloodshed. There's always exceptions, but this is common. The wort part is that people frequently vote in the malevolent leaders. Democracies can turn into autocracies without spilling a single drop of blood. I'm unaware of the reverse ever happening.
That's a very load-bearing word there.
Mostly the police and military, not the companies.
From one day to the next they sold to the Italian "developer company" Bending Spoons.
That company acquired Brightcove a few months earlier and several other services like Evernote, Meetup, companies which have nothing to do with "outdooring".
From one day to the next they got access to my 13.000 tracked km, 800 hours of data. My profile is set to private, but now I have to assume that all this data will be sold to advertisers. "Anonymized".
this comes across as if you are attempting to downplay the importance of this biometric data which is weird considering Altman is paying to get access to just a photo of your eyes.
The first wants one data breach anywhere to last a lifetime and the second is a bigger media story because people believe what we can do with the dna 23andme collects is more than reality.
And on the other hand, I do wish there was a way to distinguish real humans from bots on the Internet. I think it’s only a matter of time until the web becomes useless thanks to AI. What’s a better solution?
If we assume that all this information is permanently available in a public blockchain, how does it change anything for society really? I can think of security checks becoming better, what are the negative possibilities?
My realistic assumption is that all this data will become public one way or another, so I'm trying to understand how we can make sure it can't be abused.
At least if it's open, access is equal.
like fingerprints or facial scans
This piqued my interest, but I couldn't find anything. Do you know where I could find more information about it?
[0] https://www.statewatch.org/news/2023/august/eu-and-usa-ploug...
In most cases the most biometric data is your photo and I guess height?
So beginning to normalize the collection of eyeball data as a thing is a pretty significant escalation.
Perhaps generating “proof of humanity” digital signatures from retina scans isn’t the optimal solution, but I’ve yet to hear of any other privacy-preserving approaches. Perhaps transacting online will a require government-issued ID.
https://www.congress.gov/bill/118th-congress/senate-bill/884...
For the first time, you can now give your biometrics to OpenAI to do nothing more than you already can. This is just a pure cult of personality.
Proving authenticity in an increasingly diasporic society is difficult.
We should seek to either reduce or embrace entropy in the design of our systems. You either want systems which prove there are no Sybil attacks, or you manufacture halls of mirrors.
This is a continuous battle, there’s no panacea here, even the eye scan has threat vectors.
Calling KYC a solved problem is ludicrous.
Banks tend not to over engineer things. KYC can be seen as a 3 sided trade off: cost of KYC infra/process/etc, lost revenue from denied business and fines from regulators.
They (the banks) don’t really care about the social goals of KYC, they just try to best optimize for expected value in the trade offs.
The regulators understand this, and are basically fine with it. They have their own trade offs they are balancing.
Both sides mostly find and equilibrium.
One of the more important goals isn't to directly stop fraud but instead to provide tools that give end users results that scale with the amount of effort invested. The level of risk should be a tradeoff that the end user is able to make.
Current solutions mostly allow for that but certainly have some rough edges.
>Simply put, the premise is this: scan your eyeball, get a biometric tag, verify yourself, buy our apps (and cryptocurrency). ... Minority Report style technology
it's not really like that. They take a photo of your eye to check you are a new person and not someone who has an account already, then give you an account which is like an anonymous crypto wallet with a private key. You never do an eye scan again in normal use. They give you free crypto/money rather than you needing to buy anything. I've been given ~$300 - it fluctuates a fair bit with crypto prices.
I recommend it to anyone who's curious / positive about new tech.
What does this have to do with curiosity or new-tech positivity? Nothing.
Give biometric data, get fluctuating ~$300. You did nothing else than sell something you have. I'm not judging.
I would ask that you elaborate a little more. I am an example of one. I like LLMs, but I cringe internally and externally at times at some of the things people seem to want to use them for ( and I just saw a presentation that basically said the equivalent of "add AI here, happy sunshine leaves there". And how? Magic. Nobody knows. ). I like crypto, but it is impossible to not see million rugpulls, scams and so on out there. I like technology, but I am very, very aware of the issues with basic human nature.
"Tools for Humanity (TFH), a for-profit company co-founded by Sam Altman, Alex Blania, and Max Novendstern in 2019."
On the topic of eyes, my dad recently had surgery on his eyes and they did one at a time, for obvious reasons. That could be a way to transition. Register both, have surgery on one, let it heal, register the healed eye, have surgery on the other, then register that. Always using the good and registered eye to authenticate. But this isn’t realistic. It requires way too much forethought and planning, when people’s minds are elsewhere.
The more biometric tech converges on the ability to get a cryptographic hash of one's body, the further it retreats from the kind of thing that a layperson will trust. You end up with a root of trust that <1% of the population can verify and then you end up asking 100% of them to rely on systems built on that root. You're never going to be able to convince even a majority of people that some clever hacker hasn't cracked an iris scanner and associated millions of fake ID's with millions of AI's for scam purposes.
It needs to be the kind of thing that lets Alice assert that this key goes with Bob just after she shook Bob's hand in meatspace. Something where, in order for Bob to have two identities according to Alice, he'll have to meet her in meatspace twice and manage to have her not notice that she's already met him once before. PGP key signing parties were pretty much there, they just came too early (and not enough work was done to teach the masses about them).
The web becomes more of a dark forest with each passing day. Eventually the cost of maintaining your part of the trust graph will be lower than the cost of getting screwed by some root of trust that you can't influence or verify. I'm sad to say that I think the point where these lines cross is significantly down and to the right of where we are.
I won't dispute that PGP key signing parties coupled with government ID work very well for certain very specific usecases such as validating distro maintainers.
However for more mainstream and widespread uses that never occurred, what about work on the tooling? I've yet to see a web of trust implementation that really felt like it was properly generalized, scalable, and intuitive to interact with.
Case in point, if you wanted to implement a distributed code auditing solution on top of git and signed commits, what library would you use for the web of trust graph calculations? And would key signing parties be a usable root of trust for that with the current state of the software ecosystem? My personal view is that both of those things are woefully lacking.
DonHopkins•4h ago
https://news.ycombinator.com/item?id=38398910
DonHopkins on Nov 23, 2023 | parent | context | favorite | on: The Eyes Have It (1953)
"Lies, Inc." aka "The Unteleported Man" had an eye eater!
https://www.academia.edu/2360689/The_Missing_Pages_of_THE_UN...
Freya said, "Tell me. What is the 'eye-eater'? I have to know." Her breath caught in her throat; raggedly, she managed to breathe, but with difficulty.
"A fungiform," the taller of the THL agents said briefly. "One that resides here." He said nothing further. [...]
The eye-eater said pleasantly, "Mr. Ben Applebaum, reach inside me and you will find a slightly-different edition of Dr. Bloode's Text. A copy of the twentieth edition, which I ingested some time ago... but as far as I can determine, not already dissolved by my gastric juices." The idea seemed to amuse it; the lower portion of its face split apart in a peal of excrutiatingly-penetrating laughter.
"You're serious?" Rachmael said, feeling disorganized. And yet the eye-eater was correct; if it did possess a later edition of the text he most certainly had reason to seek it out -- wherever it lay, even within the body of the offensive eye-eater. "Look, look," the eye-eater exclaimed; it held in one of its longer [...]
https://sickmyduck.narod.ru/dick15-0.html
"A lie," the eye-eater rumbled ominously; again its pseudopodia whipped viciously, seeking out the agile creditor balloon, which dipped and bobbed barely beyond the flailing reach of the several sucker-impregnated arms. "As a matter of fact, this gentleman here-" It indicated Rachmael. "My understanding is that the lady and this individual are emotionally involved. Miss Holm is-was, whatever-a friend of mine, a very close friend. But hardly my mistress." The eye-eater looked embarrassed. [...]
https://survivorbb.rapeutation.com/viewtopic.php?f=179&t=419...
Let me quote one of Dr. Bloode's quite singular Thingisms.
"'Thingisms'?" Rachmael felt baffled -- and wary. He had a deep intuition that the Thingism, whatever it was, would not be amusing. Not to him, anyhow, or to any human.
"I always enjoyed this one," the eye-eater intoned, its saliva spilling from its mouth as it writhed with glee. "Consider: since you are about to read the book, here is Thingism Number Twenty, dealing with books.
"Ahem. 'The book business is hidebound.'"
After a pause, Rachmael said, "That's it?"
"Perhaps you failed to understand. I'll give you another gem, one more particular favorite of mine. And if that fails to move you ... Oooohhh! That's a Thingism! Listen! 'The representative of the drayage firm failed to move me.' Oooohhh! How was that?" It waited hopefully.
Baffled, Rachmael said, "I don't get it."
"All right." The eye-eater's tone was now harsh. "Read the book purely for educational purposes, then. So be it. You want to know the origin of this form which I have taken. Well, everyone will take it, sooner or later. We all do; this is how we become after we die."
He stared at it.
"While you ponder," the eye-eater continued, "I'll delight you with a few more Thingisms of Dr. Bloode's. This one I always enjoy. 'The vidphone company let me off the hook.' How was that? Or this one: 'The highway construction truck tore up the street at forty miles an hour.' Or this: 'I am not in a position to enjoy sexual relations.' Or --"
-- "Lies, Inc.", by Phil Dick