Alexander Linton of the Session Technology Foundation on building decentralized messaging and why platform-wide content moderation is impractical on encrypted platforms.
Not familiar with him or the platform. But sounds like an interesting exchange.
Make sure that it's impossible for my online actions to be traced to my identity, and then I don't need privacy, because there is no association that needs to be hidden and protected.
Similarly, sometimes people say there's a tradeoff between security and privacy, which doesn't make sense since privacy (confidentiality) is one dimension of information security.
If you still disagree, could you attempt at defining privacy and anonymity and how you can prioritize meaningful anonymity without caring about privacy?
Food for thought: Rather than nitpick on people who apparently use the word in earnest in its actual meaning and thereby conceding that those policies are actually "taking your privacy seriously", rendering the word void of meaning, we educate and take it back?
Saying "Make sure that it's impossible for my online actions to be traced to my identity, and then I don't need privacy" is in a way giving up on privacy alltogether.
If you're planning to also be very careful about not revealing anything about yourself and your activities online, then that sounds less like "I don't need privacy" and more like "I guard my own privacy."
If that's what you mean, and you don't want people to misconstrue your intentions...maybe be clearer in your wording?
Here is just one solution that helps parents, and respects everyone's privacy:
Zero knowledge proofs.
Allow any organization that already legitimately verifies ages (i.e. credit card company, driver's license issuer, ...) to provide a cryptographic key to their clients, that they can use to anonymously verifiably assert they are 18+ to any adult sites they visit.This solution (1) gives sites no user information except 18+ verification, and (2) gives key providers no information about sites clients visit.
This is what zero knowledge proofs are for.
Everyone wins:• Parents jobs get easier.
• Children are less likely to encounter adult material.
• Everyone's privacy is protected.
• Adult sites can verify 18+ ages, without driving users away.
Not solving/mitigating endemic child access to adult sites is (1) a great disservice to parents and children, and (2) makes the success of draconian surveillance legislation MORE likely.
(If you have a critique of this solution, please frame it as an issue to resolve, not a categorical swipe at crafting solutions. The cynical prevalence of the latter is so damaging to these debates.)
PICS https://en.wikipedia.org/wiki/Platform_for_Internet_Content_...
POWDER https://en.wikipedia.org/wiki/Protocol_for_Web_Description_R...
ASACP/RTA https://en.wikipedia.org/wiki/Association_of_Sites_Advocatin...
Or why anyone would discourage use of cryptographically hard privacy protecting solutions.
This is the perfect opportunity to take zero knowledge proofs mainstream, like end-to-end encryption, as a solution for myriads of current privacy leaking services and infrastructure.
The alternative to cryptographically protected privacy, is sites increasingly collecting people's identifiable information and associating their identities with access/behavior logs. Information that can never be assumed to stay private.
Friend’s phones, home computers and devices of other family members.
Unattended PCs and laptops at school. According to a music teacher who has literally had to clean her work computer after it was used for erotic viewing by students when the music room in a temp building wasn’t otherwise in use.
Web browsers on game consoles, e-readers, VR headsets, smart TVs, tablets, …
Now throw in constant device turnover, software updates, including settings panel changes, and settings values that get reverted, across the board.
I am not sure why you wanted my opinion. That’s less of an opinion and more of a list of what counts as ordinary for the last decade or so.
In my opinion that's more than enough, especially when you compare it to requiring everyone to identify themselves. It may be ZPK on the tin, but likely it will be close-source, corporation owned implementation, which will have holes. Then in a few years we will learn that Meta exploited them for years to sell your soul for ad money.
Btw - students occasionally steal teacher's cars. Should we block engine start with ID check too?
The solution I proposed was the opposite of people identifying themselves.
Zero knowledge proofs. Enabling trusted verification without revealing identity is exactly what cryptographers designed them for.
We should be using them everywhere. Like end-to-end encryption they provide massive privacy, security, and trust (I.e. ability to verify intended disclosure) improvements.
Or we can complain about parents, the ones who care enough to ask for better help, while legislatures keep passing identity revealing anti-privacy rules. That seems to be the direction many are taking here. Complain, condescendingly, don’t solve anything. Repeat.
Useful situations. On devices parents don't control.
Expecting parents to follow their children around 24/7, in case they access some adult site from a public or friend's device they don't control is beyond ridiculous.
Privacy protecting, anonymous validation of 18+ status solves the problem, in a way that doesn't require unrealistic "parenting" behavior, protects everyone's privacy, and is even helpful to responsible adult sites.
Condescendingly telling parents to "parent" in a way that is virtually impossible, instead of helping, is just rolling out the red carpet for alternate non-anonymous age verification legislation.
Zero knowledge tech, like end-to-end encryption, protects privacy.
I think a device level setting is actually quite pragmatic.
Your solution sounds good and should work fine, and be easy to implement, which is perfect! But people will soon wonder what all the elderly people that are living in retirement homes without internet access are doing on porn sites watching mostly the overwatch and fortnite cosplay themed videos...
If you are pointing out that the technical solution I proposed isn't perfect, that children may steal their older family members identities, I agree.
As noted, imperfection is a common, unhelpful argument, against improvement. However, identifying imperfections is constructive, if the point is to continue to solve problems. (Kids stealing identities isn't great for many reasons.)
It's like electronic voting, you can have the best cryptography hardware and software in the world, if the end user does not understand at least on a surface level how it works, it will be vulnerable against manipulation. You can certainly keep the same system and educate all users, but that's a whole other class of problem.
Or we can continue to have our identities and activities logged by more and more actors. And our online and even offline experiences “personalized” for us for ends that are not friendly. Now add AI, which is only in its early stages, actively participating in our surveillance and manipulation.
Privacy holes are serious security holes.
Ironic or not, zero knowledge proofs allow people to volunteer exactly the information needed for an interaction and no more.
Isn’t that the ideal? Maximize both freedom and trust? With existing tech.
Flip the parental concern upside down. Let’s take the side of genuinely responsible adult sites. Isn’t their ideal to be able to verify that visitors are adults, without surveiling them? Avoiding becoming a resource and target for other actors? A target for lawsuits if they are hacked or leak information.
Lots of adult sites are already unhappy for being put in that role in a growing number of regions.
But as design criteria go, that is certainly a sensible one to include.
Just a random first idea, the key effectively auto updates, I.e it’s a time varying key chain. I can think of several ways to do that, so the time varying nature can’t be replicated by someone else without the same originating account. But couldn’t say if any were good or not. It is something to design carefully, as all cryptographic systems need to be.
Other criteria would be easy revocation by the original key holder. Keys that are created from any multiple number of independent accounts, blind to each other, that the key recipient chooses.
Again, just throwing out first thoughts.
gjsman-1000•1mo ago
For years, technical people insisted it was the parent’s job to monitor Internet safety. Parents, especially with the advent of social media, hardcore pornography, and every childhood friend having a device, correctly said this was unreasonable and impossible.
Technical people had a chance for two decades to solve this problem on their terms. Instead we collectively decided that anyone articulating this point of view must be a morally panicked maniac, and that this is a problem with “no reasonable solutions” that we would tolerate. Now we don’t get to dictate the terms, because everyone has had enough, and nobody has patience left for kids watching BDSM on their friend’s phones at 12.
Because of our industry’s refusal to take those concerns seriously, we lost our voice, we lost the grounds of sounding reasonable, and the floodgates are now open - for everything. Nobody is listening anymore to our point of view and arguably correctly so.
akersten•1mo ago
Since the dawn of pseudoanonymous communication, politicians have been trying to get their nasty little claws into it. See Clipper Chip in the 90's. They've tried many avenues to deanonymize and centralize. Going after the parents is just their latest - they've discovered they could use convincing language like this to trick a bunch of people who previously had no reason to care about The Internet to now suddenly "realize" oh gosh it's scary out there, what can we do to help.
Unfortunately their latest tactic is working. They figured out how to recruit a (possibly) well-intentioned bloc into supporting efforts that undermine privacy in an irreversible way.
> Because of our industry’s refusal to take those concerns seriously, we lost our voice,
Fighting against demands to censor, unmask, and neuter the closest thing we've got to a global platform of freedom is a valiant effort. Not entertaining these bureaucrats isn't some moral failing of our industry, in the same sense that ignoring a persistent busker on the street entitles him to your money after some uninvolved observer has arbitrarily decided he's made the same demand enough that somehow it's starting to make sense because the victim hasn't yelled at him with a good enough argument against it.
In other words: yep, still the parents' job, yep, internet was still there when I grew up, yep, I turned out fine, yep, politicians have been trying to take away our privacy for 30 years (and unfortunately, they're finding more creative and convincing ways to disguise it). Hint: it's never about the kids
Nevermark•1mo ago
Well yes it is. It is about both the cover problem (child safety), and the ulterior motive (surveillance, control).
And not taking the reasonably concerning cover problem seriously, by finding sensible solutions, both leaves it festering unsolved in its own right, and growing in usefulness as a cover problem.
lisbbb•1mo ago
I stopped worrying about it as a parent. My kids want to look at porn? Let them. If they want to see horrible, violent shit that gives them nightmares and they can't un-see it, let them. They'll either figure it out or they will be lifetime children looking for big daddy government to solve all their stupid problems for them and society will collapse.
paulryanrogers•1mo ago
blueflow•1mo ago
Personally i prefer the beheading video as its not far from whats already on TV.
ToucanLoucan•1mo ago
I don't even think you're strictly 100% incorrect here. That said, I refuse to entertain the "think of the children" horseshit when parents happily park their kid in front of an iPad for hours a day because it shuts them up, not with kink content they shouldn't see, but with AI/algorithmically generated garbage, or for that matter, human generated garbage, that rots their brains far more than any gimp costume ever could.
For fucks sake, 12 kids in America die PER DAY due to mass shootings, and we can't even pass common sense gun regulations that the vast majority of gun owners are completely fine with. We don't give a fuck about our kids here. This has from jump, and continues to be, astro-turfed puritan whining not merely to pornography of whatever preferred kind of the moment, but to the existence of queer people as a whole.
It IS parents' fucking responsibility, and maybe that is an onerous burden, but instead of even attempting to meet it, the majority of parents have abdicated it. And it isn't porn fucking their kids up, it's non-stop screen exposure leaving them with no attention span and no ability to simply BE BORED.
AJ007•1mo ago
Also to point out, if a 10 year old walks in to the street and is hit by a car, their parent gets charged now (in some US jurisdictions.) The idea that the adult is not responsible for the child does not correspond with US law. Maybe it's different in Europe.
A secondary issue is tech companies on-boarding children to having public social media profiles (and their parents posting the child's entire lives on their own) -- which is completely asinine and appalling. That bridge was breached years ago and somehow no one complained loud enough. Certainly a mistake.
godelski•1mo ago
When I was 12 our computer was in the living room and shared by everyone. It could access porn but you had to wait till you were the only one in the home.
There's a pretty simple solution here if you don't already see it and I'm not sure why it isn't more acceptable. It solves all the problems you mention. You don't want kids being sucked into social media? Sucked into porn? Constantly staring at their screens?
Have you ever considered not giving your kids smartphones?
Or have you decided the benefits are worth the costs?
Let's be honest here, even with strong government intervention this is always a cat and mouse game. The play of "no smartphone" is going to be far stronger than anything the government can ever do. Why is this not an option?
altairprime•1mo ago
j45•1mo ago
The reality is it's not the smartphone, but the slot machine type software running on it.
There's more than enough science that placing this kind of content in front of humans before their prefrontal cortex is fully formed at age 25-26 leads them to leaning on the pre-frontal context of the adults around them, and missing that, whatever they're spending the most time with that's then possibly raising them.
Screens at lower resolutions and quality didn't seem to be as much of an issue compared to the hyper saturated motion with sound effects that are consciously chosen to keep eyeballs.
Like anything, digital can be used for good, or bad, and in lieu of good, the other can to happen and become more of a default.
godelski•1mo ago
And hey, maybe if we actually take some autonomy and remove that market from actors who don't want to build the things the market is requesting then they'll actually build the things the market is requesting... it's easy to say we want something but no one listens when we still buy the thing we say we hate. Maybe it's addiction but it's still hard to fight against. (Though we could still do better by accommodating those who are trying to break the network effects. You can complain how hard it is to get off Facebook but if you're not going to make the minuscule extra effort to accommodate those who do leave then how can you expect the ground to be laid for you to?)
At the end of the day though, with kids, the OP's argument fails because it either assumes smartphones are an inevitability or that the benefits outweigh the costs. It's a bad argument.
j45•1mo ago
That being said, it could be used for more positive things too, beyond attention farming and resale to ads alone.
b00ty4breakfast•1mo ago
binary132•1mo ago
godelski•1mo ago
Dylan16807•1mo ago
No it's not. I could imagine sentences where it would be, but not this sentence. Here, watch me replace the word:
"99% of adults are not allowing their teenager to to experiment with heroin or giving their 12 year old permission to drive their car down the freeway."
Even if most people aren't ""reasonable"", they are whatever adjective that sentence describes.
altairprime•1mo ago
The position itself is, of course, completely unreasonable — boundaries are never inappropriate to consider (and to contrast with the parent’s boundaries about immediate versus deferred conversations in unsafe circumstances, the child’s age and cognitive ability to assess risk, and so on), no matter how uncomfortable it is to teach a child about boundaries by honoring one they’ve presented one! — but that intolerance is presented in such a reasonable guise, with a tone of majority support to quash any brief qualms, that it causes many to overlook its true nature.
See also “pleasant”, as in “Pleasantville”.
binary132•1mo ago
Dylan16807•1mo ago
Whether you agree or disagree with their general stance, the word "reasonable" isn't load bearing.
In particular they didn't say that limiting cell phones was reasonable, or requires parents to be reasonable. They just wanted an example of parents enforcing boundaries.
The only load-bearing part of that sentence is the idea that parents do enforce those boundaries. Which they do. It's irrelevant if they are doing it because they're "reasonable".
TL;DR: I do know what the discussion is about. But your comment wasn't about the general discussion, it was about the quality of a specific point, and I'm defending that specific point.
b00ty4breakfast•1mo ago
iLoveOncall•1mo ago
This is not at all about children by the way, because all millenials have grown while consuming porn and social media, and haven't turned into degenerates. It is 100% an excuse to spy on citizens, and nothing else.
j45•1mo ago
If it's simple to begin with doing this, or things to try specifically that can build any parents skills and competencies in this area, mind sharing that?
zen928•1mo ago
j45•1mo ago
pluralmonad•1mo ago
j45•1mo ago
Junk food and processed sugar creates dietary based ADHD kids.
One's information diet also changes how the brain develops, that's not a pseudo threat.
I sense some trepidation around not having unfettered access to swim in the whole ocean as a child with more and more sharks and angel fish floating around.
The environment to monitor is increasingly digital, not just in person. See my note above about parents who think their kids are safe at home, when they're letting the entire unfiltered world into their kids devices, eyes, minds without context.
Nevermark•1mo ago
The myriad of settings, accessed differently on every site or service, that everyone needs to be aware of, and actively fight dysfunctional defaults and frequent "helpful" resets after updates, are not the solution parents or anyone else are looking for.
Why are the defaults set to favor the company, in a way that makes them real customer/user-targeted security holes that people have to play endless wack-a-mole to secure themselves?
That is industry taking plausible deniability and the monetizing potential of parasitical behavior at scale "seriously" indeed.
ninkendo•1mo ago
In my view, freedom of speech is a natural right that no government can take away. But no such guarantee exists that you can do so anonymously.
Now before you immediately flame me to death, please read a bit further:
We already have a hodgepodge of laws in basically every jurisdiction around logging IP addresses, cooperating with law enforcement when there’s a warrant, being able to track down who is (say) organizing violence, posting csam, etc. Like it or not, the government is entitled to search and seizure if there is a warrant signed by a judge to do so, and if you run an online service where can people can post things publicly, you better damn well keep logs of who’s posting things and you better cooperate if a law enforcement officer with a warrant asks you to.
So what I propose is that we streamline all of this. At age 16 you get a digital ID that works something like a FIDO chip that can be used to prove your identity to a government authentication server. Sub in/out whatever tech you want, it can be a passkey (blech), something resembling a yubikey, etc. You get them at your local post office, where you can actually prove your identity in person. There’s post offices everywhere, and they’re already meant to serve everyone in the country.
But critically, this key isn’t used to auth to any sites except a government-run signin service. The service itself would be a modified form of OAuth/OIDC that preserves privacy from the site you’re making an account on. They don’t know who you are, they just get a signed payload from the government signin site saying “this is a user over the age of 16”, and via a pre-established relationship between the website in question and the government auth site, a UUID is minted for that legal person. It will be the same UUID for that website for the same person, so you can’t just pretend you’re multiple people when you create multiple accounts.
With this system, and using Reddit as an example site that may leverage this:
- Reddit can’t know who you actually are, they only get a UUID and a signed payload indicating you’re over 16 (or whatever other set of properties are legally salient for the account.) In the event of a breach, all you’d get is a list of Reddit-specific UUID’s. You’d have to also hack the government auth service to know who these people actually are.
- The government doesn’t know who owns your username on Reddit, they only know the list of citizens that have Reddit accounts at all.
- In the event of a crime with a warrant, the government can compel Reddit to inform them which UUID corresponds to some account. Reddit continues to not know who the account belongs to.
- Every site using this system gets a completely different UUID for the same legal person and has no ability to correlate them
- Every legal person using this system has no idea what their UUID is for any site
- Every site using this doesn’t have to worry at all about proving identity. They get working auth for every legal person in $country, streamlining signups and onboarding, and doesn’t have to worry about asking the user to prove they’re over 16.
- You still get pseudo-anonymity in that you can use an alias (as many as the site allows, too), the site can remain blissfully ignorant as to who you really are, as well as everyone who reads your posts, etc.
- The government can find out who owns an account, but only with a warrant. They don’t have a list of account/UUID mappings anywhere.
This system is probably the most closely aligned to how I would do things if I was somehow “in charge”… you have a right to pseudo-anonymity, but you don’t have a right to cover your tracks so thoroughly that the government can’t track you down with a proper warrant.
With such a system, saying “social media is for ages 16 and up” is a simple checkbox in the signup flow. Done.
You can argue all day about whether a government should be able to uncloak your accounts with a warrant, but to me that question is already settled: yes, they absolutely can do that, they do so today all the time. Except today we have messy data breaches where everyone’s identity gets leaked because every site has to reinvent their own form of proving your legal identity (in the case of Facebook/etc) or simply proving you’re a certain age (uploading ID, etc.). I’d take a centralized government-run approach to what we have today any day.
You could ask “but should government be in the business of electronic authentication and identity?” And my answer is: “YES.” It’s basically the primary function of a working government! We trust them to issue passports for chrissake. To me this is basic table stakes in the 21st century. If we did government all over again, having the government provide a service to prove online identity is basically right up there with “collects tax revenue.”
Now, in the current government up to the challenge of doing this, and not fucking it up? Yeeesh, probably not. You got me there. But one can dream…
bgbntty2•1mo ago
The question should be about whether any future government would be up to the challenge of not fucking it up, as the system would stay in place and only grow in size. That's why privacy-invading infrastructure like this should be kept to a minimum.
sfRattan•1mo ago
At this point, any parent saying, "I just don't understand technology," or, "I just don't have the time to mind my childrens' computer use or monitor their Internet access," is morally equivalent to saying, "I just don't understand traffic safety," and, "I just don't have the time to teach my children the rules of the road or how to cross the street," while living in a big, bustling city.
It is lazy, entitled, and negligent. The world is full of networked computers and, barring some new massive Carrington event, is never going back to the way it was before.
j45•1mo ago
In the past, the general disconnection of the world and information had a natural insulation factor. Probably less so today.
Rather than admonish adults, it's actually quite common that many people in many professions don't know how to purchase, or implement software in their day to day work, let alone at home.
Maybe, this is is an aspect of digital literacy that has been lacking - we know that the consumer habit loop that smartphones go after is not always about digital health, or the user's digital literacy, it's about capturing their attention.
Parents actually seem to want the same kind of quality curation not just with the internet, but all areas of their children's lives.
The free for all they may have grown up with 20-40 years ago is simply not the same any more online, or offline.
In that way, even trying to make an effort sometimes isn't enough I'd say. Ignorance is one thing, but maybe it could seem like negligence to others.
Would there be some possible solutions or approaches you or others could offer here to help parents build the skills that lead to not giving up? Sincerely curious where folks see the starting point of these skills.
sfRattan•1mo ago
In person I've found the difference to be usually very clear. It's why I distinguish between the sympathic attitude of "I don't know where to start" and the contemptible reaction that "therefore I will act as if nothing is wrong, or write it off as someone else's fault."
>Would there be some possible solutions or approaches you or others could offer here to help parents build the skills that lead to not giving up? Sincerely curious where folks see the starting point of these skills.
It's a hard problem. I've spent a lot of time thinking about it, and almost as much time talking to relatives with children. I have come to no happy, easy answers.
Carey Parker's Firewalls Don't Stop Dragons is a good starting point to Security and Privacy, but not really how computers work. There are various books I remember from childhood about how computers work, but I don't really remember the process of coming to understand computers distinctly because it was both early and continuous over a long period.
I think the best we'll do as a species is one-to-three computer people per extended family. And I think the key will be teaching those people how to be of service to their families. But there's not really an existing framework for a "computer court wizard" in each family nor for what maxims and/or proscriptions such a person might teach computer illiterate family members in order to use computers safely and protect the family's children from the worst of dark algorithms, surveillance, and abuse/predators online.
It's completely uncharted territory.
j45•1mo ago
I'm not sure its entirely uncharted territory.
TV channels used to get managed. Magazines used to get managed.
There is a lot more volume now, obviously.
Tools like Circle can provide some level of family level DNS which can help.
Something that stands out is also helping parents get a handle on their own consumption and habits to be able to better teach kids on what to look out for.
sfRattan•1mo ago
Analog mass media couldn't dynamically adapt itself to individual viewer proclivities in order to attract attention. Parents can understand that children shouldn't watch TV at night because society has agreed to constrain more mature programming to when children are mostly asleep. We can understand not to leave a Playboy magazine lying around in reach of children or, better, not to have such things in a family home at all. But digital media defies more than convention... It defies all points of reference a human computer-layperson might have in the analog world that could help to understand it fully.
Try to explain to someone on the street the sorts of things that are and are not possible with computers and networks, and why various things fall into one or the other category. Watch their eyes glaze over. Absent points of reference and useful context it's almost anticomprehensible.
>Something that stands out is also helping parents get a handle on their own consumption and habits to be able to better teach kids on what to look out for.
Strongly agree. I've always shied away from algorithmically managed feeds and dark patterns. It felt instinctual to me but I think those instincts were born from coming to understand computers at a young age. Humanity at large has basically zero instincts for the digital world... Yet. Square one may be feeling the difference when you cut slopfeed content and targeted advertising out of your life. Square two may be new, computer-age fables to cultivate those instincts among those who aren't (and largely won't ever be) deeply computer literate. The sort of parables every single American grew up deciphering in McGuffey readers, once upon a time, but concerning things like, "a person can pretend to be anyone online, and that can cause trouble," or, "the boy who gave away his secrets could never get them back."
Madmallard•1mo ago
Is that the reason? I really don't think so.
I just think it's a power grab by the participants in government and tech companies.
Nevermark•1mo ago
That plays into the hands of those using subterfuge.
The power grab is riding on reasonable concerns about children. So its worth improving safety for children for two reasons: (1) reducing parents reasonable concerns and making parenting in the modern age a bit easier, and to (2) take that issue off the table (or meaningfully reduce the leverage it supplies), for the power grabbers.
Ironically, the fact that the underhanded motives lie beneath a reasonable concern, makes solving the reasonable concern in a healthy way even more critical.
Madmallard•1mo ago
Parents are supposed to raise their children.
Nevermark•1mo ago
Yes.
> Parents are supposed to raise their children.
Yes.
Vapid questions and statements aside, ...
Here is just one solution that helps parents, and respects everyone's privacy: Zero knowledge proofs.
Which allow anyone who is verified by anyone already (i.e. credit card company, ...) to get a cryptographic key from that organization, that they can use to anonymously verifiably assert they are 18+ to sites, (1) without giving sites any other information, and (2) without their key source getting any information on what sites they visit.
Bradsburied•1mo ago
It is neither unreasonable or impossible. No need to hand the kids a device that can access the open internet. If parents do, restrict or deny gadget time. Stick to a Nintendo or console.
Learned helplessness and codependency on internet is part of the sales pitch of big tech, salesmen in general. They push self doubt and sell their solution.
Fill the time showing kids how to rotate tires and change oil. Show them how to make pizza crust.
Learned helplessness is a feature of capitalism. Not immutable feature of physical existence.
wizardforhire•1mo ago
razakel•1mo ago