Firefox, especially in the first versions, permitted much less control on cookies than Mozilla did, but I think it still always allowed disabling third party cookies.
Which you know, because you say "you can also use CORS to interact with a truly third party site". But now, I invite you to go the rest of the way - what if the third party site isn't Project Gutenburg but `goodreads.com/my-reading-lists`? That is, what if the information that you want to pull into System A from System B should only be available to you and not to anyone on the net?
So the problem for regular humans is the disappearance of features that they've grown used to having without paying any money. Finding a better way to support themselves has proven remarkably difficult.
For a long time I thought pinterest was search spam that no human could possibly want to see, but then I met real people in the world who like it and intentionally visit the site. I bet there are people who like ehow and the rest, too.
https://privacysandbox.com/news/privacy-sandbox-next-steps/
> Taking all of these factors into consideration, we’ve made the decision to maintain our current approach to offering users third-party cookie choice in Chrome, and will not be rolling out a new standalone prompt for third-party cookies.
It is going to be interesting to see if anti-trust enforcement's manages to separate Google from its financial and practical hold on web standards/browsers.
The opportunity to increase ethical norms of web browsing would be welcome to me.
Well, it doesn't prevent them, but it does make it a little bit harder ...
I don't even understand how being forced to divest Chrome will even help. Once another company owns Chrome and can remove third party cookies, Google gets the same benefit.
So limiting Google's control over browsers will create more competition. More competition on implementations. And also more competition in terms of features and user centric service.
--
Question: Does Google really not gather information from anything but its search engine and first party apps? That would seem financially non-optimal for any advertising funded business.
I would think that sure, they log everything peopel use their search for.
But that they would also find a way to track post-search behavior as well. Google leaving money on the table seems ... unusual if there isn't some self-serving reason they would forgo that.
I am happy to become better informed.
Their long wait to do it is part of why we ended up in a regulatory mess
Of course, ad companies scream bloody murder, and the UK market watchdog had to step in so Google wouldn't turn off third party cookies by default.
The only distinction is that I can do a decent job of blocking third-party cookies today with my existing solutions like uBlock Origin, but I will probably have a much more difficult time getting around login/paywalls.
You can also easily redirect from your site to some third party tracking site that returns back to your successful login page - and fail the login if the user is blocking the tracking domain. The user then has to choose whether to enable tracking (by not blocking the tracking domain) or not seeing your website at all. Yes the site might lose viewers, but if they weren’t making the site any money, that might be a valid trade off if there’s no alternative.
Not saying I agree with any of this, btw, I hate ads and tracking with a passion - I run various DNS blocking solutions, have ad blockers everywhere possible, etc. Just stating what I believe these sort of sites would and can do.
The only problem is that then the tracking companies have to place more trust on the first party that they're giving them real data.
But they're doing it, actually, see confection.io for example
Now, down with the rest.
Only reason why we see this movement is because advertisers feels confident about removing third party cookies.
But once anyone gets a creepy fingerprinting system working, the barriers drop, and it becomes cheaper to resell the capability as a library or service.
It may offer some minor benefits in terms of enabling companies that "want to be more ethical than the competition", but that too seems like a long-shot. :p
Edit: for clarity, I believe anything with the ability to analyze the user environment via Javascript/etc on major sites is likely fingerprinting regardless. Blocking, environment isolation and spoofing is already necessary to mitigate this.
Section 2: Third party cookies have gotten bad. Ok.
Section 3: There are legitimate use cases that third party cookies currently cover. Also ok. Then they throw in, "Be aware that a set of new technologies which carry minimal risk individually, could be used in combination for tracking or profiling of web users." Yes? Huge scope increase in the document though and all of a sudden we're now talking about tons of tracking technologies in aggregate? The authors move on without further comment.
Section 4: I think the first half is essentially saying that new technology coming online in the web platform will make the third party cookie problem worse, so we should fix it soon. OK, I'm with back with you. Then the document suddenly pivots to proposing general standards for web privacy again, saying that the burden of proof is on the people originating the proposal to, before concluding by saying (apparently without irony?) that justifying the removal of third-party cookies' impact on business is outside of the scope of the document.
I'm missing a ton of cultural context here about how W3C works, so I'm guessing this probably amounts to rough notes that somebody intends to clean up later that I'm being overly critical of, and they didn't expect it to get any traction on hacker news.
No. It's sabotage.
Nothing is private: https://nothingprivate.gkr.pw
More effort ought to be put into how to make web spec to NOT be able track user even if JS is turned on.
Browser vendor Brave, Firefox suppose to privacy browser are NOT doing anything about it.
At this point, do we need to using JS disabled browser to really get privacy on the web?
My gut says no, not possible.
Maybe we need a much lighter way to express logic for UI interactions. Declarative is nice, so maybe CSS grows?
But I don’t see how executing server-controlled JS could ever protect privacy.
you stand out when you obviously hide
It's why tor browser is set to a specific dimension (in terms of pixel size), have the same set of available fonts etc.
But if privacy is truly the desired goal, the regular browser ought to behave just like tor-browser.
Yes, of course: restrict its network access. If JS can't phone home, it can't track you. This obviously lets you continue to write apps that play in a DOM sandbox (such as games) without network access.
You could also have an API whereby users can allow the JS application to connect to a server of the user's choosing. If that API works similarly to an open/save dialog (controlled entirely by the browser) then the app developer has no control over which servers the user connects to, thus cannot track the user unless they deliberately choose to connect to the developer's server.
This is of course how desktop apps worked back in the day. An FTP client couldn't track you. You could connect to whatever FTP server you wanted to. Only the server you chose to connect to has any ability to log your activity.
and how does one know their privacy has been invaded? How does the user know to enforce the DMCA law for privacy?
I think the solution has to be technological. Just like encryption, we need some sort of standard to ensure all browsers are identical and unidentifiable (unless the user _chooses_ to be identified - like logging in). Tor-browser is on the right track.
The comments on the story are completely unconvinced that anyone at Apple will ever be convicted. Any fines for the company are almost guaranteed to be a slap on the wrist since they stand to lose more money by complying with the law.
I think the same could be said about anti-cookie/anti-tracking legislation. This is an industry with trillions of dollars at stake. Who is going to levy the trillions of dollars in fines to rein it in? No one.
With a technological solution at least users stand a chance. A 3rd party browser like Ladybird could implement it. Or even a browser extension with the right APIs. Technology empowers users. Legislation is the tool of those already in power.
Don't want to be tracked. Don't go on the internet.
You could make something similar where fingerprint worthy information cant be posted or used to build an url. For example, you read the screen size then add it to an array. The array is "poisoned" and cant be posted anymore. If you use the screen size for anything those things and everything affected may stay readable but are poisoned too. New fingerprinting methods can be added as they are found. Complex calculations and downloads might make time temporarily into a sensitive value too.
In those scenarios, tainted variables were ones which were read from untrusted sources, so could cause unexpected behaviour if made part of SQL strings, shell commands, or used to assemble html pages for users. Taint checking was a way of preventing potentially dangerous variables being sent to vulnerable places.
In your scenario, poisoned variables function similarly, but with "untrusted" and "vulnerable" being replaced with "secret" and "public" respectively. Variables read from privacy-compromising sources (e.g. screen size) become poisoned, and poisoned values can't be written to public locations like urls.
There's still some potential to leak information without using the poisoned variables directly, based on conditional behaviour - some variation on
if posioned_screenwidth < poisoned_screenheight then load(mobile_css) else load(desktop_css)
is sufficient to leak some info about poisoned variables, without specifically building URLs with the information included.Just create _strict_ content security profile, which doesn't allow any external requests (fetch) and only allow load of resources (css, image, whatever) from predefined manifest.
App cannot exfiltrate any data in that case.
You may add permissions mechanisms of course (local disk, some cloud user controls, etc).
That's a big challenge in standards and not sure if anyone is working on such strongly restricted profile for web/js.
My thoughts are that we need a distinction between web pages (no JS) which are minimally interactive documents that are safe to view, and web apps (sites as they exist now) which require considerable trust to allow on your device. Of course, looking that the average person's installed app list indicates that we have a long way to go culturally with regards to establishing a good sense of digital hygiene, even for native software.
What am I supposed to witness?
What we need to do is turn the hoarding of personal information into a literal crime. They should be scrambling to forget all about us the second our business with them is concluded, not compiling dossiers on us as though they were clandestine intelligence agencies.
They run arbritrary code from sketchy servers called "websites" on people's hardware with way too many privileges. While free and open source standalone web applications exist that only use minimal JS code to access the same web resources with a much better user experience. Without trackers, without ads and third parties.
I don’t mean to sound glib. But people derive a ton of utility from the web as it stands today. If they were asked if they supported the removal of web browsers they would absolutely say no. The privacy costs are worth the gains. If you want change you have to tackle that perception.
Doing what this demo shows, is clearly a violation of the GDPR if it works the way I assume it does (via fingerprints stored server side).
IMO this service should straight up be made illegal. I love the tagline they have of supposedly "stopping fraud" or "bots", when it's obvious it's just privacy invasive BS that straight up shouldn't exist, least of all as an actual company with customers.
but if you cannot have a third party cookie, the remote site from the tracker cannot be sure that the script was actually downloaded, nor executed.
I could even see a data broker centralizing this and distributing tracking to all of their clients. The client would just need to communicate with the central broker, which is not hard at all.
That script would execute with the origin of the server. It's access to resources and /shared state/ would be hampered by this. So as a cross-site tracking strategy I don't think this works.
> I don't understand how this helps the web unless protections against tracking itself, not the methods used, are established.
Which is why I think state partitioning[0] and CHIPs[1] are good technologies. It allows previously existing standards, like cookies, to continue to exist and function mostly as expected, but provides the user a good amount of default security against cross site trackers and other malware.
[0]: https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/...
[1]: https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/...
The workaround is looking more and more like IP, fingerprinting, and AI. I’d argue this is worse than 3p cookies, which were at least dumb and easy to clear.
Which is just going to be in additional to 3rd-party cookies. Google's own study concluded removing 3rd-party cookies loses revenue and "privacy-preserving" tracking increases revenue: https://support.google.com/admanager/answer/15189422 So they'll just do both: https://privacysandbox.com/news/privacy-sandbox-next-steps/
I feel like storing my "preferences" locally without letting me edit them as a stupid move.
Unpopular opinion: There are no privacy-preserving way for "fraud mitigation".
Either you accept fraud as cost to run business, or do away the privacy. Most business owner don't want the fraudulent user to come back, ever. If we value the privacy of user, we need to harm some business.
Idea: domains should be able to publish a text record in their DNS (similarly to SPF record for mail domains) designating other domains which are allowed to peek at their cookies.
Suppose I operate www.example.com. My cookie record could say that foo.com and bar.com may ask for example.com cookies (in addition to example.com, of course). A website from any other domain may not. As the operator of example.com, I can revoke that at any time.
Whenever a page asks for a cookie outside of its domain, the browser will perform a special DNS query for that cookie's domain. If that query fails, or returns data indicating that the page does not have access, then it is denied.
Just feeling uncomfortable putting more data into DNS. DNS is not encrypted. DNSSEC is easy to bypass (or break way too often that nobody want to enforce it).
-- but these are not w3c's problem.
If someone hijacks example.com's cookie record, that won't be caught; they just write themselves permission to have their page access example.com's cookies.
The same info could just be hosted by example.com (at some /.well-known path or whatever). The web could generate a lot of hits against that.
The DNS records could be (optionally?) signed. You'd need the SSL key of the domain to check the signature.
Using a SPF record, a domain indicates hosts that are allowed to deliver mail on its behalf (meaning using an envelope sender address from that domain).
Browsers and browser extensions will be able to use that info to identify shit sites, turning the whitelist around into blacklisting uses, like ad blocking and whatnot.
One simple mechanism would be for the browser to deny the cookie request if the requested domain's cookie DNS record contains more than, say, three affiliated domains. (At the discretion of the browser developer, and user settings.) The proliferation of that sort of config would discourage domains from being overly promiscuous with their tracking cookie access.
Plus, existing cookie control mechanisms don't go away.
[1] The exact figure may depend on which source you use, and there is some indication that ad and tracker blocking may artificially deflate Firefox and friends. https://gs.statcounter.com/browser-market-share [2] https://www.wired.com/story/the-doj-still-wants-google-to-di...
Maybe Chrome can get away with "the spec says it, sorry advertisers" but I doubt the courts will accept that.
nolroz•8h ago