I suspect Shopify's terms inform their customers (webshop operators) that they are responsible for disclosure, etc and being compliant with state privacy laws - however since majority of web shops are exempt (due to small size, revenue, etc), these shops did not (knowingly or otherwise) publish these terms. That's just speculation on my part...
If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
This particular case gives me ADA and Prop65 vibes... lots of bottom-feeding lawyers using serial plaintiffs to extort businesses out of money. At least in this case they're going after someone with deep pockets and not just small businesses...
Relentlessly stalking millions of people makes it millions of times worse than stalking one person, not somehow okay.
It disgusts me that companies who want to transact with me don't vet their partners better. Off-Meta is another one that's despicable. Companies like my bank or their partners have NO business uploading lists of their users to third parties like that (even if it was induced by use of their analytics SDK's).
I disagree energetically. If Shopify wants to run a service identifying people between every site that it serves as a backend to, it should ask those people if they want to be included in that. The only alternative to stop the illegal activity otherwise is to print a list of Shopify's customers, and visit (and sue) them one by one in California. Shopify is running the service, and the shop owner probably doesn't even know how it works.
I'd even think that a shop owner sued over this should in turn be able to sue Shopify. If Shopify knows that something it does is not legal in California, it should tell its clients who may do business in California.
> If Shopify knows that something it does is not legal in California
This is what is being debated. This ruling is mostly expected out of the 9th... we'll see what happens when a real court hears this case.
Most of my work is in the Shopify app dev ecosystem, and while I haven't been following this case very closely, I do think it's ironic how Shopify is behaving here given the privacy standards they enforce on their app developers.
Some context: all Shopify app developers are required to follow the EU's GDPR rules for customer data, full stop. Your app must implement Shopify's mandatory GDPR webhooks. You must delete customer data when a shop's customer is deleted; you must produce all data you store on a shop's customer within 7 days upon receipt of a certain GDPR webhook; and you must delete all the data you store on the shop itself after the shop uninstalls your app.
Additionally, if your app requires access to any customer data (whether its via the Customer API, or via other APIs e.g. to get the name of a customer who placed an order), you need to apply for access to that data on an app-by-app basis – replete with an explanation for why your app needs the data. Shopify's app store staff has to manually review and approve that data access application before you can publish your app on their app store.
To be clear, I think these restrictions are a good thing†, as apps used to have access to a veritable firehose of private customer data. But it's ironic to see Shopify enforce such standards on their app developers, while at the same time arguing that they should be able to track their own potential customers anywhere and everywhere across the internet regardless of privacy laws.
† Though I think it's a little odd that a Canadian company is making me, an American app developer, think about/adhere to the EU's GDPR rules. Not to mention other privacy laws like the one in California. Why not just call it "Shopify's Privacy Standards?"
https://www.eff.org/deeplinks/2024/07/courts-should-have-jur...
The important thing is what protections are gained now, directly as a result.
Also, sales tax already goes to the state where a good is delivered to. Texas already knows when you buy from Good Vibrations, if you live there.
I don't have any horse in this race, though I know the EFF is very popular on HN, and that many people here are also against data collection.
For completion I think "cease to insecurely extract, aggregate and abuse all that user data" should also be mentioned as an alternative to the different ways they could skirt regulation.
The only change I would make to your suggestion would be to remove the word "insecurely".
They shouldn't extract or aggregate user data in any fashion whatsoever.
The author seems to think that there should be some way for Shopify to avoid jurisdiction while still offering services in California, but I don’t really understand why he thinks so.
It is baffling to hear the author ask the question “did Shopify ‘expressly aim an intentional act at California?” And subsequently conclude that Shopify’s entire business model is in doubt if it doesn’t do business in California.
A company in California, selling to a customer in California, shouldn't be able to say "California law doesn't apply because my payment processor is Canadian." And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well, at least insofar as it applies to the services provided via those California-based retailers' web sites.
(The actual opinion is linked at the bottom of the submission; I humbly suggest folks commenting here should read it first: https://cdn.ca9.uscourts.gov/datastore/opinions/2025/04/21/2...)
The linked opinion's discussion of jurisdiction seems to be about whether the case can be heard in any court, not if there's an alternate venue that would be more proper. FWIW, the retailer's web site[0] does not seem to contain a terms of service or anything with a choice-of-law provision. If the argument is that no court has jurisdiction in which the plaintiff can seek redress, that seems equivalent to saying that the law doesn't apply to the defendants.
Shopify already exercises a huge amount of discretion as to which businesses it's willing to provide services for, and how much it charges them. It does not seem unreasonable to me that the company would either a) be willing to answer to California courts, or b) stop selling its services to California retailers.
That’s why the author said that when the court stated that it would be absurd if there were no state that had jurisdiction, the court was being misleading, because there is always at least one state in which a U.S. entity is subject to jurisdiction.
Seems bizarre to me that several lower courts ruled in favor of Shopify though..
Jurisdiction has nothing to do with whether laws are applicable to a case. It has to do with whether a particular court can hear a case. Personal jurisdiction questions go back hundreds of years to resolve issues like “is it just to force a defendant to travel hundreds of miles by horse and carriage to defend himself?” versus allowing him to defend himself closer to home.
If the plaintiff sued Shopify in New York, this issue wouldn't have come up at all. (And, yes, a New York court can apply California law.)
This stuff is all covered in first-year civil procedure in law school.
For example, imagine an American tourist visiting the UK drives dangerously and kills a motorcyclist, then escapes to the US before being caught.
A UK criminal court can order jail and a driving ban, but has no way to enforce the sentence unless the tourist decides to return to the UK, or the US decides to extradite.
A US criminal court can't do anything, the crime took place outside of their jurisdiction.
The victim's family can sue the tourist in a US civil court, and that court can take UK law into account - but can basically only impose a cash settlement that gets paid out of the tourist's insurance. They don't actually lose a cent.
So in a sense one state's courts can apply foreign state laws. But in terms of results, i.e. days spent in jail, laws don't appear to apply across borders.
For the moment, for purpose of consumer protections, fine. But on longer time scales, I'm not sure. Does it really make sense for legacy states to be able to bind transacations on the internet? Doesn't that just make it a very large intranet?
Obviously information refuses to be stopped by borders. Are we going to have a situation where states of various sizes try to claim jurisdiction, but only those with sufficiently violent tendencies (and thus the ability to achieve compliance by fear) succeed? Won't that corrupt the internet even worse than an absence of state-based consumer protections?
If two people who live 500 miles apart in the area currently claimed by the State of California, but do not recognize the State of California, and regard themselves as citizens of the internet, and, who is right, them, of the government of California?
Most of us will probably say that there is some social contract by which, for better or worse, the State of California is right.
But what if, in 100 years, California goes bankrupt. Does that change the calculus? If so, why? And does it change retroactively, for the purposes of historical classification of internet transactions? The diplomatic and economic affairs of state don't change the operation of internet protocols. It's hard to even fully imagine how to create an internet whose shapes are coterminous with the boundaries asserted by various states.
I'm broadly skeptical of any judicial rulings which extend the laws of the legacy states onto the internet, even if they appear to be on the side of short-term justice. This whole thing is starting to feel like a bandaid better ripped off quickly.
Yes.
We have a problem right now where the only place democracy, sensible laws and due process take place is in meatspace.
The internet - insomuch as it’s a real place, is a feudal society. It’s made up of small fiefdoms (websites) and some larger kingdoms which exert tyrannical power within their borders. They watch everything you do - usually to advertise to you. And they can banish you at a moments notice if doing so would result more profit for their rulers.
There’s an interesting argument you can make that the internet should be its own sovereign space. “Information wants to be free” and all that. Maybe if the internet was created 200 years ago, during the period of time when constitutions were being written everywhere, we would have created one for the internet. And then, maybe, the internet could have policies and courts and rules that uphold the rights of people. But that hasn’t happened. We have, through our collective inaction, delegated judicial oversight of the internet to sovereign states in meatspace. And thank goodness. Somebody needs to tell internet companies that my personal data is not for sale. Or tell Apple that they aren’t entitled to 15-30% of Netflix’s revenue after already selling a user their phone. (And don’t they dare redirect users to their website!)
If us technologists won’t govern ourselves, we delegate that important job to the state of California. To the European Union. To Australia’s department of fair trade & ACCC. And so on. It means we get a fractured Internet. But people have inalienable rights that need to be defended. Those rights must not be undermined just because we’re online and there’s a profit to be made.
So the flip side of your position that someone needs to be subject to a foreign law when dealing with a foreign party because otherwise that parties right might be stommped is that they also need the means to block interactions with that foreign party so their own rights aren't potentially stomped.
In the case where there are sales you might actually know where the other parties reside, but in the majority of interactions online you don't and there is no great means to control your exposure to other jurisdictions.
Yes. That's the point. If you choose to do business with someone you also accept the jurisdiction of their local courts and laws. If you don't want that risk then do business locally.
Sites like paypal and escrow.com need licenses for every state in the US to carry on business (Mostly called money transmission licenses, but there are a few other names and regulations depending on the state). Yes, it is just as big a compliance nightmare as you'd expect.
So yes, it does happen.
And anyway, as this case shows, if you have customers in a state you need to follow the laws of that state. This is why Pornhub have stopped servicing various states.
I haven't laughed that hard in awhile. Poor Shopify, they couldn't possibly protect the privacy and data of their customers.
At it's core, a payment system is a form. Yes, many bells and whistles around that form are powered by cookies/local storage, but they aren't necessary.
> First, the majority might say that Shopify should not engage in privacy-invasive activities. I didn’t invest the energy to figure out the irreducible privacy elements of the plaintiffs’ claims, but if using cookies to track users is an essential part of the claim, then more privacy-protective option are not feasibly available to Shopify.
By and large states having different laws is a pain, but arguing that you can do business in every state while only following the laws of one state is a very messy rejection of state's rights, and leads to using the commerce clause to basically negate most state level regulations and jurisdiction.
> If a company develops a web-based business for the purpose of conducting online transactions in all 50 States, it should not be surprised that it may be sued in any State for unlawful transactions that may occur within that State.
Obviously. But the author calls this "chilling". Without this, companies could circumvent state laws, to conduct actions that are illegal in that state within that state, simply by headquartering or hosting in another. That would be absurd.
It would create a race-to-the-bottom of consumer rights, where states wanting business tax revenue are incentivized to make their states surveillance/ data harvesting/ consumer exploitation havens, whose businesses could then operate across all other states freely.
Trying to conflate a mom & pop shop vs. shopify is silly and not something that the court attempted to do.
Shopify is trying to avoid California having jurisdiction (to try them for privacy violations), because Shopify *knows* they're violating California consumer privacy laws.
The idea that enforcing consumer protection laws will somehow result in an explosion of frivolous lawsuits, or the death of business confidence, or will chill business development, is just a lie that companies push to fearmonger away consumer protections.
> just a lie
That's just some words; how will it not affect Mom & Pop?
> Why is a mom and pop ceramics business violating consumer privacy laws?
Because they don't have a legal department to make sure they comply with the rules in 50 states and N countries.
> The broadest possible interpretation of this ruling is that any website that downloads any digital asset–cookies, javascript, heck maybe even HTML–onto a California resident’s computer can be sued in California, even if the website doesn’t know where the users are. If this is correct, the majority effectively would be saying: if you place a cookie on a reader’s device, you’ve done something more than passive publishing (i.e., you can passively publish without the cookie) and must accept the jurisdictional consequences.
This feels so close, but a little off my interpretation. In Collin's concurrence, he specifically calls out that the required parties "minimum-contacts" in the transaction were both in the state as part of the reason for the unambiguous jurisdiction, with Stripe being a third unexpected party to the information. To insert themselves is in effect inserting themselves into the jurisdiction as well. This paragraph:
> When a State specifically regulates the conduct of electronic systems with respect to transactions within its borders, the as-intended operation of those systems within that State is the relevant tortious conduct for minimum-contacts purposes, and that conduct is attributable to those persons who deliberately intended that such systems reach into that State and operate in that manner when they do so...
For me, this implies that third-party services not required or expected when a user interacts with a site that run scripts or set cookies for anything outside that "minimum-contacts" requirement is liable for what they do with that data, access, and what that code does.
hn_acker•1w ago
> Ninth Circuit Takes a Wrecking Ball to Internet Personal Jurisdiction Law–Briskin v. Shopify
topspin•1w ago
Aloisius•1w ago
topspin•1w ago
Aloisius•1w ago
dredmorbius•1w ago
There are exceptions. I believe Napoleonic Law (France, Spain, Mexico, etc.) follows a different model, as does German law, where court decisions do not establish precedent, see e.g., <https://paperity.org/p/232486962/on-the-role-of-precedents-a...>.