If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.
The data belongs to the government, and you can't get around that right by going to business that holds the data and asking them to delete it.
Sounds reasonable to me. If the police want to put up a camera, then the police should put up a camera.
Offloading their legal responsibilities to a third party company is shitty.
We're talking about Flock. A company offering surveillance as a service. Per their website:
>Trusted by over 12,000 public safety customers including cities, towns, counties, and business partners.
If Flock's argument holds then most of the CCPA be circumvented this same way. All it takes is a few entities and clever contract language.
Would you ask your local ISP to delete data they provided to Tinder like your IP address? That doesn't make sense to me.
I'm not convinced this is the case. It might be equipment made by them, but does that necessarily mean they were ever even in possession of the data in question?
Would you ask the manufacturer of your oven what you ate for dinner last week? No, you're just using an appliance that they made.
In the case of Flock I don't think we have any evidence of whether Flock themselves ever hold or store any data produced by their devices when operated by a customer.
Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.
You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"
In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.
Not a lawyer, just noting the parallel.
I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.
Would our main check on this be whistleblowers?
The best source of this information is https://deflock.org/ . FWIW, this is run by a neighbor in Boulder, CO which has been wrestling with the use of these cameras.
2x4
rebar
spraypaint
spray foam
battery powered metal cutter
And bash those pieces of shit to chunks or completely ruin the lens and solar.Republican community? They love corporate surveillance. Democrat community? They too love corporate surveillance.
There is no "Peoples' Party" that rejects this garbage.
It would be a pity if someone made dense point clouds of these devices.
Did YC house style change a while back to drop the "(YC xxx)" annotation since so many popular firms particpate / or because it's well known?
> (2) (A) “Personal information” does not include publicly available information [...]
> (B) (i) For purposes of this paragraph, “publicly available” means any of the following:
> (I) Information that is lawfully made available from federal, state, or local government records.
> (II) Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer
For example, if Flock receives a legitimate request to delete some data, then Flock must forward that request to all their Data Processors (e.g. including AWS/GCP/Cloudflare) and they must delete it as well.
By analogy, Google Docs isn't marketed for healthcare use. If you wanted, you could put a bunch of PHI in a Google doc and it wouldn't be their responsibility. They certainly didn't tell you to do that. However, if they marketed Google Docs as a great place to store PHI, yeah, then suddenly they're on the hook for complying with the relevant laws like HIPAA.
(Although in this case Google will sign a HIPAA business associate agreement with you and voluntarily agree to comply. They still don't market it that way, or at least don't predominantly do so.)
> In accordance with its Terms and Conditions, Flock Safety may access, use, preserve and/or disclose the LPR data to law enforcement authorities, government officials, and/or third parties, if legally required to do so or if Flock has a good faith belief that such access, use, preservation or disclosure is reasonably necessary to comply with a legal process, enforce the agreement between Flock and the customer, or detect, prevent or otherwise address security, privacy, fraud or technical issues. Additionally, Flock uses a fraction of LPR images (less than one percent), which are stripped of all metadata and identifying information, solely for the purpose of improving Flock Services through machine learning.
In this document, to which they linked in their reply, it says clearly "address ... privacy ... issues."
Does your case not constitute a privacy issue? I would say so.
Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.
Wait til you see their "Transparency Portal" which, if my County and neighboring can be used as a sample size, doesn't even name at least 30% of agencies using Flock.
Does this hold water? I'm reading the CCPA rules now but if anyone knows, it would save me some tedious research.
If you write the police and ask them to delete all their data about you, that isn't a thing that they do. It shouldn't matter if the police store their data on AWS or their own servers.
Flock is a tool used by the police so it should work the same way.
But that's not what Flock is claiming. They're claiming that they don't even have to consider the request because they don't own the data.
[0] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...
[1] https://www.clarip.com/data-privacy/ccpa-erasure-exemptions/
Or vote for/against them, that might work too.
kstrauser•2h ago
> Flock Safety’s customers own the data and make all decisions around how such data is used and shared.
which seems to directly oppose the CCPA. It's my data, not their customers'.
Again, I didn't really expect this to work. And yet, I'm still disappointed with the path by which it didn't work.
carefree-bob•2h ago
But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.
But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.
And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.
fudgy73•2h ago
[0] https://www.flocksafety.com/blog/flock-safety-does-my-neighb...
mminer237•2h ago
kstrauser•2h ago
Flock's cameras aren't in bathrooms. However, they're still recording people who haven't opted into it. ("But you have no expectation of privacy in a public place!" "You have the expectation that someone might inadvertently overhear you. You don't have the expectation that someone is actively recording you at all times.")
danielsunsu•2h ago
yabutlivnWoods•2h ago
If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps.
As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control.
tptacek•1h ago
itsdesmond•1h ago
But you knew that.
tptacek•1h ago
They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.
Mordisquitos•1h ago
However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it.
danudey•1h ago
If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools.
tptacek•1h ago
My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is.
Karrot_Kream•25m ago
ldoughty•2h ago
So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.
If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"
tptacek•1h ago
fsckboy•1h ago
all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.
if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.
shermantanktop•15m ago
It's not hard to see how this enables an institution to gate itself from criticism.
thaumaturgy•1h ago
The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.
(After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)
Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?
necovek•1h ago
ScoobleDoodle•1h ago
tptacek•59m ago
necovek•52m ago
cwillu•50m ago
tptacek•48m ago
This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?
jaredwiener•31m ago
tadfisher•39m ago
Does Flock do some kind of P2P dance to avoid the data transiting their systems?
close04•29m ago
unethical_ban•25m ago
I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies.
Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another.
unethical_ban•28m ago
Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?
If so, that makes the panopticon slightly less powerful.
valeriozen•10m ago
Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?
jnovek•5m ago
It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.
mistrial9•2h ago
halJordan•1h ago
thebaine•13m ago
dozerly•1h ago
themafia•2h ago
https://leginfo.legislature.ca.gov/faces/codes_displayText.x...
The enforcement provisions are rather bleak as well and afford no opportunity to directly bring a case against the agency that operates the system but instead just the individual who misuses it.
I think one of the more direct attacks would be going after jurisdictions that chronically have officers misusing the system. I think you're going to have to create precedent in this way to foment actual change.
everdrive•2h ago
nainachirps_•2h ago
If they are processing data after being told it was not obtained with consent do they not have any liability?
Glyptodon•2h ago
kstrauser•2h ago
kube-system•2h ago
kstrauser•1h ago
kube-system•1h ago
Edit: from https://oag.ca.gov/privacy/ccpa
> If a service provider has said that it does not or cannot act on your request because it is a service provider, you may follow up to ask who the business is. However, sometimes the service provider will not be able to provide that information. You may be able to determine who the business is based on the services that the service provider provides, although sometimes this may be difficult or impossible.
singleshot_•1h ago
kube-system•1h ago
ratdragon•2h ago
kstrauser•2h ago
tptacek•2h ago
kstrauser•1h ago
* The right to know about the personal information a business collects about them and how it is used and shared;
* The right to delete personal information collected from them (with some exceptions);
* The right to opt-out of the sale or sharing of their personal information including via the GPC;
This isn't someone incidentally taking pictures of license plates in an otherwise noncommercial setting. It's a company literally created to collect and sell PII. Laws are different for them than for us.
[0]https://oag.ca.gov/privacy/ccpa
snowwrestler•1h ago
kstrauser•1h ago
uoaei•1h ago
lcnPylGDnU4H9OF•1h ago
uoaei•1h ago
lcnPylGDnU4H9OF•1h ago
kube-system•1h ago
> “Personal information” does not include [...] Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer
tptacek•4m ago
danudey•1h ago
Even your full legal name and birth date cannot be guaranteed to refer only to you specifically (as there could be someone else with an identical name and birth date), but it's obviously still PII because it helps narrow the field immensely if you can combine it with other information - for example, your IP address.
So yeah, "anyone could have been driving my car", but if you also know that the car drove from your home to your work then that narrows down the list of likely individuals immensely.
Conversely, if your license plate was spotted parked near an anti-ICE rally, then they can be pretty confident that you or someone you know was near an anti-ICE rally, which means they can harass you about it, follow you around, shoot you in the street, etc.
tomwheeler•1h ago
adrr•46m ago
lcnPylGDnU4H9OF•1h ago
> (v) (1) “Personal information” means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following [...]:
> (E) Biometric information.
> (H) Audio, electronic, visual, thermal, olfactory, or similar information.
https://leginfo.legislature.ca.gov/faces/codes_displaySectio...
To your point, the intent would presumably still matter for exceptions to when deletion requests must be honored (say for journalism), but a photo of someone walking down a public street would still logically be considered the subject's personal information, by the above definition.
necovek•1h ago
Obviously, the idea is to not disallow having someone take a photo of you as a background, passing figure as they take a front-and-center photo of their family, but not allow you to be the main subject unknowingly and especially when you object explicitly.
On the other hand, a photographer still owns the copyright to a photo, so a subject (including in a portrait) cannot claim it or distribute it without permission even if they can potentially stop the photographer from distributing that photo.
IANAL, but you are not by default allowed to use anyone's "likeness" for your individual profit.
patrickmay•1h ago
This is not the case in the United States. There is no presumption of privacy in public. In fact, there is a whole genre known as "street photography" that involves taking pictures in public without explicit consent of the subjects.
necovek•54m ago
If https://legalclarity.org/can-you-post-someones-picture-witho... is to be trusted though, at least you get protection from your likeness being used for commercial purposes, though that seems a bit more limited than I'd expect.
tadfisher•45m ago
[0]: https://www.eff.org/deeplinks/2025/10/flock-safety-and-texas...
tptacek•37m ago
I think what's happening here is that people are trying to colloquially define "selling access to data" to fit the camera data sharing that Flock enables, and then saying that because you have to pay to be a Flock customer to get access to that data, they're effectively selling it. I don' think that's how data brokerage laws work. Flock doesn't own the data they're providing access to, and they're providing that sharing access with the (avid!) consent of their customers.
snowwrestler•1h ago
tadfisher•1h ago
[0]: https://www.scotusblog.com/cases/case-files/chatrie-v-united...
snowwrestler•55m ago
tptacek•1h ago
You're going to get a lot of cheerleading and support about this in venues like HN and Reddit, because you're narrowcasting to an audience already primed to be hyperconcerned about surveillance technology (I am too). I think you're going to find those attitudes do not in fact generalize to the public at large, and especially not to the legal system.
Best of luck either way. It'll be an interesting experience to write up, and I'm happy to read about the outcome, even if I do think it's highly predictable.
john_strinlai•1h ago
fyi, flock owns the cameras.
"We operate using a lease model. What does that mean? Since we own the hardware, we own the problems that occur."
kstrauser•1h ago
tptacek•57m ago
kstrauser•37m ago
mindslight•1h ago
But yes, data that can be used to track my movements in my vehicle is certainly a type of personally identifiable information. I'd argue there should be some exemptions for individuals operating on a small scale, which I believe the CCPA has (and if we actually got a US GDPR, that it should have). But also that kind of exception shouldn't apply to a camera operated by and backhauling to Ring.
necovek•1h ago
Now, with you likely not keeping that Ring tied to a business account, how that applies to non-businesses holding PII is a different matter.
danudey•1h ago
kasey_junk•58m ago
captaincrisp•6m ago
mindslight•2h ago
Personally I would really like to see torts for attorneys who willfully promulgate blatantly incorrect legal interpretations - they're effectively providing incorrect legal advice. A non-attorney is likely to believe such advice coming from a member of the Bar, and the net goal is to discourage the target from seeking further legal advice.
SoftTalker•1h ago
snowwrestler•1h ago
But maybe I am unclear on how Flock works.
bjt•1h ago
What you own is the image copyright. But the right to copy is only one of the rights at issue.
Under various state laws (California in particular), you might not be entitled to do all the things with that picture that you could do of one that doesn't have my likeness. Privacy laws like the CCPA are one possible carve-out. A "right of publicity" is another.
There's an old saying about property law that "property is a bundle of sticks". The bundle can be subdivided.
https://www.law.cornell.edu/wex/publicity
pksebben•1h ago
Like, say I have an interview in your office and you step out for coffee. I take a picture of the applicant list on your desk. That doesn't make the list of applicants "my data".
lotsofpulp•41m ago
If the list is sitting there out in the open, then yes, it does make it your data.
throwway120385•22m ago
The legal system thrives on specifics of a situation, so simply asserting that the list of applicants is or is not "yours" because you can see it seems like a gross oversimplification. The specifics of how you came to be there, what your relationship with the officeholder is, and so on probably matters a lot in that situation and I think there might be some unwritten rules or social norms that you'd be expected to follow as well.
throwway120385•27m ago
goodluckchuck•1h ago
necovek•1h ago
In short, Flock is a "service provider" and not the entity doing the recording.
Perhaps you can make a case that they are a "data broker" instead (https://oag.ca.gov/privacy/ccpa#collapse1i), but that is a separate law, and what you are really looking at is a combination of license plate, time and location being collected as data being collected and sold without your consent.
Obviously, I am not a lawyer (and not even US-based), but I like when privacy is respected :)
zbrozek•1h ago
wakamoleguy•56m ago
To the extent that Flock is only storing the data on behalf of their customers, I'd understand they wouldn't be required to delete it. But to the extent that they are indexing it, deriving from it, aggregating it across customers, and sharing it via their platform, it seems they should be required to remove that data from those services.
But then again, I am not a lawyer!
tgsovlerkhgsel•42m ago
Which, hilariously, means that under GDPR, you only need to contact the web site, and they have to go talk to their 1207 partners that value your privacy to fulfill your request (I'm sure that in practice they'll say "sorry it's all 'anonymous' so we can't" or "we can't be sure that it's you even though you provided the identifier from your cookies"). I'm really disappointed that NOYB hasn't started going after web sites like that - that's quickly put a damper on the whole web surveillance economy.
charcircuit•36m ago
Just because data is about you, that doesn't mean it is your data.
john_strinlai•30m ago
"Personal information is information that identifies, relates to, or could reasonably be linked with you or your household."
and, you do have the rights set forth in the ccpa (know, delete, correct, limit exposure, etc.) regarding that data.
lmkg•34m ago
I have some background in data privacy compliance.
It sounds like they are claiming to be a Service Provider under CCPA, which is similar to a Processor under GDPR. Long story short, a Controller is the one legally responsible for ensuring the rights of the data subject, and a service provider/processor is a "dumb pipe" for a Controller that does what they're told. So IF they are actually a Service Provider, they're correct that the legal responsibility for CCPA belongs to their customers and not them.
That's a big IF, though.
Being a Processor/Service Providor means trade-offs. The data you collect isn't yours, you're not allowed to benefit from it. If Flock aggregates data from one customer and sells that aggregate to a different customer, they're no longer just a service provider. They're using data for their own purposes, and cannot claim to be "just" a service provider.