Wife and I are expecting our third child, and despite my not doing much googling or research into it (we already know a lot from the first two) the algorithms across the board found out somehow. Even my instagram "Explore" tab that I accidentally select every now and then started getting weirdly filled with pictures of pregnant women.
It is what it is at this point. Also I finally got my last settlement check from Equifax, which paid for Chipotle. Yay!
There's clearly quite the active market for this information
But there should be and there should be punishments for data breaches, or at least compensations for those affected. Then there would be an incentive for corporations to take their user's privacy more seriously.
Your personal data is basically the currency of the digital world. This is way data about you is collected left, right, and center. It's valuable.
When I trust a bank to safely lock away my grandmother's jewelry, I have to pay for it, but in return, if it just so happened that the bank gets broken into and all my possessions get stolen, at least I'll get (some) compensation.
When I give my valuable data to a company, I have already paid them (with the data themselves), but I have no case whatsoever if they get compromised.
It's worth keeping in mind that this is basically untrue.
In most of these algorithms, there's no "is_expecting: True" field. There are just some strange vectors of mysterious numbers, which can be more or less similar to other vectors of mysterious numbers.
The algorithms have figured out that certain ad vectors are more likely to be clicked if your user vector exhibits some pattern, and that some actions (keywords, purchases, slowing down your scroll speed when you see a particular image) should make your vector go in that direction.
Have the government own data collection? Yeah, I don't even know where to start with all the problems this would cause.
Ignore it and let companies keep abusing customers? Nope.
Stop letting class-action lawsuits slap the company's wrists and then give $0.16 payouts to everyone?
What exactly do we do without killing innovation, building moats around incumbents, giving all the power to politicians who will just do what the lobbyists ask (statistically), or accepting things as is?
Otherwise it would suggest you think the problem is they didn't ask? When was the last time you saw a customer read a terms of service? Or better yet reject a product because of said terms once they hit that part of the customer journey?
The issue isn't about asking it's that for take your pick of reasons no one ever says no. The asking is thus pro forma and irrelevant.
Can you delete it after the shortest possible period of using it, potentially? Do you keep data after someone stops being a customer or stops actively using the tech?
This could also be drastically improved by the government spearheading a FOSS project for medical data management (archival, backup, etc). A single offering from the US federal government would have a massive return on investment in terms of impact per dollar spent.
Maybe the DOGE staff could finally be put to good use.
I'm aware. I thought we were talking about something a bit higher effort than that.
> online data backup service
That isn't what I said. I suggested federally backed FOSS tooling for the specific usecase. If nothing else that would ensure that low effort scanners came up empty by providing purpose built software hardened against the expected attack vectors. Since it seems we're worrying about the potential for broader system misconfiguration they could even provide a blessed OS image.
The breach in the article has nothing to do with what we're talking about. That was a case of shadow IT messing up. There's not much you can do about that.
Ask me how many medical practices connect every day via IE on Windows 8.
As all the comments in this thread suggest the cost of having an extra record , even an extra breached record is low. The cost of failing to produce a required medical record is high.
Put this together with dropping storage prices, razor then margins, and IT estates made out of thousands of specialized point solutions cobbled together with every integration pattern ever invented and you get a de facto retention of infinity paired with a de jure obligation of could-be-anything-tomorrow.
Yes, some breaches (actual hack attacks) are unavoidable, so you don't slap a fine on every breach. But the vast majority of "breaches" are pure negligence.
That's a terrible argument for allowing our data to be sprayed everywhere. How about regulations with teeth that prohibit "dragons" from hoarding data about us? I do not care what the impact is on the "economy". That ship sailed with the current government in the US.
Or, both more and less likely, cut us in on the revenue. That will at least help some of the time we have to waste doing a bunch of work every time some company "loses" our data.
I'm tired of subsidizing the wealth and capital class. Pay us for holding our data or make our data toxic.
Obviously my health provider and my bank need my data. But no one else does. And if my bank or health provider need to share my data with a third party it should be anonymized and tokenized.
None of this is hard, we simply lack will (and most consumers, like voters are pretty ignorant).
Or if it's a freebie then it's hidden behind a plain text link 3 levels deep on their website.
33 bits is all that are required to individually identify any person on Earth.
If you'd like to extend that to the 420 billion or so who've lived since 1800, that extends to 39 bits, still a trivially small amount.
Every bit[1] of leaked data bisects that set in half, and simply anonymising IDs does virtually nothing of itself to obscure identity. Such critical medical and billing data as date of birth and postal code are themselves sufficient to narrow things down remarkably, let alone a specific set of diagnoses, procedures, providers, and medications. Much as browser fingerprints are often unique or nearly so without any universal identifier so are medical histories.
I'm personally aware of diagnostic and procedure codes being used to identify "anonymised" patients across multiple datasets dating to the early 1990s, and of research into de-anonymisation in Australia as of the mid-to-late 1990s. Australia publishes anonymisation and privacy guidelines, e.g.:
"Data De‑identification in Australia: Essential Compliance Guide"
<https://sprintlaw.com.au/articles/data-de-identification-in-...>
"De-identification and the Privacy Act" (2018)
<https://www.oaic.gov.au/privacy/privacy-guidance-for-organis...>
It's not merely sufficient to substitute an alternative primary key, but also to fuzz data, including birthdates, addresses, diagnostic and procedure codes, treatment dates, etc., etc., all of which both reduces clinical value of the data and is difficult to do sufficiently.
________________________________
Notes:
1. In the "binary digit" sense, not in the colloquial "small increment" sense.
https://www.cms.gov/priorities/burden-reduction/overview/int...
https://www.cms.gov/priorities/burden-reduction/overview/int...
The governor of Missouri at the time, Mike Parson, called him a hacker and advocated prosecuting him. Fortunately the prosecutor's office declined to file charges though.
So a state employee/contractor (doesn't say) uploaded unaggregated customer records to a mapping website hosted on the public internet?
I think it would be less wrong to say this is where covered entities that discover reportable breaches of PHI (whether their own or that of a BA) that trigger the immediate reporting obligation report them.
This is a narrower scope of coverage and shallower depth of epistemic obligation than you implied.
On top of common OWASP vulnerabilities, the bigger concern is that EHR and provider service apps do not have the robust security practices needed to defend against attacks. They aren't doing active pen testing, red-teaming, supply chain auditing -- all of the recurring and costly practices necessary to ensure asset security.
There are many regulations, HIPAA being the most notable, but their requirements and the audit process are incredibly primitive . They are still using a 1990s threat model. Despite HIPAA audits being expensive, the discoveries are trivial, and they are not recurring, so vulns can originate between the audit duration and the audit summary delivery.
This is why I am almost always very reluctant to give out any information that is not absolutely necessary to provide me the service that I need. If they don't know it, they can't leak it.
Every company wants you to fill out their standard form that tries to get you to volunteer way more info than they really need.
cosmotic•1d ago
A4ET8a8uTh0_v2•1d ago
Anyway, short of collapsing current data broker system, I am not sure what the answer is. Experian debacle showed us they are too politically entrenched to be touched by regular means.
At this point, I am going through life assuming most of my data is up for grabs. That is not a healthy way to live though.
hmokiguess•1d ago
At some point, everything that we have ever assumed to be confidential and secure will be exposed and up for grabs.
A4ET8a8uTh0_v2•1d ago
hmokiguess•1d ago
[1] https://torrentfreak.com/annas-archive-urges-ai-copyright-ov...
stackskipton•1d ago
It will have to be political and it's got to be fines/damages that are business impacting enough for companies to pause and be like A) Is it worth collecting this data and storing it forever? and B) If I don't treat InfoSec as important business function, it could cost me my business.
It also clear that certification systems do not work and any law/policy around it should not offer any upside for acquiring them.
EDIT: I also realize in United States, this won't happen.
closeparen•20h ago
fc417fc802•17h ago
What the health system should impose is a standard for interoperability. Not an internal network that presents a juicy target.
dajtxx•17h ago
I struggle to see how data brokers, social media, etc are a net benefit to society so would be happy to see those sorts of businesses cease to exist, but I suspect I'm in the minority.
miki123211•5h ago
The "social contract" is that many services are fully or partially financed by advertising. Rich people produce more ad revenue (because they spend more), but they get the same quality of service, effectively subsidizing access for the poorer part of the population, who couldn't afford it otherwise.
If we break this social contract down, companies will still try to extract as much revenue as possible, but the only way to do that will be through feature gating, price discrimination, and generally making your life a misery unless you make a lot of money.
skeptic_ai•1d ago
EvanAnderson•21h ago
fhsm•17h ago
closeparen•23h ago
A4ET8a8uTh0_v2•22h ago
closeparen•20h ago
I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.
A4ET8a8uTh0_v2•19h ago
<< I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.
I posit that both could be true at the same time.
fc417fc802•17h ago
miki123211•5h ago
I see so many comments about how punishments for data breaches should be increased, but not a single story about quantifiable harm that any of those commenters has suffered from them.
SilverElfin•1d ago
rationalist•15h ago
Each of the big three credit bureaus offer free accounts where they email me if something changes and allow me to lock and thaw my credit.