«This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union;»
So the GDPR applies. The EU can of course sanction violators outside, but given the current political climate it is likely to be more difficult than before.
No they can't. That says only about where the law applies, it doesn't say about prosecution of entities not residing in the EU. EU's legal arm can't extent outside the boarders of the EU, without an outright military invasion, it's toothless to foreign entities.
>So the GDPR applies.
I never said it doesn't apply. I said how is the EU gonna prosecute an entity that doesn't reside in the EU?
Extraditions are tough to negotiate even for serious stuff like murder. No foreign judge will take GDPR violations seriously to do that.
In this case 23andMe is on the Data Privacy Framework list, so they have volunteered to follow the GDPR while still being based in the US. This is basically the same as a number of other GDPR cases, including the GDPR fine against Clearview AI. Fining 23andMe if they violate GDPR should be trivial in that case.
We generally don't need to worry about EU judgments against US companies being enforced, or vice versa. It doesn't get far enough that we need to think about how it's going to be enforced as a practical matter.
Even if they are, 6.5 million in fines is chump change.
The fine is revenue adjusted.
Fines are scaled according to revenue. That's the reason for the monumental fines leveled at Google and Facebook. They don't mess around.
I believe Grindr's fine was just based on their revenue from Norwegian users. Probably plenty hurtful enough to make them implement data protection in the jurisdictions which demand it.
All the lawsuits from EU against tech companies outside the EU have been carried out without any military invasions required. You are delusional if you think USA is going to go “Google won’t pay the fines, invade us if you want the money”
Why don't you read my comment thoroughly before accusing people of being delusional? Google is registered in Europe(HQ in Ireland IIRC ) and does business in Europe as an European company, so there's a physical, legal and tax paying entity the EU can fine and even European management they can send to jail just in case they don't comply, same how they did with VW for the diesel scandal.
If you would have read my comment thoroughly (hard ask, I know), you would have seen I'm specifically asking about how would the EU fine foreign companies that have EU citizens' data but have no HQ or any legal entity in Europe that can be reached by EU law enforcement in case of GDPR violations.
>On April 18, 2018, a DNA sample was surreptitiously collected from the door handle of DeAngelo's car;[64] another sample was later collected from a tissue found in DeAngelo's curbside garbage can.[185]
DNA as the guilt-by-association, they then will get the evidence.
https://en.wikipedia.org/wiki/Joseph_James_DeAngelo#Arrest,_...
as far as re-making you I think those like thiel and others are in to trans-humanism. so probably to help make the ultimate borg.
I'm currently in the process of figuring out where my most distant known paternal-line ancestor came from. Took a Y-DNA test, found a very distant all-male line cousin who was open to taking a Y DNA test (because he had already taken one at 23andMe, I knew the odds were good that he would be OK with it - the 23andMe test also had enough Y information that I knew it was likely we both descended from the same man).
His first results came in yesterday. In another couple of weeks, when his Y SNPs come in, we'll know which of my ~10 private mutations (ones no one else has been found to share) we have in common, which will in turn put better time estimates on our distance to other testers in our part of the Y-tree.
I did the 23andme thing years ago (foolishly or not, I found it interesting), and everything on there at the time and when I checked recently said I had no markers for anything. They are very pro-active in reminding you all over the app/website and for each condition they track that they only track a few markers so the results are obviously not really that predictive of anything, and that practicing a healthy lifestyle (whatever your genetic markers) is the most important thing to do.
So, I suppose, if everyone could use the data, everyone could use it to develop drugs, not just 23andMe. That’s good if you want more drugs to be developed and released… bad if you don’t.
Note: I'm not saying you should not delete your DNA. Do what you want over course. I'm just saying for me, I signed up, fully expected my DNA to be used to conduct research. That fact that I could get interesting graphs and some health info was just a bonus for me
Is the best way to hasten the next bankruptcy to not delete your data?
The obvious solution is with legislation for transparency and better health care system.
s/insured/hired
Wait until we have DNA detectors wired up to collect the DNA we exhale and rapid sequencers that handle what might be below the limit of detection today.
Maybe that's fifty years down the road, but it's coming.
Gattaca was a prescient premonition, it was just a hundred years ahead of its time.
1. Why take the chance?
2. Your DNA being out in the wild also impacts the privacy of your relatives (including those you might not know, and those who don't exist yet (a child, a nephew, a niece for instance)), so if not for you, do it for them.
It won't be a guarantee, but it maximizes the chances.
I do think that the health stuff from 23andMe is only marginally better than astrology, that the ethnicity estimates are inferior to what most people can get from good old fashioned genealogy, but that the matching may be useful, if you value knowing who you're related to a lot.
But indeed, that's me thinking that you should care a lot more about your privacy and the privacy of your relatives than anything 23andMe provides.
I mean, I think I would find it fun to discover relatives, but not at this cost.
While some DNA characteristics can be statistically linked with some costly health conditions, the connection to "being a good hire" seems totally imaginary to me, has always been and will always be.
For what it's worth, public posts and comments on internet are probably a much better indicator of whether someone is going to be an obedient employee, and this dystopia is technically doable right now, and certainly many are working on it already.
Sure it is.
Easy case in point: count the chromosomes. The wrong number alters gene dosage, leading to diseases such as down syndrome.
Neuroticism amongst many other personality traits is heavily genetically linked, obesity and cardiovascular health are genetically linked, ...
Lots of things an employer or insurer would be interested in if our laws do not protect us. Some may eventually be willing to skirt the law if it gives them an edge and the penalties don't outweigh the benefits.
We're not there yet, but give it 25-50 years.
And about dna to be used baselessly to take decision, sure, like graphology, or astrology could be used... but then concealing dna is like concealing your handwriting or exact date of birth. Why not, but we are no longer speaking about privacy protection but about fighting superstition.
Good luck blue cross.
If these companies have the legal clearance to use DNA data, why would they be satisfied only having secondhand access to that data for a relatively small subset of the population? They'll obviously want that data for everyone.
Makes perfect sense to me, to be honest. Outside of certain genetic deceases, the behavioral aspect is what ultimately matters. And afaik in terms of the US, a large proportion of the population dies due to behavioral aspects (which I would count obesity rates as an outcome of that).
A simple example: yes, I would say I am genetically predisposed to have addictive habits, suffer from liver failure due to alcohol+drugs, and/or deal with lung problems from smoking (given that this was the fate of multiple males up my ancestry tree from both sides of the family). Naturally, I believe that it is more valuable for an insurance company to know whether I actually engage in those harmful behaviors vs. my genes dictating the likelihood of me engaging in those behaviors. In the absence of the former, the latter could be pretty useful, but otherwise there is no competition at all.
Sure, genes can give you a probability of those behavioral aspects occurring, but going for the primary metric you are looking for vs. a weak predictor of it clearly makes way more sense.
‘You said on your application you don’t smoke, but on x data dump it shows you do. Por que?’ Or just deny your claim for that reason, and make you fight it.
But what about the reverse? There's something intuitively unjust about the customer not knowing why they're being charged a higher rate, especially if it means the company believes there's a potential danger (enough that it affects the bottom-line) but conceals it.
So yeah, I think "transparency" is a robust principle to follow here, especially if anyone is arguing market competition is going to curb the worst abuses.
Except of course in the US, with scammy hospitals charging over-inflated prices to uninsured patients, while only insurers can access realistic prices.
For example, suppose you can accurately determine that someone has exactly a 5% chance of developing a problem that will cost exactly $X to treat. Find 10,000 of those people, and insurance is still useful for spreading the risk and financing the treatments.
Also, legislation requires health insurance to pay for all healthcare, so there is no point paying to insure against specific ailments.
You might have supplemental insurance that pays you in case of a specific illness, but that is not what is commonly referred to as health insurance.
Can you provide a link to a business selling the type of policy you are referring to? I am curious what these look like.
Deliberately extracting personal data into un-audited environments without good reason (eg printing a label for shipping), should be punished with GDPR-style global turnover-based penalties and jail for those responsible.
There already are, but only for Europeans through the GDPR.
How it should be is that personal data's current and historical disposition is always available to the person in question.
If that's a problem for the company processing the data (other than being fiddly to implement at first), that sounds like the company is up to some shady shit that they want to keep quiet about.
Nothing to hide, nothing to fear should apply here, and companies should be fucking terrified with an existential dread of screwing up their data handling and looking for ways to always avoid handing PII at all costs. The analogy of PII being like radioactive material is a good one. You need excellent processes, excellent reasons to be doing it in the first place, you must show you can do it safely, securely and if you fuck up, you'd better hope your process documentation is top tier or you'll be in the dock. Or, better, you can decide that actually you can make do by handling the nuclear material only in some safer form like encapsulated vitrified blocks at least for most of your processes.
The data processing industry has repeatedly demonstrated they they cannot be trusted and so they should reap the whirlwind.
Fully agree on what you are saying, and my popcorn is ready for August when the penalties part of the AI Act comes into force. There is a grace period for two years for certain systems already on the market, but any new model introduced after August this year has to be compliant. AI Act+GDPR will be a great show to watch...
How are scientists able to work on encrypted genomes?
That would not be very comforting, this would mean that even encrypted, we can know stuff from the encrypted form, which kinds of defeat the purpose unless I'm missing something.
That being said, scientists can implement their own protocols, and use whatever technique they want. For example: https://github.com/securegenomics/protocol-alzheimers-sensit....
It's that our platform makes federated computing + homomorphic encryption analysis easy, but protocols are customizable.
jmole•6mo ago
It seems like you could do lots of useful things without having a name attached to any particular sample. There must be some kind of differential privacy approach here that would work well.