frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

France's homegrown open source online office suite

https://github.com/suitenumerique
207•nar001•2h ago•110 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
374•theblazehen•2d ago•134 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
65•AlexeyBrin•3h ago•12 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
40•onurkanbkrc•3h ago•2 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
749•klaussilveira•18h ago•234 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
108•alainrk•2h ago•117 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
1002•xnx•23h ago•569 comments

Show HN: One-click AI employee with its own cloud desktop

https://cloudbot-ai.com
7•fainir•1h ago•2 comments

First Proof

https://arxiv.org/abs/2602.05192
11•samasblack•32m ago•5 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
6•vinhnx•1h ago•1 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
132•jesperordrup•8h ago•55 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
91•videotopia•4d ago•20 comments

Ga68, a GNU Algol 68 Compiler

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
30•matt_d•4d ago•6 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
148•matheusalmeida•2d ago•40 comments

Reputation Scores for GitHub Accounts

https://shkspr.mobi/blog/2026/02/reputation-scores-for-github-accounts/
6•edent•2h ago•0 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
253•isitcontent•18h ago•27 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
266•dmpetrov•18h ago•142 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
6•rbanffy•3d ago•0 comments

Show HN: Kappal – CLI to Run Docker Compose YML on Kubernetes for Local Dev

https://github.com/sandys/kappal
10•sandGorgon•2d ago•2 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
531•todsacerdoti•1d ago•258 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
409•ostacke•1d ago•105 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
353•vecti•20h ago•159 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
321•eljojo•21h ago•198 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
448•lstoll•1d ago•296 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
54•helloplanets•4d ago•54 comments

Cross-Region MSK Replication: K2K vs. MirrorMaker2

https://medium.com/lensesio/cross-region-msk-replication-a-comprehensive-performance-comparison-o...
6•andmarios•4d ago•1 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
365•aktau•1d ago•190 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
292•i5heu•21h ago•246 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
103•quibono•5d ago•30 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
53•gmays•13h ago•22 comments
Open in hackernews

23andMe is out of bankruptcy and it still hasn’t substantially changed its ways

https://www.washingtonpost.com/technology/2025/07/17/23andme-bankruptcy-privacy/
111•1vuio0pswjnm7•6mo ago

Comments

jmole•6mo ago
Is there any intrinsic value whatsoever in the DNA or SNPs themselves? Or is it just the link between your name and your DNA that is so concerning?

It seems like you could do lots of useful things without having a name attached to any particular sample. There must be some kind of differential privacy approach here that would work well.

hirvi74•6mo ago
Can one even truly delete their DNA from 23andMe? Wouldn't deleting someone's DNA require deleting not only the existing record, but also the record from all historical back-ups too? What is to say 23andMe just doesn't flip a bit in their database and claim one's DNA is (soft) deleted?
vintermann•6mo ago
Well, GDPR compliance is one thing. It could be very expensive if you have European customers and get caught doing this.
FirmwareBurner•6mo ago
If they don't have any offices or assets in Europe, then the EU can't do anything against them for breaking GDPR.
chha•6mo ago
Sure they can. GDPR article 2 (2) says:

«This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union;»

So the GDPR applies. The EU can of course sanction violators outside, but given the current political climate it is likely to be more difficult than before.

FirmwareBurner•6mo ago
>Sure they can. GDPR article 2 (2) says:

No they can't. That says only about where the law applies, it doesn't say about prosecution of entities not residing in the EU. EU's legal arm can't extent outside the boarders of the EU, without an outright military invasion, it's toothless to foreign entities.

>So the GDPR applies.

I never said it doesn't apply. I said how is the EU gonna prosecute an entity that doesn't reside in the EU?

Extraditions are tough to negotiate even for serious stuff like murder. No foreign judge will take GDPR violations seriously to do that.

chha•6mo ago
Sorry for that, guess I'm just too used to the common misconception some people have that the GDPR doesn't apply if a company isn't established in the EU.

In this case 23andMe is on the Data Privacy Framework list, so they have volunteered to follow the GDPR while still being based in the US. This is basically the same as a number of other GDPR cases, including the GDPR fine against Clearview AI. Fining 23andMe if they violate GDPR should be trivial in that case.

vintermann•6mo ago
EU will presumably stop you from doing business in the EU, if you break EU laws and ignore judgments. Companies don't want that. Grindr LLC, a US company with no EU corporate presence as far as I know, was fined 6.5 million for breaching GDPR by an Oslo Court, upheld in appeals. They paid that fine. If they didn't, they'd probably be kicked out of the app stores from Norway (if not from all of EU). Apple and Google do have an EU presence. Even if they didn't, they wouldn't start a war over 23andMe.

We generally don't need to worry about EU judgments against US companies being enforced, or vice versa. It doesn't get far enough that we need to think about how it's going to be enforced as a practical matter.

culopatin•6mo ago
You really think they are getting access to their data and looking for some German dude’s data in every server they have? At most 23andMe gets some 250 item questionnaire that some poor soul with a red stapler in a basement has to answer, some middle manager puts a signature on it, sends it saying “yeah we good” and that’s it. Unless there is enough noise for someone to sue, no one is looking at that shit hard enough.

Even if they are, 6.5 million in fines is chump change.

Novosell•6mo ago
Meta has been fined €1.2b for breaching GDPR, though it seems to still be appealing.

The fine is revenue adjusted.

vintermann•6mo ago
Filling out some form won't help if someone finds out they're not doing what they said they will do. The data protection agencies will sue if they find that out. And they did find out that Grindr was selling data it wasn't allowed to.

Fines are scaled according to revenue. That's the reason for the monumental fines leveled at Google and Facebook. They don't mess around.

I believe Grindr's fine was just based on their revenue from Norwegian users. Probably plenty hurtful enough to make them implement data protection in the jurisdictions which demand it.

chha•6mo ago
Fines are based on a number of things; the type of data (PII or Article-9-PII), number of people affected, amount of data, previous violations, and as far as I know also the country issuing the fine. 6.5M might be a small fine by some standards, but if fined and nothing improves the fine is likely to be a lot higher the next time around.
florbnit•6mo ago
> EU's legal arm can't extent outside the boarders of the EU, without an outright military invasion

All the lawsuits from EU against tech companies outside the EU have been carried out without any military invasions required. You are delusional if you think USA is going to go “Google won’t pay the fines, invade us if you want the money”

FirmwareBurner•6mo ago
>You are delusional if you think USA is going to go “Google won’t pay the fines, invade us if you want the money”

Why don't you read my comment thoroughly before accusing people of being delusional? Google is registered in Europe(HQ in Ireland IIRC ) and does business in Europe as an European company, so there's a physical, legal and tax paying entity the EU can fine and even European management they can send to jail just in case they don't comply, same how they did with VW for the diesel scandal.

If you would have read my comment thoroughly (hard ask, I know), you would have seen I'm specifically asking about how would the EU fine foreign companies that have EU citizens' data but have no HQ or any legal entity in Europe that can be reached by EU law enforcement in case of GDPR violations.

UltraSane•6mo ago
The only truly reliable way to do this is to have a per-customer encryption key used to encrypt ALL data for that customer. Then you can simply delete the key to delete the customer data.
seanvelasco•6mo ago
i'm curious, what's the worst-case scenario if one were to put their whole DNA data exposed publicly? would a future civilization re-make your image? or are there societal benefits?
wslh•6mo ago
Clearly, one risk is that your DNA, or that of a close relative could be linked to a crime, even if you weren't directly involved.
hammyhavoc•6mo ago
DNA isn’t guilt by association. Cops still need real evidence—this isn’t CSI fan fiction.
a_bonobo•6mo ago
>Identification of DeAngelo began in December 2017 when officials, led by detective Paul Holes and FBI lawyer Steve Kramer, uploaded the killer's DNA profile from a Ventura County rape kit to the personal genomics website GEDmatch.[182] The website identified 10 to 20 people who had the same great-great-great-grandparents as the Golden State Killer; a team of five investigators working with genealogist Barbara Rae-Venter used this list to construct a large family tree.[183] From this tree, they established two suspects; one was ruled out by a relative's DNA test, leaving DeAngelo the main suspect.[184]

>On April 18, 2018, a DNA sample was surreptitiously collected from the door handle of DeAngelo's car;[64] another sample was later collected from a tissue found in DeAngelo's curbside garbage can.[185]

DNA as the guilt-by-association, they then will get the evidence.

https://en.wikipedia.org/wiki/Joseph_James_DeAngelo#Arrest,_...

hammyhavoc•6mo ago
You're describing the process working—DNA gave investigators a lead, which they followed up with legally obtained, direct evidence tying DeAngelo to the crime. That’s not guilt by association; that’s good policing. Framing that as something sinister ignores the outcome: a serial rapist and murderer finally faced justice. That’s not dystopia—it’s what society should be doing—or would you rather a rapist, kidnapper and murderer walked freely?
kstrauser•6mo ago
For example, long-term care, and disability insurance aren’t blocked at the federal level from discrimination based on DNA. If they suspect you might get bad arthritis some day, they can block you from insurance (barring state laws saying otherwise).
vpShane•6mo ago
The worst case scenario would have already happened: your privacy being violated, and your info being in the hands of anybody or anything that wants to use it against you.

as far as re-making you I think those like thiel and others are in to trans-humanism. so probably to help make the ultimate borg.

chneu•6mo ago
what's the best case scenario?
vintermann•6mo ago
Best case scenario is that you can answer a ton of historical questions that you wouldn't have a chance to otherwise.

I'm currently in the process of figuring out where my most distant known paternal-line ancestor came from. Took a Y-DNA test, found a very distant all-male line cousin who was open to taking a Y DNA test (because he had already taken one at 23andMe, I knew the odds were good that he would be OK with it - the 23andMe test also had enough Y information that I knew it was likely we both descended from the same man).

His first results came in yesterday. In another couple of weeks, when his Y SNPs come in, we'll know which of my ~10 private mutations (ones no one else has been found to share) we have in common, which will in turn put better time estimates on our distance to other testers in our part of the Y-tree.

newman8r•6mo ago
Best case scenario is it's used to identify unknown remains, or help locate a murder suspect who's probably distantly related to you (i.e. a 5th cousin you never met). More of these samples makes it harder for serial killers to stay hidden for too long.
sjw987•6mo ago
All your DNA markers show that you have no markers for any adverse genetic conditions and can therefore have lower insurance rates if one of the worst case scenarios occurs (them selling data to health insurers).

I did the 23andme thing years ago (foolishly or not, I found it interesting), and everything on there at the time and when I checked recently said I had no markers for anything. They are very pro-active in reminding you all over the app/website and for each condition they track that they only track a few markers so the results are obviously not really that predictive of anything, and that practicing a healthy lifestyle (whatever your genetic markers) is the most important thing to do.

ashdksnndck•6mo ago
23andMe’s new business model (Anne Wojcicki and Regeneron both had the same plan for it) is to use the data for drug development. That’s theoretically where the $$$$ is. Turns out you can use statistical association between genes and other biomarkers to discover drugs that will succeed in clinical trials.

So, I suppose, if everyone could use the data, everyone could use it to develop drugs, not just 23andMe. That’s good if you want more drugs to be developed and released… bad if you don’t.

beaugunderson•6mo ago
someone could synthesize your DNA and leave it a crime scene, to use an example popularized by the consent form of the Harvard Personal Genome Project.
socalgal2•6mo ago
I signed up for 23andme to specifically make my DNA available. I believe that more DNA = more cures

Note: I'm not saying you should not delete your DNA. Do what you want over course. I'm just saying for me, I signed up, fully expected my DNA to be used to conduct research. That fact that I could get interesting graphs and some health info was just a bonus for me

PaulHoule•6mo ago
When people say "data is the new oil", I think https://en.wikipedia.org/wiki/Exxon_Valdez_oil_spill

Is the best way to hasten the next bankruptcy to not delete your data?

ulfw•6mo ago
Deleted it years ago. Don't trust any of them to actually have deleted it.
ks2048•6mo ago
I do worry about my data with them, but when I think about the worst-case scenario - you will not get insured (or have high rates) because you have {some genetic condition}.. it seems just as likely that they will simply require my DNA to apply for insurance. (or get my DNA from a blood test within their system, etc.).

The obvious solution is with legislation for transparency and better health care system.

echelon•6mo ago
> you will not get insured (or have high rates) because you have {some genetic condition}..

s/insured/hired

Wait until we have DNA detectors wired up to collect the DNA we exhale and rapid sequencers that handle what might be below the limit of detection today.

Maybe that's fifty years down the road, but it's coming.

Gattaca was a prescient premonition, it was just a hundred years ahead of its time.

ks2048•6mo ago
Yes, I can imagine those dystopias - my point was that I don't imagine my choice to try 23andMe in 2019 is what dooms me - while others are saved by not making that choice.
jraph•6mo ago
I kind of understand your point, now my view would be that it should not be a reason not to delete the data from them, in case it actually helps. Otherwise:

1. Why take the chance?

2. Your DNA being out in the wild also impacts the privacy of your relatives (including those you might not know, and those who don't exist yet (a child, a nephew, a niece for instance)), so if not for you, do it for them.

It won't be a guarantee, but it maximizes the chances.

vintermann•6mo ago
Why take the chance? Because genealogical data is valuable to you, of course. If it isn't, no amount of legal or technical security will make it worth it.

I do think that the health stuff from 23andMe is only marginally better than astrology, that the ethnicity estimates are inferior to what most people can get from good old fashioned genealogy, but that the matching may be useful, if you value knowing who you're related to a lot.

jraph•6mo ago
OP said "I do worry about my data with them", it seems to be the best they could do to try resolve this would be to delete the data.

But indeed, that's me thinking that you should care a lot more about your privacy and the privacy of your relatives than anything 23andMe provides.

I mean, I think I would find it fun to discover relatives, but not at this cost.

hammyhavoc•6mo ago
What makes you believe this is coming? What evidence points to this inevitability?
rixed•6mo ago
Either you or me are deeply wrong about how genotypes relate to phenotype.

While some DNA characteristics can be statistically linked with some costly health conditions, the connection to "being a good hire" seems totally imaginary to me, has always been and will always be.

For what it's worth, public posts and comments on internet are probably a much better indicator of whether someone is going to be an obedient employee, and this dystopia is technically doable right now, and certainly many are working on it already.

lnsru•6mo ago
The samples of DNA of the best employees will be collected, evaluated and compared to applicants. So if your DNA is similar they will let you in. Wait for a clever startup to offer a complete solution for this comparison soon.
rixed•6mo ago
I'm not discussing that it would be doable if it worked, and that some startups will be up for it. I'm saying that DNA is not a predictor of being a "good employee", and if you believe otherwise I'd be interrested to know why.
echelon•6mo ago
> I'm saying that DNA is not a predictor of being a "good employee"

Sure it is.

Easy case in point: count the chromosomes. The wrong number alters gene dosage, leading to diseases such as down syndrome.

Neuroticism amongst many other personality traits is heavily genetically linked, obesity and cardiovascular health are genetically linked, ...

Lots of things an employer or insurer would be interested in if our laws do not protect us. Some may eventually be willing to skirt the law if it gives them an edge and the penalties don't outweigh the benefits.

We're not there yet, but give it 25-50 years.

rixed•6mo ago
Come on, I don't need a dna analysis to tell if one has Down syndrome or is obese. Again, a google search teach me much more about a candidate than his or her dna.

And about dna to be used baselessly to take decision, sure, like graphology, or astrology could be used... but then concealing dna is like concealing your handwriting or exact date of birth. Why not, but we are no longer speaking about privacy protection but about fighting superstition.

lnsru•6mo ago
But good employees have DNA to analyze and compare to applicants. I am not saying, there is science behind. I also not saying, that good employees are the real good ones and not good looking ones.
sidewndr46•6mo ago
Whether or not the connection exists is not the question. The question is whether or not someone will make decisions based off that limited information
barbazoo•6mo ago
Why would that matter at hiring time? If the person develops a health issue during employment they’ll just fire them. Unlike insurance where there they’d have to spend money.
qingcharles•6mo ago
50 years down the road AI will have taken all the jobs, so I'm not sure we should be worried about getting hired. That ship will have sailed.
willcipriano•6mo ago
I put a fake name in when I signed up.

Good luck blue cross.

data_maan•6mo ago
I always wondered why people are so trusting (gullible?) to use their real data
Terr_•6mo ago
If they have enough DNA and not-so-secret genealogical data, they can derive your real name anyway.
echelon•6mo ago
They don't even need your DNA. Just your relatives.
slg•6mo ago
The analogy I have used in the past is this fear is like thinking that health insurance companies were more likely to buy the old Marlboro Miles database rather than just making detailing your smoking history a required part of the application process.

If these companies have the legal clearance to use DNA data, why would they be satisfied only having secondhand access to that data for a relatively small subset of the population? They'll obviously want that data for everyone.

vintermann•6mo ago
Yes. Behavioral data is in most cases far more useful to them than DNA. No need to go all the way of using the non-coding SNPs in a genealogical test to infer your coding DNA, to infer your propensity for smoking, when they can just find out if you smoke instead.
abbotcabbot•6mo ago
It may have been the lower hanging fruit. But they already have all the tools and behavior analysis is no longer the competitive edge. They can ask for your smoking status and deny your claim for fraud later, but (any genetic data on behavior is an update to fraud investigation value and) they can't know your propensity to non-smoker cancers.
sidewndr46•6mo ago
The data that 23andme goes far beyond DNA. It goes into explicit family history. Which as it turns out is pretty close to behavioral information
filoleg•6mo ago
> Yes. Behavioral data is in most cases far more useful to them than DNA.

Makes perfect sense to me, to be honest. Outside of certain genetic deceases, the behavioral aspect is what ultimately matters. And afaik in terms of the US, a large proportion of the population dies due to behavioral aspects (which I would count obesity rates as an outcome of that).

A simple example: yes, I would say I am genetically predisposed to have addictive habits, suffer from liver failure due to alcohol+drugs, and/or deal with lung problems from smoking (given that this was the fate of multiple males up my ancestry tree from both sides of the family). Naturally, I believe that it is more valuable for an insurance company to know whether I actually engage in those harmful behaviors vs. my genes dictating the likelihood of me engaging in those behaviors. In the absence of the former, the latter could be pretty useful, but otherwise there is no competition at all.

Sure, genes can give you a probability of those behavioral aspects occurring, but going for the primary metric you are looking for vs. a weak predictor of it clearly makes way more sense.

lazide•6mo ago
It’s much ‘better’ for them to use it as an anti-fraud measure later when someone makes a claim, not early on when people are paying them money.

‘You said on your application you don’t smoke, but on x data dump it shows you do. Por que?’ Or just deny your claim for that reason, and make you fight it.

Terr_•6mo ago
One aspect to this tangle is knowledge asymmetry: One of the traditional justifications for insurers poking around is to guard against an applicant that conceals important factors as a kind of fraud.

But what about the reverse? There's something intuitively unjust about the customer not knowing why they're being charged a higher rate, especially if it means the company believes there's a potential danger (enough that it affects the bottom-line) but conceals it.

So yeah, I think "transparency" is a robust principle to follow here, especially if anyone is arguing market competition is going to curb the worst abuses.

cm2187•6mo ago
The paradox is that the better insurance companies are at pricing risk, the more irrelevant their business becomes. If they can predict your disease with 100% reliability, and charge you based on that, then what is the point of buying insurance if you are going to be charged either nothing or the full cost of the treatment, just like if you were not insured.

Except of course in the US, with scammy hospitals charging over-inflated prices to uninsured patients, while only insurers can access realistic prices.

Terr_•6mo ago
I think that conflates accurate probabilities with predicting discrete future events.

For example, suppose you can accurately determine that someone has exactly a 5% chance of developing a problem that will cost exactly $X to treat. Find 10,000 of those people, and insurance is still useful for spreading the risk and financing the treatments.

lotsofpulp•6mo ago
Health problems and healthcare is not that simple. Almost everyone will experience healthcare issues in their life, so the insurance becomes more of a subsidy mechanism to time shift costs from one population to another (which at some point becomes indistinguishable from taxpayer funded healthcare, just with a different administrator).

Also, legislation requires health insurance to pay for all healthcare, so there is no point paying to insure against specific ailments.

sidewndr46•6mo ago
This isn't really true in the US, I have specific policies that insure me against other types of illness. They pay directly to me in the form of a predetermined cash settlement in the event of a diagnosis.
lotsofpulp•6mo ago
The Affordable Care Act does not allow for benefit maximums. Necessary healthcare beyond the out of pocket maximum is (theoretically) required to be paid for by the insurer.

You might have supplemental insurance that pays you in case of a specific illness, but that is not what is commonly referred to as health insurance.

Can you provide a link to a business selling the type of policy you are referring to? I am curious what these look like.

cm2187•6mo ago
At 5% you are still loosely predicting a disease, but with progress in generics you may be one day predicting with a 80% accuracy. The insurance premium will converge to cost of treatment.
PorterBHall•6mo ago
Currently, due to the Affordable Care Act aka Obamacare, health insurance companies are prohibited from setting rates based on individual risk (except they can charge higher premiums to tobacco users). Before the law, ensures would typically put applicants through a health screening that would determine their rate. People with pre-existing medical conditions could be denied coverage or charged higher rates. Women routinely paid higher premiums than men.
grues-dinner•6mo ago
Things like financial and medical data should be required to have an audit log that you can see, in real-time and subscribe to updates for, including extraction into "anonymised" formats, along with a description of that process, format and a justification for why it is robust against deanonymisation. If data is handled well, there is nothing to fear here. Fiddly, perhaps. Expensive, probably. But personal data processing should be risky and expensive.

Deliberately extracting personal data into un-audited environments without good reason (eg printing a label for shipping), should be punished with GDPR-style global turnover-based penalties and jail for those responsible.

chha•6mo ago
> Deliberately extracting personal data into un-audited environments without good reason (eg printing a label for shipping), should be punished with GDPR-style global turnover-based penalties and jail for those responsible.

There already are, but only for Europeans through the GDPR.

grues-dinner•6mo ago
Technically not quite, because even in the EU, you don't have to provide the audit log for someone's data specifically and you as a subject have to make specific requests to delete or retreive your data, it's not make transparent to you as a default position. But yes, you can't just dump it out anywhere you want.

How it should be is that personal data's current and historical disposition is always available to the person in question.

If that's a problem for the company processing the data (other than being fiddly to implement at first), that sounds like the company is up to some shady shit that they want to keep quiet about.

Nothing to hide, nothing to fear should apply here, and companies should be fucking terrified with an existential dread of screwing up their data handling and looking for ways to always avoid handing PII at all costs. The analogy of PII being like radioactive material is a good one. You need excellent processes, excellent reasons to be doing it in the first place, you must show you can do it safely, securely and if you fuck up, you'd better hope your process documentation is top tier or you'll be in the dock. Or, better, you can decide that actually you can make do by handling the nuclear material only in some safer form like encapsulated vitrified blocks at least for most of your processes.

The data processing industry has repeatedly demonstrated they they cannot be trusted and so they should reap the whirlwind.

chha•6mo ago
It doesn't say audited environments as such, but you are required to use secure environments that you control as a basis. What "secure" means can always be discussed, but in general it depends on what data you process and what you do with it; if it is a large volume/big population/article 9-data auditable environments should be expected - though not publicly auditable. Although that would be nice...

Fully agree on what you are saying, and my popcorn is ready for August when the penalties part of the AI Act comes into force. There is a grace period for two years for certain systems already on the market, but any new model introduced after August this year has to be compliant. AI Act+GDPR will be a great show to watch...

barisozmen•6mo ago
I started doing a relevant project https://github.com/barisozmen/securegenomics . Because I believe 23andMe event will result in people to be more wary of sharing their genetic data, and we need ways to make people able to contribute in genetic research without exposing their data.
jraph•6mo ago
In what you show, people encrypt their genome before uploading data on some server, and then scientists can work on the data.

How are scientists able to work on encrypted genomes?

vintermann•6mo ago
Homomorphic encryption, presumably? It's not impossible. But I also think it's overkill. Also, I don't know of good open source software that lets me do the kind of analysis I want even on non-encrypted data.
jraph•6mo ago
> Homomorphic encryption, presumably?

That would not be very comforting, this would mean that even encrypted, we can know stuff from the encrypted form, which kinds of defeat the purpose unless I'm missing something.

barisozmen•6mo ago
Yes, it's by homomorphic encryption as @vintermann mentioned.

That being said, scientists can implement their own protocols, and use whatever technique they want. For example: https://github.com/securegenomics/protocol-alzheimers-sensit....

It's that our platform makes federated computing + homomorphic encryption analysis easy, but protocols are customizable.

kazinator•6mo ago
The idea that someone who cares about their personal data should happily give it to a company as long as they are not near bankruptcy is absurd.
mgh2•6mo ago
https://archive.is/7i4GP
ProAm•6mo ago
Now that they have been 'sold and bought' I dont believe any of the contractual agreements have to be kept. And since the old CEO just bought the company I can only imagine the worse in terms of data to sell.
anothernewdude•6mo ago
Be careful not to delete all your DNA if you're still alive and using it.
CITIZENDOT•6mo ago
Why not? Elaborate
7734128•6mo ago
It's quite hazardous to your health to all of a sudden lack any DNA. Your cells won't know how to replicate for one.
gattr•6mo ago
You could always restore from incidental backups (hairbrush, etc.).
lion__93332•6mo ago
That feels really risky. If your DNA is preserved, could a replica of you appear decades later?
vintermann•6mo ago
Well, a 23andMe v5 test looks at approximately 640000 locations on your genome, out of approximately 3 billion. So they have to use a lot of "frog DNA" to fill out the rest.