Anymore I have zero desire to keep any copy of work code or other data on any personal device. Nope, never gonna need it, don't want it, just a potential legal headache with no upside.
But when I was younger? I could totally imagine getting a big juicy dataset like that and wanting a copy for myself. It'd make me feel special, having information no one else had.
A broken logic. Of course the people who you would have stolen the data from, had it. A question pops up, though... what's in your possession you should not be in the possession of.
I'm not doing anything wrong! It's not like I'm selling it! I'm just showing off the cool data no one else has! I'm saving the day, probably, by letting us solve a problem with my cool data that would be impossible otherwise.
I had access to insane amounts of highly sensitive data as an early 20-y/o and never once felt inclined to share it or brag about it with anyone.
Hiring processes around these roles should distinguish between past-me and past-you.
Like, any system will fail if too many of its members don't care about maintaining it, but you're going to hire the wrong person from time to time.
It's important to design your systems to minimize access, both in terms of not allowing everyone access to everything and to only allow people as much access as then need to do their jobs, to require multiple people to sign off on temporary access grants, to create audit trails and to actually audit them and have consequences for violating the rules.
(Which, in this case, DOGE purposefully dismantled.)
It doesn't just protect the data from nefarious villains, it also protects young idiots from themselves, who don't realize you can cause harm just by being curious.
I'm proposing that we both have systems to mitigate insider risk and we try to avoid hiring ideologically motivated and ethically compromised goobers to highly sensitive government jobs.
And I'm proposing that we don't write this off as, "welp he's a kid!"
Oh, wait. No I would never have done that. That's just insane.
In the DOGE case, they specifically broke all the controls that existed to manage insider risk and keep people from making copies like this, but (especially 20-30 years ago) I've been on plenty of networks that just had no concept of insider risk and everything was just open for anyone to access (or protected by shared passwords everyone knew).
I don’t think there’s a risk that it will influence a rare person in power to enforce the rules to go lighter. I just think it encourages people to be less reckless with hoarding data who might otherwise put themselves in danger.
Same. I won't even have Teams or Authenticator on my phone unlike most others here (though wrt Teams, that is at least as much about not wanting work to bother me as it is about the danger of data seepage). I need the authenticator to do the job, but I have an old factory-reset phone that has that (and, just in case, Teams) on it.
> But when I was younger? I could totally imagine getting a big juicy dataset like that and wanting a copy for myself.
I'm pretty sure I never would have done. I've always resisted knowing credentials and personal information that aren't mine (so if anything untoward happens with/using that information there is no way it can be my fault/doing, as well as the less selfish reasons) despite people falling over themselves to do things like tell me their passwords & such when they were wanting some for of tech support.
But I think there is a different attitude to data risk in that age group today. They've grown up in a world where very little is really private, and every app and its dog has wanted their contact details and other information (and all too often information about their friends & family), do the idea that data is a free-for-all is dangerously normalised in their heads.
I find older people are similarly very lax with their own data, in fact often being rather too trusting of others generally, but not so much with other peoples. There are a lot more people who are appropriately careful (or even paranoid) in their 30s/40s/50s (I'm late 40s myself) - I think we are lucky to be in the middle, being exposed to information dangers enough to not have that “naivety or age” and not desensitised by having lax information security pushed at us from an early age.
But:
1) That's why we have traditionally had the safeguards that we have had, to protect against this sort of crime, and
2) The allegation in this case is that he later approached coworkers to do something with this data, even if they ultimately didn't help him do it. So it doesn't appear to be hoarding just for the sake of it here.
Agency: "Social Security initially denied Borges’s allegations and said the data referenced in his complaint is stored in a secure environment walled-off from the internet."
Ah walled of the internet, so no one can get there and copy the data to a flashdrive. Move on, move on!
You can't make that up.
I suspect the whistleblower is correct, but I don't think it's proven to the point where we can confidently state that "it happened." SSA isn't trying to dispute the method, they're trying to dispute the fundamental claim.
Unfortunately it seems quite believable. This is the same outfit that fired a bunch of people responsible for overseeing the US Nuclear Arsenal. [0] The combination of arrogance and stupidity was breathtaking.
[0] https://thebulletin.org/2025/04/doges-staff-firing-fiasco-at...
I have a sinking suspicion this engineer won't see the inside of a jail cell.
But why? The only conclusion I can come to is "stealing elections". I'll include this partial list I made of Republican voter suppression efforts going back decades [1].
I believe out there someone is collecting all this data into an AI model to predict how people will vote, something that Cambridge Analytica was a toy version of. But it goes beyond how people will vote but whether they will vote. Likewise, data will be constructed to strike off people from voter rolls if the system believes they won't vote how you want. We've seen efforts like this where similar-sounding names of felons in other states are used to strike off people from voter rolls. And that's a real problem because people might not know they're no longer registered to vote and in some states you have to register 30 or more days before the election.
There is essentially infinite money available to fund Republicans stealing elections because it results in public funding cuts to give even more tax breaks to billionaires.
You can't directly use the SSA databsae obviously so any effort must be small enough to not draw attention, involve part or all of the computing done overseas to avoid legal scrutiny and/or "washing" that data through data provider services. I would bet if you started exhaustively looking at various companies in or adjacent to these spaces, you'd find some pretty dodgy stuff.
https://www.onthewing.org/user/Bonhoeffer%20-%20Theory%20of%...
See if Musk was in any way involved, or acted with such reckless disregard for known security standards that he could be civilly or criminally liable. Do the same as above for him.
The only way this stops is if consequences are introduced.
I think given the performance of DOGE, the wars, the executive orders, the epstein files, we can make a SMALL logical stretch here and assume, FOR THE MOMENT, that this happened.
Waiting for the outcome of an investigation is the only prudent decision.
https://www.poynter.org/commentary/2025/the-daily-beast-retr...
The topic at hand was a whistleblower report, which would have serious ramifications if proven false. It isn't apples-to-apples.
[0] https://thehill.com/homenews/media/fox-news-donald-trump-dig...
I was for the admin based on claims of lawful immigration enforcement and keeping out of foreign wars. however, after inept efforts with immigration, doge and the Iran war I will not be for republicans again.
Now, your turn to answer the question.
If those people weren't granted unprecedented access to our data, there would be no whistle to blow. You can wait for the "investigation" to play out, the rest can see that obvious risks were ignored to benefit someone.
Again, there doesn't need to be evidence. The point is that a claim like this is clearly plausible and worth investigating because of political decisions this administration made. They took a non-political issue (access to social security data) and explicitly made it political. You don't get to later use those same politics as a protective shield for criticism.
> it maps perfectly onto an existing fear people were already primed for.
People were primed because of the repeated warning that experts were giving about the security of this data and carelessness in allowing access. You are helping to prove my point that the administration encouraged this by their own actions.
That is to say, there is no reason to extend this administration or anything DOGE-related the benefit of the doubt.
So many years of dealing with this administration, and people are still attempting to point our hypocrisy and hold people to standards with regard to principle, past statements, character, etc. None of it will work here.
Real quote from a friend when this whole thing was going down.
It's hardly the first time that side effects have been ignored in the pursuit of a goal (in this administration, yes, but let alone in any previous administration, or any previous governing body at all). In due time, this one will fall out of your mental stack, too.
Either way this data is definitely going to spread behind closed doors.
It’s interesting (horrifying) to think of the implications actually. People wouldn’t buy this data directly, it’s too obviously illegally procured. But laundered through an LLM to provide “insights” without citation? That’s plausible deniability.
Nobody should have permission to query 70M Americans, it's a huge security flaw for the average citizen. But Pentagon has been doing this for a while a la Snowden, and the average american doesn't seem to be worried. With Snowden becoming a menace rather than a hero.
Once private government data from Americans starts being heavily used to mess up elections, or even worse, persecute people with a different opinion than the ruling party...
Americans will finally wake up that GDPR doesn't stiffle innovation, but rather protect its citizens from an evil actors.
But it may be too late, like when NSDAP started chasing jews and migrants. There was nothing they could do other than to flee to survive.
- Terry Pratchett
Why not? Shouldn't the public be allowed to learn who all the DOGE employees were? Federal employees are public record, are they not?
Yet here on HN, what have we been arguing about? Big tech. Google and Meta have been allowed to become boogeymen in this community out of all proportion to the actual threat they posed[1].
While the actual boogeyman stealing our data to exploit in the market? It was us.
[1] I mean, lets be honest, while everyone has abstract complaints the truth is that they've actually been remarkably benign stewards of our data over the past 20 years. Much, much, MUCH more responsible than the glibertarian dude in the cubicle next to you, as it turns out.
Justin Fox not being able to say what DEI is really tells you everything you need to know about how grants were cancelled.
samrus•1h ago