frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Ask HN: What have you built/shipped with Claude-code

1•blhack•1m ago•0 comments

Digital Omnibus Report V2: Analysis of Select GDPR and EPrivacy Proposals by EC

https://noyb.eu/en/digital-omnibus-report-v2-analysis-select-gdpr-and-eprivacy-proposals-commission
1•buzer•2m ago•0 comments

Penn Calls Government's Demand for Lists of Jewish Staff 'Disconcerting'

https://www.nytimes.com/2026/01/20/us/university-of-pennsylvania-trump-jewish-staff.html
1•duxup•2m ago•1 comments

macOS Stats: Local Privilege Escalation via Exposed XPC Method

https://github.com/exelban/stats/security/advisories/GHSA-qwhf-px96-7f6v
1•inatreecrown2•2m ago•0 comments

Zero to One: Learning Agents and Agentic Patterns

https://pradyumnachippigiri.dev/blogs/understanding-ai-agents
1•PraddyChippzz•2m ago•1 comments

In a warming world, freshwater production is moving deep beneath the sea

https://apnews.com/article/climate-solutions-desalination-oceans-drinking-water-faba2579f83df4c06...
1•embedding-shape•3m ago•0 comments

Skyreader: A RSS Reader on the AT Protocol

https://www.disnetdev.com/blog/2026-01-20-skyreader-a-rss-reader-on-the-at-protocol/
2•erlend_sh•4m ago•1 comments

Being creative requires taking risks

https://www.henrikkarlsson.xyz/p/being-creative-requires-taking-risks
1•Curiositry•8m ago•0 comments

Hacker Lists Vibecoded Apps: 198 Scanned, 196 Found Vulnerable

https://firehound.covertlabs.io
2•birdculture•11m ago•0 comments

Spotlight Rules

https://spotlight-rules.com/
1•mooreds•13m ago•0 comments

Claude Chill: Fix Claude Code's Flickering in Terminal

https://github.com/davidbeesley/claude-chill
1•behnamoh•13m ago•0 comments

The Surprising Way AI Models Are Helping Humans Communicate Better

https://www.bbc.com/future/article/20251218-how-ai-can-teach-us-to-really-listen
1•xthe•15m ago•1 comments

How to generate 50K token documents using an agentic scaffold

https://www.dataframer.ai/posts/long-text-generation-dataframer-vs-baseline/
1•alex_aimon•15m ago•0 comments

FastMCP 3.0: From Tool Servers to Context Applications

https://mcpstatus.io/blog/202601-fastmcp-3
2•qave•17m ago•0 comments

Show HN: Free interactive security awareness library

https://ransomleak.com/learning/
1•dkozyatinskiy•18m ago•0 comments

AMD Ryzen AI Halo

https://twitter.com/AMDRyzen/status/2013642938106986713
2•polyrand•19m ago•0 comments

When Buttons Were the Hottest New Thing in Radio

https://paleofuture.com/blog/2024/12/30/when-buttons-were-the-hottest-new-thing-in-radio
1•ohjeez•22m ago•0 comments

Death Is an Engineering Challenge

https://danburonline.substack.com/p/death-is-an-engineering-challenge
1•kvee•25m ago•0 comments

I created an app to fight my sedentarism

https://movedoro.com/
1•gllermaly•26m ago•1 comments

Show HN: Poopyfeed – free, private, no account newborn tracking

https://develop.poopyfeed.com
1•mjmasia•26m ago•0 comments

JustHTML 1.0.0 Released

https://github.com/EmilStenstrom/justhtml
1•EmilStenstrom•31m ago•1 comments

F16 Falcon 2.0 – 3D Flight Simulator on Casio Calculator [video]

https://www.youtube.com/watch?v=cu6AIIK-RDc
1•starkparker•32m ago•0 comments

Anime.js Layout

https://animejs.com/documentation/layout/
2•handfuloflight•32m ago•0 comments

National income per adult has increased 1.1% per year on average 2010-2025

https://bsky.app/profile/gabrielzucman.bsky.social/post/3mcv2eqdb7s2y
2•doener•33m ago•1 comments

Amnesty urges halt to execution of 19-year-old Iranian protester

https://www.iranintl.com/en/202601209686
2•ukblewis•36m ago•0 comments

Ask HN: Have you integrated LLMs into any of your bash scripts or aliases?

1•detectivestory•37m ago•1 comments

Trying to exercise my data privacy rights led me to build a small opt-out tool

https://privacypartnersapp.com/
1•longbread•38m ago•1 comments

We're Still Underestimating What AI Means

https://tinyclouds.org/underestimating-ai/
2•thegeomaster•38m ago•0 comments

Reacting to news is basically a cheat code for traffic

https://jackseo.io/
2•janekfollendorf•39m ago•1 comments

Ask HN: Employment Dot Com Boom

3•_RPM•42m ago•0 comments
Open in hackernews

The challenges of soft delete

https://atlas9.dev/blog/soft-delete.html
33•buchanae•2h ago

Comments

cj•1h ago
We deal with soft delete in a Mongo app with hundreds of millions of records by simply moving the objects to a separate collection (table) separate from the “not deleted” data.

This works well especially in cases where you don’t want to waste CPU/memory scanning soft deleted records every time you do a lookup.

And avoids situations where app/backend logic forgets to apply the “deleted: false” filter.

vjvjvjvjghv•59m ago
I guess that works well with NoSQL. In a relational database it gets harder to move record out if they have relationships with other tables.
tempest_•45m ago
Eh you could implement this pretty simply with postgres table partitions
buchanae•42m ago
Ah, that's an interesting idea! I had never considered using partitions. I might write a followup post with these new ideas.
tempest_•31m ago
There are a bunch of caveats around primary keys and uniqueness but I suspect it could be made to work depending on your data model.
nemothekid•58m ago
The trigger architecture is actually quite interesting, especially because cleanup is relatively cheap. As far as compliance goes, it's also simply to declare that "after 45 days, deletions are permanent" as a catch all, and then you get to keep restores. For example, I think (IANAL), the CCPA gives you a 45 day buffer for right to erasure requests.

Now instead of chasing down different systems and backups, you can simply set ensure your archival process runs regularly and you should be good.

whalesalad•46m ago
A good solution here (can be) to utilize a view. The underlying table has soft-delete field and the view will hide rows that have been soft deleted. Then the application doesn't need to worry about this concern all over the place.
elyobo•32m ago
postgres with rls to hide soft deleted records means that most of the app code doesn't need to know or care about them, still issues reads, writes, deletes to the same source table and as far as the app knows its working
maxchehab•39m ago
How do you handle schema drift?

The data archive serialized the schema of the deleted object representative the schema in that point in time.

But fast-forward some schema changes, now your system has to migrate the archived objects to the current schema?

buchanae•23m ago
In my experience, archived objects are almost never accessed, and if they are, it's within a few hours or days of deletion, which leaves a fairly small chance that schema changes will have a significant impact on restoring any archived object. If you pair that with "best-effort" tooling that restores objects by calling standard "create" APIs, perhaps it's fairly safe to _not_ deal with schema changes.

Of course, as always, it depends on the system and how the archive is used. That's just my experience. I can imagine that if there are more tools or features built around the archive, the situation might be different.

I think maintaining schema changes and migrations on archived objects can be tricky in its own ways, even kept in the live tables with an 'archived_at' column, especially when objects span multiple tables with relationships. I've worked on migrations where really old archived objects just didn't make sense anymore in the new data model, and figuring out a safe migration became a difficult, error-prone project.

talesmm14•23m ago
I've worked at companies where soft delete was implemented everywhere, even in irrelevant internal systems... I think it's a cultural thing! I still remember a college professor scolding me on an extension project because I hadn't implemented soft delete... in his words, "In the business world, data is never deleted!!"
MaxGabriel•22m ago
This might stem from the domain I work in (banking), but I have the opposite take. Soft delete pros to me:

* It's obvious from the schema: If there's a `deleted_at` column, I know how to query the table correctly (vs thinking rows aren't DELETEd, or knowing where to look in another table)

* One way to do things: Analytics queries, admin pages, it all can look at the same set of data, vs having separate handling for historical data.

* DELETEs are likely fairly rare by volume for many use cases

* I haven't found soft-deleted rows to be a big performance issue. Intuitively this should be true, since queries should be O log(N)

* Undoing is really easy, because all the relationships stay in place, vs data already being moved elsewhere (In practice, I haven't found much need for this kind of undo).

In most cases, I've really enjoyed going even further and making rows fully immutable, using a new row to handle updates. This makes it really easy to reference historical data.

If I was doing the logging approach described in the article, I'd use database triggers that keep a copy of every INSERT/UPDATE/DELETEd row in a duplicate table. This way it all stays in the same database—easy to query and replicate elsewhere.

nine_k•8m ago
> DELETEs are likely fairly rare by volume for many use cases

All your other points make sense, given this assumption.

I've seen tables where 50%-70% were soft-deleted, and it did affect the performance noticeably.

> Undoing is really easy

Depends on whether undoing even happens, and whether the act of deletion and undeletion require audit records anyway.

In short, there are cases when soft-deletion works well, and is a good approach. In other cases it does not, and is not. Analysis is needed before adopting it.

nerdponx•11m ago
[delayed]
theLiminator•9m ago
Privacy regulations make soft delete unviable in many of the cases where it's useful.
sedatk•8m ago
The opposite is true in countries where there are data retention laws. Soft-delete is mandatory in those cases.
rorylaitila•9m ago
Databases store facts. Creating a record = new fact. "Deleting" a record = new fact. But destroying rows from tables = disappeared fact. That is not great for most cases. In rare cases the volume of records may be a technical hurdle; in which case, move facts to another database. The times I've wanted to destroy large volume of facts is approximately zero.
ntonozzi•7m ago
I've given up on soft delete -- the nail in the coffin for me was my customers' legal requirements that data is fully deleted, not archived. It never worked that well anyways. I never had a successful restore from a large set of soft-deleted rows.
zahlman•3m ago
> customers' legal requirements that data is fully deleted

Strange. I've only ever heard of legal requirements preventing deletion of things you'd expect could be fully deleted (in case they're needed as evidence at trial or something).