frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Reverse Engineering Medium.com's Editor: How Copy, Paste, and Images Work

https://app.writtte.com/read/gP0H6W5
1•birdculture•2m ago•0 comments

Go 1.22, SQLite, and Next.js: The "Boring" Back End

https://mohammedeabdelaziz.github.io/articles/go-next-pt-2
1•mohammede•7m ago•0 comments

Laibach the Whistleblowers [video]

https://www.youtube.com/watch?v=c6Mx2mxpaCY
1•KnuthIsGod•9m ago•1 comments

I replaced the front page with AI slop and honestly it's an improvement

https://slop-news.pages.dev/slop-news
1•keepamovin•13m ago•1 comments

Economists vs. Technologists on AI

https://ideasindevelopment.substack.com/p/economists-vs-technologists-on-ai
1•econlmics•15m ago•0 comments

Life at the Edge

https://asadk.com/p/edge
2•tosh•21m ago•0 comments

RISC-V Vector Primer

https://github.com/simplex-micro/riscv-vector-primer/blob/main/index.md
3•oxxoxoxooo•25m ago•1 comments

Show HN: Invoxo – Invoicing with automatic EU VAT for cross-border services

2•InvoxoEU•25m ago•0 comments

A Tale of Two Standards, POSIX and Win32 (2005)

https://www.samba.org/samba/news/articles/low_point/tale_two_stds_os2.html
2•goranmoomin•29m ago•0 comments

Ask HN: Is the Downfall of SaaS Started?

3•throwaw12•30m ago•0 comments

Flirt: The Native Backend

https://blog.buenzli.dev/flirt-native-backend/
2•senekor•32m ago•0 comments

OpenAI's Latest Platform Targets Enterprise Customers

https://aibusiness.com/agentic-ai/openai-s-latest-platform-targets-enterprise-customers
1•myk-e•34m ago•0 comments

Goldman Sachs taps Anthropic's Claude to automate accounting, compliance roles

https://www.cnbc.com/2026/02/06/anthropic-goldman-sachs-ai-model-accounting.html
2•myk-e•37m ago•5 comments

Ai.com bought by Crypto.com founder for $70M in biggest-ever website name deal

https://www.ft.com/content/83488628-8dfd-4060-a7b0-71b1bb012785
1•1vuio0pswjnm7•38m ago•1 comments

Big Tech's AI Push Is Costing More Than the Moon Landing

https://www.wsj.com/tech/ai/ai-spending-tech-companies-compared-02b90046
4•1vuio0pswjnm7•40m ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
2•1vuio0pswjnm7•42m ago•0 comments

Suno, AI Music, and the Bad Future [video]

https://www.youtube.com/watch?v=U8dcFhF0Dlk
1•askl•43m ago•2 comments

Ask HN: How are researchers using AlphaFold in 2026?

1•jocho12•46m ago•0 comments

Running the "Reflections on Trusting Trust" Compiler

https://spawn-queue.acm.org/doi/10.1145/3786614
1•devooops•51m ago•0 comments

Watermark API – $0.01/image, 10x cheaper than Cloudinary

https://api-production-caa8.up.railway.app/docs
1•lembergs•53m ago•1 comments

Now send your marketing campaigns directly from ChatGPT

https://www.mail-o-mail.com/
1•avallark•56m ago•1 comments

Queueing Theory v2: DORA metrics, queue-of-queues, chi-alpha-beta-sigma notation

https://github.com/joelparkerhenderson/queueing-theory
1•jph•1h ago•0 comments

Show HN: Hibana – choreography-first protocol safety for Rust

https://hibanaworks.dev/
5•o8vm•1h ago•1 comments

Haniri: A live autonomous world where AI agents survive or collapse

https://www.haniri.com
1•donangrey•1h ago•1 comments

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•1h ago•0 comments

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•1h ago•0 comments

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
2•helloplanets•1h ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•1h ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•1h ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•1h ago•0 comments
Open in hackernews

6NF File Format

https://habr.com/en/articles/942516/
76•sergeyprokhoren•5mo ago

Comments

nikolayasdf123•5mo ago
> habr.com

interesting to see this forum show-up again.

remember 15 years ago there were posts about DYI drone from some random guy with lots of theoretical physics about stable conditions derivations. it got a lot of criticism. now looking back and following what DJI is doing with sensors, his approach was totally wrong and that community nailed it with feedback. the forum got some extravagant ideas and some worthy criticism. at least back then.

artemonster•5mo ago
I remember visiting this site daily 10-15 years ago, in russian, ofc. The moderation was super high, karma system worked great, the content quality was astonishing. Then they switched up owners, tried heavily monetizing corpo-pseudo-blogpost-marketing crap and it all went downhill from there
balamatom•5mo ago
habr is an institution. it's like the "runet hn", minus wild west vc ecosystem, plus integrated blog posting like lj ogs intended to. probably helps a lot with original work like TFA getting traction. more power to that!

runet sites of that era are often born out of the hacker's characteristic contrarian attitude "because we can". attempts to monetize them in more recent years are bound to accomplish little more than fuck up the content quality and/or the "owner cashes out and opens cafe" thing.

nevertheless, to this day, when i think habrahabr, i think way higher bar for technical competence than hn. it's all in the attitude.

wearable•5mo ago
What are the modern equivalents of habr?
throw-the-towel•5mo ago
There's probably none. The Russian Internet has been Eternal Septembered too much for something similar to appear.
balamatom•5mo ago
if i knew any, i sure as fuck wouldn't post them on hn of all places.
0x457•5mo ago
It went downhill when they allowed getting an invitation via single blog post, requiring just one person to like it enough to give an invitation. Which wasn't hard to write - just translate something popular from hackernews before anyone else does it.

Shortly after, it became hilariously easy to farm and manipulate karma balances across the entire site. With 50 accounts (mults or real people all the same) you could create a new account a day.

Monetization started when it was already in a death spiral.

NooneAtAll3•5mo ago
don't forget the awful redesign, including completely replacing post formatter

all accumulated mastery of creating posts by experienced authors - gone overnight

jojobas•5mo ago
It was also notoriously politics-free, until something happened.
unquietwiki•5mo ago
Looks interesting, but few comments on the forum & even a negative vote count ATM. Format kinda looks "old school" in terms of defining records, but I guess that can be a positive in some circumstances?
inkyoto•5mo ago
I would say it is a niche solution that solves a specific problem.

Modern data sources increasingly lean towards and produce nested and deeply nested semi-structured datasets (i.e. JSON) that are heavily denormalised and rely on organisation-wide entity ID's rather than system-generated referential integrity ID's (PK and FK ID's). That is a reason why modern data warehouse products (e.g. Redshift) have added extensive support for the nested data processing – because it neither makes sense to flatten/un-nest the nested data nor is it easy to do anyway.

sergeyprokhoren•5mo ago
This is a fairly common problem. Data is often transferred between information systems in denormalized form (tables with hundreds of columns - attributes). In the data warehouse, they are normalized (data duplication in tables is excluded by using references to reference tables) to make it easier to perform complex analytical queries to the data. Usually, they are normalized to 3NF and very rarely to 6NF, since there is still no convenient tool for 6NF (see my DSL: https://medium.com/@sergeyprokhorenko777/dsl-for-bitemporal-... ). And then the data is again denormalized in data marts to generate reports for external users. All these cycles of normalization - denormalization - normalization - denormalization are very expensive for IT departments. Therefore, I had an idea to transfer data between information systems directly in normalized form, so that nothing else would have to be normalized. The prototypes were the Anchor Modeling and (to a much lesser extent) Data Vault methodologies.
snthpy•5mo ago
Nice. Anchor Modelling is underappreciated.

Gonna have a look at your DSL.

gregw2•5mo ago
Cool to see you tackle this problem.

If I were you though, I'd consider if I'd get more traction with an open source extension of Iceberg format that supports row based reporting and indexes for a unified open source HTAP ecosystem.

sergeyprokhoren•5mo ago
What does looks "old school" mean? Do you want to wrap this format in JSON like JSON-LD? I don't mind
mhalle•5mo ago
This format requires temporal validity with `valid_from`, but doesn't include `valid_to`. I don't understand how `valid_from` and the also required `recorded_at` interact.
STKFLT•5mo ago
I don't have any additional insight to the format, but I think the idea is that there is an implied ->infinity on every date range. Every bank can only have one bank_name so multiple bank_names for the same bank entity can be sorted on the 'valid' and 'recorded' axes to find the upper bounds of each.
dragonwriter•5mo ago
In the bitemporal model, both system and valid times are half-open intervals, and both the preceding and following interval can either have a different value or no value. Using only start times means that while records can be updated in either time stream, they cannot be logically deleted (in transaction time) or invalidated (in valid time) once they exist. There are databases where this assumption is valid, but in general it is problematic.
deepsun•5mo ago
Improvement idea -- in my experience "valid_from" is always a date (no time, no timezone). That's how it's reported in documents (e.g. contract validity period).

Rows that need seconds (e.g. bank transactions) are events, they aren't "valid" from a particular point in time forward, they just happen.

nine_k•5mo ago
In my experience, validity time may start at the start of business day, and likely has a specific time zone. In particular, I've seen how documents related to stock trading at NASDAQ specify Eastern Standard Time as the applicable timezone.

I understand how convenient it is to use UTC-only timestamps. It works in most cases but not all.

adammarples•5mo ago
No point losing information like that. What do you do if someone opens and closes an account on the same day? Changes their email address three times in one day? Etc
deepsun•5mo ago
Agree, if your updates have time then datetime it should be. It's just in my work everything is date only. E.g. employment starts on a date, not datetime, there's no data loss.
spennant•5mo ago
Odd seeing this right now for me. I recently implemented a 6NF schema for parsed XBRL files from EDGAR. The architecture was the right call... too bad the data is not useful for analytics.
rixed•5mo ago
> country_code 01K3Y07Z94DGJWVMB0JG4YSDBV

A 7th normal form should mandate that no identifiers should ever be assigned to identifiers.

hdjrudni•5mo ago
Not sure about that. Some folks would argue you should always use surrogate keys.

I probably wouldn't for country_code specifically, but for most things its useful even when you have a 'natural' key.

rixed•5mo ago
It can be useful in some cases but can also be a hindrance. First because identifiers are more useful when they allow to actualy identify the thing; and also because now they can change from instance to instance, from customer to customer etc...
sergeyprokhoren•5mo ago
https://biggo.com/news/202509050143_6NF_File_Format_Debate