frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Show HN: TheorIA – An Open Curated Physics Dataset (Equations,Explanations,JSON)

https://theoria-dataset.github.io/theoria-dataset/
9•ManuelSH•11mo ago
We’re building TheorIA— an open, high quality dataset of theoretical physics results: equations, derivations, definitions, and explanations — all in structured, machine- and human-readable JSON.

Why? Physics is rich with beautiful, formal results — but most of them are trapped in PDFs, LaTeX, or lecture notes. That makes it hard to:

- train symbolic/physics-aware ML models,

- build derivation-checking tools,

- or even just teach physics interactively.

THEORIA fills that gap. Each entry includes:

A result name (e.g., Lorentz transformations)

Clean equations (AsciiMath)

Straightforward step-by-step derivation with reasoning

Symbol definitions & assumptions

Programmatic validation using sympy

References, arXiv-style domain tags, and contributor metadata

Everything is in open, self-contained JSON files. No scraping, no PDFs, just clear structured data for physics learners, teachers, and ML devs.

Contributors Wanted: We’re tiny right now and trying to grow. If you’re into physics or symbolic ML:

Add an entry (any result you love)

Review others' derivations

Build tools on top of the dataset

GitHub https://github.com/theoria-dataset/theoria-dataset/

Licensed under CC-BY 4.0, and we welcome educators, students, ML people, or just anyone who thinks physics deserves better data.

Comments

somethingsome•11mo ago
There are only 3 entries, am I correct?
ManuelSH•11mo ago
Yes, we are at very early stage. Looking for other physics experts to help increasing it.
somethingsome•11mo ago
I like the idea of having a dataset for physics, but those entries are very basics, most of the physics happens with very complicated maths and it will be difficult to make an entry for a lot of physics.

For example, imagine the entry for the standard equation, should all the derivation and symbolic implementation done as a unique entry? It will be difficult to separate it in logical entries that reference each others, and many physical ideas are fundamentally different, leading to divergences.

I have the impression that it should be easier to just parse reference books and format each paragraph/section as an entry, and maybe build a graph. (considering the reference book as authoritative on the subject)

ManuelSH•11mo ago
I guess you mean the Lagrangian of the Standard Model… which I agree, it will be daunting… although there is no limit in a json…

The idea of automatically parsing books is very nice and possibly faster, but note that:

- there are already various datasets of physics papers and such content - the result will be quite different versus what we intent here, which is to have a high quality dataset of physics results with clear derivations (whenever derivation exist)

Maybe we can still use your idea to achieve the last point in some way… maybe there is a book that is already formatted as the dataset and we could use it as a starting point. But I don’t know any.

BrandiATMuhkuh•11mo ago
This is some cools work.

Not sure if it fits but I still have ~20k currated step by step solution for mathematics (pedagogical math) "lying" around from my previous startup. They are all hand currated. And could even be used for fine tuning or so.

Here are some details: The dataset has 20.600 Abstract Exercises which turn into 1.193.958 Concrete Exercises.

An Abstract Exercise looks like this: a + b = c A Concrete Exercise looks like this: 2 + 3 = 5 Tital compiled file size (JSONL): 11.6GB

And here is an explorer to see some of the data https://curriculum.amy.app/ToM

ManuelSH•11mo ago
very nice! maybe you can put this dataset in some repository like github, kaggle or hugging face, if you are not doing anything with it. Can be helpful to train models.

I built an autoblogging system that brings me visitors while sleeping

https://www.lazyseo.io/
1•costin07•55s ago•1 comments

Agentic AI pentesting with Strix: results from 18 LLM models

https://theartificialq.github.io/2026/04/14/agentic-ai-pentesting-with-strix-results-from-18-llm-...
1•TheArtificialQ•55s ago•0 comments

Mirror neurons 30 years later: implications and applications

https://www.sciencedirect.com/science/article/pii/S1364661322001346
1•rolph•1m ago•0 comments

There Is No Progress in Philosophy (2011) [pdf]

https://cdn2.psychologytoday.com/assets/There%20Is%20No%20Progress%20in%20Philosophy.pdf
1•the-mitr•2m ago•0 comments

Show HN: Resonly – prioritize feature requests by revenue impact

https://resonly.com/
1•omegascorp•2m ago•0 comments

GPT-5.4 Pro solved Erdos problem #1196

https://xcancel.com/Liam06972452/status/2044051379916882067
1•energy123•2m ago•0 comments

Show HN: One-click code review interview

https://entrevue.app/try/
1•fs_software•3m ago•0 comments

Artful Mathematics: The Heritage of M. C. Escher [pdf]

https://pub.math.leidenuniv.nl/~smitbde/papers/2003-de_smit-lenstra-escher.pdf
1•m-hodges•4m ago•0 comments

Show HN: API Changelog Tracker

https://apipulse.app/
1•kull•4m ago•0 comments

Germany's New Baby-Nazis

https://www.crossbordertalks.eu/2026/04/14/germany-s-new-baby-nazis-en/
1•robtherobber•4m ago•0 comments

Show HN: Web-tarpit – Bot trap for any JavaScript server. Zero deps, one import

https://github.com/crumrine/web-tarpit
2•brian200•6m ago•0 comments

CoreWeave, Anthropic Form AI Cloud Agreement

https://www.wsj.com/tech/ai/coreweave-anthropic-form-ai-cloud-agreement-13021a5b
2•gmays•6m ago•0 comments

In Praise of (Some) Compartmentalization

https://pluralistic.net/2026/04/14/compartment/
3•hn_acker•7m ago•0 comments

Late-Bound Sagas: Why Your Agent Is Not an LLM in a Loop

https://medium.com/agentspan/late-bound-sagas-why-your-agent-is-not-an-llm-in-a-loop-a8c50731c551
2•opiniateddev•7m ago•0 comments

Show HN: We built an MCP for Windows – ask Claude about CPU, temps, and privacy

https://github.com/AppControlLabs/appcontrol-mcp-go/
4•suprnurd•7m ago•1 comments

In the brain, objects seen and imagined follow the same neural path

https://www.npr.org/2026/04/14/nx-s1-5781219/brain-vision-neurons-imagine-new-things
4•Brajeshwar•7m ago•1 comments

Speaking Freely: Dr. Jean Linis-Dinco

https://www.eff.org/pages/speaking-freely-dr-jean-linis-dinco
3•hn_acker•8m ago•0 comments

Running AI agents in microVMs instead of containers

https://github.com/the-void-ia/void-box
1•cspinetta•8m ago•1 comments

Snow HN: Vibecasting – create a produced podcast from a prompt

https://vibecasting.fm
1•scottfits•8m ago•0 comments

War as a Pretext: Gulf States Are Tightening the Screws on Speech–Again

https://www.eff.org/deeplinks/2026/04/war-pretext-gulf-states-are-tightening-screws-speech-again
1•hn_acker•8m ago•0 comments

CVE-2026-5747 – Out-of-bounds Write in Firecracker virtio-PCI Transport

https://aws.amazon.com/security/security-bulletins/2026-015-aws/
1•rootforce•8m ago•0 comments

Claude Code writes a 71-workout plan in one idempotent call via MCP

https://mcprunbook.com/posts/claude-code-with-tredict.html
1•Aldipower•9m ago•0 comments

Selkie – Opinionated TUI Framework for Raku

https://github.com/m-doughty/Selkie
3•apogee•9m ago•3 comments

Microsoft Raises the Prices of All Surface Devices Due to Memory Costs

https://www.thurrott.com/hardware/334872/microsoft-raises-the-prices-of-all-surface-devices-due-t...
1•ingve•10m ago•0 comments

ÖzgürKon 2026 schedule is online

https://ozgurkon.org/2026/schedule/?lang=en
1•Reingohya•11m ago•1 comments

Computer Shopper: A phone book for computers

https://tedium.co/2015/08/12/computer-shopper-history/
1•evo_9•11m ago•0 comments

Show HN: LangAlpha – what if Claude Code was built for Wall Street?

https://github.com/ginlix-ai/langalpha
1•zc2610•12m ago•0 comments

Crystallize – a meditative tool built on the 5 Platonic solids

https://crystallize.click
1•nomode•14m ago•1 comments

Everybody Is Lying About How They Use AI

https://medium.com/@sageframe/everybody-is-lying-about-how-they-use-ai-f1cee48a51a8
2•sageframe•15m ago•0 comments

To teach in the era of ChatGPT is to know pain

https://arstechnica.com/science/2026/04/to-teach-in-the-time-of-chatgpt-is-to-know-pain/
1•ckemere•15m ago•1 comments