frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Go 1.22, SQLite, and Next.js: The "Boring" Back End

https://mohammedeabdelaziz.github.io/articles/go-next-pt-2
1•mohammede•4m ago•0 comments

Laibach the Whistleblowers [video]

https://www.youtube.com/watch?v=c6Mx2mxpaCY
1•KnuthIsGod•5m ago•1 comments

I replaced the front page with AI slop and honestly it's an improvement

https://slop-news.pages.dev/slop-news
1•keepamovin•9m ago•1 comments

Economists vs. Technologists on AI

https://ideasindevelopment.substack.com/p/economists-vs-technologists-on-ai
1•econlmics•12m ago•0 comments

Life at the Edge

https://asadk.com/p/edge
1•tosh•17m ago•0 comments

RISC-V Vector Primer

https://github.com/simplex-micro/riscv-vector-primer/blob/main/index.md
2•oxxoxoxooo•21m ago•1 comments

Show HN: Invoxo – Invoicing with automatic EU VAT for cross-border services

2•InvoxoEU•22m ago•0 comments

A Tale of Two Standards, POSIX and Win32 (2005)

https://www.samba.org/samba/news/articles/low_point/tale_two_stds_os2.html
2•goranmoomin•25m ago•0 comments

Ask HN: Is the Downfall of SaaS Started?

3•throwaw12•26m ago•0 comments

Flirt: The Native Backend

https://blog.buenzli.dev/flirt-native-backend/
2•senekor•28m ago•0 comments

OpenAI's Latest Platform Targets Enterprise Customers

https://aibusiness.com/agentic-ai/openai-s-latest-platform-targets-enterprise-customers
1•myk-e•31m ago•0 comments

Goldman Sachs taps Anthropic's Claude to automate accounting, compliance roles

https://www.cnbc.com/2026/02/06/anthropic-goldman-sachs-ai-model-accounting.html
2•myk-e•33m ago•4 comments

Ai.com bought by Crypto.com founder for $70M in biggest-ever website name deal

https://www.ft.com/content/83488628-8dfd-4060-a7b0-71b1bb012785
1•1vuio0pswjnm7•34m ago•1 comments

Big Tech's AI Push Is Costing More Than the Moon Landing

https://www.wsj.com/tech/ai/ai-spending-tech-companies-compared-02b90046
4•1vuio0pswjnm7•36m ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
2•1vuio0pswjnm7•38m ago•0 comments

Suno, AI Music, and the Bad Future [video]

https://www.youtube.com/watch?v=U8dcFhF0Dlk
1•askl•40m ago•2 comments

Ask HN: How are researchers using AlphaFold in 2026?

1•jocho12•43m ago•0 comments

Running the "Reflections on Trusting Trust" Compiler

https://spawn-queue.acm.org/doi/10.1145/3786614
1•devooops•47m ago•0 comments

Watermark API – $0.01/image, 10x cheaper than Cloudinary

https://api-production-caa8.up.railway.app/docs
1•lembergs•49m ago•1 comments

Now send your marketing campaigns directly from ChatGPT

https://www.mail-o-mail.com/
1•avallark•53m ago•1 comments

Queueing Theory v2: DORA metrics, queue-of-queues, chi-alpha-beta-sigma notation

https://github.com/joelparkerhenderson/queueing-theory
1•jph•1h ago•0 comments

Show HN: Hibana – choreography-first protocol safety for Rust

https://hibanaworks.dev/
5•o8vm•1h ago•1 comments

Haniri: A live autonomous world where AI agents survive or collapse

https://www.haniri.com
1•donangrey•1h ago•1 comments

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•1h ago•0 comments

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•1h ago•0 comments

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
2•helloplanets•1h ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•1h ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•1h ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•1h ago•0 comments

Show HN: I Hacked My Family's Meal Planning with an App

https://mealjar.app
1•melvinzammit•1h ago•0 comments
Open in hackernews

Gravity coupling matches the 128-bit integer limit to 6 ppm

2•albert_roca•1mo ago
I'm asking for a sanity check on the hierarchy problem, or the ≈ 10^38 gap between gravity and the strong force.

The results of this test suggest the universe might use a 128-bit integer architecture.

1. HYPOTHESIS

• The proton is defined at the 64-bit integer limit (2^64).

• Gravity is defined at the 128-bit integer limit (2^128).

If this is true, the gravitational coupling constant (alpha_G) should be the least significant bit (LSB) of this structure (2^-127) (1), modulated by the same geometric cost factor found in the proton mass (1 + alpha / 3) (2).

(1) The model derives G from the proton mass scaling, resulting in a factor of 2 / 2^128, which simplifies to 2^-127.

(2) The term (1 + alpha / 3) is the geometric interaction cost derived for the proton mass in the holographic model.

2. FORMULA

  alpha_G_model = (1 + alpha / 3)^2 / 2^127
3. DATA (CODATA 2022)

  mp    = 1.67262192e-27 kg   (proton mass)
  hbar  = 1.05457181e-34 J s
  c     = 299792458 m/s
  G_exp = 6.67430e-11         (experimental G)
  alpha = 7.29735256e-3       (fine-structure constant)
4. VERIFICATION

Standard experimental coupling from G_exp:

  alpha_G_exp = (G_exp * mp^2) / (hbar * c)
  Value: 5.906063e-39
Prediction using ONLY alpha and powers of 2 (no G, no mass):

  alpha_G_model = (1 + 0.00729735256 / 3)^2 * 2^-127

  2^-127 (raw LSB)   ≈ 5.877471e-39
  Correction factor  ≈ 1.004870
  alpha_G_model      ≈ 5.906099e-39
5. RESULTS

  Experimental: 5.906063e-39
  Predicted:    5.906099e-39
  Delta:        3.6e-44
  Discrepancy:  6 ppm
6. PYTHON

  # Constants (CODATA 2022)
  mp    = 1.67262192e-27
  hbar  = 1.05457181e-34
  c     = 299792458
  G     = 6.67430e-11
  alpha = 7.29735256e-3

  # 1. Experimental coupling
  alpha_G_exp = (G * mp**2) / (hbar * c)

  # 2. 128-bit Model Prediction
  alpha_G_model = ((1 + alpha/3)**2) * (2**-127)

  # Comparison
  ppm = abs(alpha_G_model - alpha_G_exp) / alpha_G_exp * 1e6

  print(f"Exp:   {alpha_G_exp:.6e}")
  print(f"Model: {alpha_G_model:.6e}")
  print(f"Diff:  {ppm:.2f} ppm")
7. QUESTION

In your opinion, is this a numerological coincidence, or is it rather a structural feature?

Preprint: https://doi.org/10.5281/zenodo.17847770

Comments

fjfaase•1mo ago
What is so special about 2^127?
albert_roca•1mo ago
The model identifies the proton mass stability at the 64-bit limit (2^64). Since gravitational interaction scales with m_p^2 , the hierarchy gap corresponds to the square of that limit:

  (2^64)^2 = 2^128
The geometric derivation involves a factor of 2, linked to the holographic pixel diagonal (√2 )^2:

  2 / 2^128 = 2^−127
2^−127 represents the least significant bit (LSB) of a 128-bit integer.
fjfaase•1mo ago
Where does the 64 come from and what do you mean with 'proton mass stability'? The proton is believed to be stable because it is the lowest mass baryon. GUT theories say it might be unstable with a half time of at least 10^34. How does that relate to your number 64? Does the number have a unit?
albert_roca•1mo ago
64 is dimensionless. It comes from the model's holographic scaling law, where mass scales with surface complexity (m ∼ 4^i). The proton appears at i = 32.

  4^32= (2^2)^32 = 2^64
2^64 seems to be the minimum information density required to geometrically define a stable volume. The proton stability implies that nothing simpler can sustain a 3D topology. This limit defines the object's topological complexity, not its lifespan.

Please note that the model is being developed with IA assistance, and I realize that the onthological base needs further refinement.

The proton mass (m_p) is derived as:

  m_p = ((√2 · m_P) / 4^32) · (1 + α / 3)
  m_p = ((√2 · m_P) / √4^64) · (1 + α / 3)
  m_p ≈ 1.67260849206 × 10^-27 kg
  Experimental value: 1.67262192595(52) × 10^-27 kg
  ∆: 8 ppm.
G is derived as:

  G = (ħ · c · 2 · (1 + α / 3)^2) / (mp^2 · 4^64)
  G ≈ 6.6742439706 × 10^-11
  Experimental value: 6.67430(15) × 10^-11 m^3 · kg^-1 · s^-2
  ∆: 8 ppm.
α_G is derived as:

  α_G = (2 · (1 + α / 3)^2) / 4^64
  α_G ≈ 5.9061 · 10^–39
  Experimental value: ≈ 5.906 · 10^-39
  ∆: 8 ppm
The terms (1 + α / 3) and 4^64 appear in the three derivations. All of them show the same discrepancy from the experimental value (8 ppm). (Note: There is a typo in the expected output of the previous Python script; it should yield a discrepancy of 8.39 ppm, not 6 ppm.)

The model also derives α as:

  α^-1 = (4 · π^3 + π^2 + π) - (α / 24)
  α^-1 = 137.0359996
  Experimental value: 137.0359991.
  ∆: < 0.005 ppm.
Is it statistically plausible that this happens by chance? Are there any hidden tricks? AI will find a possible conceptualization for (almost) anything, but I'm trying to get an informed human point of view.