frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

P2P crypto exchange development company

1•sonniya•4m ago•0 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
1•jesperordrup•8m ago•0 comments

Write for Your Readers Even If They Are Agents

https://commonsware.com/blog/2026/02/06/write-for-your-readers-even-if-they-are-agents.html
1•ingve•9m ago•0 comments

Knowledge-Creating LLMs

https://tecunningham.github.io/posts/2026-01-29-knowledge-creating-llms.html
1•salkahfi•10m ago•0 comments

Maple Mono: Smooth your coding flow

https://font.subf.dev/en/
1•signa11•16m ago•0 comments

Sid Meier's System for Real-Time Music Composition and Synthesis

https://patents.google.com/patent/US5496962A/en
1•GaryBluto•24m ago•1 comments

Show HN: Slop News – HN front page now, but it's all slop

https://dosaygo-studio.github.io/hn-front-page-2035/slop-news
4•keepamovin•25m ago•2 comments

Show HN: Empusa – Visual debugger to catch and resume AI agent retry loops

https://github.com/justin55afdfdsf5ds45f4ds5f45ds4/EmpusaAI
1•justinlord•27m ago•0 comments

Show HN: Bitcoin wallet on NXP SE050 secure element, Tor-only open source

https://github.com/0xdeadbeefnetwork/sigil-web
2•sickthecat•30m ago•1 comments

White House Explores Opening Antitrust Probe on Homebuilders

https://www.bloomberg.com/news/articles/2026-02-06/white-house-explores-opening-antitrust-probe-i...
1•petethomas•30m ago•0 comments

Show HN: MindDraft – AI task app with smart actions and auto expense tracking

https://minddraft.ai
2•imthepk•35m ago•0 comments

How do you estimate AI app development costs accurately?

1•insights123•36m ago•0 comments

Going Through Snowden Documents, Part 5

https://libroot.org/posts/going-through-snowden-documents-part-5/
1•goto1•36m ago•0 comments

Show HN: MCP Server for TradeStation

https://github.com/theelderwand/tradestation-mcp
1•theelderwand•39m ago•0 comments

Canada unveils auto industry plan in latest pivot away from US

https://www.bbc.com/news/articles/cvgd2j80klmo
3•breve•40m ago•1 comments

The essential Reinhold Niebuhr: selected essays and addresses

https://archive.org/details/essentialreinhol0000nieb
1•baxtr•43m ago•0 comments

Rentahuman.ai Turns Humans into On-Demand Labor for AI Agents

https://www.forbes.com/sites/ronschmelzer/2026/02/05/when-ai-agents-start-hiring-humans-rentahuma...
1•tempodox•44m ago•0 comments

StovexGlobal – Compliance Gaps to Note

1•ReviewShield•48m ago•1 comments

Show HN: Afelyon – Turns Jira tickets into production-ready PRs (multi-repo)

https://afelyon.com/
1•AbduNebu•49m ago•0 comments

Trump says America should move on from Epstein – it may not be that easy

https://www.bbc.com/news/articles/cy4gj71z0m0o
6•tempodox•49m ago•3 comments

Tiny Clippy – A native Office Assistant built in Rust and egui

https://github.com/salva-imm/tiny-clippy
1•salvadorda656•53m ago•0 comments

LegalArgumentException: From Courtrooms to Clojure – Sen [video]

https://www.youtube.com/watch?v=cmMQbsOTX-o
1•adityaathalye•56m ago•0 comments

US moves to deport 5-year-old detained in Minnesota

https://www.reuters.com/legal/government/us-moves-deport-5-year-old-detained-minnesota-2026-02-06/
8•petethomas•1h ago•3 comments

If you lose your passport in Austria, head for McDonald's Golden Arches

https://www.cbsnews.com/news/us-embassy-mcdonalds-restaurants-austria-hotline-americans-consular-...
1•thunderbong•1h ago•0 comments

Show HN: Mermaid Formatter – CLI and library to auto-format Mermaid diagrams

https://github.com/chenyanchen/mermaid-formatter
1•astm•1h ago•0 comments

RFCs vs. READMEs: The Evolution of Protocols

https://h3manth.com/scribe/rfcs-vs-readmes/
3•init0•1h ago•1 comments

Kanchipuram Saris and Thinking Machines

https://altermag.com/articles/kanchipuram-saris-and-thinking-machines
1•trojanalert•1h ago•0 comments

Chinese chemical supplier causes global baby formula recall

https://www.reuters.com/business/healthcare-pharmaceuticals/nestle-widens-french-infant-formula-r...
2•fkdk•1h ago•0 comments

I've used AI to write 100% of my code for a year as an engineer

https://old.reddit.com/r/ClaudeCode/comments/1qxvobt/ive_used_ai_to_write_100_of_my_code_for_1_ye...
3•ukuina•1h ago•1 comments

Looking for 4 Autistic Co-Founders for AI Startup (Equity-Based)

1•au-ai-aisl•1h ago•1 comments
Open in hackernews

Gravity coupling matches the 128-bit integer limit to 6 ppm

2•albert_roca•1mo ago
I'm asking for a sanity check on the hierarchy problem, or the ≈ 10^38 gap between gravity and the strong force.

The results of this test suggest the universe might use a 128-bit integer architecture.

1. HYPOTHESIS

• The proton is defined at the 64-bit integer limit (2^64).

• Gravity is defined at the 128-bit integer limit (2^128).

If this is true, the gravitational coupling constant (alpha_G) should be the least significant bit (LSB) of this structure (2^-127) (1), modulated by the same geometric cost factor found in the proton mass (1 + alpha / 3) (2).

(1) The model derives G from the proton mass scaling, resulting in a factor of 2 / 2^128, which simplifies to 2^-127.

(2) The term (1 + alpha / 3) is the geometric interaction cost derived for the proton mass in the holographic model.

2. FORMULA

  alpha_G_model = (1 + alpha / 3)^2 / 2^127
3. DATA (CODATA 2022)

  mp    = 1.67262192e-27 kg   (proton mass)
  hbar  = 1.05457181e-34 J s
  c     = 299792458 m/s
  G_exp = 6.67430e-11         (experimental G)
  alpha = 7.29735256e-3       (fine-structure constant)
4. VERIFICATION

Standard experimental coupling from G_exp:

  alpha_G_exp = (G_exp * mp^2) / (hbar * c)
  Value: 5.906063e-39
Prediction using ONLY alpha and powers of 2 (no G, no mass):

  alpha_G_model = (1 + 0.00729735256 / 3)^2 * 2^-127

  2^-127 (raw LSB)   ≈ 5.877471e-39
  Correction factor  ≈ 1.004870
  alpha_G_model      ≈ 5.906099e-39
5. RESULTS

  Experimental: 5.906063e-39
  Predicted:    5.906099e-39
  Delta:        3.6e-44
  Discrepancy:  6 ppm
6. PYTHON

  # Constants (CODATA 2022)
  mp    = 1.67262192e-27
  hbar  = 1.05457181e-34
  c     = 299792458
  G     = 6.67430e-11
  alpha = 7.29735256e-3

  # 1. Experimental coupling
  alpha_G_exp = (G * mp**2) / (hbar * c)

  # 2. 128-bit Model Prediction
  alpha_G_model = ((1 + alpha/3)**2) * (2**-127)

  # Comparison
  ppm = abs(alpha_G_model - alpha_G_exp) / alpha_G_exp * 1e6

  print(f"Exp:   {alpha_G_exp:.6e}")
  print(f"Model: {alpha_G_model:.6e}")
  print(f"Diff:  {ppm:.2f} ppm")
7. QUESTION

In your opinion, is this a numerological coincidence, or is it rather a structural feature?

Preprint: https://doi.org/10.5281/zenodo.17847770

Comments

fjfaase•1mo ago
What is so special about 2^127?
albert_roca•1mo ago
The model identifies the proton mass stability at the 64-bit limit (2^64). Since gravitational interaction scales with m_p^2 , the hierarchy gap corresponds to the square of that limit:

  (2^64)^2 = 2^128
The geometric derivation involves a factor of 2, linked to the holographic pixel diagonal (√2 )^2:

  2 / 2^128 = 2^−127
2^−127 represents the least significant bit (LSB) of a 128-bit integer.
fjfaase•1mo ago
Where does the 64 come from and what do you mean with 'proton mass stability'? The proton is believed to be stable because it is the lowest mass baryon. GUT theories say it might be unstable with a half time of at least 10^34. How does that relate to your number 64? Does the number have a unit?
albert_roca•1mo ago
64 is dimensionless. It comes from the model's holographic scaling law, where mass scales with surface complexity (m ∼ 4^i). The proton appears at i = 32.

  4^32= (2^2)^32 = 2^64
2^64 seems to be the minimum information density required to geometrically define a stable volume. The proton stability implies that nothing simpler can sustain a 3D topology. This limit defines the object's topological complexity, not its lifespan.

Please note that the model is being developed with IA assistance, and I realize that the onthological base needs further refinement.

The proton mass (m_p) is derived as:

  m_p = ((√2 · m_P) / 4^32) · (1 + α / 3)
  m_p = ((√2 · m_P) / √4^64) · (1 + α / 3)
  m_p ≈ 1.67260849206 × 10^-27 kg
  Experimental value: 1.67262192595(52) × 10^-27 kg
  ∆: 8 ppm.
G is derived as:

  G = (ħ · c · 2 · (1 + α / 3)^2) / (mp^2 · 4^64)
  G ≈ 6.6742439706 × 10^-11
  Experimental value: 6.67430(15) × 10^-11 m^3 · kg^-1 · s^-2
  ∆: 8 ppm.
α_G is derived as:

  α_G = (2 · (1 + α / 3)^2) / 4^64
  α_G ≈ 5.9061 · 10^–39
  Experimental value: ≈ 5.906 · 10^-39
  ∆: 8 ppm
The terms (1 + α / 3) and 4^64 appear in the three derivations. All of them show the same discrepancy from the experimental value (8 ppm). (Note: There is a typo in the expected output of the previous Python script; it should yield a discrepancy of 8.39 ppm, not 6 ppm.)

The model also derives α as:

  α^-1 = (4 · π^3 + π^2 + π) - (α / 24)
  α^-1 = 137.0359996
  Experimental value: 137.0359991.
  ∆: < 0.005 ppm.
Is it statistically plausible that this happens by chance? Are there any hidden tricks? AI will find a possible conceptualization for (almost) anything, but I'm trying to get an informed human point of view.