frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: I built Divvy to split restaurant bills from a photo

https://divvyai.app/
1•pieterdy•52s ago•0 comments

Hot Reloading in Rust? Subsecond and Dioxus to the Rescue

https://codethoughts.io/posts/2026-02-07-rust-hot-reloading/
1•Tehnix•1m ago•0 comments

Skim – vibe review your PRs

https://github.com/Haizzz/skim
1•haizzz•2m ago•1 comments

Show HN: Open-source AI assistant for interview reasoning

https://github.com/evinjohnn/natively-cluely-ai-assistant
1•Nive11•3m ago•1 comments

Tech Edge: A Living Playbook for America's Technology Long Game

https://csis-website-prod.s3.amazonaws.com/s3fs-public/2026-01/260120_EST_Tech_Edge_0.pdf?Version...
1•hunglee2•6m ago•0 comments

Golden Cross vs. Death Cross: Crypto Trading Guide

https://chartscout.io/golden-cross-vs-death-cross-crypto-trading-guide
1•chartscout•9m ago•0 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
2•AlexeyBrin•12m ago•0 comments

What the longevity experts don't tell you

https://machielreyneke.com/blog/longevity-lessons/
1•machielrey•13m ago•1 comments

Monzo wrongly denied refunds to fraud and scam victims

https://www.theguardian.com/money/2026/feb/07/monzo-natwest-hsbc-refunds-fraud-scam-fos-ombudsman
3•tablets•18m ago•0 comments

They were drawn to Korea with dreams of K-pop stardom – but then let down

https://www.bbc.com/news/articles/cvgnq9rwyqno
2•breve•20m ago•0 comments

Show HN: AI-Powered Merchant Intelligence

https://nodee.co
1•jjkirsch•22m ago•0 comments

Bash parallel tasks and error handling

https://github.com/themattrix/bash-concurrent
2•pastage•22m ago•0 comments

Let's compile Quake like it's 1997

https://fabiensanglard.net/compile_like_1997/index.html
2•billiob•23m ago•0 comments

Reverse Engineering Medium.com's Editor: How Copy, Paste, and Images Work

https://app.writtte.com/read/gP0H6W5
2•birdculture•29m ago•0 comments

Go 1.22, SQLite, and Next.js: The "Boring" Back End

https://mohammedeabdelaziz.github.io/articles/go-next-pt-2
1•mohammede•35m ago•0 comments

Laibach the Whistleblowers [video]

https://www.youtube.com/watch?v=c6Mx2mxpaCY
1•KnuthIsGod•36m ago•1 comments

Slop News - HN front page right now as AI slop

https://slop-news.pages.dev/slop-news
1•keepamovin•40m ago•1 comments

Economists vs. Technologists on AI

https://ideasindevelopment.substack.com/p/economists-vs-technologists-on-ai
1•econlmics•43m ago•0 comments

Life at the Edge

https://asadk.com/p/edge
3•tosh•48m ago•0 comments

RISC-V Vector Primer

https://github.com/simplex-micro/riscv-vector-primer/blob/main/index.md
4•oxxoxoxooo•52m ago•1 comments

Show HN: Invoxo – Invoicing with automatic EU VAT for cross-border services

2•InvoxoEU•52m ago•0 comments

A Tale of Two Standards, POSIX and Win32 (2005)

https://www.samba.org/samba/news/articles/low_point/tale_two_stds_os2.html
3•goranmoomin•56m ago•0 comments

Ask HN: Is the Downfall of SaaS Started?

3•throwaw12•57m ago•0 comments

Flirt: The Native Backend

https://blog.buenzli.dev/flirt-native-backend/
2•senekor•59m ago•0 comments

OpenAI's Latest Platform Targets Enterprise Customers

https://aibusiness.com/agentic-ai/openai-s-latest-platform-targets-enterprise-customers
1•myk-e•1h ago•0 comments

Goldman Sachs taps Anthropic's Claude to automate accounting, compliance roles

https://www.cnbc.com/2026/02/06/anthropic-goldman-sachs-ai-model-accounting.html
4•myk-e•1h ago•5 comments

Ai.com bought by Crypto.com founder for $70M in biggest-ever website name deal

https://www.ft.com/content/83488628-8dfd-4060-a7b0-71b1bb012785
1•1vuio0pswjnm7•1h ago•1 comments

Big Tech's AI Push Is Costing More Than the Moon Landing

https://www.wsj.com/tech/ai/ai-spending-tech-companies-compared-02b90046
5•1vuio0pswjnm7•1h ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
4•1vuio0pswjnm7•1h ago•0 comments

Suno, AI Music, and the Bad Future [video]

https://www.youtube.com/watch?v=U8dcFhF0Dlk
1•askl•1h ago•2 comments
Open in hackernews

Teaching LLMs to compose math symbolically, not execute it

2•CheerfulDreamer•2mo ago
Right now LLMs cannot be counted on to perfectly perform math. The solution I propose is to teach LLMs to instead of executing the math, just compose the mathematical equations correctly and leave the execution to a post-processing step.

My core method would be: Use a single special token ᶜ (U+1D9C) before each element that needs computation, and then compute the result afterwards. For known math that doesn't need to be computed the ᶜ is not added or present.

Thus we would see in the output:

Normal (already computed): 847 * 293 = 248171

Requesting computation: ᶜ847 ᶜ* ᶜ293 ᶜ= ᶜx

The Core Mechanic: Post-Process Computation

This is what makes everything work: Model generates output with ᶜ-marked expressions (fast, no blocking) Generation completes Parse all ᶜ-marked expressions Execute computations with perfect precision Substitute results back into the output Show user the final result with normal mathematical notation

The model never waits for computation results. It reasons symbolically with variables, and values are computed after generation is complete.

Multi-step example:

Model generates: "First, ᶜ847 ᶜ* ᶜ293 ᶜ= ᶜa, then ᶜa ᶜ+ ᶜ150 ᶜ= ᶜb. The answer is ᶜb."

Post-processing: - Execute: 847 * 293 = 248171 (bind to 'ᶜa') - Execute: 248171 + 150 = 248321 (bind to 'ᶜb') - Substitute: ...

User sees: "First, 847 * 293 = 248171, then 248171 + 150 = 248321. The answer is 248321." This is how the model can compose complex calculations without blocking - it's just manipulating symbols, and we handle execution separately.

Training Path 1: During Base Model Training

We augment the training set such that:

Instructional ("Calculate 847 × 293") → add ᶜ tokens Expository ("The result 847 × 293 = 248,171 shows...") → leave as-is

The model learns both patterns during pretraining. When it generates ᶜ-marked expressions during training, they get post-processed (executed and substituted) before computing the loss. The model learns that ᶜ notation leads to computed results.

Training Path 2: Fine-Tuning Existing Models

*If you already have a trained base model:*

1. *Add ᶜ token to vocabulary* 2. *Generate synthetic training data:* ``` Q: "What is 847 multiplied by 293?" A: "Let me calculate: ᶜ847 ᶜ* ᶜ293 ᶜ= ᶜx. The result is x."

Post-process: → "Let me calculate: 847 * 293 = 248171. The result is 248171."

Train on the post-processed version.

Loss and rewards:

High penalty: arithmetic errors without using ᶜ Small penalty: unnecessary ᶜ use (like for 2+2) Reward: correct ᶜ usage and accurate composition

The model learns: "I already know math notation from base training. Now I'm learning to mark computations with ᶜ and let the execution engine handle them."

Fine-tuning is faster since the model already understands mathematical notation - you're just teaching when to use the ᶜ pattern.

Why This Works Separation of concerns:

Model: mathematical composition, when to calculate, symbolic reasoning Execution engine: precise arithmetic, guaranteed correctness Post-processing is the key: The model never waits for results during generation. It composes symbolically, we compute separately. The model doesn't waste parameters learning that 847 × 293 = 248,171. It learns "multiplication is needed here" and delegates execution.

Extensions Same pattern for any deterministic operation:

Dates: ᶜdate_2023 ᶜ- ᶜdate_2022 ᶜ= ᶜdays

Counting: ᶜcount ᶜ( ᶜlist ᶜ) ᶜ= ᶜn

Memory: ᶜstore ᶜ( ᶜslot ᶜ, ᶜvalue ᶜ)

Public Domain Anyone may use, implement, modify, or build upon this approach for any purpose, commercial or non-commercial, without restriction. I specifically disclaim any patent rights and intend this publication to serve as prior art preventing future patent restrictions.

My goal is to help advance AI capabilities in a way that benefits everyone. All praise to Jesus and God who created this amazing universe for us to enjoy.