frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Evolution: Training neural networks with genetic selection achieves 81% on MNIST

https://github.com/A1CST/GENREG_ALPHA_MNIST
2•AsyncVibes•2h ago

Comments

AsyncVibes•2h ago
I've been working on GENREG (Genetic Regulatory Networks), an evolutionary learning system that trains neural networks without gradients or backpropagation. Instead of calculating loss derivatives, genomes accumulate "trust" based on task performance and reproduce through trust-based selection. Training uses GPU but inference runs on low-end CPUs. Today I hit 81.47% accuracy on the official MNIST test set using pure evolutionary pressure. The Setup Architecture: Simple MLP (784 → 64 → 10) Population: 200 competing genomes Selection: Trust-based (high performers reproduce) Mutation: Gaussian noise on offspring weights Training time: ~600 generations, ~40 minutes Results MNIST (64 neurons, 50K params): 81.47% test accuracy Best digits: 0 (94%), 1 (97%), 6 (85%) Hardest: 5 (61%), 8 (74%), 3 (75%) The 32-neuron version (25K params) achieved 72.52% - competitive performance with half the parameters. UMAP embeddings reveal the learning strategy: 32-neuron model: Can't separate all 10 digits. Masters 0 and 1 (>90%) but confusable digits like 5/3/8 collapse into overlapping clusters. 64-neuron model: Clean 10-cluster topology with distinct regions. Errors at decision boundaries between similar digits. Key Discoveries

Fitness signal stability is critical: Training plateaued at 65% with 1 random image per digit. Variance was too high. Switching to 20 images per digit fixed this immediately. Child mutation drives exploration: Mutation during reproduction matters far more than mutating existing population. Disabling it completely flatlined learning. Capacity forces trade-offs: The 32-neuron model initially masters easy digits (0, 1) then evolutionary pressure forces it to sacrifice some accuracy there to improve hard digits. Different optimization dynamic than gradient descent.

Most MNIST baselines reach 97-98% using 200K+ parameters. GENREG achieves 81% with 50K params and 72% with 25K params, showing strong parameter efficiency despite lower absolute ceiling. Other Results Alphabet recognition (A-Z): 100% mastery in ~1800 generations Currently testing generalization across 30 font variations Limitations Speed: ~40 minutes to 81% vs ~5-10 minutes for gradient descent Accuracy ceiling: Haven't beaten gradient baselines yet Scalability: Unclear how this scales to larger problems Current Experiments Architecture sweep (16/32/64/128/256 neurons) Mutation rate ablation studies Curriculum learning emergence Can we hit 90%+ on MNIST? Minimum viable capacity for digit recognition?

Ship comms: how do they work?–lightning talk by Matt Kulukundis [video]

https://www.youtube.com/watch?v=RFvnXCHS57M
1•leoc•5m ago•0 comments

Show HN: Beauty Arena – An Elo-based experiment to measure aesthetics

https://beautyarena.vercel.app
1•railing1024•5m ago•0 comments

"For You" Feed for Science

https://www.researchhub.com/
1•Tardigrade10•6m ago•0 comments

Outside, Dungeon, Town: Integrating the Three Places in Videogames (2024)

https://keithburgun.net/outside-dungeon-town-integrating-the-three-places-in-videogames/
1•vector_spaces•7m ago•0 comments

39C3 – Breaking architecture barriers: Running x86 games and apps on ARM

https://www.youtube.com/watch?v=3yDXyW1WERg
1•doener•8m ago•0 comments

Stranger Things Creator: Turn Off TV's Garbage Settings

https://www.instagram.com/reel/DRiaP2zEow4/
1•next_xibalba•9m ago•1 comments

Why A.I. Didn't Transform Our Lives in 2025

https://www.newyorker.com/culture/2025-in-review/why-ai-didnt-transform-our-lives-in-2025
1•petethomas•10m ago•0 comments

Show HN: McNeal – Encrypted messaging over acoustic channels

https://github.com/AntonioLambertTech/McnealV2
1•netIsNewWrldOrd•11m ago•0 comments

Show HN: SAFi, a Governance Engine for LLMs

https://safi.selfalignmentframework.com
1•jnamaya•13m ago•0 comments

'Capital of capital': how Abu Dhabi rose as a sovereign wealth powe

https://www.ft.com/content/711e670c-4832-49cb-8b10-5018103ec785
1•petethomas•17m ago•0 comments

The Magic of Merlin

https://www.birds.cornell.edu/home/the-magic-of-merlin/
1•Group_B•21m ago•1 comments

Why most software startups don't need VCs anymore (most)

https://sderosiaux.substack.com/p/why-most-software-startups-dont-need
1•chtefi•23m ago•0 comments

Episteme: A New System for Science

https://episteme.com/
1•momeara•24m ago•0 comments

Comparative Overview of EU-US Vehicle Standards

https://etsc.eu/comparative-overview-eu-us-vehicle-standards/
4•throw0101c•30m ago•0 comments

Show HN: Giselle – open-source visual editor for building AI workflows

https://github.com/giselles-ai/giselle
1•codenote•32m ago•0 comments

Don't call yourself a programmer, and other career advice (2011)

https://www.kalzumeus.com/2011/10/28/dont-call-yourself-a-programmer/
1•teleforce•33m ago•1 comments

Five myths about learning a new language – busted

https://theconversation.com/five-myths-about-learning-a-new-language-busted-266946
2•zeristor•34m ago•0 comments

Miniray – A WGSL Minifier

https://hugodaniel.com/posts/miniray/
1•todsacerdoti•35m ago•0 comments

MongoDB Server Security Update, December 2025

https://www.mongodb.com/company/blog/news/mongodb-server-security-update-december-2025
12•plorkyeran•38m ago•1 comments

The Manus Debate and Why Some Bubbly AI Moonshots Aren't Bubbles

https://substack.com/inbox/post/182749582
1•theno0b•42m ago•1 comments

Show HN: DynamicHorizon – Dynamic Island for macOS

https://www.dynamichorizon.app
2•DHDEV•56m ago•0 comments

The Second Great Error Model Convergence

https://matklad.github.io/2025/12/29/second-error-model-convergence.html
1•kartikarti•59m ago•0 comments

Hyaluronic Acid in Topical Applications: Hero Molecule in the Cosmetics Industry

https://www.mdpi.com/2218-273X/15/12/1656
2•PaulHoule•1h ago•0 comments

Robots Are Hard – Revisiting the original Roomba and its simple architecture

https://robotsinplainenglish.com/e/2025-12-27-roomba.html
3•ArmageddonIt•1h ago•0 comments

Capital in the 22nd Century

https://philiptrammell.substack.com/p/capital-in-the-22nd-century
2•jger15•1h ago•0 comments

Will Skyrocketing Silver Prices Make Photo Film More Expensive?

https://petapixel.com/2025/12/29/will-skyrocketing-silver-prices-make-photo-film-even-more-expens...
1•geox•1h ago•0 comments

Show HN: J_PyDB – tiny encrypted file-based Python DB

https://github.com/NovaDev404/J_PyDB
1•SuperGamer474•1h ago•1 comments

Stranger Things Creator Says Turn Off "Garbage" Settings

https://screenrant.com/stranger-things-creator-turn-off-settings-premiere/
29•1970-01-01•1h ago•16 comments

Bye Bye Big Tech: How I Migrated to an Almost All-EU Stack (and Saved 500€/Year)

https://www.zeitgeistofbytes.com/p/bye-bye-big-tech-how-i-migrated-to
5•alexcos•1h ago•0 comments

Yae – Powerful yet Minimal Nix Dependency Manager

https://github.com/Fuwn/yae
1•MrJulia•1h ago•0 comments