frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Beyond Agentic Coding

https://haskellforall.com/2026/02/beyond-agentic-coding
1•todsacerdoti•25s ago•0 comments

OpenClaw ClawHub Broken Windows Theory – If basic sorting isn't working what is?

https://www.loom.com/embed/e26a750c0c754312b032e2290630853d
1•kaicianflone•2m ago•0 comments

OpenBSD Copyright Policy

https://www.openbsd.org/policy.html
1•Panino•3m ago•0 comments

OpenClaw Creator: Why 80% of Apps Will Disappear

https://www.youtube.com/watch?v=4uzGDAoNOZc
1•schwentkerr•7m ago•0 comments

What Happens When Technical Debt Vanishes?

https://ieeexplore.ieee.org/document/11316905
1•blenderob•8m ago•0 comments

AI Is Finally Eating Software's Total Market: Here's What's Next

https://vinvashishta.substack.com/p/ai-is-finally-eating-softwares-total
1•gmays•8m ago•0 comments

Computer Science from the Bottom Up

https://www.bottomupcs.com/
1•gurjeet•9m ago•0 comments

Show HN: I built a toy compiler as a young dev

https://vire-lang.web.app
1•xeouz•10m ago•0 comments

You don't need Mac mini to run OpenClaw

https://runclaw.sh
1•rutagandasalim•11m ago•0 comments

Learning to Reason in 13 Parameters

https://arxiv.org/abs/2602.04118
1•nicholascarolan•13m ago•0 comments

Convergent Discovery of Critical Phenomena Mathematics Across Disciplines

https://arxiv.org/abs/2601.22389
1•energyscholar•13m ago•1 comments

Ask HN: Will GPU and RAM prices ever go down?

1•alentred•14m ago•0 comments

From hunger to luxury: The story behind the most expensive rice (2025)

https://www.cnn.com/travel/japan-expensive-rice-kinmemai-premium-intl-hnk-dst
2•mooreds•15m ago•0 comments

Substack makes money from hosting Nazi newsletters

https://www.theguardian.com/media/2026/feb/07/revealed-how-substack-makes-money-from-hosting-nazi...
5•mindracer•16m ago•1 comments

A New Crypto Winter Is Here and Even the Biggest Bulls Aren't Certain Why

https://www.wsj.com/finance/currencies/a-new-crypto-winter-is-here-and-even-the-biggest-bulls-are...
1•thm•16m ago•0 comments

Moltbook was peak AI theater

https://www.technologyreview.com/2026/02/06/1132448/moltbook-was-peak-ai-theater/
1•Brajeshwar•16m ago•0 comments

Why Claude Cowork is a math problem Indian IT can't solve

https://restofworld.org/2026/indian-it-ai-stock-crash-claude-cowork/
1•Brajeshwar•16m ago•0 comments

Show HN: Built an space travel calculator with vanilla JavaScript v2

https://www.cosmicodometer.space/
2•captainnemo729•17m ago•0 comments

Why a 175-Year-Old Glassmaker Is Suddenly an AI Superstar

https://www.wsj.com/tech/corning-fiber-optics-ai-e045ba3b
1•Brajeshwar•17m ago•0 comments

Micro-Front Ends in 2026: Architecture Win or Enterprise Tax?

https://iocombats.com/blogs/micro-frontends-in-2026
2•ghazikhan205•19m ago•0 comments

These White-Collar Workers Actually Made the Switch to a Trade

https://www.wsj.com/lifestyle/careers/white-collar-mid-career-trades-caca4b5f
1•impish9208•19m ago•1 comments

The Wonder Drug That's Plaguing Sports

https://www.nytimes.com/2026/02/02/us/ostarine-olympics-doping.html
1•mooreds•20m ago•0 comments

Show HN: Which chef knife steels are good? Data from 540 Reddit tread

https://new.knife.day/blog/reddit-steel-sentiment-analysis
1•p-s-v•20m ago•0 comments

Federated Credential Management (FedCM)

https://ciamweekly.substack.com/p/federated-credential-management-fedcm
1•mooreds•20m ago•0 comments

Token-to-Credit Conversion: Avoiding Floating-Point Errors in AI Billing Systems

https://app.writtte.com/read/kZ8Kj6R
1•lasgawe•21m ago•1 comments

The Story of Heroku (2022)

https://leerob.com/heroku
1•tosh•21m ago•0 comments

Obey the Testing Goat

https://www.obeythetestinggoat.com/
1•mkl95•21m ago•0 comments

Claude Opus 4.6 extends LLM pareto frontier

https://michaelshi.me/pareto/
1•mikeshi42•22m ago•0 comments

Brute Force Colors (2022)

https://arnaud-carre.github.io/2022-12-30-amiga-ham/
1•erickhill•25m ago•0 comments

Google Translate apparently vulnerable to prompt injection

https://www.lesswrong.com/posts/tAh2keDNEEHMXvLvz/prompt-injection-in-google-translate-reveals-ba...
1•julkali•25m ago•0 comments
Open in hackernews

Graph Continuous Thought Machines

1•Sai-dewa•6mo ago
We propose a method by which a neural graph continuous thought machines dispositional nodes connections may be designed faithful to a human brain. A graph continuous thought machine replaces the synapse and neuron level models with a graph cnn .In some sense, the nodes of the graph at any one time represent the instantiation of the nodes of the dispositional neural model it is part of. Instantiating only those nodes that are currently firing. The GCNN then outputs the next graph as the system searches graph space for solutions as guided by learnt property vectors.The outputs from its neural synchronization matrix then modulate the attention given to inputs as well as to the nodes of the dispositional network. This way it designs The dispositional neural models connections (disposition for particular graphs to be next after others). We then employ neural training modules which are spiking neural networks which have their nodes mapped with keys from a musical keyboard. In particular when exposed to the state of teacher systems the nodes are trained to musically harmonize, while when exposed to the state of the untrained agent they are dissonant. The agent then tries to maximise consonance in the spiking network by using it as a reward signal. By this method the agent is trained to perform like the teacher system. We introduce text conditioned neural training modules, that condition the input on text. We show a method to modulate not just the behavior of the system , but the connectivity of the dispositional network of a GCTM. https://www.researchgate.net/publication/392733228_Text_Conditioned_Self_Architecture_Search_for_Building_Brain_Like_Connectivity_by_Describing_It

Comments

Sai-dewa•6mo ago
have a paper on graph continuous thought machines that replace the synapse model and the neuron models with a graph convolutional network.

The gcnn outputs the next graph in the thought process as guided by learnt property vectors.

What's interesting is that the synchronization matrix regulates the attention given to the nodes as well as the input.

So these nodes may be seen as neurons in their own right. And consecutive graphs have connections between them that sent virtual signals and caused them to spike.

The nodes and potential nodes exist in a dispositional neural network, and only the nodes that are currently activated are instantiated in the gcnn.

So as the outputs from the synchronization matrix modulate attention, a subset of the attended dispositional neurons will represent memory.

While other parts of the dispositional network and parts of the input represent keys that index the next presentation of memory.

In fact only the pre frontal cortex dispositional nodes will contribute to the synchronization matrix.

So the pfc performs read and write operations to memory this way.

Sai-dewa•6mo ago
So the actual connections between dispositional neurons changes as the property vectors are learnt