frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: AI agent forgets user preferences every session. This fixes it

https://www.pref0.com/
1•fliellerjulian•59s ago•0 comments

Introduce the Vouch/Denouncement Contribution Model

https://github.com/ghostty-org/ghostty/pull/10559
1•DustinEchoes•2m ago•0 comments

Show HN: SSHcode – Always-On Claude Code/OpenCode over Tailscale and Hetzner

https://github.com/sultanvaliyev/sshcode
1•sultanvaliyev•3m ago•0 comments

Microsoft appointed a quality czar. He has no direct reports and no budget

https://jpcaparas.medium.com/microsoft-appointed-a-quality-czar-he-has-no-direct-reports-and-no-b...
1•RickJWagner•4m ago•0 comments

Multi-agent coordination on Claude Code: 8 production pain points and patterns

https://gist.github.com/sigalovskinick/6cc1cef061f76b7edd198e0ebc863397
1•nikolasi•5m ago•0 comments

Washington Post CEO Will Lewis Steps Down After Stormy Tenure

https://www.nytimes.com/2026/02/07/technology/washington-post-will-lewis.html
1•jbegley•6m ago•0 comments

DevXT – Building the Future with AI That Acts

https://devxt.com
2•superpecmuscles•6m ago•4 comments

A Minimal OpenClaw Built with the OpenCode SDK

https://github.com/CefBoud/MonClaw
1•cefboud•7m ago•0 comments

The silent death of Good Code

https://amit.prasad.me/blog/rip-good-code
2•amitprasad•7m ago•0 comments

The Internal Negotiation You Have When Your Heart Rate Gets Uncomfortable

https://www.vo2maxpro.com/blog/internal-negotiation-heart-rate
1•GoodluckH•8m ago•0 comments

Show HN: Glance – Fast CSV inspection for the terminal (SIMD-accelerated)

https://github.com/AveryClapp/glance
2•AveryClapp•9m ago•0 comments

Busy for the Next Fifty to Sixty Bud

https://pestlemortar.substack.com/p/busy-for-the-next-fifty-to-sixty-had-all-my-money-in-bitcoin-...
1•mithradiumn•10m ago•0 comments

Imperative

https://pestlemortar.substack.com/p/imperative
1•mithradiumn•11m ago•0 comments

Show HN: I decomposed 87 tasks to find where AI agents structurally collapse

https://github.com/XxCotHGxX/Instruction_Entropy
1•XxCotHGxX•15m ago•1 comments

I went back to Linux and it was a mistake

https://www.theverge.com/report/875077/linux-was-a-mistake
3•timpera•16m ago•1 comments

Octrafic – open-source AI-assisted API testing from the CLI

https://github.com/Octrafic/octrafic-cli
1•mbadyl•17m ago•1 comments

US Accuses China of Secret Nuclear Testing

https://www.reuters.com/world/china/trump-has-been-clear-wanting-new-nuclear-arms-control-treaty-...
2•jandrewrogers•18m ago•1 comments

Peacock. A New Programming Language

1•hashhooshy•23m ago•1 comments

A postcard arrived: 'If you're reading this I'm dead, and I really liked you'

https://www.washingtonpost.com/lifestyle/2026/02/07/postcard-death-teacher-glickman/
2•bookofjoe•24m ago•1 comments

What to know about the software selloff

https://www.morningstar.com/markets/what-know-about-software-stock-selloff
2•RickJWagner•28m ago•0 comments

Show HN: Syntux – generative UI for websites, not agents

https://www.getsyntux.com/
3•Goose78•29m ago•0 comments

Microsoft appointed a quality czar. He has no direct reports and no budget

https://jpcaparas.medium.com/ab75cef97954
2•birdculture•29m ago•0 comments

AI overlay that reads anything on your screen (invisible to screen capture)

https://lowlighter.app/
1•andylytic•30m ago•1 comments

Show HN: Seafloor, be up and running with OpenClaw in 20 seconds

https://seafloor.bot/
1•k0mplex•30m ago•0 comments

Tesla turbine-inspired structure generates electricity using compressed air

https://techxplore.com/news/2026-01-tesla-turbine-generates-electricity-compressed.html
2•PaulHoule•32m ago•0 comments

State Department deleting 17 years of tweets (2009-2025); preservation needed

https://www.npr.org/2026/02/07/nx-s1-5704785/state-department-trump-posts-x
3•sleazylice•32m ago•1 comments

Learning to code, or building side projects with AI help, this one's for you

https://codeslick.dev/learn
1•vitorlourenco•33m ago•0 comments

Effulgence RPG Engine [video]

https://www.youtube.com/watch?v=xFQOUe9S7dU
1•msuniverse2026•34m ago•0 comments

Five disciplines discovered the same math independently – none of them knew

https://freethemath.org
4•energyscholar•35m ago•1 comments

We Scanned an AI Assistant for Security Issues: 12,465 Vulnerabilities

https://codeslick.dev/blog/openclaw-security-audit
1•vitorlourenco•35m ago•0 comments
Open in hackernews

The Curious Case of Entropy

https://keccak-doomsday.com/the_curious_case_of_entropy
2•j_chance•5mo ago

Comments

sunscream89•5mo ago
This is actually pretty good stuff.

> Entropy exists, not just from thermodynamics but also from the existence of information theory.

Try “entropy is the existential phenomena of potential distributing over the surface area of negative potential.” This fits both sides only invalidating modern information theory. State is a single for instance of a potential resolve. Number of states are a boundary condition of potential interfering with itself (constructively or destructively). Existential reality is not made up of information, existential reality is potential resolving into state in the moment of now. Probability is the projected distribution of potential (over whatever surface area such as inverse square the distance or heat dissipation or voltage resistance or number of cars on a highway.)

Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).

This could somehow fit it, such as how some minds are limited by what they interpret (numbers of state) while other minds see an intrinsic whole (a potential distribution like a moment of force to an architect.)

j_chance•5mo ago
Thanks!

> Existential reality is not made up of information

> Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).

I think this leads to the question "is information a function of perception". Knowledge of resolution of potential is information, and the properties of entropy should pass through IMO

What about when information is encoded into physical state? If I sample an entropy source and record each sample as magnetism on a disk, information becomes a physical phenomenon?

How can we be sure there are no non-human entities observing all resolutions of potential at any given time and then modifying reality (e.g. existing beyond our understanding). Also consider actors at future points in time that may observe and act upon information that formed in the past. e.g. it's easy to dismiss the information from a dinosaur dying thousands of years ago, until some random human discovers it's toe and triggers a full excavation.

I don't have an answer to this, I just don't feel confident dismissing information as not a physical phenomenon itself. It's certainly a real feedback mechanism, it's just unclear if an actor is required for it to be a phenomenon.

sunscream89•5mo ago
Information I define as “the removal of uncertainty.” If it does not “remove uncertainty” it is not information (it may be something else, like data or noise.) therefore yes you are right information is dependent upon the beholder.

To solve this, I declare information as a vector.

A vector is meaningless without context.

A vector is its own dimensionality (size,speed,density,etc.) and value (magnitude).

This clears up some of your insightful wanderings. Other parts may be outside of information‘s responsibility.

Let us return to your apparently lesser footnote diety who is secretly supreme instigator all along.

Decay!

Even your smartest reading shall some day be the incomprehensible dust (or rust) or otherwise long since faded.

In a world governed by uncertainty, ignorance and confusion is the natural state of minds. It is the illuminant mind which wonders, restlessly tugging at loose stitch until the whole fabric unravels (for better or worse.)

All existential change comes by decay or interference (constructive or destructive.) that’s what this whole entropy thing is, and vectors transform potential into meaningful states (which themselves secretly decay when not being watched.)

j_chance•5mo ago
Ok this took some time to think about. I've always conflated entropy and randomness/data, but this is wrong. Say that i sample 512 bytes of data from an entropy source. That data is no longer entropic, because it's no longer uncertain. It's now information (assuming it can be used to inform about the entropy source). It may be random looking, but not entropic.

I think I see what you mean. Information and entropy are fundamental opposites, information cannot be entropy and entropy cannot be information. Information is certainty, entropy is uncertainty.

So with information theory, we measure entropy sources to get information, but it's disconnected from the reality of entropy in physical processes. Which is maybe fine? It seems like an abstraction that is 0 cost. Like if i roll a dice and i compare that to flipping a coin, the relative amount of information yielded is based on the number of distinct outcomes given some arbitrary constraints (what side is facing "up" for each object assuming the human operates the object fairly).

If we then do some reasoning using this relative difference in amount of information, the physical, thermodynamic reality of the human doing the action, the object moving through space, etc etc is arbitrarily stripped away, but with no impact to the soundness of the reasoning about the information (assuming the human has not rigged the system/the entropy source is sound). Sorta like algebra over a field. Any field can be used so long as the axioms of the algebra hold.

So it all comes down to soundness of entropy sources...?

(this has improved my understanding of information theory _significantly_ :pray:)

sunscream89•5mo ago
> Information is certainty, entropy is uncertainty.

That is wrong.

There is NEVER certainty.

Certainly is a lie.

Certainty is a dillusion.

Listen carefully: “information removes uncertainty.” Do you see? The most well intending self deceptions begin with a misinterpretation.

Like Shannon and his prime example. He did not say “entropy is the number of possible states.”

If you excite an electron valence (increase potential) would these values not change to the next shelf? Everyone who heard what they wanted to hear.

There is no certainty, even the memory fades. You will always ask yourself if you did the right thing or picked the right one. Ten years from now some missing piece will come to light and perspectives will change.

Certainty never “existed” only the illusion of closure. We do the best we can when necessary with what we have. That is success, not certainty.

We cannot find certainty, only optimal solutions.

In a universe governed by entropy, that everything dissolves to time is the only certainty.

sunscream89•5mo ago
> What about when information is encoded into physical state? If I sample an entropy source and record each sample as magnetism on a disk, information becomes a physical phenomenon?

Something specific and technical

If we consider “information” a “vector”, a contextually significant magnitude[value] and dimensionality[scale for that kind of value]; bits themselves are derived from dubious sources. While zero entropy random numbers will behave ideompotently, changes in ambient temperature or power fluctuations during boot up or something may change the clock resulting in a different random seed. If you’re always taking random reads, it increases any drift the low entropy system has dependent upon environmental fluctuations, which is low yet not absent. Ideally one would read “entropicly” from an analog source needing lots of error correction to figure out what the bit renderings should be. Look at any signal specification and you will see what lengths the engineers went to for reliable signal decoding (patterns of voltage threshold with other tricks like checksums or parity).

Existential forms are in “balance”, they are not permanent or “real” in the continuity sense (ideas are shadows on the wall, whatever contours of perception made them.) physical reality is a noisy reading, your digital device cleans it up with filters, performs error correction and records its most confident value (which a magnet or static electricity or stray cosmic ray could upset.)

So after your 1 or 0 on the disk, you will always have to wonder if there was line noise when you took that reading, or if the external universe influenced anything since last check.

It is an endless cycle of reducing uncertainty and making the optimal resolve (which may be no interference!)

It is the pathological and neurotic mind driven to restlessness which compulsively acts when no action or less action is least disturbing to any pre-established natural order (natural dynamics heal themselves when you stop messing with them, etc.)

Life is miraculous because it creates all we have through patient time converting potential into embodied substance through work and vectors (protein folding through catalog curation, all full of error correction and lasting uncertainty). Life may be made a chaotic hell by those devilish busybodies trying to make things certain in a world best enjoyed through wonder.

sunscream89•5mo ago
Btw, I wrote an earlier response which may have been more concise.

https://news.ycombinator.com/item?id=44770889