Something specific and technical
If we consider “information” a “vector”, a contextually significant magnitude[value] and dimensionality[scale for that kind of value]; bits themselves are derived from dubious sources. While zero entropy random numbers will behave ideompotently, changes in ambient temperature or power fluctuations during boot up or something may change the clock resulting in a different random seed. If you’re always taking random reads, it increases any drift the low entropy system has dependent upon environmental fluctuations, which is low yet not absent. Ideally one would read “entropicly” from an analog source needing lots of error correction to figure out what the bit renderings should be. Look at any signal specification and you will see what lengths the engineers went to for reliable signal decoding (patterns of voltage threshold with other tricks like checksums or parity).
Existential forms are in “balance”, they are not permanent or “real” in the continuity sense (ideas are shadows on the wall, whatever contours of perception made them.) physical reality is a noisy reading, your digital device cleans it up with filters, performs error correction and records its most confident value (which a magnet or static electricity or stray cosmic ray could upset.)
So after your 1 or 0 on the disk, you will always have to wonder if there was line noise when you took that reading, or if the external universe influenced anything since last check.
It is an endless cycle of reducing uncertainty and making the optimal resolve (which may be no interference!)
It is the pathological and neurotic mind driven to restlessness which compulsively acts when no action or less action is least disturbing to any pre-established natural order (natural dynamics heal themselves when you stop messing with them, etc.)
Life is miraculous because it creates all we have through patient time converting potential into embodied substance through work and vectors (protein folding through catalog curation, all full of error correction and lasting uncertainty). Life may be made a chaotic hell by those devilish busybodies trying to make things certain in a world best enjoyed through wonder.
sunscream89•5mo ago
> Entropy exists, not just from thermodynamics but also from the existence of information theory.
Try “entropy is the existential phenomena of potential distributing over the surface area of negative potential.” This fits both sides only invalidating modern information theory. State is a single for instance of a potential resolve. Number of states are a boundary condition of potential interfering with itself (constructively or destructively). Existential reality is not made up of information, existential reality is potential resolving into state in the moment of now. Probability is the projected distribution of potential (over whatever surface area such as inverse square the distance or heat dissipation or voltage resistance or number of cars on a highway.)
Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).
This could somehow fit it, such as how some minds are limited by what they interpret (numbers of state) while other minds see an intrinsic whole (a potential distribution like a moment of force to an architect.)
j_chance•5mo ago
> Existential reality is not made up of information
> Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).
I think this leads to the question "is information a function of perception". Knowledge of resolution of potential is information, and the properties of entropy should pass through IMO
What about when information is encoded into physical state? If I sample an entropy source and record each sample as magnetism on a disk, information becomes a physical phenomenon?
How can we be sure there are no non-human entities observing all resolutions of potential at any given time and then modifying reality (e.g. existing beyond our understanding). Also consider actors at future points in time that may observe and act upon information that formed in the past. e.g. it's easy to dismiss the information from a dinosaur dying thousands of years ago, until some random human discovers it's toe and triggers a full excavation.
I don't have an answer to this, I just don't feel confident dismissing information as not a physical phenomenon itself. It's certainly a real feedback mechanism, it's just unclear if an actor is required for it to be a phenomenon.
sunscream89•5mo ago
To solve this, I declare information as a vector.
A vector is meaningless without context.
A vector is its own dimensionality (size,speed,density,etc.) and value (magnitude).
This clears up some of your insightful wanderings. Other parts may be outside of information‘s responsibility.
Let us return to your apparently lesser footnote diety who is secretly supreme instigator all along.
Decay!
Even your smartest reading shall some day be the incomprehensible dust (or rust) or otherwise long since faded.
In a world governed by uncertainty, ignorance and confusion is the natural state of minds. It is the illuminant mind which wonders, restlessly tugging at loose stitch until the whole fabric unravels (for better or worse.)
All existential change comes by decay or interference (constructive or destructive.) that’s what this whole entropy thing is, and vectors transform potential into meaningful states (which themselves secretly decay when not being watched.)
j_chance•5mo ago
I think I see what you mean. Information and entropy are fundamental opposites, information cannot be entropy and entropy cannot be information. Information is certainty, entropy is uncertainty.
So with information theory, we measure entropy sources to get information, but it's disconnected from the reality of entropy in physical processes. Which is maybe fine? It seems like an abstraction that is 0 cost. Like if i roll a dice and i compare that to flipping a coin, the relative amount of information yielded is based on the number of distinct outcomes given some arbitrary constraints (what side is facing "up" for each object assuming the human operates the object fairly).
If we then do some reasoning using this relative difference in amount of information, the physical, thermodynamic reality of the human doing the action, the object moving through space, etc etc is arbitrarily stripped away, but with no impact to the soundness of the reasoning about the information (assuming the human has not rigged the system/the entropy source is sound). Sorta like algebra over a field. Any field can be used so long as the axioms of the algebra hold.
So it all comes down to soundness of entropy sources...?
(this has improved my understanding of information theory _significantly_ :pray:)
sunscream89•5mo ago
That is wrong.
There is NEVER certainty.
Certainly is a lie.
Certainty is a dillusion.
Listen carefully: “information removes uncertainty.” Do you see? The most well intending self deceptions begin with a misinterpretation.
Like Shannon and his prime example. He did not say “entropy is the number of possible states.”
If you excite an electron valence (increase potential) would these values not change to the next shelf? Everyone who heard what they wanted to hear.
There is no certainty, even the memory fades. You will always ask yourself if you did the right thing or picked the right one. Ten years from now some missing piece will come to light and perspectives will change.
Certainty never “existed” only the illusion of closure. We do the best we can when necessary with what we have. That is success, not certainty.
We cannot find certainty, only optimal solutions.
In a universe governed by entropy, that everything dissolves to time is the only certainty.