Huh. I've been stewing on Up And Atom's video on the Landauer limit: https://youtu.be/XY-mbr-aAZE . If you don't know, Landauer's result connects Shannon's information entropy to Boltzmann's thermodynamic entropy by describing the minimum waste heat required to erase one bit of information i.e. cut the state space in half.
And a while later, Google's creepy Android newsfeed algorithm recommended to me this article about some researchers who claim that absolute zero is reachable in finite time, using finite energy, if you take quantum physics into account, i.e. a violation of the Third Law: https://thedebrief.org/quantum-version-of-the-third-law-of-thermodynamics-says-absolute-zero-is-theoretically-possible/ .
However, this claim is more reasonable than it seems, because the feat is only possible if you can perform infinitely complex manipulations. Complexity is just a synonym for information, so this means that you have infinite knowledge of the quantum state of the system being cooled, which is something you can only know the exact history of all particles since the last time they were each individually at 0 K.
It occurred to me that, since Landauer's equation connects energy to information/entropy, it's entirely possible that energy and information are the same thing, in almost exactly the same way that energy and mass. The weird difference is the dependency on absolute temperature. Temperature is just kinetic energy per volume, basically, so this implies that the more bits you pack into a given quantum system of fixed volume, the more energy it costs to erase one bit i.e. cool the system. That feels like it ties in very directly with the Bekenstein bound / black hole entropy, and the relationship with temperature as an "energy density" feels analogous to gravitational potential energy, i.e. bits get more expensive to remove as they fall into the gravity well because they are attracted to each other.
And, getting *very* far out on a limb: since the heat death of the universe is the process over which the expanding universe cools to 0 K over infinite time, the heat death is actually the *start* of time, is it not? Someone else's time, if not ours. After all, zero temperature means zero information entropy per volume. So I wonder if we actually have a twin universe for which our Big Bang is their heat death and vice versa, and that's what the AdS/CFT and dS/dS stuff is about. Perhaps our gravity could be dual to their thermodynamics, and their gravity to our thermodynamics, via some duality compatible with our respective geometries. Hell, since attraction looks like repulsion when you reverse the flow of time, maybe the thermodynamic information content of their universe (attractive) is dual to dark energy (repulsive) in ours?
#physics #QuantumPhysics #QuantumGravity #entropy #InformationEntropy #ItFromBit
#physics #quantumphysics #quantumgravity #entropy #informationentropy #itfrombit
Text of a comment I just wrote under Up and Atom's latest video:
I'm currently working my way through some ideas involving information theory that might have something to say about NP completeness, especially why individual problem instances are often solvable in P time but the general case is not.
In particular, there are 2**(2**k) boolean functions of k inputs, since there are 2**k distinct assignments to input variables and 2 possible outputs that that assignment might have, so you can imagine a guessing game in which Bob picks a boolean function and Alice has to guess which one it is, based on choosing some variable assignments and trying to reverse engineer the rules to gain enough Shannon entropy to identify the function. If the function is chosen uniformly, then you need 2**k bits, log2 of 2**(2**k), which sucks because the relationship between number of guesses and number of bits gained is linear.
But every boolean function can be represented by a countably infinite set of boolean formulae. Each such formula is represented by a binary tree with input variables as leaves and 2-bit boolean gates as nodes, but the number of leaves in the minimum formula is different for different functions. What if the probability of each boolean function is determined by the number of leaves in the minimal formula(e)? You could take the logarithm of the number of leaves to transform it into a "function entropy", and use that plus information theory to compute a consistent probability distribution so that low entropy functions are more likely than high-entropy functions. The guessing game then becomes a matter of Alice constructing a series of function models and then using her questions to disprove them.
That doesn't solve the problem of NP-completeness entirely, because I haven't figured out exactly how to relate the boolean function guessing game to the SAT problem yet. But it feels like a promising line of thought, since it's giving structure to the guessing step in NP's definition, and the intractability of P vs NP comes from the fact that we're treating NP instances as unstructured black boxes (which is a requirement for the standard proof technique, oracle relativization). In essence, SAT instances that are almost always true or almost always false are easy to solve because the boolean expressions involved are lower entropy, compared to SAT instances that are true or false with closer to 50/50 probability. Basically, the expression tree in the hard instances is full of XOR and XNOR operations, because those are the 2-bit boolean operations that have 50/50 truth tables, and 50/50 truth tables maximize the information entropy of the outputs over the input space.
[Here's a link to the video itself: https://youtu.be/ctwX--JEzSA ]
#UpAndAtom #ComputerScience #math #InformationTheory #InformationEntropy
#upandatom #computerscience #math #informationtheory #informationentropy