Friday, June 4, 2010

Probably informed…

The following are a few citations to go with my recent post on scientism, which meandered into some thoughts about information theory and entropy. The key words in my dialogue were:

It's about the cost of information for any closed system. Information about the system produces a decrease in available energy in that system, which is diverted from it, in the form of entropy, into information that is about the system but emerges from outside it. Information is a new, higher-level energy complex which, you could say, "feeds on" the disorder in that system until an informed judgment is made about the system. Greater information about the system--at a higher level--means greater entropy in that system. Hence, an informationally complete judgment of the whole cosmos--a true theory of everything--would require an informed perspective outside the cosmos. And this perspective would require more energy than is available within the system to make it's judgment into information.

The following citations are leads I want to keep pursuing to see if I'm wildly off-base and then to see if there's any way to develop the idea.

From Wikipedia:

"In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information contained in a message, usually in units such as bits. …

"…in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics."

From Howard Rheingold:

"The first law of thermodynamics stated that the energy in a closed system is constant. That means that energy cannot be created or destroyed in such systems, but can only be transformed. The second law states, in effect, that part of that unchangeable reservoir of energy becomes a little less stable every time a transformation takes place. When you pour hot water into cold water, you can't separate it back into a hot and a cold glass of water again (without using a lot more energy). Entropy, meaning "transformation," was the word Claudius later proposed for that lost quantity of usable energy. …

"When it was discovered that heat is a measure of the average motion of a population of molecules, the notion of entropy became linked to the measure of order or disorder in a system. If this linkage of such disparate ideas as "heat," "average motion," and "order of a system" sounds confusing, you have a good idea of how nineteenth-century physicists felt. For a long time, they thought that heat was some kind of invisible fluid that was transferred from one object to another. When it was discovered that heat is way of characterizing a substance in which the molecules were, on the average, moving around faster than the molecules in a "cold" substance, a new way of looking at systems consisting of large numbers of parts (molecules, in this case) came into being. …

"A system with high entropy has a low degree of order. A system with low entropy has a higher degree of order. In a steam engine, you have the heat in one place (the boiler) and it is dissipated into the cold part (the condenser). This is a very orderly (low entropy) system in the sense that anyone can reliably predict in which part of the engine the hot molecules are likely to be found. …

"An unshuffled deck of cards has a lower degree of entropy because energy went into arranging it in an unlikely manner. Less energy is then required to put the deck into a more probable, less orderly, less predictable, more highly entropic state….

"What would happen, Maxwell asked, if you could place a tiny imp at the molecular gate, a demon who didn't contribute energy to the system, but who could open and close the gate between the two sides of the container? Now what if the imp decides to let only the occasional slow-moving, colder molecule pass from the hot to the cold side when it randomly approaches the gate? Taken far enough, this policy could mean that the hot side would get hotter and the cold side would get colder, and entropy would decrease instead of increase without any energy being added to the system!

In 1922, a Hungarian student of physics by the name of Leo Szilard (later to be von Neumann's colleague in the Manhattan project), then in Berlin, finally solved the paradox of Maxwell's demon by demonstrating that the demon does indeed need to contribute energy to the system, but like a good magician the demon does not expend that energy in its most visible activity -- moving the gate -- but in what it knows about the system. The demon is a part of the system, and it has to do some work in order to differentiate the hot and cold molecules at the proper time to open the gate. Simply by obtaining the information about molecules that it needs to know to operate the gate, the demon adds more entropy to the system than it subtracts.

"If there are only two cases in the population, a single yes or no decision reduces the uncertainty to zero. In a group of four, it takes two decisions to be sure. In a group of trillions, you have to guess a little. When you are making predictions about such large populations, averages based on the overall behavior of the population have to replace precise case-by-case calculations based on the behavior of individual members of the population.

"One of the properties of a statistical average is that it is quite possible for a population to be characterized by an average value that is not held by any particular element of the population. If you have a population consisting of three people, and you know that one is three feet tall, one five feet tall, and one is six feet tall, you have quite precise information about that population, which would enable you to pick out individuals by height. But if all you know is that the average height of the population is four feet, eight inches, you wouldn't know anything useful about any one of the three particular individuals. Whenever a system is represented by an average, some information is necessarily lost, just as two energy states lose a little energy when they are brought into equilibrium. …

"… One of the things Shannon demonstrated in 1948 was that the entropy of a system is represented by the logarithm of possible combinations of states in that system -- which is the same as the number of yes-or-no questions that have to be asked to locate one individual case. Entropy, as it was redefined by Shannon, is the same as the number of binary decisions necessary to identify a specific sequence of symbols.

"Human bodies can be better understood as complex communication networks than as clockwork-like machines. …"

From Thomas D. Schneider:

"Shannon called his measure not only the entropy but also the "uncertainty". I prefer this term because it does not have physical units associated with it. If you correlate information with uncertainty, then you get into deep trouble. …

"Information is the very opposite of randomness! … Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine). If you use this definition, it will clarify all the confusion in the literature. …

"…the typical physicists' muddle about "erasure" in which they set the state of a device to one of several states and call this a "loss of information". But setting a device to one state (no matter what it is) decreases the entropy and increases the information. The main mistake that the physicists make is not having any real working examples. It's entirely theoretical for them. (These people believe that they can beat the Second Law. I would simply ask them to build the perpetual motion machine and run the world power grid from it before making such a claim.) …

"Shannon information is a difference between uncertainties. …"

From Thomas D. Schneider*:

"Because of noise, after a communication there is always some uncertainty remaining, Hafter and this must be subtracted from the uncertainty before the communication is sent, Hbefore. In combination with the R/H pitfall, this pitfall has lead many authors to conclude that information is randomness."

From Thomas D. Schneider**:

"A better term to use for measuring the state of a set of symbols is "uncertainty". I take the middle road and say that entropy and uncertainty can be related under the condition when the microstates of the system correspond to symbols, as they do for molecular machines."

From Gregory Chaitin, "Gödel's Theorem and Information":

"Gödel's theorem may be demonstrated using arguments having an information-theoretic flavor. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual. …

"A great many different proofs of Gödel's theorem are now known, and the result is now considered easy to prove and almost obvious: It is equivalent to the unsolvability of the halting problem, or alternatively to the assertion that there is an r.e. (recursively enumerable) set that is not recursive. And it has had no lasting impact on the daily lives of mathematicians or on their working habits; no one loses sleep over it any more.

"Gödel's original proof constructed a paradoxical assertion that is true but not provable within the usual formalizations of number theory. In contrast I would like to measure the power of a set of axioms and rules of inference. … "

TO BE CONTINUED…

No comments: