Philosophical Clarity

Order & Information

Information and Order

The field of information theory was founded by Claude Shannon, a mathematician and engineer who worked on one of the first analogue computers in the 1930s and went on to show how the electromechanical switches of telephone systems could be used to perform logical tasks. His landmark 1948 paper, A Mathematical Theory of Communication, developed a treatment of information as ordered arrangements that could be reproduced at another point. He showed how information content could be calculated in terms of the number of 1s and 0s needed to transmit it, an approach which paved the way for digital communication and modern computers.1

Shannon showed that information is equivalent to a reduction in uncertainty. We can calculate the possible uncertainty we can have about a system in terms of the number of possible states that are available for each of the system’s elements to adopt.2 Everything that is certain about the system, such as a given pattern of the indentations on a CD that encodes a piece of music, or the fixed position of bases on a section of nucleic acid, is information.

Life And Information

Living things embody information because of the certainty in the forms they adopt. The molecular structures; the pattern of cell organelles; the relationship between the cells in a multicellular organism … they are all fixed. And the organism propagates this embodied information when it recreates the same definite forms in succeeding generations.

Shannon showed that this certainty of form is exactly the same thing, mathematically, as thermodynamic order.3

So both the ability of living systems to build up ordered structures by channelling energy and their ability to propagate information turn out to be aspects of the same truth – that, in the face of the vastness of possibility within the universe, life behaves with precision, building up thermodynamically unlikely and information-rich elements that will themselves go on to behave with precision in the next round of life’s enduring cycle.

Flourishing

If we look at human flourishing through a lens of order, the states of affairs in which people flourish reveal themselves as particularly information rich, for the states in which a person’s needs are met are a tiny proportion of the full range of possible states. There are far more possibilities in which a person is thirsty, for instance, than those in which they drink what they need to keep healthy.

It is unlikely that the high-information, low-entropy states in which needs are met will come about by accident. It’s only because we regularly perform the complex sets of actions needed for liquids to end up sliding down our throats, for instance, that we are able to keep ourselves properly hydrated. If we didn’t act with precision to keep hydrated, the order embodied in our healthy bodies would start to disintegrate.

As the need-meeting actions build information into the body of the world, it makes sense to think of them as intelligent. Indeed, intelligence can be defined by its ability to create order.

Delightful Futures

It is often not just the meeting of needs that makes a possible future desirable. We all find some things inherently attractive. Some of us are excited by the idea of climbing a particular mountain, others by the idea of bringing a new work of art into existence and others by the idea of a particularly tasty meal.

We all have our personal landscapes of meaning, rooted in our personal histories. We are able to see wonder in places where others can’t. Possibilities can stand out in our minds as if they were glowing. For whatever reason, they have an appeal.

Like the futures in which needs are met, the futures that appeal to us because we see the wonder of them are rare, information-rich states that it takes intelligent action to reach.

  1. “The fundamental problem of communication is that of reproducing at one point exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.” C. E. Shannon, A Mathematical Theory Of Communication, Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948, introduction.
  2. The scope for uncertainty can be calculated using the formula S = -k·∑·[Pilog(Pi)], where S is ‘Shannon’s entropy’ and Pi is the probability for the value of a given bit of information i.
  3. ΔS = ΔH/ΔT, or the change in entropy (S) is proportional to the change in heat energy within the system divided by the change in temperature. So, since it takes approximately 80 times more energy to melt a snowflake than it does to raise the temperature of melted snowflake by a single degree, the increase in entropy when the snowflake is melted is a corresponding 80 time greater. Entropy can also be expressed statistically: S = -k·∑·[Pilog(Pi)], where ‘k’ is an arbitrary constant (Boltzmann's constant), Pi is the probability that particle i will be in a given microstate, and the ∑ tells us to calculate Pilog(Pi) for every particle in the system and add all our results together. The entropy then becomes an expression of the improbability of the system as a whole.