The root of these misunderstandings lies in a lack of attention to context. Science is very strong on content, but it has a habit of ignoring 'external' constraints on the systems being studied.
Context is an important but neglected feature of information. It is so easy to focus on the combinatorial clarity of the message and to ignore the messy, complicated processes carried out by the receiver when it decodes the message. Context is crucial to the interpretation of messages: to their meaning. In his book The User Illusion Tor Norretranders introduced the term exformation to capture the role of the context, and Douglas Hofstadter made the same general point in Godel, Escher, Bach. Observe how, in the next chapter, the otherwise incomprehensible message 'THEOSTRY' becomes obvious when context is taken into account.
Instead of thinking about a DNA 'blueprint' encoding an organism, it's easier to think of a CD
encoding music. Biological development is like a CD that contains instructions for building a new CD-player. You can't 'read' those instructions without already having one. If meaning does not depend upon context, then the code on the CD should have an invariant meaning, one that is independent of the player. Does it, though?
Compare two extremes: a 'standard' player that maps the digital code on the CD to music in the manner intended by the design engineers, and a jukebox. With a normal jukebox, the only message that you send is some money and a button-push; yet in the context of the jukebox these are interpreted as a specific several minutes' worth of music. In principle, any numerical code can 'mean' any piece of music you wish; it just depends on how the jukebox is set up, that is, on the exformation associated with the jukebox's design. Now consider a jukebox that reacts to a CD not by playing the tune that's encoded on it, as a series of bits, but by interpreting that code as a number, and then playing some other CD to which that number has been assigned. For instance, suppose that a recording of Beethoven's Fifth Symphony starts, in digital form, with
11001. That's the number 25 in binary. So the jukebox reads the CD as '25', and looks for CD
number 25, which we'll assume is a recording of Charlie Parker playing jazz. On the other hand, elsewhere in the jukebox is CD number 973, which actually is Beethoven's Fifth Symphony.
Then a CD of Beethoven's Fifth can be 'read' in two totally different ways: as a 'pointer' to Charlie Parker, or as Beethoven's Fifth Symphony itself (triggered by whichever CDs start with
973 in binary). Two contexts, two interpretations, two meanings, two results.
Whether something is a message depends upon context, too: sender and receiver must agree upon a protocol for turning meanings into symbols and back again. Without this protocol a semaphore is just a few bits of wood that flap about. Tree branches are bits of wood that flap about, too, but no one ever tries to decode the message being transmitted by a tree. Tree rings the growth rings that appear when you saw through the trunk, one ring per year -are a different matter. We have learned to 'decode' their 'message', about climate in the year 1066 and the like.
A thick ring indicates a good year with lots of growth on the tree, probably warm and wet; a thin ring indicates a poor year, probably cold and dry. But the sequence of tree rings only became a message, only conveyed information, when we figured out the rules that link climate to tree growth. The tree didn't send its message to us.
In biological development the protocol that gives meaning to the DNA message is the laws of physics and chemistry. That is where the exformation resides. However, it is unlikely that exformation can be quantified. An organism's complexity is not determined by the number of bases in its DNA sequence, but by the complexity of the actions initiated by those bases within the context of biological development. That is, by the meaning of the DNA 'message' when it is received by a finely tuned, up-and-running biochemical machine. This is where we gain an edge over those amoebas. Starting with an embryo that develops little flaps, and making a baby with those exquisite little hands, involves a series of processes that produce skeleton, muscles, skin, and so on. Each stage depends on the current state of the others, and all of them depend on contextual physical, biological, chemical and cultural processes.
A central concept in Shannon's information theory is something that he called entropy, which in this context is a measure of how statistical patterns in a source of messages affect the amount of information that the messages can convey. If certain patterns of bits are more likely than others, then their presence conveys less information, because the uncertainty is reduced by a smaller amount. In English, for example, the letter 'E' is much more common than the letter 'Q'. So receiving an 'E' tells you less than receiving a 'Q'. Given a choice between 'E' and 'Q', your best bet is that you're going to receive an 'E'*. And you learn the most when your expectations are proved wrong. Shannon's entropy smooths out these statistical biases and provides a 'fair' measure of information content.
In retrospect, it was a pity that he used the name 'entropy', because there is a longstanding concept in physics with the same name, normally interpreted as 'disorder'. Its opposite, 'order', is usually identified with complexity. The context here is the branch of physics known as thermodynamics, which is a specific simplified model of a gas. In thermodynamics, the molecules of a gas are modelled as 'hard spheres', tiny billiard balls. Occasionally balls collide, and when they do, they bounce off each other as if they are perfectly elastic. The Laws of Thermodynamics state that a large collection of such spheres will obey certain statistical regularities. In such a system, there are two forms of energy: mechanical energy and heat energy.
The First Law states that the total energy of the system never changes. Heat energy can be transformed into mechanical energy, as it is in, say, a steam engine; conversely, mechanical energy can be transformed into heat. But the sum of the two is always the same. The Second Law states, in more precise terms (which we explain in a moment), that heat cannot be transferred from a cool body to a hotter one. And the Third Law states that there is a specific temperature below which the gas cannot go - 'absolute zero', which is around -273 degrees Celsius.
The most difficult -and the most interesting -of these laws is the Second. In more detail, it involves a quantity that is again called 'entropy', which is usually interpreted as 'disorder'. If the gas in a room is concentrated in one corner, for instance, this is a more ordered (that is, less disordered!) state than one in which it is distributed uniformly throughout the room. So when the gas is uniformly distributed, its entropy is higher than when it is all in one corner. One formulation of the Second Law is that the amount of entropy in the universe always increases as time passes. Another way to say this is that the universe always becomes less ordered, or equivalently less complex, as time passes. According to this interpretation, the highly complex world of living creatures will inevitably become less complex, until the universe eventually runs out of steam and turns into a thin, lukewarm soup.
This property gives rise to one explanation for the 'arrow of time', the curious fact that it is easy to scramble an egg but impossible to unscramble one. Time flows in the direction of increasing entropy. So scrambling an egg makes the egg more disordered -that is, increases its entropy which is in accordance with the Second Law. Unscrambling the egg makes it less disordered, and decreases energy, which conflicts with the Second Law. An egg is not a gas, mind you, but thermodynamics can be extended to solids and liquids, too.