Information Theory

In  “Recent Contributions to the Mathematical Theory of Communication”  (University of Illinois Press, Urbana, 1964) Warren Weaver summarised the paradoxical character of Information Theory (as formulated by C. Shannon):

“2.2 Information

“The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.

“In fact, two messages, one of which is heavily loaded with meaning and the other of which is pure nonsense, can be exactly equivalent, from the present viewpoint, as regards information. It is this, undoubtedly, that Shannon means when he says that “the semantic aspects of communication are irrelevant to the engineering aspects”. But this does not mean that the engineering aspects are necessarily irrelevant to the semantic aspects.

“To be sure, this word information in communication theory relates not so much to what you do say, as to what you could say.

“That is, information is a measure of one’s freedom of choice when one selects a message. If one is confronted with a very elementary situation where ha has to choose one of two alternative messages, then it is arbitrarily said that the information, associated with this situation, is unity. Note that it is misleading (although often convenient) so say that one or the other message conveys unit information. The concept of information applies not to the individual messages (as the concept of meaning would), but rather to the situation as a whole, the unit information indicating that in this situation one has an amount of freedom of choice in selecting a message which it is convenient to regard as a standard or unit amount.” Pages 8-9

In my view this fragment summarises the essence of the Shannon-Weaver theory of information. Understanding these clarifications about the concept of information, and especially about what it is not meant by “information” in this theory, is necessary to avoid a lax and incorrect view of every aspect of our “information economy.” For example, let us consider the role of “noise” in information exchange. A trivialised understanding of information theory assumes nowadays that data always has “value” and that “information flows” are always unidirectional (from the sender to the receiver. “Noise” appears as an obstacle in the picture, and is interpreted as 100% negative or value-destroying. Nevertheless, a closer reading of the original Shannon-Weaver approach shows that, if information is a measure of “freedom of choice” in message selection, then an increase in this choice (caused by “noise”) could be paradoxically interpreted as an “increase in information.”

Uncertainty and information are both increased in the “presence” of noise in the information channel. So, the measure of information does not say anything about the value it has for the receiver. In the Shannon-Weaver formulation, information as a measure of “entropy” is not a measure of economic value, hence more or less information are also not relative to “content” or “relevance” for the receiver.

In fact, more information can have –and in many cases does have—a negative or counterproductive effect on the receiver. In other cases, uncertainty arising from an increased choice on the side of the receiver may be desirable, but only if the receiver is able to address the increased information and reduce its own uncertainty level.

We have to reach a point where we recognise that only an active receiver (in a bidirectional information exchange) will allow for a complete picture of the exchange. An active receive is one which injects information from its side into the channel, and in this ways controls, filters and reduces incoming uncertainty/information.