Information is what we know or communicate.

In 1948, Claude Shannon published his "Mathematical Theory of Communication." Shannon's theory, one of the major advances in our understanding of information, came at a time of rapid growth of the telephone and telegraph communication networks. Shannon was motivated by the problem of evaluating and comparing different ways of sending signals.

The basis of Shannon's treatment is a distinction between information and meaning. Shannon asserted that to design a communication system, one need not know what message will be sent or the message's meaning. One need only be able to send a set of possible messages. The number of possible messages corresponds with how much information the system can carry.

If we have two ways of communicating, like printed English and Morse code, and they have the same number of possible messages, we can map each of the messages of one system onto each of the messages of the other system. As long as the sender and recipient know how to translate from one to the other in a unique and mutually agreed upon way, the same messages can be transferred in either system, and each method can carry the same information. However, if two systems can transfer different numbers of messages, then the one that can transfer more messages can substitute for the one that can transfer fewer, but not the other way around.

The number of messages that can be sent in a given amount of time is an important characteristic of the system. A system that takes more time to send a given number of messages takes that amount more time to convey the same information. The understanding of information as being a count of the number of possible messages can also be related to the length of a text of printed English, a string of morse code, or a binary string of characters. The amount of information grows with the length of the text, or the number of morse code signals, or the length of the binary string.

Connection to the complexity profile:

We can use this concept of communication to consider how someone would describe a system to someone else. The amount of information to be communicated in a complete description is an intuitive measure of the system's level of complexity, a key problem in describing and studying complex systems. Applying this concept is complicated by the question of levels of detail in a description. It can be difficult to compare the lengths of descriptions of different systems because they may be at different levels of detail. Nonetheless, the concept of a correspondence between a system's level of complexity and the amount of information needed to describe it is important.

The original article may still be the best description of Shannon's information theory. Shannon's article is also available as a book.

Related concepts: meaning, complexity

Back to Concept Map

Copyright © 2011 Yaneer Bar-Yam All rights reserved.