Significant Points in the Study of Complex Systems

by Yaneer Bar-Yam


In order to help establish a backdrop for the ICCS conferences, I have compiled (with much feedback and contributions from others) a list of "significant points" in the study of complex systems. These are supposed to represent key conceptual insights coupled with mathematical tools for the analysis and discussion of complex systems in general. Feedback and additions are welcome. The points are provided only in brief. In general some familiarity is assumed. Some controversial points are included (what field has no controversy?).

  1. Multi-scale descriptions are needed to understand complex systems. Relevant mathematical tools are scaling laws, fractals & trees, renormalization, multigrid. These specific methods are not exclusive of the more general issue of relating finer scale descriptions to larger scale descriptions (e.g. which fine scale parameters are relevant on larger scales, etc.). Examples: weather - patterns on all scales (cyclones, tornadoes, dust devils); proteins - secondary, tertiary, quaternary structure; physiology - molecules, cells, tissues, systems; brain - hemispheres, lobes, functional regions, etc.; economy/society - similar.
  2. Fine scales influence large scale behavior. Relevant mathematical tools are nonlinear feedback iterative maps, mathematics of deterministic chaos, amplification & dissipation. The specific methods of deterministic chaos are not exclusive of more general issue of fine scale effects on large scale behavior. Examples: weather - "butterfly effect", proteins - enzymatic activity is amplification, physiology - neuromuscular control (a nerve cell action triggering a muscle), economy/society - the relevance of individuals to larger scale behaviors (how many people watch Michael Jordan).
  3. Pattern formation: Prominent among simple mathematical models that capture pattern formation are local activation / long range inhibition models. e.g. Turing patterns, and the work of Prigogine. Examples: weather - cells of airflow, protein - alpha and beta structure, physiology - processes of pattern formation in development, brain/mind - various patterns of interconnection and pattern recognition mechanisms (on-center off-surround), magnetic bubble memories, patterns of species in phenome or genome space, economy/society - patterns of industrial/residential/ commercial areas.
  4. Multiple (meta) stable states: Small displacements (perturbations) lead to recovery, larger ones can lead to radical changes of properties. Dynamics on such a landscape do not average simply. Mathematical models are generally based upon local frustration e.g.. spin glasses, random Boolean nets. Attractor networks use local minima as memories. Examples: weather - persistent structures, proteins - results of displacements in sequence or physical space, physiology - the effect of shocks, dynamics of e.g. the heart, brain/mind - memory, recovery from damage, economy/society - e.g. suggested by dynamics of market responses.
  5. Complexity - answer to question "How complex is it?": There is much discussion of this question. A general answer: The amount of information necessary to describe the system. There are various important issues that require clarification. One of these relates to the use of inference to obtain the description from a seemingly smaller amount of information. This leads to the concept of algorithmic complexity. Another relevant point: The apparent complexity depends on the scale at which the system is described, however, once a particular scale is chosen the complexity should be well defined and bounded (at a particular instant) by the information necessary to describe the microstate of the system (the entropy). Also note that complexity on a large scale requires correlations on a small scale, which reduces the smaller scale complexity. Example: random motion (high small scale complexity) averages out on a larger scale.
  6. Behavior (response) complexity: To describe the behavior (actions) of a system acting in response to its environment, where the complexity of the environmental variables are C(e) and of the action is C(a), we often try to describe the response function f, where a=f(e). However, unless simplifying assumptions are made, specifying the response to each environment requires an amount of information that grows exponentially with the complexity of the environment (a response must be specified for each possible environment). Specifically C(f)=C(a)*2^C(e). This is impossible for all but simple environments (e.g. less than a few tens of bits). This means that behaviorism in psychology, or strict phenomenology in any field, or testing the effects of multiple drugs, or testing computer chips with many input bits, is fundamentally impossible.
  7. Emergence: Related to the dependence of the whole on parts, the interdependence of parts, and specialization of parts. This is directly relevant to questions about how we study systems both theoretically and experimentally. Parts must be studied "in vivo". For example - "If you remove a vacuum tube from a radio and the radio squeals do not conclude that the purpose of the tube is to suppress squeals." While studying the parts in isolation does not work, the nature of complex systems can be probed by investigating how changes in one part affect the others, and the behavior of the whole.
  8. 7+/-2 rule: This is related to the interdependence of parts of a system. For a system divided into components, looking at the dependencies between them we can ask when does the state / behavior of one of the components depend on the state of each of the other ones, and not on an average. This pertains to the question of when the central limit theorem applies to a number of independent variables. The conclusion is that this number is approximately 7. Supported by the phenomenology of substructure branching ratios in proteins, physiology, brain, and social systems (e.g. organizational rules about the number of members of a committee).
  9. The relationship of descriptions and systems: This is relevant to our understanding of theory and simulations, the recognition of systems in their models, encoding and decoding (compression), and the subject of algorithmic complexity. Specific applications are apparent in biological development (genome vs. physiology), engineering design, and memory vs. experience.
  10. Selection is information (à la Shannon theory): The amount of information necessary to specify a system is obtained by enumerating the possible states and comparing them with the possible states of the description e.g. a bit string, or e.g. English language (at about 1 bit / character). This enables the systems to be enumerated and one of them specified. Selection as information is relevant to the issue of multiple selection: replication (reproduction) with variation, and comparative selection (competition) as a mechanism for POSSIBLE increase in complexity. Consistent with modern biological views of evolution it is essential to emphasize that selection does not have to increase complexity.
  11. Composites: To form a new complex system take parts (aspects) of other complex systems and recombine them. For this to work parts must be partially independent. Examples - sexual reproduction, creativity (e.g. seeing a person walking and a bird flying and imagining a person flying by combining information of shape and motion represented in different parts of the brain), and modular construction (building blocks) in artificial systems. The purpose of composites is to allow rapid evolution.
  12. Control hierarchy: When (if) a single component controls the collective behavior (not the individual behaviors of all the components) of a system, then the collective behavior cannot be more complex than the individual behavior. i.e. there is no emergent complexity. Examples: muscle (since the muscle is controlled by a single neuron, its collective behavior is no more complex than the neuron behavior), society/economy: corporate hierarchies/dictatorships/etc. (to the extent that central control is exercised complexity of collective behavior is bounded by the complexity of the controlling individual).
  13. Modeling and simulation: There are a number of simulation methodologies that have arisen as having general application in the study of complex systems. These include: Monte Carlo, simulated annealing, cellular automata. Other methods have been mentioned above.

Back to About Complex Systems home page

 

 

Phone: 617-547-4100 | Fax: 617-661-7711 | Email: office at necsi.edu

210 Broadway Suite 101 Cambridge, MA USA