Cite as:

Benjamin Allen, Blake C. Stacey, and Yaneer Bar-Yam, An information-theoretic formalism for multiscale structure in complex systems, arXiv:1409.4708 (September 16, 2014).

Download PDF

(also on arXiv)


Excerpt from Science News Article:

There’s a new way to quantify structure and complexity

by Tom Siegfried

In searching for sense in the complexities of nature, science has often found success by identifying common aspects of diverse phenomena. 

When one principle explains how different things behave, nature becomes more comprehensible, and more manageable. Modern science took flight when Newton showed how one idea – universal gravitation – explained both the motions of celestial bodies and apples falling on Earth. Most important, it didn’t matter whether the apple was red or green, or even if it was an apple. Newton’s law described how everything else fell, from bricks to bullets. 

But Newton’s gravity, and his laws of motion, and the rest of science built on that foundation had limits. Newton’s science couldn’t cope with really strong gravity, extremely fast motion or supertiny particles. Relativity theory and quantum physics helped with that. But there remains a realm where standard science has struggled to find unifying principles among different behaviors. That would be the kingdom of complexity, the universe of systems that defy simplification. 

Such complex systems are everywhere, of course. Some are physical — the electric power grid, for instance. Many are biological — brains, bodies, ecosystems. And others are social — financial markets, interlocking corporate directorates, and yes, for God’s sake, Twitter. 

It’s hard to find simple scientific principles from which to deduce all the multifaceted things that such complex systems do. But there is, for sure, one thing that they do all have in common. They all have a structure. And it’s by quantifying structure, three scientists suggest in an intriguing new paper, that complexity can be tamed...

Continue reading at Science News.


Abstract

We develop a general formalism for representing and understanding structure in complex systems. In our view, structure is the totality of relationships among a system's components, and these relationships can be quantified using information theory. In the interest of flexibility we allow information to be quantified using any function, including Shannon entropy and Kolmogorov complexity, that satisfies certain fundamental axioms. Using these axioms, we formalize the notion of a dependency among components, and show how a system's structure is revealed in the amount of information assigned to each dependency. We explore quantitative indices that summarize system structure, providing a new formal basis for the complexity profile and introducing a new index, the "marginal utility of information". Using simple examples, we show how these indices capture intuitive ideas about structure in a quantitative way. Our formalism also sheds light on a longstanding mystery: that the mutual information of three or more variables can be negative. We discuss applications to complex networks, gene regulation, the kinetic theory of fluids and multiscale cybernetic thermodynamics.