Complexity Science aims to understand what is that makes some systems to be "more than the sum of their parts". A natural first step to address this issue is to study networks of pairwise interactions, which have been done with great success in many disciplines -- to the extend that many people today identify Complexity Science with network analysis. In contrast, multivariate complexity provides a vast and mostly unexplored territory. As a matter of fact, the "modes of interdependency" that can exist between three or more variables are often nontrivial, poorly understood and, yet, are paramount for our understanding of complex systems in general, and emergence in particular.

In this talk we present an information-theoretic framework to analyse high-order correlations, i.e. statistical dependencies that exist between groups of variables that cannot be reduced to pairwise interactions. Following the spirit of information theory, our approach is data-driven and model-agnostic, being applicable to discrete, continuous, and categorical data. We review the evolution of related ideas in the context of theoretical neuroscience, and discuss the most prominent extensions of information-theoretic metrics to multivariate settings. Then, we introduce the O-information, a novel metric that quantify various structural (i.e. synchronous) high-order effects. Finally, we provide a critical discussion on the framework of Integrated Information Theory (IIT), which suggests an approach to extend the analysis to dynamical settings. To illustrate the presented methods, we show how the analysis of high-order correlations can reveal critical structures in various scenarios, including cellular automata, Baroque music scores, and various EEG datasets.

References:

[1] F. Rosas, P.A. Mediano, M. Gastpar and H.J. Jensen, ``Quantifying High-order Interdependencies via Multivariate Extensions of the Mutual Information'', submitted to PRE, under review.

https://arxiv.org/abs/1902.11239

[2] F. Rosas, P.A. Mediano, M. Ugarte and H.J. Jensen, ``An information-theoretic approach to self-organisation: Emergence of complex interdependencies in coupled dynamical systems'', in Entropy, vol. 20 no. 10: 793, pp.1-25, Sept. 2018.

https://www.mdpi.com/1099-4300/20/10/793