The Local Information Dynamics of Distributed Computation in Complex Systems

The Local Information Dynamics of Distributed Computation in Complex Systems
Author: Joseph T. Lizier
Publsiher: Springer Science & Business Media
Total Pages: 249
Release: 2012-11-06
Genre: Technology & Engineering
ISBN: 9783642329524

Download The Local Information Dynamics of Distributed Computation in Complex Systems Book in PDF, Epub and Kindle

The nature of distributed computation in complex systems has often been described in terms of memory, communication and processing. This thesis presents a complete information-theoretic framework to quantify these operations on information (i.e. information storage, transfer and modification), and in particular their dynamics in space and time. The framework is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems (e.g. that gliders are dominant information transfer agents). Applications to several important network models, including random Boolean networks, suggest that the capability for information storage and coherent transfer are maximised near the critical regime in certain order-chaos phase transitions. Further applications to study and design information structure in the contexts of computational neuroscience and guided self-organisation underline the practical utility of the techniques presented here.

Guided Self Organization Inception

Guided Self Organization  Inception
Author: Mikhail Prokopenko
Publsiher: Springer Science & Business Media
Total Pages: 488
Release: 2013-12-19
Genre: Technology & Engineering
ISBN: 9783642537349

Download Guided Self Organization Inception Book in PDF, Epub and Kindle

Is it possible to guide the process of self-organisation towards specific patterns and outcomes? Wouldn’t this be self-contradictory? After all, a self-organising process assumes a transition into a more organised form, or towards a more structured functionality, in the absence of centralised control. Then how can we place the guiding elements so that they do not override rich choices potentially discoverable by an uncontrolled process? This book presents different approaches to resolving this paradox. In doing so, the presented studies address a broad range of phenomena, ranging from autopoietic systems to morphological computation, and from small-world networks to information cascades in swarms. A large variety of methods is employed, from spontaneous symmetry breaking to information dynamics to evolutionary algorithms, creating a rich spectrum reflecting this emerging field. Demonstrating several foundational theories and frameworks, as well as innovative practical implementations, Guided Self-Organisation: Inception, will be an invaluable tool for advanced students and researchers in a multiplicity of fields across computer science, physics and biology, including information theory, robotics, dynamical systems, graph theory, artificial life, multi-agent systems, theory of computation and machine learning.

Directed Information Measures in Neuroscience

Directed Information Measures in Neuroscience
Author: Michael Wibral,Raul Vicente,Joseph T. Lizier
Publsiher: Springer
Total Pages: 234
Release: 2014-03-20
Genre: Technology & Engineering
ISBN: 9783642544743

Download Directed Information Measures in Neuroscience Book in PDF, Epub and Kindle

Analysis of information transfer has found rapid adoption in neuroscience, where a highly dynamic transfer of information continuously runs on top of the brain's slowly-changing anatomical connectivity. Measuring such transfer is crucial to understanding how flexible information routing and processing give rise to higher cognitive function. Directed Information Measures in Neuroscience reviews recent developments of concepts and tools for measuring information transfer, their application to neurophysiological recordings and analysis of interactions. Written by the most active researchers in the field the book discusses the state of the art, future prospects and challenges on the way to an efficient assessment of neuronal information transfer. Highlights include the theoretical quantification and practical estimation of information transfer, description of transfer locally in space and time, multivariate directed measures, information decomposition among a set of stimulus/responses variables and the relation between interventional and observational causality. Applications to neural data sets and pointers to open source software highlight the usefulness of these measures in experimental neuroscience. With state-of-the-art mathematical developments, computational techniques and applications to real data sets, this book will be of benefit to all graduate students and researchers interested in detecting and understanding the information transfer between components of complex systems.

Information based methods for neuroimaging analyzing structure function and dynamics

Information based methods for neuroimaging  analyzing structure  function and dynamics
Author: Jesus M. Cortés,Daniele Marinazzo,Miguel Angel Muñoz
Publsiher: Frontiers Media SA
Total Pages: 192
Release: 2015-05-07
Genre: Neurosciences. Biological psychiatry. Neuropsychiatry
ISBN: 9782889195022

Download Information based methods for neuroimaging analyzing structure function and dynamics Book in PDF, Epub and Kindle

The aim of this Research Topic is to discuss the state of the art on the use of Information-based methods in the analysis of neuroimaging data. Information-based methods, typically built as extensions of the Shannon Entropy, are at the basis of model-free approaches which, being based on probability distributions rather than on specific expectations, can account for all possible non-linearities present in the data in a model-independent fashion. Mutual Information-like methods can also be applied on interacting dynamical variables described by time-series, thus addressing the uncertainty reduction (or information) in one variable by conditioning on another set of variables. In the last years, different Information-based methods have been shown to be flexible and powerful tools to analyze neuroimaging data, with a wide range of different methodologies, including formulations-based on bivariate vs multivariate representations, frequency vs time domains, etc. Apart from methodological issues, the information bit as a common unit represents a convenient way to open the road for comparison and integration between different measurements of neuroimaging data in three complementary contexts: Structural Connectivity, Dynamical (Functional and Effective) Connectivity, and Modelling of brain activity. Applications are ubiquitous, starting from resting state in healthy subjects to modulations of consciousness and other aspects of pathophysiology. Mutual Information-based methods have provided new insights about common-principles in brain organization, showing the existence of an active default network when the brain is at rest. It is not clear, however, how this default network is generated, the different modules are intra-interacting, or disappearing in the presence of stimulation. Some of these open-questions at the functional level might find their mechanisms on their structural correlates. A key question is the link between structure and function and the use of structural priors for the understanding of the functional connectivity measures. As effective connectivity is concerned, recently a common framework has been proposed for Transfer Entropy and Granger Causality, a well-established methodology originally based on autoregressive models. This framework can open the way to new theories and applications. This Research Topic brings together contributions from researchers from different backgrounds which are either developing new approaches, or applying existing methodologies to new data, and we hope it will set the basis for discussing the development and validation of new Information-based methodologies for the understanding of brain structure, function, and dynamics.

Transfer Entropy

Transfer Entropy
Author: Deniz Gençağa
Publsiher: MDPI
Total Pages: 335
Release: 2018-08-24
Genre: Electronic books
ISBN: 9783038429197

Download Transfer Entropy Book in PDF, Epub and Kindle

This book is a printed edition of the Special Issue "Transfer Entropy" that was published in Entropy

From Matter to Life

From Matter to Life
Author: Sara Imari Walker,Paul C. W. Davies,George F. R. Ellis
Publsiher: Cambridge University Press
Total Pages: 517
Release: 2017-02-23
Genre: Science
ISBN: 9781107150539

Download From Matter to Life Book in PDF, Epub and Kindle

This book tackles the most difficult and profound open questions about life and its origins from an information-based perspective.

Complexity Criticality and Computation C

Complexity  Criticality and Computation  C
Author: Mikhail Prokopenko
Publsiher: MDPI
Total Pages: 269
Release: 2018-04-06
Genre: Electronic Book
ISBN: 9783038425144

Download Complexity Criticality and Computation C Book in PDF, Epub and Kindle

This book is a printed edition of the Special Issue "Complexity, Criticality and Computation (C³)" that was published in Entropy

An Introduction to Transfer Entropy

An Introduction to Transfer Entropy
Author: Terry Bossomaier,Lionel Barnett,Michael Harré,Joseph T. Lizier
Publsiher: Springer
Total Pages: 190
Release: 2016-11-15
Genre: Computers
ISBN: 9783319432229

Download An Introduction to Transfer Entropy Book in PDF, Epub and Kindle

This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering.