Bits from Biology for Computational Intelligence

@article{Wibral2014BitsFB,
  title={Bits from Biology for Computational Intelligence},
  author={Michael Wibral and Joseph T. Lizier and Viola Priesemann},
  journal={ArXiv},
  year={2014},
  volume={abs/1412.0291}
}
Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material covered includes the… 

Figures and Tables from this paper

Can Transfer Entropy Infer Information Flow in Neuronal Circuits for Cognitive Processing?
TLDR
This work test how often cryptographic logic emerges in an evolutionary process that generates artificial neural circuits for two fundamental cognitive tasks (motion detection and sound localization), and suggests that transfer entropy will sometimes fail to infer directed information when it exists, and sometimes suggest a causal connection when there is none.
Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition
TLDR
It is found that modification rose with maturation, but ultimately collapsed when redundant information among neurons took over, indicating that this particular developing neural system initially developed intricate processing capabilities but ultimately displayed information processing that was highly similar across neurons, possibly due to a lack of external inputs.
Self-organization of information processing in developing neuronal networks
TLDR
Results for the first time demonstrate experimentally that approaching criticality with maturation goes in hand with increasing processing capabilities.
Contextual Modulation in Mammalian Neocortex is Asymmetric
TLDR
It is shown that contextual modulation is fundamentally asymmetric, contrasts with all four simple arithmetic operators, can take various forms, and can occur together with the anatomical asymmetry that defines pyramidal neurons in mammalian neocortex.
A Method to Present and Analyze Ensembles of Information Sources
TLDR
This work demonstrates how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise and shows how two ensembles are compared using a randomization process to determine whether the sources in one contain more information than the other.
High-Degree Neurons Feed Cortical Computations
TLDR
These are the first results to show that the extent to which a neuron modifies incoming information streams depends on its topological location in the surrounding functional network.
Partial information decomposition as a unified approach to the characterization and design of neural goal functions
TLDR
It is shown how to better understand existing neural goal functions using PID measures and provide an information theoretic framework for the design of novel goal functions for artificial neural networks.
Control of criticality and computation in spiking neuromorphic networks with plasticity
TLDR
A relation between criticality, task-performance and information theoretic fingerprint in a spiking neuromorphic network with synaptic plasticity is demonstrated, and an understanding of how the collective network state should be tuned to task requirement is provided.
Information processing dynamics in neural networks of macaque cerebral cortex reflect cognitive state and behavior
TLDR
It is found that different brain regions processed information in dynamic and flexible ways, with signals flowing up and down the hierarchy of sensory-motor depending on the demands of the moment, which shows how “computation” in the brain can reflect complex behaviors and cognitive states.
Transitions in brain-network level information processing dynamics are driven by alterations in neural gain
TLDR
The results suggest that the modulation of neural gain via the ascending arousal system may fundamentally alter the information processing mode of the brain, which in turn has important implications for understanding the biophysical basis of cognition.
...
...

References

SHOWING 1-10 OF 160 REFERENCES
JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems
  • J. Lizier
  • Computer Science
    Front. Robot. AI
  • 2014
TLDR
The Java Information Dynamics Toolkit (JIDT) is introduced, a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data.
The local information dynamics of distributed computation in complex systems
TLDR
A complete information-theoretic framework to quantify these operations on information, and in particular their dynamics in space and time, is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems.
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
TLDR
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Towards a synergy-based approach to measuring information modification
TLDR
This work outlines how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information.
Local active information storage as a tool to understand distributed neural information processing
TLDR
It is suggested that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding, when measured on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat.
A free energy principle for the brain
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
TLDR
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
How to measure the information gained from one symbol.
TLDR
This work proposes an alternative measure for the information per observation and proves that this is the only definition that satisfies additivity and allows additional interpretation of several published results, which suggests that the neurons studied are operating far from their information capacity.
Emergence of Glider-like Structures in a Modular Robotic System
TLDR
In the most fit snakebot in the final generation, the first known application of a direct measure of information transfer, transfer entropy, as a fitness function to evolve a self-organized multi-agent system is reported.
The Bayesian brain: the role of uncertainty in neural coding and computation
...
...