From Neuroelectrodynamics to Thinking Machines

@article{Aur2011FromNT,
  title={From Neuroelectrodynamics to Thinking Machines},
  author={Dorian Aur},
  journal={Cognitive Computation},
  year={2011},
  volume={4},
  pages={4-12}
}
  • D. Aur
  • Published 1 March 2012
  • Biology, Computer Science, Psychology
  • Cognitive Computation
Natural systems can provide excellent solutions to build artificial intelligent systems. The brain represents the best model of computation that leads to general intelligent action. However, current mainstream models reflect a weak understanding of computations performed in the brain that is translated in a failure of building powerful thinking machines. Specifically, temporal reductionist neural models elude the complexity of information processing since spike timing models reinforce the idea… 
Reply to Comments on Neuroelectrodynamics: Where are the Real Conceptual Pitfalls?
TLDR
The paper associates the general failure to build intelligent thinking machines with current reductionist principles of temporal coding and advocates for a change in paradigm regarding the brain analogy.
Can we build a conscious machine?
TLDR
Since with training, meaningful information accumulates and is electrically integrated in the brain, one can predict, that this gradual process of training will trigger a tipping point for conscious experience to emerge in the hybrid system.
Computing by physical interaction in neurons.
TLDR
The main conceptual idea is that under the influence of electric fields, efficient computation by interaction occurs between charge densities embedded within molecular structures and the transient developed flow of electrical charges.
Comments on Aur’s “From Neuroelectrodynamics to Thinking Machines”
TLDR
Dorian Aur announces a new approach to understand neural computation, up to now unperceived by neurophysiologists, in which meaningful patterns are built upon spike directivity vectors that quantify transient charge density taking place during action potential.
Comments on Aur’s “From Neuroelectrodynamics to Thinking Machines”
TLDR
Dorian Aur announces a new approach to understand neural computation, up to now unperceived by neurophysiologists, in which meaningful patterns are built upon spike directivity vectors that quantify transient charge density taking place during action potential.
State of the Art: Mathematical Approaches in Brain Science
This chapter explores some of the most relevant mathematical structures used by brain researchers to unravel brain’s structure, function and dynamics. It starts by looking into the concept of brain
On Having No Head: Cognition throughout Biological Systems
TLDR
The study of cognitive processes implemented in aneural contexts is a fascinating, highly interdisciplinary topic that has many implications for evolution, cell biology, regenerative medicine, computer science, and synthetic bioengineering.
REPLAY TO “ COMMENTS ON AUR ’ S FROM NEUROELECTRODYNAMICS TO THINKING MACHINES
The paper [1] relates the general failure to build intelligent thinking machines to reductionist models and suggests a change in paradigm regarding the brain analogy. With the main focus on temporal
A Spiking Neuron as Information Bottleneck
TLDR
This work proposes a simple learning rule for the weights of spiking neurons derived from the information bottleneck (IB) framework that minimizes the loss of relevant information transmitted in the output spike train and shows that the proposed IB learning rule allowsSpiking neurons to learn a predictive code, that is, to extract those parts of their input that are predictive for future input.
Great Than The Sum: Integrated Information In Large Brain Networks
TLDR
This work presents a solution to the computation time for large systems from longer than the timespan of the universe to just several hours, and demonstrates that brain connectomes are structured in ways that facilitate high integrated information and provide the first measurement of integrated information in a real nervous system: the brain of Caenorhabditis elegans.
...
1
2
3
...

References

SHOWING 1-10 OF 72 REFERENCES
Towards an artificial brain.
Resonate-and-fire neurons
A Spiking Neuron as Information Bottleneck
TLDR
This work proposes a simple learning rule for the weights of spiking neurons derived from the information bottleneck (IB) framework that minimizes the loss of relevant information transmitted in the output spike train and shows that the proposed IB learning rule allowsSpiking neurons to learn a predictive code, that is, to extract those parts of their input that are predictive for future input.
Towards an integrative theory of cognition.
TLDR
Integrative neural modeling is shown to be an important methodology for analyzing the response activities of functional imaging studies in elucidating the relationship between brain and structure, function and behavior.
Neuronal spatial learning
TLDR
Analyzing spatial spike propagation in expert medium spiny neurons with the charge movement model shows that electrical flow has directionality which becomes organized with behavioral learning, which implies that neurons within a network may behave as “weak learners” attending to preferred spatial directions.
Where is the ‘Jennifer Aniston neuron’?
TLDR
A new paradigm regarding neural code where information processing, computation and memory formation in the brain can be explained in terms of dynamics and interaction of electric charges is confirmed.
Reading the Neural Code: What do Spikes Mean for Behavior?
TLDR
The present study reveals the existence of an intrinsic spatial code within neuronal spikes that predicts behavior as a hidden feature that reveals the semantics of each spike and in the current experiment, predicts the correct turn that the animal would subsequently make to obtain reward.
Spike timing-dependent plasticity: a Hebbian learning rule.
TLDR
This work has examined the functional consequences of STDP directly in an increasing number of neural circuits in vivo, and revealed several layers of complexity in STDP, including its dependence on dendritic location, the nonlinear integration of synaptic modification induced by complex spike trains, and the modulation ofSTDP by inhibitory and neuromodulatory inputs.
Population Encoding With Hodgkin–Huxley Neurons
  • A. Lazar
  • Biology, Computer Science
    IEEE Transactions on Information Theory
  • 2010
TLDR
The recovery of (weak) stimuli encoded with a population of Hodgkin-Huxley neurons is investigated, formulated as a spline interpolation problem in the space of finite length bounded energy signals.
A carbon nanotube implementation of temporal and spatial dendritic computations
TLDR
A neural dendritic computational circuit design that demonstrates linear, superlinear and sublinear summation of both spatially and temporally separated EPSPs.
...
1
2
3
4
5
...