Artificial Neural Networks

@article{Katz2010ArtificialNN,
  title={Artificial Neural Networks},
  author={William T. Katz and J. W. Snell and Michael B. Merickel},
  journal={Methods in enzymology},
  year={2010},
  volume={210},
  pages={
          610-36
        }
}
Such mappings have been found to exist, for example in the perception process of the human eye where properties of an image are mapped directly to an area of the brain (figure 1). A form of cognition can thus be simulated by replicating the behaviour of the brain and its massively parallel architecture. Some Artificial Neural Networks attempt to artificially recreate this by mapping complex inputs into a lower dimensional space. 
Künstliche neuronale Netze zur Steuerung von Heimbeatmungsgeräten
TLDR
Evaluation of artificial neural networks (ANN) regarding their reliability in directing ventilator settings for homeventilated patients and how this affects applicability requires further testing in the real patient setting.
REVISIÓN Neural networks to formulate special fats
Las redes neuronales son una rama de la inteligencia artificial basadas en la estructura y funcionamiento de sistemas biológicos, teniendo como principal característica la capacidad de aprender y
Comparison of Chemical-Biological Flocculation Process Model Based on Artificial Neural Network
Based on the experimental research on a pilot units of the chemical-biological flocculation process, the multi-input multi-output (MIMO) model and the multi-input single-output (MISO) model have been
A normalized plot as a novel and time-saving tool in complex enzyme kinetic analysis.
TLDR
A new data treatment is described for designing kinetic experiments and analysing kinetic results for multi-substrate enzymes that reduces to less than a half the amount of data necessary for a proper description of the system.
Predictive modeling of an industrial UASB reactor using NARX neural network
A NARX(Nonlinear Autoregressive Exogenous Input) neural network model of an industrial UASB reactor was developed in this research work. A total of 111 days' data were used for the modeling process,

References

SHOWING 1-10 OF 137 REFERENCES
Constructing associative memories using neural networks
TLDR
This paper proposes using a generalized Hopfield's model 1, also known as the McCulloh-Pitts model 2, as associative memories, and determines the information capacity of this generalized model.
The perceptron: a probabilistic model for information storage and organization in the brain.
TLDR
This article will be concerned primarily with the second and third questions, which are still subject to a vast amount of speculation, and where the few relevant facts currently supplied by neurophysiology have not yet been integrated into an acceptable theory.
An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks
  • H. White
  • Computer Science
    International 1989 Joint Conference on Neural Networks
  • 1989
The author presents a statistical test of the hypothesis that a given multilayer feedforward network exactly represents some unknown mapping subject to inherent noise against the alternative that the
A Growing Neural Gas Network Learns Topologies
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to
Construction of neural nets using the radon transform
The authors present a method for constructing a feedforward neural net implementing an arbitrarily good approximation to any L/sub 2/ function over (-1, 1)/sup n/. The net uses n input nodes, a
Approximation theory and feedforward networks
Finding Structure in Time
TLDR
A proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory and suggests a method for representing lexical categories and the type/token distinction is developed.
...
...