Multiprocessor And Memory Architecture Of The Neurocomputer Synapse-1

  title={Multiprocessor And Memory Architecture Of The Neurocomputer Synapse-1},
  author={Ulrich Ramacher and W. Raab and Joachim K. Anlauf and J. A. Ulrich Hachmann and J{\"o}rg Beichter and Nico Br{\"u}ls and Matthias Wesseling and Elisabeth Sicheneder and Reinhard M{\"a}nner and Joachim Gl{\"a}{\ss} and Andreas Wurz},
  journal={International journal of neural systems},
  volume={4 4},
A general purpose neurocomputer, SYNAPSE-1, which exhibits a multiprocessor and memory architecture is presented. It offers wide flexibility with respect to neural algorithms and a speed-up factor of several orders of magnitude--including learning. The computational power is provided by a 2-dimensional systolic array of neural signal processors. Since the weights are stored outside these NSPs, memory size and processing power can be adapted individually to the application needs. A neural… 

SYNAPSE-1: a high-speed general purpose parallel neurocomputer system

The paper describes the general purpose neurocomputer SYNAPSE-1 which has been developed in cooperation between Siemens Munich and the University of Mannheim. This system contains one of the most

Competitive Learning Algorithms and Neurocomputer Architecture

An overview of several competitive learning algorithms in artificial neural networks, including self-organizing feature maps, focusing on properties of these algorithms important to hardware implementations, and a reconfigurable parallel neurocomputer architecture designed using digital signal processing chips and field-programmable gate array devices.

Overview of neural hardware

An up-to-date overview of current state-of-the-art neural hardware implementations is presented and special attention is given to multiprocessor projects that focus on scalability, flexibility, and adaptivity of the design and thus seem suitable for brain-style (cognitive) processing.

A parallel neurochip for neural networks implementing the reactive tabu search algorithm: application case studies

This work presents two different applications implemented on the neurocomputer Totem Nc3001 from Neuricam Inc. to test the performance of this powerful parallel unit consisting of 32 Digital Signal Processors and to evaluate its suitability to neural network applications.

A Digital VLSI Architecture for Real-World Applications

As the other chapters of this book show, the neural network model has significant advantages over traditional mcdels for certain applications. It has also expanded our understanding of biological

2 Artificial Neuron Model and Neural Network Structures

An overview of neural network hardware is presented, including two examples of neurohardware, CNAPS and SYNAPSE-1, and some real-world applications of Neural network hardware, which are described in detail.

The Human Brain Project and neuromorphic computing.

This paper summarize how these objectives will be pursued in the Human Brain Project will see it moving away from current "bit precise" computing models and towards new techniques that exploit the stochastic behavior of simple, reliable, very fast, lowpower computing devices embedded in intensely recursive architectures.

Implementing Neural Models in Silicon

  • L. Smith
  • Computer Science
    Handbook of Nature-Inspired and Innovative Computing
  • 2006
This work discusses the technologies involved in direct incorporation of a neural model into a system, permitting real-time input and output, and discusses some example systems used in computational neuroscience and pattern recognition.

Parallel Environments for Implementing Neural Networks

This paper surveys the area of parallel environments for the implementations of ANNs, and prescribes desired characteristics to look for in such implementations, and describes how to implement them on parallel machines.

A VLSI array processor for neural network algorithms

A chip based on a new scalable parallel systolic VLSI architecture is presented for executing the compute-bound algorithmic primitives used by search and learning algorithms in neural networks and