Syed Shakib Sarwar

Learn More
requirements stretch the capabilities of computing platforms. The fundamental components of these neural networks are the neurons and its synapses. The core of a digital hardware neuron consists of multiplier, accumulator and activation function. Multipliers consume most of the processing energy in the digital neurons, and thereby in the hardware(More)
—Multilayered artificial neural networks have found widespread utility in classification and recognition applications. The scale and complexity of such networks together with the inadequacies of general purpose computing platforms have led to a significant interest in the development of efficient hardware implementations. In this work, we focus on designing(More)
Artificial neural networks (NN) have shown a significant promise in difficult tasks like image classification or speech recognition. Even well-optimized hardware implementations of digital NNs show significant power consumption. It is mainly due to non-uniform pipeline structures and inherent redundancy of numerous arithmetic operations that have to be(More)
Neuromorphic algorithms are being increasingly deployed across the entire computing spectrum from data centers to mobile and wearable devices to solve problems involving recognition, analytics, search and inference. For example, large-scale artificial neural networks (popularly called deep learning) now represent the state-of-the art in a wide and(More)
In this paper, we propose a Spin-Torque (ST) based sensing scheme that can enable energy efficient multi-bit long distance interconnect architectures. Current-mode interconnects have recently been proposed to overcome the performance degradations associated with conventional voltage mode Copper (Cu) interconnects. However, the performance of current mode(More)
  • 1