Sridhar Narayan

Learn More
Multilayer perceptron (MLP) networks trained using backpropagation can be slow to converge in many instances. The primary reason for slow learning is the global nature of backpropagation. Another reason is the fact that a neuron in an MLP network functions as a hyperplane separator and is therefore inefficient when applied to classification problems in(More)
— Introducing large-scale problems early in the CS1 course has been shown to be an effective way to teach algorithmic concepts. Adopting this approach in a CS1 course taught in Java, however, presents some significant challenges. This paper describes a tool, the Algorithar-ium, that facilitates the process. The Algoritharium allows CS1 students to explore(More)
Multi-layer Perceptron (MLP) networks function as hyperplane classifiers when applied to classification problems. Therefore, MLP networks can be inefficient when applied to problems in which class boundaries are inadequately modeled by hyperplanes. Attempts to remedy this problem typically necessitate the introduction of a <i>new</i> neural network model in(More)
Despite advances in high-bandwidth technology, the growing demands for video and graphics information transfer necessitate the development of effective image compression techniques. In recent years, the emergence of neural network technology has led to the development of neural image compression schemes. This paper extends existing neural techniques for(More)
  • 1