# STRIP - a strip-based neural-network growth algorithm for learning multiple-valued functions

@article{Ngom2001STRIPA, title={STRIP - a strip-based neural-network growth algorithm for learning multiple-valued functions}, author={Alioune Ngom and Ivan Stojmenovic and Veljko M. Milutinovic}, journal={IEEE transactions on neural networks}, year={2001}, volume={12 2}, pages={ 212-27 } }

We consider the problem of synthesizing multiple-valued logic functions by neural networks. A genetic algorithm (GA) which finds the longest strip in V is a subset of K(n) is described. A strip contains points located between two parallel hyperplanes. Repeated application of GA partitions the space V into certain number of strips, each of them corresponding to a hidden unit. We construct two neural networks based on these hidden units and show that they correctly compute the given but arbitrary…

## Figures and Tables from this paper

## 20 Citations

### Evolutionary strategy for learning multiple-valued logic functions

- Computer ScienceProceedings. 34th International Symposium on Multiple-Valued Logic
- 2004

An evolutionary strategy which finds the longest strip in V/spl sube/K/sup n/ is described and neural networks based on these hidden units are constructed.

### The Upper Bound on the Number of Hidden Neurons in Multi-Valued Multi-Threshold Neural Networks

- Computer Science2009 International Workshop on Intelligent Systems and Applications
- 2009

The lower bound on the number of hidden neurons in three-layer multi-valued multi-threshold neural networks for implementation of an arbitrary q-valued function defined on a set of N-point with n-dimension (N¿qn) is presented.

### Using Three Layer Neural Network to Compute Multi-valued Functions

- Computer ScienceISNN
- 2007

This paper concerns how to compute multi-valued functions using three-layer feedforward neural networks with one hidden layer. Firstly, we define strongly and weakly symmetric functions. Then we give…

### The Computing Capacity of Three-Input Multiple-Valued One-Threshold Perceptrons

- Computer ScienceProceedings 30th IEEE International Symposium on Multiple-Valued Logic (ISMVL 2000)
- 2000

An efficient algorithm is obtained for counting the number of k-valued logic functions simulated by a three-input k-value one-threshold perceptron in three-dimensional space.

### Multi-valued Neural Network Trained by Differential Evolution for Synthesizing Multiple-Valued Functions

- Computer Science2015 2nd International Conference on Information Science and Control Engineering
- 2015

Results indicate that differential evolution is suitable to train MVL networks for synthesizing MVL functions and the optimum window and biasing parameters to be chosen for convergence are derived.

### Using Three Layer Neural Networks to Compute Discrete Real Functions

- Computer ScienceThird International Conference on Natural Computation (ICNC 2007)
- 2007

This paper concerns how to compute discrete real functions using three-layer feedforward neural networks with one hidden layer. Firstly, we define strongly and weakly symmetric real functions. Then…

### Training of a feedforward multiple-valued neural network by error backpropagation with a multilevel threshold function

- Computer ScienceIEEE Trans. Neural Networks
- 2001

A technique for the training of multiple-valued neural networks based on a backpropagation learning algorithm employing a multilevel threshold function is proposed and trials performed on a benchmark problem demonstrate the convergence of the network within the specified range of parameters.

### Topology of Constructive Neural Network Algorithms

- Computer Science
- 2013

This study aims to evaluate constructive neural network algorithms by clustering them, and obtained results are evaluated with respect to average error calculation method, topographic product and Dunn's Index.

### On the number of multilinear partitions and the computing capacity of multiple-valued multiple-threshold perceptrons

- Computer ScienceProceedings 1999 29th IEEE International Symposium on Multiple-Valued Logic (Cat. No.99CB36329)
- 1999

Results on the capacity of a single (n, k, s)-perceptron are obtained for V/spl sub/R/sup n/ in general position and for V=K/sup 2/.

### Shape reconstruction by genetic algorithms and artificial neural networks

- Computer Science
- 2002

A new surface reconstruction method based on complex form functions, genetic algorithms and neural networks that can be used for CAD model reconstruction of 3D objects and free smooth shape modelling is presented.

## References

SHOWING 1-10 OF 40 REFERENCES

### A constructive algorithm for binary neural networks: the oil-spot algorithm

- Computer ScienceIEEE Trans. Neural Networks
- 1995

A constructive training algorithm for supervised neural networks that dynamically constructs a two-layer neural network by involving successively binary examples based on the representation of the mapping of interest onto the binary hypercube of the input space.

### On sequential construction of binary neural networks

- Computer ScienceIEEE Trans. Neural Networks
- 1995

A new technique called sequential window learning (SWL), for the construction of two-layer perceptrons with binary inputs is presented, and the introduction of a new type of neuron, having a window-shaped activation function, considerably increases the convergence speed and the compactness of resulting networks.

### CARVE-a constructive algorithm for real-valued examples

- Computer ScienceIEEE Trans. Neural Networks
- 1998

A constructive neural-network algorithm is presented, which extends the "sequential learning" algorithm of Marchand et al. from Boolean inputs to the real-valued input case, and uses convex hull methods for the determination of the network weights.

### A Growth Algorithm for Neural Network Decision Trees

- Computer Science
- 1990

This paper considers the problem of constructing a tree of perceptrons able to execute a given but arbitrary Boolean function defined on Ni input bits and applies a sequential and parallel learning procedure to add hidden units until the task in hand is performed.

### A geometric approach to learning in neural networks

- Computer ScienceInternational 1989 Joint Conference on Neural Networks
- 1989

A geometric view is presented of how information is processed in feedforward networks of linear threshold units and a new class of learning algorithms are introduced, which provide a suboptimal solution in a polynomial number of steps.

### The Patch algorithm: fast design of binary feedforward neural networks

- Computer Science
- 1993

A new constructive learning algorithm to generate binary neural networks is presented, able to handle analogue inputs and problems with multiple output states and allows training sets up to several thousands of patterns.

### A Fast Partitioning Algorithm and a Comparison of Binary Feedforward Neural Networks

- Computer Science
- 1992

A comparison was carried out of several learning algorithms for training feedforward neural networks with linear threshold units and a fast method for the selection of input patterns that can be identified by a single neuron is presented.

### Learning in feedforward layered networks: the tiling algorithm

- Computer Science
- 1989

A new algorithm which builds a feedforward layered network in order to learn any Boolean function of N Boolean units, which is an algorithm for growth of the network, which adds layers, and units inside a layer, at will until convergence.