#### Filter Results:

- Full text PDF available (13)

#### Publication Year

2005

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Saratha Sathasivam, Wan Ahmad Tajuddin Wan Abdullah
- Computing
- 2011

Intelligent systems are yielded from integration of a logic programming and connectionist systems. Radial basis function neural network is a commonly-used type of feedforward networks. In this paper, we proposed a method for connectionist model generation using Radial Basis Function neural network to encode higher order logic programming. We encode each… (More)

Knowledge could be gained from experts, specialists in the area of interest, or it can be gained by induction from sets of data. Automatic induction of knowledge from data sets, usually stored in large databases, is called data mining. Data mining methods are important in the management of complex systems. There are many technologies available to data… (More)

- Nawaf Hamadneh, Waqar A. Khan, Saratha Sathasivam, Hong Choon Ong
- PloS one
- 2013

Particle swarm optimization (PSO) is employed to investigate the overall performance of a pin fin.The following study will examine the effect of governing parameters on overall thermal/fluid performance associated with different fin geometries, including, rectangular plate fins as well as square, circular, and elliptical pin fins. The idea of entropy… (More)

Synaptic weights for neurons in logic programming can be calculated either by using Hebbian learning or by Wan Abdullah's method. In other words, Hebbian learning for governing events corresponding to some respective program clauses is equivalent with learning using Wan Abdullah's method for the same respective program clauses. In this paper we will… (More)

The Little-Hopfield neural network programmed with Horn clauses is studied. We argue that the energy landscape of the system, corresponding to the inconsistency function for logical interpretations of the sets of Horn clauses, has minimal ruggedness. This is supported by computer simulations. 1.INTRODUCTION Recurrent single field neural networks are… (More)

- Saratha Sathasivam
- 2008 Fifth International Conference on Computer…
- 2008

There are two ways to calculate synaptic weights for neurons in logic programming. There are by using Hebbian learning or by Wan Abdullah's method. Hebbian learning for governing events corresponding to some respective program clauses is equivalent with learning using Wan Abdullah's method for the same respective program clauses. We will evaluate… (More)

In recent studies on artificial intelligence, logic program occupies a significant position because of its attractive features. Neural networks are dynamic systems in the learning and training phase of their operation and convergence is an essential feature, so it is necessary for the researchers developing the models and their learning algorithms to find a… (More)

The logic of abduction and deduction contribute to our conceptual understanding of a phenomenon, while the logic of induction adds quantitative details to our conceptual knowledge. In this paper, we will look into how this reasoning techniques-abduction, deduction and induction, are relevant in neural networks logic programming. Deduction simplifies the… (More)

The two well-known neural network, Hopf ield networks and Radial Basis Function networks, have different structures and characteristics. Hopf ield neural network and RBF neural network are two of the most commonly-used types of feedback networks and feedforward networks respectively. This study gives an overview for Hopf ield neural network and RBF neural… (More)