Isaac E. Lagaris

Learn More
Gaussian mixture models (GMMs) constitute a well-known type of probabilistic neural networks. One of their many successful applications is in image segmentation, where spatially constrained mixture models have been trained using the expectation-maximization (EM) framework. In this letter, we elaborate on this method and propose a new methodology for the(More)
We present a method to solve initial and boundary value problems using artificial neural networks. A trial solution of the differential equation is written as a sum of two parts. The first part satisfies the initial/boundary conditions and contains no adjustable parameters. The second part is constructed so as not to affect the initial/boundary conditions.(More)
In this paper, we propose a new methodology for analysis of microarray images. First, a new gridding algorithm is proposed for determining the individual spots and their borders. Then, a Gaussian mixture model (GMM) approach is presented for the analysis of the individual spot images. The main advantages of the proposed methodology are modeling flexibility(More)
A novel method for solving ordinary and partial differential equations, based on grammatical evolution is presented. The method forms generations of trial solutions expressed in an analytical closed form. Several examples are worked out and in most cases the exact solution is recovered. When the solution cannot be expressed in a closed analytical form then(More)
A stochastic global optimization method based on Multistart is presented. In this, the local search is conditionally applied with a probability that takes in account the topology of the objective function at the detail offered by the current status of exploration. As a result, the number of unnecessary local searches is drastically limited, yielding an(More)
In this article we present an incremental method for building a mixture model. Given the desired number of clusters K ≥ 2, we start with a two-component mixture and we optimize the likelihood by repeatedly applying a Split-Merge operation. When an optimum is obtained, we add a new component to the model by splitting in two, a properly chosen cluster. This(More)
Partial differential equations (PDEs) with boundary conditions (Dirichlet or Neumann) defined on boundaries with simple geometry have been successfully treated using sigmoidal multilayer perceptrons in previous works. This article deals with the case of complex boundary geometry, where the boundary is determined by a number of points that belong to it and(More)
We present three new stopping rules for Multistart based methods. The first uses a device that enables the determination of the coverage of the bounded search domain. The second is based on the comparison of asymptotic expectation values of observable quantities to the actually measured ones. The third offers a probabilistic estimate for the number of local(More)
Given a data set, a dynamical procedure is applied to the data points in order to shrink and separate, possibly overlapping clusters. Namely, Newton’s equations of motion are employed to concentrate the data points around their cluster centers, using an attractive potential, constructed specially for this purpose. During this process, important information(More)
We present two sequential and one parallel global optimization codes, that belong to the stochastic class, and an interface routine that enables the use of the Merlin/MCL environment as a non-interactive local optimizer. This interface proved extremely important, since it provides flexibility, effectiveness and robustness to the local search task that is in(More)