M. José Ramírez-Quintana

Learn More
One of the main drawbacks of many machine learning techniques, such as neural networks or ensemble methods, is the incomprehensibility of the model produced. One possible solution to this problem is to consider the learned model as an oracle and generate a new model that " mimics " the semantics of the oracle by expressing it in the form of rules. In this(More)
—Quantification is the name given to a novel machine learning task which deals with correctly estimating the number of elements of one class in a set of examples. The output of a quantifier is a real value; since training instances are the same as a classification problem, a natural approach is to train a classifier and to derive a quantifier from it. Some(More)
We present a framework for the Induction of Functional Logic Programs (IFLP) from facts. This can be seen as an extension to the now consolidated eld of Inductive Logic Programming (ILP). Inspired in the inverse resolution operator of ILP, we study the reversal of narrowing, the more usual operational mechanism for Functional Logic Programming. We also(More)
In this work, we consider the extension of the Inductive Functional Logic Programming (IFLP) framework in order to learn functions in an incremental way. In general, incremental learning is necessary when the number of examples is infinite, very large or presented one by one. We have performed this extension in the FLIP system, an implementation of the IFLP(More)
A new IFLP schema is presented as a general framework for the induction of functional logic programs (FLP). Since narrowing (which is the most usual operational semantics of FLP) performs a unification (mgu) followed by a replacement, we introduce two main operators in our IFLP schema: a generalisation and an inverse replacement or intra-replacement, which(More)
The combination of classifiers is a powerful tool to improve the accuracy of classifiers, by using the prediction of multiple models and combining them. Many practical and useful combination techniques work by using the output of several classifiers as the input of a second layer classifier. The problem of this and other multi-classifier approaches is that(More)
Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. However, there are two important shortcomings associated with ensemble methods. Huge amounts of memory are required to store a set of multiple hypotheses and, more importantly , comprehensibility of a single hypothesis is lost. In this work, we devise a new(More)
In this paper, we present a method for generating very expressive decision trees over a functional logic language. The generation of the tree follows a short-to-long search which is guided by the MDL principle. Once a solution is found, the construction of the tree goes on in order to obtain more solutions ordered as well by description length. The result(More)
Distance-based methods have been a successful family of machine learning techniques since the inception of the discipline. Basically , the classification or clustering of a new individual is determined by the distance to one or more prototypes. From a comprehensibility point of view, this is not especially problematic in propositional learning where(More)