# RIONA: A New Classification System Combining Rule Induction and Instance-Based Learning

@article{Gra2002RIONAAN, title={RIONA: A New Classification System Combining Rule Induction and Instance-Based Learning}, author={Grzegorz G{\'o}ra and Arkadiusz Wojna}, journal={Fundam. Informaticae}, year={2002}, volume={51}, pages={369-390} }

The article describes a method combining two widely-used empirical approaches to learning from examples: rule induction and instance-based learning. In our algorithm (RIONA) decision is predicted not on the basis of the whole support set of all rules matching a test case, but the support set restricted to a neighbourhood of a test case. The size of the optimal neighbourhood is automatically induced during the learning phase. The empirical study shows the interesting fact that it is enough to…

## Figures and Tables from this paper

## 63 Citations

### Combination of Metric-Based and Rule-Based Classification

- Computer ScienceRSFDGrC
- 2005

In the combined model the notions of rule, rule minimality and rule consistency are generalized to metric-dependent form and the rule-based algorithm takes the role of nearest neighbor voting model.

### Analogy-Based Reasoning in Classifier Construction

- Computer ScienceTrans. Rough Sets
- 2005

This dissertation introduces two new classification models based on the k-nn algorithm and proposes a method for dealing with such sets based on locally induced metrics that improved significantly the classification accuracy of methods with global models in the hardest tested problems.

### Hybrid Methods in Data Classification and Reduction

- Computer ScienceRough Sets and Intelligent Systems
- 2013

Three algorithms will be presented which will use the notion of surroundings and a k-NN method for data reduction and five algorithms which use reducts and deterministic or inhibitory decision rules for feature selection.

### On Combined Classifiers, Rule Induction and Rough Sets

- Computer ScienceTrans. Rough Sets
- 2007

The main aim of this paper is to summarize the author's own experience with applying one of his rule induction algorithm, called MODLEM, in the framework of different combined classifiers, namely, the bagging, n2-classifier and the combiner aggregation.

### On k-NN Method with Preprocessing

- Computer ScienceFundam. Informaticae
- 2006

A new model of data classification based on preliminary reduction of the training set of examples (preprocessing) in order to facilitate the use of nearest neighbours (NN) techniques in near real-time applications is introduced and the issue of minimising the computational resource requirements of NN techniques, memory as well as time is addressed.

### Beyond Sequential Covering - Boosted Decision Rules

- Computer ScienceAdvances in Machine Learning I
- 2010

A general scheme for learning an ensemble of decision rules in a boosting framework, using different loss functions and minimization techniques is presented, which is covered by such algorithms as SLIPPER, LRI and MLRules.

### A Hierarchical Approach to Multimodal Classification

- Computer ScienceRSFDGrC
- 2005

This paper proposes hierarchical or layered approach to the problem of overlapping and insufficient coverage in classifier construction, which considers a series of models under gradually relaxing conditions to form a hierarchical structure.

### Solving Regression by Learning an Ensemble of Decision Rules

- Computer ScienceICAISC
- 2008

A novel decision rule induction algorithm for solving the regression problem and the prediction model in the form of an ensemble of decision rules is powerful, which is shown by results of the experiment presented in the paper.

### ENDER: a statistical framework for boosting decision rules

- Computer ScienceData Mining and Knowledge Discovery
- 2010

A learning algorithm, called ENDER, which constructs an ensemble of decision rules, which is tailored for regression and binary classification problems and uses the boosting approach for learning, which can be treated as generalization of sequential covering.

### Predicting the number of nearest neighbors for the k-NN classification algorithm

- Computer ScienceIntell. Data Anal.
- 2014

This work proposes a novel method of using back-propagation neural networks to explore the relationship between data set characteristics and the optimal values of k, then the relationship and the dataSet characteristics of a new data set are used to recommend the value of k for this data set.

## References

SHOWING 1-10 OF 37 REFERENCES

### RIONA: A Classifier Combining Rule Induction and k-NN Method with Automated Selection of Optimal Neighbourhood

- Computer ScienceECML
- 2002

The article describes a method combining two widely-used empirical approaches: rule induction and instance-based learning, which results in a significant acceleration of the algorithm using all minimal rules.

### Unifying instance-based and rule-based induction

- Computer ScienceMachine Learning
- 2004

In an extensive empirical study, RISE consistently achieves higher accuracies than state-of-the-art representatives of both its parent approaches, as well as a decision tree learner (C4.5).

### Instance-Based Classification by Emerging Patterns

- Computer SciencePKDD
- 2000

A new, instance-based classifier using EPs, called DeEPs, is proposed to achieve much better accuracy and efficiency than the previously proposed EP- based classifiers, and is superior to other classifiers on accuracy.

### Instance-based learning algorithms

- Computer ScienceMachine Learning
- 2004

This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.

### A study of distance-based machine learning algorithms

- Computer Science
- 1994

It is shown that the k-nearest neighbor algorithm (kNN) outperforms the first nearest neighbor algorithm only under certain conditions, and methods for choosing the value of k for kNN are investigated, and two methods for learning feature weights for a weighted Euclidean distance metric are proposed.

### Lazy Decision Trees

- Computer ScienceAAAI/IAAI, Vol. 1
- 1996

This work proposes a lazy decision tree algorithm--LAZYDT--that conceptually constructs the "best" decision tree for each test instance, and is robust with respect to missing values without resorting to the complicated methods usually seen in induction of decision trees.

### A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features

- Computer ScienceMachine Learning
- 2004

A nearest neighbor algorithm for learning in domains with symbolic features, which produces excellent classification accuracy on three problems that have been studied by machine learning researchers: predicting protein secondary structure, identifying DNA promoter sequences, and pronouncing English text.

### A weighted nearest neighbor algorithm for learning with symbolic features

- Computer ScienceMachine Learning
- 2004

A nearest neighbor algorithm for learning in domains with symbolic features, which produces excellent classification accuracy on three problems that have been studied by machine learning researchers: predicting protein secondary structure, identifying DNA promoter sequences, and pronouncing English text.

### Programs for Machine Learning

- Computer Science
- 1994

In his new book, C4.5: Programs for Machine Learning, Quinlan has put together a definitive, much needed description of his complete system, including the latest developments, which will be a welcome addition to the library of many researchers and students.

### Improving Rule-Based Systems Through Case-Based Reasoning

- Computer ScienceAAAI
- 1991

A novel architecture is presented for combining rule-based and case-based reasoning, and the result that this performance was better than what could be achieved with the rules alone illustrates the capacity of the architecture to improve on the rule- based system it starts with.