A Kernel-Based Framework for Learning Graded Relations From Data

@article{Waegeman2012AKF,
  title={A Kernel-Based Framework for Learning Graded Relations From Data},
  author={Willem Waegeman and Tapio Pahikkala and Antti Airola and Tapio Salakoski and Michiel Stock and Bernard De Baets},
  journal={IEEE Transactions on Fuzzy Systems},
  year={2012},
  volume={20},
  pages={1090-1101}
}
Driven by a large number of potential applications in areas, such as bioinformatics, information retrieval, and social network analysis, the problem setting of inferring relations between pairs of data objects has recently been investigated intensively in the machine learning community. To this end, current approaches typically consider datasets containing crisp relations so that standard classification methods can be adopted. However, relations between objects like similarities and preferences… 

Figures and Tables from this paper

Efficient regularized least-squares algorithms for conditional ranking on relational data

TLDR
A general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects is presented and it is shown theoretically, that learning with the ranking loss is likely to generalize better than with the regression loss.

A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression

TLDR
This work reviews and unify kernel-based algorithms that are commonly used in different pairwise learning settings, ranging from matrix filtering to zero-shot learning, and shows that independent task kernel ridge regression, two-step kernel ridge regressors, and a linear matrix filter arise naturally as a special case of Kronecker kernel Ridge regression, implying that all these methods implicitly minimize a squared loss.

Multi-target prediction: a unifying view on problems and methods

TLDR
A unifying view on what is called multi-target prediction (MTP) problems and methods is presented by identifying a number of key properties, which distinguish such methods and determine their suitability for different types of problems.

Generalized vec trick for fast learning of pairwise kernel models

TLDR
All the reviewed kernels can be expressed as sums of Kronecker products, allowing the use of generalized vec trick for speeding up their computation, and an extensive comparison of the kernels on a number of biological interaction prediction tasks.

Fast Gradient Computation for Learning with Tensor Product Kernels and Sparse Training Labels

TLDR
The training of tensor kernel RLS models for pair-input problems can be further accelerated by taking advantage of the sparsity of the training labels, demonstrated in a running time experiment and the applicability of the algorithm in a practical problem of predicting drug-target interactions.

A new case-based reasoning method based on dissimilar relations

TLDR
A new CBR approach that exploits the information about dissimilar relations for solving new problems by identifying dissimilar cases enables global utilization of more information from the case library and thereby contributes to the avoidance of the similarity constraint with a conventional CBR method.

Efficient Pairwise Learning Using Kernel Ridge Regression: an Exact Two-Step Method

TLDR
This work analyzes kernel-based methods for pairwise learning, with a particular focus on a recently-suggested two-step method, and shows that this method offers an appealing alternative for commonly-applied Kronecker- based methods that model dyads by means of pairwise feature representations and pairwise kernels.

Algebraic Shortcuts for Leave-One-Out Cross-Validation in Supervised Network Inference

TLDR
A series of leave-one-out cross-validation shortcuts to rapidly estimate the performance of state-of-the-art kernel-based network inference techniques.

Identification of Functionally Related Enzymes by Learning-to-Rank Methods

TLDR
It is shown that rankings of that kind can be substantially improved by applying kernel-based learning algorithms, which enables the detection of statistical dependencies between similarities of the active cleft and the biological function of annotated enzymes.

Learning monadic and dyadic relations : three case studies in systems biology

TLDR
This article elaborate on three applications that represent such a learning scenario: predicting functional relationships between enzymes in bioinformatics, predicting protein-ligand interactions in computational drug design and predicting heterotroph-methanotroph interactions in microbial ecology.

References

SHOWING 1-10 OF 83 REFERENCES

Conditional Ranking on Relational Data

TLDR
This work presents a general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects and shows empirically that incorporating domain knowledge in the model about the underlying relations can improve the generalization performance.

A new pairwise kernel for biological network inference with support vector machines

TLDR
The metric learning pairwise kernel is a new formulation to infer pairwise relationships with SVM, which provides state-of-the-art results for the inference of several biological networks from heterogeneous genomic data.

Inferring biological networks with output kernel trees

TLDR
Output kernel tree based methods provide an efficient tool for the inference of biological networks from experimental data and their simplicity and interpretability should make them of great value for biologists.

Link Prediction in Relational Data

TLDR
It is shown that the collective classification approach of RMNs, and the introduction of subgraph patterns over link labels, provide significant improvements in accuracy over flat classification, which attempts to predict each link in isolation.

On Representing and Generating Kernels by Fuzzy Equivalence Relations

TLDR
This paper provides a novel view on kernels based on fuzzy-logical concepts which allows to incorporate prior knowledge in the design process and demonstrates that kernels mapping to the unit interval with constant one in its diagonal can be represented by a commonly used fuzzy- logical formula for representing fuzzy rule bases.

Learning intransitive reciprocal relations with kernel methods

Large Margin Methods for Structured and Interdependent Output Variables

TLDR
This paper proposes to appropriately generalize the well-known notion of a separation margin and derive a corresponding maximum-margin formulation and presents a cutting plane algorithm that solves the optimization problem in polynomial time for a large class of problems.

Logical and relational learning

TLDR
A new view on logical and relational learning and its role in machine learning and artificial intelligence is reflected by identifying some of the lessons learned and formulating some challenges for future developments.

Preference Learning

TLDR
The editors first offer a thorough introduction, including a systematic categorization according to learning task and learning technique, along with a unified notation, and the first half of the book is organized into parts on applications of preference learning in multiattribute domains, information retrieval, and recommender systems.

Kernel Methods for Pattern Analysis

TLDR
This book provides an easy introduction for students and researchers to the growing field of kernel-based pattern analysis, demonstrating with examples how to handcraft an algorithm or a kernel for a new specific application, and covering all the necessary conceptual and mathematical tools to do so.
...