Support vector machine learning for interdependent and structured output spaces

Abstract

Learning general functional dependencies is one of the main goals in machine learning. Recent progress in kernel-based methods has focused on designing flexible and powerful input representations. This paper addresses the complementary issue of problems involving complex outputs such as multiple dependent output variables and structured output spaces. We propose to generalize multiclass Support Vector Machine learning in a formulation that involves features extracted jointly from inputs and outputs. The resulting optimization problem is solved efficiently by a cutting plane algorithm that exploits the sparseness and structural decomposition of the problem. We demonstrate the versatility and effectiveness of our method on problems ranging from supervised grammar learning and named-entity recognition, to taxonomic text classification and sequence alignment.

DOI: 10.1145/1015330.1015341
View Slides

Extracted Key Phrases

5 Figures and Tables

050100'04'06'08'10'12'14'16
Citations per Year

1,296 Citations

Semantic Scholar estimates that this publication has 1,296 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Tsochantaridis2004SupportVM, title={Support vector machine learning for interdependent and structured output spaces}, author={Ioannis Tsochantaridis and Thomas Hofmann and Thorsten Joachims and Yasemin Altun}, booktitle={ICML}, year={2004} }