Large Margin Methods for Structured and Interdependent Output Variables


Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary issue of designing classification algorithms that can deal with more complex outputs, such as trees, sequences, or sets. More generally, we consider problems involving multiple dependent output variables, structured output spaces, and classification problems with class attributes. In order to accomplish this, we propose to appropriately generalize the well-known notion of a separation margin and derive a corresponding maximum-margin formulation. While this leads to a quadratic program with a potentially prohibitive, i.e. exponential, number of constraints, we present a cutting plane algorithm that solves the optimization problem in polynomial time for a large class of problems. The proposed method has important applications in areas such as computational biology, natural language processing, information retrieval/extraction, and optical character recognition. Experiments from various domains involving different types of output spaces emphasize the breadth and generality of our approach.

Extracted Key Phrases

Citations per Year

2,089 Citations

Semantic Scholar estimates that this publication has 2,089 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Tsochantaridis2005LargeMM, title={Large Margin Methods for Structured and Interdependent Output Variables}, author={Ioannis Tsochantaridis and Thorsten Joachims and Thomas Hofmann and Yasemin Altun}, journal={Journal of Machine Learning Research}, year={2005}, volume={6}, pages={1453-1484} }