Contrastive Estimation: Training Log-Linear Models on Unlabeled Data

@inproceedings{Smith2005ContrastiveET,
  title={Contrastive Estimation: Training Log-Linear Models on Unlabeled Data},
  author={Noah A. Smith and Jason Eisner},
  booktitle={ACL},
  year={2005}
}
Conditional random fields (Lafferty et al., 2001) are quite effective at sequence labeling tasks like shallow parsing (Sha and Pereira, 2003) and namedentity extraction (McCallum and Li, 2003). CRFs are log-linear, allowing the incorporation of arbitrary features into the model. To train on u labeled data, we requireunsupervisedestimation methods for log-linear models; few exist. We describe a novel approach,contrastive estimation . We show that the new technique can be intuitively understood… CONTINUE READING
Highly Influential
This paper has highly influenced 26 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 350 citations. REVIEW CITATIONS

6 Figures & Tables

Topics

Statistics

02040'04'06'08'10'12'14'16'18
Citations per Year

351 Citations

Semantic Scholar estimates that this publication has 351 citations based on the available data.

See our FAQ for additional information.