#### Filter Results:

- Full text PDF available (229)

#### Publication Year

1993

2017

- This year (13)
- Last 5 years (72)
- Last 10 years (105)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Tommi S. Jaakkola, David Haussler
- NIPS
- 1998

Generative probability models deal with missing information and variable length sequences.

- Michael I. Jordan, Zoubin J. C. Ghahramani, Tommi S. Jaakkola, Lawrence K. Saul
- Machine Learning
- 1999

This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields). We present a number of examples of graphical models, including the QMR-DT database, the sigmoid belief network, the Boltzmann machine, and several variants of hidden Markov models, in… (More)

- Nathan Srebro, Jason D. M. Rennie, Tommi S. Jaakkola
- NIPS
- 2004

We present a novel approach to collaborative prediction, using low-norm instead of low-rank factorizations. The approach is inspired by, and has strong connections to, large-margin linear discrimination. We show how to learn low-norm factorizations by solving a semi-definite program, and discuss generalization error bounds for them.

- Martin J. Wainwright, Tommi S. Jaakkola, Alan S. Willsky
- IEEE Transactions on Information Theory
- 2005

We develop and analyze methods for computing provably optimal maximum a posteriori probability (MAP) configurations for a subclass of Markov random fields defined on graphs with cycles. By decomposing the original distribution into a convex combination of tree-structured distributions, we obtain an upper bound on the optimal value of the original problem… (More)

- Tommi S. Jaakkola, Mark Diekhans, David Haussler
- Journal of Computational Biology
- 2000

A new method for detecting remote protein homologies is introduced and shown to perform well in classifying protein domains by SCOP superfamily. The method is a variant of support vector machines using a new kernel function. The kernel function is derived from a generative statistical model for a protein family, in this case a hidden Markov model. This… (More)

- Martin J. Wainwright, Tommi S. Jaakkola, Alan S. Willsky
- IEEE Transactions on Information Theory
- 2002

We introduce a new class of upper bounds on the log partition function of a Markov random field (MRF). This quantity plays an important role in various contexts, including approximating marginal distributions, parameter estimation, combinatorial enumeration, statistical decision theory, and large-deviations bounds. Our derivation is based on concepts from… (More)

- Itamar Simon, John Barnett, +8 authors Richard A. Young
- Cell
- 2001

Genome-wide location analysis was used to determine how the yeast cell cycle gene expression program is regulated by each of the nine known cell cycle transcriptional activators. We found that cell cycle transcriptional activators that function during one stage of the cell cycle regulate transcriptional activators that function during the next stage. This… (More)

Linear Programming (LP) relaxations have become powerful tools for finding the most probable (MAP) configuration in graphical models. These relaxations can be solved efficiently using message-passing algorithms such as belief propagation and, when the relaxation is tight, provably find the MAP configuration. The standard LP relaxation is not tight enough in… (More)

- Amir Globerson, Tommi S. Jaakkola
- NIPS
- 2007

We present a novel message passing algorithm for approximating the MAP problem in graphical models. The algorithm is similar in structure to max-product but unlike max-product it always converges, and can be proven to find the exact MAP solution in various settings. The algorithm is derived via block coordinate descent in a dual of the LP relaxation of MAP,… (More)

- Nathan Srebro, Tommi S. Jaakkola
- ICML
- 2003

We study the common problem of approximating a target matrix with a matrix of lower rank. We provide a simple and efficient (EM) algorithm for solving weighted low-rank approximation problems, which, unlike their unweighted version, do not admit a closedform solution in general. We analyze, in addition, the nature of locally optimal solutions that arise in… (More)