#### Filter Results:

- Full text PDF available (13)

#### Publication Year

2003

2014

- This year (0)
- Last 5 years (3)
- Last 10 years (12)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Tuyen N. Huynh, Raymond J. Mooney
- ICML
- 2008

Markov logic networks (MLNs) are an expressive representation for statistical relational learning that generalizes both first-order logic and graphical models. Existing methods for learning the logical structure of an MLN are not discriminative; however, many relational learning problems involve specific target predicates that must be inferred from given… (More)

- Lilyana Mihalkova, Tuyen N. Huynh, Raymond J. Mooney
- AAAI
- 2007

Transfer learning addresses the problem of how to leverage knowledge acquired in a source domain to improve the accuracy and speed of learning in a related target domain. This paper considers transfer learning with Markov logic networks (MLNs), a powerful formalism for learning in relational domains. We present a complete MLN transfer system that first… (More)

- Hung Hai Bui, Tuyen N. Huynh, Sebastian Riedel
- UAI
- 2013

Using the theory of group action, we first introduce the concept of the automorphism group of an exponential family or a graphical model, thus formalizing the general notion of symmetry of a probabilistic model. This automorphism group provides a precise mathematical framework for lifted inference in the general exponential family. Its group action… (More)

- Tuyen N. Huynh, Raymond J. Mooney
- ECML/PKDD
- 2009

Markov logic networks (MLNs) are an expressive representation for statistical relational learning that generalizes both first-order logic and graphical models. Existing discriminative weight learning methods for MLNs all try to learn weights that optimize the Conditional Log Likelihood (CLL) of the training examples. In this work, we present a new… (More)

- Tuyen N. Huynh, Raymond J. Mooney
- SDM
- 2011

Most of the existing weight-learning algorithms for Markov Logic Networks (MLNs) use batch training which becomes computationally expensive and even infeasible for very large datasets since the training examples may not fit in main memory. To overcome this problem, previous work has used online learning algorithms to learn weights for MLNs. However, this… (More)

- Raymond J. Mooney, William W. Cohen, +6 authors Maytal Saar-Tsechansky
- 2003

viii

- Hung B. Bui, Tuyen N. Huynh, Rodrigo de Salvo Braz
- AAAI
- 2012

The presence of non-symmetric evidence has been a barrier for the application of lifted inference since the evidence destroys the symmetry of the first-order probabilistic model. In the extreme case, if distinct soft evidence is obtained about each individual object in the domain then, often, all current exact lifted inference methods reduce to traditional… (More)

- Hung Hai Bui, Tuyen N. Huynh, David A Sontag
- UAI
- 2014

We analyze variational inference for highly symmetric graphical models such as those arising from first-order probabilistic models. We first show that for these graphical models, the treereweighted variational objective lends itself to a compact lifted formulation which can be solved much more efficiently than the standard TRW formulation for the ground… (More)

- Pedro Domingos, Kristen Grauman, +7 authors John Tai Hung Wong
- 2009

ix Chapter

- Tuyen N. Huynh, Raymond J. Mooney
- Statistical Relational Artificial Intelligence
- 2010

Most of the existing weight-learning algorithms for Markov Logic Networks (MLNs) use batch training which becomes computationally expensive and even infeasible for very large datasets since the training examples may not fit in main memory. To overcome this problem, previous work has used online learning algorithms to learn weights for MLNs. However, this… (More)