Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Latent-Variable Modeling of String Transductions with Finite-State Methods
A conditional loglinear model is presented for string-to-string transduction that employs overlapping features over latent alignment sequences, and which learns latent classes and latent string pair regions from incomplete training data, and it is demonstrated that latent variables can dramatically improve results, even when trained on small data sets.
LatticeRnn: Recurrent Neural Networks Over Lattices
- Faisal Ladhak, Ankur Gandhe, Markus Dreyer, Lambert Mathias, A. Rastrow, Björn Hoffmeister
- Computer ScienceINTERSPEECH
- 8 September 2016
It is shown that making decisions based on the full ASR output lattice makes SLU systems more robust to ASR errors, and this model generalizes recurrent neural networks to process weighted lattices as input, instead of sequences.
Zero-Shot Learning Across Heterogeneous Overlapping Domains
- Anjishnu Kumar, Pavankumar Reddy Muddireddy, Markus Dreyer, Björn Hoffmeister
- Computer ScienceINTERSPEECH
- 20 August 2017
This work presents a zero-shot learning approach for text classification, predicting which natural language understanding domain can handle a given utterance, and compares to generative baselines and shows that this approach requires less storage and performs better on new domains.
Efficiently Summarizing Text and Graph Encodings of Multi-Document Clusters
- Ramakanth Pasunuru, Mengwen Liu, Mohit Bansal, Sujith Ravi, Markus Dreyer
- Computer ScienceNAACL
- 1 June 2021
This paper presents an efficient graph-enhanced approach to multi-document summarization (MDS) with an encoder-decoder Transformer model that leads to significant improvements on the Multi-News dataset, overall leading to an average 1.8 ROUGE score improvement over previous work.
Transfer Learning for Neural Semantic Parsing
This paper proposes using sequence-to-sequence in a multi- task setup for semantic parsing with focus on transfer learning and shows that the multi-task setup aids transfer learning from an auxiliary task with large labeled data to the target task with smaller labeled data.
Comparing Reordering Constraints for SMT Using Efficient BLEU Oracle Computation
An empirical evaluation of popular reordering constraints: local constraints, the IBM constraints, and the Inversion Transduction Grammar (ITG) constraints are presented and it is shown that reordering under the ITG constraints can improve over the baseline by more than 7.5 Bleu points.
Final Report of the 2005 Language Engineering Workshop on Statistical Machine Translation by Parsing
Exploiting prosody for PCFGs with latent annotations
Novel methods for integrating prosody in syntax using generative models are proposed and it is shown that prosody improves a grammar in terms of accuracy as well as the parsimonious use of parameters.
HyTER: Meaning-Equivalent Semantics for Translation Evaluation
An annotation tool is developed that enables us to create representations that compactly encode an exponential number of correct translations for a sentence, and it is shown that this metric provides better estimates of machine and human translation accuracy than alternative evaluation metrics.
Just ASK: Building an Architecture for Extensible Self-Service Spoken Language Understanding
- Anjishnu Kumar, Arpit Gupta, Julian Chan, S. Tucker, Björn Hoffmeister, Markus Dreyer
- Computer ScienceArXiv
- 1 November 2017
The design of the machine learning architecture that underlies the Alexa Skills Kit (ASK) is presented, which was the first Spoken Language Understanding Software Development Kit (SDK) for a virtual digital assistant, as far as the authors are aware.