#### Filter Results:

- Full text PDF available (11)

#### Publication Year

2012

2017

- This year (3)
- Last 5 years (14)
- Last 10 years (14)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Thang Luong, Hieu Pham, Christopher D. Manning
- EMNLP
- 2015

An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach… (More)

- Thang Luong, Hieu Pham, Christopher D. Manning
- VS@HLT-NAACL
- 2015

Recent work in learning bilingual representations tend to tailor towards achieving good performance on bilingual tasks, most often the crosslingual document classification (CLDC) evaluation, but to the detriment of preserving clustering structures of word representations monolingually. In this work, we propose a joint model to learn word representations… (More)

- Irwan Bello, Hieu Pham, Quoc V. Le, Mohammad Norouzi, Samy Bengio
- ArXiv
- 2016

This paper presents a framework to tackle combinatorial optimization problems using neural networks and reinforcement learning. We focus on the traveling salesman problem (TSP) and train a recurrent neural network that, given a set of city coordinates, predicts a distribution over different city permutations. Using negative tour length as the reward signal,… (More)

- Hieu Pham, Thang Luong, Christopher D. Manning
- VS@HLT-NAACL
- 2015

We propose a novel approach to learning distributed representations of variable-length text sequences in multiple languages simultaneously. Unlike previous work which often derive representations of multi-word sequences as weighted sums of individual word vectors, our model learns distributed representations for phrases and sentences as a whole. Our work is… (More)

- Hieu Pham, Zihang Dai, Lei Li
- 2015

Motivation and contribution Recurrent neural networks (RNN) with long short-term memory (LSTM) are recently proposed to model sequences without prior domain knowledge [3, 6]. In these work, the authors empirically observed that RNN-LSTMs trained with vanilla optimization algorithms, such as stochastic gradient descent (SGD) with a simple learning rate… (More)

- Azalia Mirhoseini, Hieu Pham, +7 authors Jeff Dean
- ICML
- 2017

The past few years have witnessed a growth in size and computational requirements for training and inference with neural networks. Currently, a common approach to address these requirements is to use a heterogeneous distributed environment with a mixture of hardware devices such as CPUs and GPUs. Importantly, the decision of placing parts of the neural… (More)

- Hieu Pham
- 2016

Single-source shortest paths. In the single-source shortest paths problem (SSSP), we are given a graph G = (V,E) and a source node s ∈ V , and we must compute d(s, v) for all v ∈ V . If the graph is unweighted, we can solve this in O(m + n) time by breadth-first search (BFS). If the graph has nonnegative integer weights, we can use Dijkstra’s algorithm with… (More)

We present a framework to tackle combinatorial optimization problems using neural networks and reinforcement learning. We focus on the traveling salesman problem (TSP) and train a recurrent neural network that, given a set of city coordinates, predicts a distribution over different city permutations. Using negative tour length as the reward signal, we… (More)

- Hieu Pham, Tam Bui, Hiroshi Hasegawa
- 2014

This paper describes an evolutionary strategy called PSOGA-NN, which uses Neural Network (NN) for selfadaptive control of hybrid Particle Swarm Optimization and Adaptive Plan system with Genetic Algorithm (PSO-APGA) to solve large scale problems and constrained real-parameter optimization. This approach combines the search ability of all optimization… (More)

- Tam Bui, Hieu Pham, Hiroshi Hasegawa
- AsiaSim
- 2012