Shiyu Wu

Learn More
Neural network based methods have obtained great progress on a variety of natural language processing tasks. However, it is still a challenge task to model long texts, such as sentences and documents. In this paper, we propose a multi-timescale long short-term memory (MT-LSTM) neu-ral network to model long texts. MT-LSTM partitions the hidden states of the(More)
Recently, neural network based sentence modeling methods have achieved great progress. Among these methods, the re-cursive neural networks (RecNNs) can effectively model the combination of the words in sentence. However, RecNNs need a given external topological structure , like syntactic tree. In this paper, we propose a gated recursive neural network(More)
Using a cross-modal priming task, the present study explores whether Chinese-English bilinguals process goal related information during auditory comprehension of English narratives like native speakers. Results indicate that English native speakers adopted both mechanisms of suppression and enhancement to modulate the activation of goals and keep track of(More)
  • 1