Semantic Scholar uses AI to extract papers important to this topic.
Long short-term memory (LSTM) networks are a state-of-the-art technique for sequence learning. They are less commonly applied to… Expand This paper introduces Grid Long Short-Term Memory, a network of LSTM cells arranged in a multidimensional grid that can be… Expand Relation classification is an important semantic processing task in the field of natural language processing (NLP). State-ofthe… Expand In this paper we address the question of how to render sequence-level networks better at handling structured input. We propose a… Expand Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of… Expand Long Short Term Memory (LSTM) networks have been demonstrated to be particularly useful for learning sequences containing longer… Expand Both Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) have shown improvements over Deep Neural Networks… Expand This work was sponsored in part by the U. S. Army Research Laboratory and the U. S. Army Research Office/nunder contract/grant… Expand Long Short-Term Memory (LSTM) is a specific recurrent neural network (RNN) architecture that was designed to model temporal… Expand Abstract : A dichotomy of human memory into immediate memory and long-term memory (associative memory, habit) has been widely… Expand