Skip to search formSkip to main contentSkip to account menu

Bidirectional recurrent neural networks

Bidirectional Recurrent Neural Networks(BRNN) were invented in 1997 by Schuster & Paliwal. BRNN is introduced to increase the amount of input… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Review
2018
Review
2018
................................................................................................................................... iv ACKNOWLEDGEMENTS ............................................................................................................ vi TABLE OF CONTENTS ............................................................................................................... vii LIST OF TABLES .......................................................................................................................... ix LIST OF FIGURES ......................................................................................................................... x CHAPTER 1: INRTODUCTION .................................................................................................... 1 1.1 Research Problem and Motivation ......................................................................................... 7 1.2 Contributions ......................................................................................................................... 9 CHAPTER 2: LITERATURE SURVEY ....................................................................................... 11 2.1 Traditional Methods for Natural Language Processing ....................................................... 11 2.1.1 N-gram Models ............................................................................................................. 13 2.1.2 Structured Language Models ........................................................................................ 14 2.1.3 Word Vector Representations ....................................................................................... 15 2.2 Neural Networks: Basics and Definitions ............................................................................ 16 2.2.1 Neural Network Language Models (NNLMs) .............................................................. 19 2.2.2 Feedforward Neural Network Based Language Models (FFNNLMs).......................... 20 2.3 Deep Learning Background ................................................................................................. 21 2.4 Deep Learning for Natural Language Processing ................................................................ 22 2.4.1 Windows-Based Neural Networks ................................................................................ 24 2.5 Convolutional Neural Networks (CNNs) ............................................................................. 26 2.5.1 Pooling Layer ................................................................................................................ 27 2.6 Convolution Neural Networks for Natural Language Processing (CNNs-NLP) ................. 29 2.7 GoogLeNet: Inception Convolution Neural Networks ........................................................ 31 2.8 Recurrent Neural Networks (RNNs) .................................................................................... 35 2.8.1 Recurrent Neural Networks Based Language Models (RNNLMs) .............................. 39 2.8.2 The Problem of Long-Term Dependencies ................................................................... 40 2.8.3 Vanishing and Exploding Gradients ............................................................................. 42 2.9 Long Short-Term Memory (LSTM) .................................................................................... 43 2.10 Bidirectional Recurrent Neural Networks (BRNNs) ......................................................... 45 2.11 Gated Recurrent Unite (GRU) ........................................................................................... 46 2.12 Vector Representations of Words ...................................................................................... 47 2.13 Combination of Convolution Neural Networks and Recurrent Neural Networks (CNNsRNNs) ........................................................................................................................................ 49 CHAPTER 3: RESEARCH PLAN ................................................................................................ 51 3.1 Deep Neural Network Language Model for Text Classification ......................................... 51 3.2 The Embedding Layer .......................................................................................................... 51 
2018
2018
Describing videos in human language is of vital importance in many applications, such as managing massive videos on line and… 
2017
2017
We propose a neural network model for coordination boundary detection. Our method relies on the two common properties… 
Review
2017
Review
2017
Recently the area of image captioning has received a lot of attention from researchers and academia. Image caption generation… 
2017
2017
Learning algorithms for natural language processing (NLP) tasks traditionally rely on manually defined relevant contextual… 
2016
2016
The vanilla attention-based neural machine translation has achieved promising performance because of its capability in leveraging… 
2016
2016
Convolutional and bidirectional recurrent neural networks have achieved considerable performance gains as acoustic models in… 
2015
2015
Deeply-stacked Bidirectional Recurrent Neural Networks (BiRNNs) are able to capture complex, short-and long-term, context… 
2003
2003
Predicting the secondary structure of a protein is a main topic in bioinformatics. A reliable predictor is needed by threading… 
2003
2003
In this paper, the problem of text-to-phoneme mapping of isolated words for the English language is studied. Multilayer…