Predictions of short-term driving intention using recurrent neural network on sequential data

@article{Xing2018PredictionsOS,
  title={Predictions of short-term driving intention using recurrent neural network on sequential data},
  author={Zhou Xing and Fei Xiao},
  journal={ArXiv},
  year={2018},
  volume={abs/1804.00532}
}
Predictions of driver's intentions and their behaviors using the road is of great importance for planning and decision making processes of autonomous driving vehicles. In particular, relatively short-term driving intentions are the fundamental units that constitute more sophisticated driving goals, behaviors, such as overtaking the slow vehicle in front, exit or merge onto a high way, etc. While it is not uncommon that most of the time human driver can rationalize, in advance, various on-road… 
2 Citations

Figures and Tables from this paper

Prediction of driver lane-change behavior: modeling, feature selection and evaluation
TLDR
This study uses statistical methods that allow a deeper understanding of the contribution of the individual feature to driver LC behavior and is more generalized in comparison to prior research where the feature selection methods tend to work only for one specific algorithm.

References

SHOWING 1-7 OF 7 REFERENCES
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
TLDR
These advanced recurrent units that implement a gating mechanism, such as a long short-term memory (LSTM) unit and a recently proposed gated recurrent unit (GRU), are found to be comparable to LSTM.
Sequence to Sequence Learning with Neural Networks
TLDR
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
On the Properties of Neural Machine Translation: Encoder–Decoder Approaches
TLDR
It is shown that the neural machine translation performs relatively well on short sentences without unknown words, but its performance degrades rapidly as the length of the sentence and the number of unknown words increase.
Generating Sequences With Recurrent Neural Networks
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach
Modeling of the Latent Embedding of Music using Deep Neural Network
TLDR
Deep convolutional neural network is proposed and experimented to imitate how human brain processes hierarchical structures in the auditory signals, such as music, speech, etc., at various timescales to discover the latent factor models of the music based upon acoustic hyper-images that are extracted from the raw audio waves of music.
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Reed , Dragomir Anguelov , Dumitru Erhan , Vincent Vanhoucke , and Andrew Rabinovich . Going deeper with convolutions . CoRR , abs / 1409