Detecting dynamics of action in text with a recurrent neural network

@article{Gruber2021DetectingDO,
  title={Detecting dynamics of action in text with a recurrent neural network},
  author={N. Gruber},
  journal={Neural Comput. Appl.},
  year={2021},
  volume={33},
  pages={15709-15718}
}
  • N. Gruber
  • Published 15 June 2021
  • Computer Science
  • Neural Comput. Appl.
According to the dynamics of action (DoA)-theory, action is an interplay of instigating and consummatory forces over time. The TAT/PSE—a psychological test instrument—should measure this dynamics. Therefore, people get presented different pictures with the instruction to invent stories. In those stories, the periodical tendencies should be visible, but this could not be shown yet. I reanalyzed two datasets regarding category IS: They were coded by a human expert, a recurrent neural network (RNN… 
1 Citations
The Implicit Achievement Motive in the Writing Style.
  • N. Gruber
  • Linguistics
    Journal of psycholinguistic research
  • 2022
Linguistic theories and research indicate that unconscious processes should influence the content, but moreover also the way how things are expressed. As the first is well researched and the second

References

SHOWING 1-10 OF 40 REFERENCES
Are GRU Cells More Specific and LSTM Cells More Sensitive in Motive Classification of Text?
TLDR
Results show that the GRUs outperform LSTMs for overall motive coding and showed that GRUs have higher specificity (true negative rate) and learn better less prevalent content than LSTM cells, while a closer look at a picture x category matrix reveals that L STMs outperform GRUs only where deep context understanding is important.
Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures
TLDR
The experimental results show that: 1) self-attentional networks and CNNs do not outperform RNNs in modeling subject-verb agreement over long distances; 2)Self-att attentional networks perform distinctly better than RNN's and CNN's on word sense disambiguation.
Understanding Convolutional Neural Networks for Text Classification
TLDR
An analysis into the inner workings of Convolutional Neural Networks for processing text shows that filters may capture several different semantic classes of ngrams by using different activation patterns, and that global max-pooling induces behavior which separates important n grams from the rest.
Relation Classification: CNN or RNN?
  • Dongxu Zhang, Dong Wang
  • Computer Science
    NLPCC/ICCPOL
  • 2016
TLDR
A model based on recurrent neural networks (RNN) is presented and the experimental results strongly indicate that even with a simple RNN structure, the model can deliver much better performance than CNN, particularly for long-span relations.
Comparative Study of CNN and RNN for Natural Language Processing
TLDR
This work is the first systematic comparison of CNN and RNN on a wide range of representative NLP tasks, aiming to give basic guidance for DNN selection.
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
TLDR
A systematic evaluation of generic convolutional and recurrent architectures for sequence modeling concludes that the common association between sequence modeling and recurrent networks should be reconsidered, and convolutionals should be regarded as a natural starting point for sequence modeled tasks.
Question Answering over Freebase via Attentive RNN with Similarity Matrix based CNN
TLDR
An attentive recurrent neural network with similarity matrix based convolutional neural network (AR-SMCNN) model, which is able to capture comprehensive hierarchical information utilizing the advantages of both RNN and CNN, and a new heuristic extension method for entity detection, which significantly decreases the effect of noise.
Recent Trends in Deep Learning Based Natural Language Processing [Review Article]
TLDR
This paper reviews significant deep learning related models and methods that have been employed for numerous NLP tasks and provides a walk-through of their evolution.
The projective expression of needs; the effect of different intensities of the hunger drive on thematic apperception.
TLDR
The first experiment in this series attempted to measure the effects of different intensities of the hunger drive on perception for which the objective determinants had been reduced to a minumum and provided clues as to how perceptual material should be interpreted to diagnose the strength of a drive such as hunger.
...
...