An introduction to Deep Learning in Natural Language Processing: Models, techniques, and tools

@article{Lauriola2021AnIT,
  title={An introduction to Deep Learning in Natural Language Processing: Models, techniques, and tools},
  author={Ivano Lauriola and Alberto Lavelli and Fabio Aiolli},
  journal={Neurocomputing},
  year={2021}
}

Natural Language Processing with Improved Deep Learning Neural Networks

TLDR
A transfer-based dependent syntax analyzer using a feed-forward neural network as a classifier and the characteristics extracted by the syntactic analyzer as its input to train a recursive neural network classifier optimized by sentences is proposed.

Computational linguistics and discourse complexology: Paradigms and research methods

The dramatic expansion of modern linguistic research and enhanced accuracy of linguistic analysis have become a reality due to the ability of artificial neural networks not only to learn and adapt,

Analysis of Audio Signals Using Deep Learning Algorithms Applied to COVID Diagnostic Systems

TLDR
This article presents a literature review of the main techniques that have been used in recent years for analysis, feature extraction and classification from audio spectra or spectrograms, as well as examples of application in the context of the COVID-19 pandemic.

Aggression Detection in Social Media from Textual Data Using Deep Learning Models

TLDR
This work extracted eight novel emotional features and used a newly designed deep neural network with only three numbers of layers to identify aggressive statements and achieves an F1 score of 97%, surpassing the state-of-the-art models by a significant margin.

Forward Composition Propagation for Explainable Neural Reasoning

TLDR
An algorithm called Forward Composition Propagation (FCP) to explain the predictions of feed-forward neural networks operating on structured pattern recognition problems and a case study concerning bias detection in a state-of-the-art problem in which the ground truth is known.

Densely Convolutional Neural Network for Transcription Factor Binding Sites Prediction Using DNA Sequence and Histone Modification

TLDR
This work proposes a novel densely convolutional model using DNA sequences and histone modifications for TFBSs prediction, which significantly outperforms several state-of-the-art prediction methods in terms of accuracy, ROC-A UC and PR-AUC.

CNN VE LSTM TABANLI HİBRİT BİR DERİN ÖĞRENME MODELİ İLE ÇOK ETİKETLİ METİN ANALİZİ

  • Halit Çetiner
  • Computer Science
    Adıyaman Üniversitesi Mühendislik Bilimleri Dergisi
  • 2022
TLDR
A hybrid model based on CNN and LSTM has been proposed to automatically classify all written social sharing content, both positive and negative, into defined target tags, and the obtained performance results show that the proposed method can be applied to different multilabel text analysis problems.

Recognition of Persian/Arabic Handwritten Words Using a Combination of Convolutional Neural Networks and Autoencoder (AECNN)

TLDR
A new subword fusion algorithm is proposed based on the similarity of the main subwords and signs that is much more capable than the other methods known in the literature.

Face Recognition Based on Deep Learning and FPGA for Ethnicity Identification

TLDR
A new Deep Learning (DL) approach based on a Deep Convolutional Neural Network (DCNN) model is developed, which outperforms a reliable determination of the ethnicity of people based on their facial features and is compared against that using graphics processing units (GPUs).

Phish Responder: A Hybrid Machine Learning Approach to Detect Phishing and Spam Emails

TLDR
This research investigated the threat of phishing and spam and developed a detection solution to address this challenge, which was evaluated by comparing it with other solutions and through an independent t-test which demonstrated that the numerical-based technique is statistically significantly better than existing approaches.

References

SHOWING 1-10 OF 96 REFERENCES

Natural Language Processing Advancements By Deep Learning: A Survey

TLDR
This survey categorizes and addresses the different aspects and applications of NLP that have benefited from deep learning and describes how deep learning methods and models advance these areas.

A Survey of the Usages of Deep Learning for Natural Language Processing

TLDR
An introduction to the field and a quick overview of deep learning architectures and methods is provided and a discussion of the current state of the art is provided along with recommendations for future research in the field.

Exploring the Limits of Language Modeling

TLDR
This work explores recent advances in Recurrent Neural Networks for large scale Language Modeling, and extends current models to deal with two key challenges present in this task: corpora and vocabulary sizes, and complex, long term structure of language.

Survey of Neural Text Representation Models

TLDR
This survey systematize and analyze 50 neural models from the last decade, focusing on task-independent representation models, discuss their advantages and drawbacks, and subsequently identify the promising directions for future neural text representation models.

Transformers: State-of-the-Art Natural Language Processing

TLDR
Transformers is an open-source library that consists of carefully engineered state-of-the art Transformer architectures under a unified API and a curated collection of pretrained models made by and available for the community.

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

TLDR
A benchmark of nine diverse NLU tasks, an auxiliary dataset for probing models for understanding of specific linguistic phenomena, and an online platform for evaluating and comparing models, which favors models that can represent linguistic knowledge in a way that facilitates sample-efficient learning and effective knowledge-transfer across tasks.

Sequence to Sequence Learning with Neural Networks

TLDR
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.

A Survey on Neural Network Language Models

TLDR
The structure of classic NNLMs is described firstly, and then some major improvements are introduced and analyzed, and some research directions of NNL Ms are discussed.

A Convolutional Neural Network for Modelling Sentences

TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations.

Deep Contextualized Word Representations

TLDR
A new type of deep contextualized word representation is introduced that models both complex characteristics of word use and how these uses vary across linguistic contexts, allowing downstream models to mix different types of semi-supervision signals.
...