• Publications
  • Influence
Opinion Mining and Information Fusion: A survey
TLDR
In this paper we present a survey on Information Fusion applied to Opinion Mining. Expand
  • 130
  • 4
  • PDF
Deep contextualized word representations for detecting sarcasm and irony
TLDR
We propose a deep learning model that uses character-level vector representations of words, based on ELMo. Expand
  • 33
  • 2
  • PDF
Combining eye tracking, pupil dilation and EEG analysis for predicting web users click intention
TLDR
A physiological-based analysis for the assessment of web users click intention, by merging pupil dilation and electroencephalogram (EEG) responses. Expand
  • 24
  • 2
Refining Raw Sentence Representations for Textual Entailment Recognition via Attention
TLDR
In this paper we present the model used by the team Rivercorners for the 2017 RepEval shared task. Expand
  • 15
  • 1
  • PDF
Mining fine-grained opinions on closed captions of YouTube videos with an attention-RNN
TLDR
In this paper we target this phenomenon and introduce the first dataset created from closed captions of YouTube product review videos as well as a new attention-RNN model for aspect extraction and joint aspect extraction. Expand
  • 8
  • 1
  • PDF
IIIDYT at IEST 2018: Implicit Emotion Classification With Deep Contextualized Word Representations
TLDR
In this paper we describe our system designed for the WASSA 2018 Implicit Emotion Shared Task (IEST), which obtained 2nd place out of 30 teams with a test macro F1 score of 0.710. Expand
  • 6
  • 1
  • PDF
Predicting Web User Click Intention Using Pupil Dilation and Electroencephalogram Analysis
TLDR
In this work, a new approach for analysing the Web user behavior is introduced, consisting of a physiological-based click intention assessment, based on pupil dilation and electroencephalogram responses evaluation. Expand
  • 4
Content Aware Source Code Change Description Generation
TLDR
We propose to study the generation of descriptions from source code changes by integrating the messages included on code commits and the intra-code documentation inside the source in the form of docstrings. Expand
  • 5
  • PDF
Gating Mechanisms for Combining Character and Word-level Word Representations: An Empirical Study
TLDR
We provide strong empirical evidence that modeling characters improves the learned representations at the word and sentence levels, and that doing so is particularly useful when representing less frequent words. Expand
  • 2
  • PDF
IIIDYT at SemEval-2018 Task 3: Irony detection in English tweets
TLDR
We propose representation learning approach that relies on a multi-layered bidirectional LSTM and pre-trained word embeddings for irony detection in English tweets, a part of SemEval 2018. Expand
  • 2
  • PDF