Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Rapformer: Conditional Rap Lyrics Generation with Denoising Autoencoders
- Nikola I. Nikolov, Eric Malmi, Curtis G. Northcutt, Loreto Parisi
- Computer ScienceINLG
- 14 December 2020
Experimental results show that Rapformer is capable of generating technically fluent verses that offer a good trade-off between content preservation and style transfer, and a Turing-test-like experiment reveals that the method fools human lyrics experts 25% of the time.
Character-level Chinese-English Translation through ASCII Encoding
This paper enables character-level NMT for Chinese, by breaking down Chinese characters into linguistic units similar to that of Indo-European languages by using the Wubi encoding scheme, which preserves the original shape and semantic information of the characters, while also being reversible.
Character-Level Translation with Self-attention
- Yingqiang Gao, Nikola I. Nikolov, Yuhuang Hu, Richard H. R. Hahnloser
- Computer ScienceACL
- 30 April 2020
We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block…
Large-scale Hierarchical Alignment for Author Style Transfer
It is shown that pseudo-parallel sentences extracted from comparable corpora representative of two different author styles not only improve existing parallel data, but can even lead to competitive performance on their own.
Data-driven Summarization of Scientific Articles
This work generates two novel multi-sentence summarization datasets from scientific articles and test the suitability of a wide range of existing extractive and abstractive neural network-based summarization approaches, demonstrating that scientific papers are suitable for data-driven text summarization.
Large-Scale Hierarchical Alignment for Data-driven Text Rewriting
It is shown that pseudo-parallel sentences extracted with the proposed unsupervised method not only supplement existing parallel data, but can even lead to competitive performance on their own.
Abstractive Document Summarization without Parallel Data
This work develops an abstractive summarization system that relies only on large collections of example summaries and non-matching articles, consisting of an unsupervised sentence extractor that selects salient sentences to include in the final summary, as well as a sentence abstractor that is trained on pseudo-parallel and synthetic data.
Conditional Rap Lyrics Generation with Denoising Autoencoders
A method for automatically synthesizing a rap verse given an input text written in another form, such as a summary of a news article, to reconstruct rap lyrics from content words is developed.
Abstractive Document Summarization in High and Low Resource Settings
- Nikola I. Nikolov
- Computer Science
- 1 May 2020
Summary Refinement through Denoising
- Nikola I. Nikolov, Alessandro Calmanovici, Richard H. R. Hahnloser
- Computer ScienceRANLP
- 25 July 2019
We propose a simple method for post-processing the outputs of a text summarization system in order to refine its overall quality. Our approach is to train text-to-text rewriting models to correct…