Learning to Capitalize with Character-Level Recurrent Neural Networks: An Empirical Study

@inproceedings{Susanto2016LearningTC,
  title={Learning to Capitalize with Character-Level Recurrent Neural Networks: An Empirical Study},
  author={Raymond Hendy Susanto and Hai Leong Chieu and Wei Lu},
  booktitle={EMNLP},
  year={2016}
}
In this paper, we investigate case restoration for text without case information. Previous such work operates at the word level. We propose an approach using character-level recurrent neural networks (RNN), which performs competitively compared to language modeling and conditional random fields (CRF) approaches. We further provide quantitative and qualitative analysis on how RNN helps improve truecasing. 
Do Character-Level Neural Network Language Models Capture Knowledge of Multiword Expression Compositionality?
Ner and Pos When Nothing Is Capitalized
Case-Sensitive Neural Machine Translation
Truecasing German user-generated conversational text
Robust Named Entity Recognition with Truecasing Pretraining
Sarah's Participation in WAT 2019
On the Multi-Property Extraction and Beyond
Normalization of historical texts with neural network models
...
1
2
...

References

SHOWING 1-10 OF 30 REFERENCES
Character-Aware Neural Language Models
Capitalizing Machine Translation
Recurrent neural network based language model
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
Learning long-term dependencies with gradient descent is difficult
ResToRinG CaPitaLiZaTion in #TweeTs
Effective Approaches to Attention-based Neural Machine Translation
Weak Semi-Markov CRFs for Noun Phrase Chunking in Informal Text
Adaptation of maximum entropy capitalizer: Little data can help a lot
...
1
2
3
...