BERT in Plutarch's Shadows

  title={BERT in Plutarch's Shadows},
  author={Ivan P. Yamshchikov and Alexey N. Tikhonov and Yorgos Pantis and Charlotte Schubert and J{\"u}rg Jost},
The extensive surviving corpus of the ancient scholar Plutarch of Chaeronea (ca. 45-120 CE) also contains several texts which, according to current scholarly opinion, did not originate with him and are therefore attributed to an anonymous author Pseudo-Plutarch. These include, in particular, the work Placita Philosophorum (Quotations and Opin-ions of the Ancient Philosophers), which is extremely important for the history of ancient philosophy. Little is known about the identity of that… 

Figures and Tables from this paper



Aëtiana: The Method and Intellectual Context of a Doxographer, Volume III, Studies in the Doxographical Traditions of Ancient Philosophy

Ancient doxography, particularly as distilled in the work on problems of physics by Aetius, is a vital source for our knowledge of early Greek philosophy up to the first century BCE. But its purpose

Latin BERT: A Contextual Language Model for Classical Philology

It is shown that Latin BERT achieves a new state of the art for part-of-speech tagging on all three Universal Dependency datasets for Latin and can be used for predicting missing text (including critical emendations) and for semantically-informed search by querying contextual nearest neighbors.

A survey of modern authorship attribution methods

A survey of recent advances of the automated approaches to attributing authorship is presented, examining their characteristics for both text representation and text classification.

Quantitative patterns of stylistic influence in the evolution of literature

This study conducts the first large-scale temporal stylometric study of literature by using the vast holdings in the Project Gutenberg Digital Library corpus and gives quantitative support to the notion of a literary “style of a time” with a strong trend toward increasingly contemporaneous stylistic influence.

Restoring ancient text using deep learning: a case study on Greek epigraphy

Pythia is presented, the first ancient text restoration model that recovers missing characters from a damaged text input using deep neural networks, and sets the state-of-the-art inAncient text restoration.

BERT is Not an Interlingua and the Bias of Tokenization

Cananical Correlation Analysis of the internal representations of a pre- trained, multilingual BERT model reveals that the model partitions representations for each language rather than using a common, shared, interlingual space.

Aëtiana: The Method and Intellectual Context of a Doxographer, Volume I, The Sources

s), 19n23, 21n27, 27, 28n45, 45, 73, 77, 78, 80, 82, 83, 87, 90, 95, 97n200, 103, 108n230, 117, 123n266, 124, 126, 127, 128, 175, 223, 132, 133, 133n280, 151, 158, 161, 162, 162n335, 163, 164, 173,

BertAA : BERT fine-tuning for Authorship Attribution

BertAA is introduced, a fine-tuning of a pre-trained BERT language model with an additional dense layer and a softmax activation to perform authorship classification to reach competitive performances on Enron Email, Blog Authorship, and IMDb datasets.

Shakespearizing Modern Language Using Copy-Enriched Sequence to Sequence Models

This paper explores automated methods to transform text from modern English to Shakespearean English using an end to end trainable neural model with pointers to enable copy action and pre-train embeddings of words.

Neural Machine Translation of Rare Words with Subword Units

This paper introduces a simpler and more effective approach, making the NMT model capable of open-vocabulary translation by encoding rare and unknown words as sequences of subword units, and empirically shows that subword models improve over a back-off dictionary baseline for the WMT 15 translation tasks English-German and English-Russian by 1.3 BLEU.