• Corpus ID: 231792203

Entropy of Sounds: Sonnets to Battle Rap

@inproceedings{Ackerman2020EntropyOS,
  title={Entropy of Sounds: Sonnets to Battle Rap},
  author={Jordan Ackerman},
  booktitle={CogSci},
  year={2020}
}
Poetry and lyrics across cultures, from Sonnets to Rap, demonstrate an obvious human cognitive capacity for the perception and production of various multi-syllable sound patterns. Here we use entropy to measure discrete serialized representations of phones and to explore the complexity of these sound structures across genres of creative language arts. The present exploratory analysis has two main objectives. First, our aim is to broaden the scope of cognitive processes and data that are… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 20 REFERENCES

Vowel transitions in the sonnets of Shakespeare: an information theoretic analysis

This paper argues for the importance of articulatory phonology in the study of poetic form and style, showing how a shift of focus away from symbol-probability centric analyses and towards

Information theoretic approaches to phonological structure: the case of Finnish vowel harmony

A natural implementation of autosegmental phonology within an information theoretic perspective is explored, and it is found that it is empirically inadequate; that is, it performs more poorly than a simple bigram model.

Unsupervised Rhyme Scheme Identification in Hip Hop Lyrics Using Hidden Markov Models

This work attacks a woefully under-explored language genre--lyrics in music--introducing a novel hidden Markov model based method for completely unsupervised identifica-tion of rhyme schemes in hip hop lyrics, which is the first such effort.

Information entropy of humpback whale songs.

This analysis demonstrates that there is a strong structural constraint in the generation of the songs, and the structural constraints exhibit periodicities with periods of 6-8 and 180-400 units, implying that no empirical Markov model is capable of representing the songs' structure.

Universal Entropy of Word Ordering Across Linguistic Families

A relative entropy measure is computed to quantify the degree of ordering in word sequences from languages belonging to several linguistic families to indicate that despite the differences in the structure and vocabulary of the languages analyzed, the impact of word ordering in theructure of language is a statistical linguistic universal.

Entropy and Long-Range Correlations in Literary English

We investigated long-range correlations in two literary texts, "Moby Dyck" by H. Melville and Grimm's tales. The analysis is based on the calculation of entropylike quantities as the mutual

MEASURING INFORMATION IN JAZZ IMPROVISATION

Scott J. Simon School of Library and Information Science, University of South Florida, 4202 E. Fowler Ave, CIS1040, Tampa, FL 33620-7800. Email:ssimon@cas.usf.edu. Phone: 813.974.3521. Fax:

A convergent gambling estimate of the entropy of English

In his original paper on the subject, Shannon found upper and lower bounds for the entropy of printed English based on the number of trials required for a subject to guess subsequent symbols in a given text by the Shannon-McMillan-Breiman theorem.

Prediction and entropy of printed English

A new method of estimating the entropy and redundancy of a language is described. This method exploits the knowledge of the language statistics possessed by those who speak the language, and depends

Unsupervised Discovery of Rhyme Schemes

This paper describes an unsupervised, language-independent model for finding rhyme schemes in poetry, using no prior knowledge about rhyme or pronunciation.