How to Do Digital Philosophy of Science

  title={How to Do Digital Philosophy of Science},
  author={Charles H. Pence and Grant Ramsey},
  journal={Philosophy of Science},
  pages={930 - 941}
Philosophy of science is expanding via the introduction of new digital data and tools for their analysis. The data comprise digitized published books and journal articles, as well as heretofore unpublished material such as images, archival text, notebooks, meeting notes, and programs. The growth in available data is matched by the extensive development of automated analysis tools. The variety of data sources and tools can be overwhelming. In this article, we survey the state of digital work in… 
Digital Literature Analysis for Empirical Philosophy of Science
Empirical philosophers of science aim to base their philosophical theories on observations of scientific practice. But since there is far too much science to observe it all, how can we form and test
When philosophy (of science) meets formal methods: a citation analysis of early approaches between research fields
C citation analysis is used to identify, among the articles published in Synthese and Philosophy of Science between 1985 and 2021, those that cite the specialistic literature in game theory and network science, and a reference map of philosophy is constructed, on which logic is distributed in a more uniform way than recently encountered disciplines such as game theoryand network science.
Eight journals over eight decades: a computational topic-modeling approach to contemporary philosophy of science
As a discipline of its own, the philosophy of science can be traced back to the founding of its academic journals, some of which go back to the first half of the twentieth century. While the
Quantitative methods in philosophy of language
Correspondence Rafael Ventura, Department of Philosophy, Bilkent University, Ankara 06800 Turkey. Email: Abstract In this paper, I survey and defend the use of quantitative
Revisiting three decades of Biology and Philosophy: a computational topic-modeling perspective
This article proposes to approach the history of the philosophy of biology with a complementary data-driven perspective that makes use of statistical algorithms applied to the complete full-text corpus of one major journal of the field—Biology and Philosophy—from its launch in 1986 up until 2017.
Ordinary Meaning and Consilience of Evidence 1
: In this chapter I note two recent trends, one in experimental jurisprudence and one in experimental philosophy. First, some work in experimental jurisprudence has pushed for moving beyond textual
Acknowledgments-based networks for mapping the social structure of research fields. A case study on recent analytic philosophy
This study introduces a new interpretative framework in which the acknowledgments of academic publications are intended as positioning signals exchanged by researchers, and provides the formal definition of the four acknowledgments-based networks that stand at the core of the method.
Media memory in the digital world
The article analyses how historical memory is being formed in the modern digital realm. The authors show the emergence of a new form of historical memory, characteristic of the digital era, which we
The unexamined philosophy is not worth doing: An introduction to New Directions in Metaphilosophy
  • Y. Shan
  • Philosophy
  • 2022
Recently there has been an increasing interest in metaphilosphy. The aim of philosophy has been examined. The development of philosophy has also been scrutinised. With the development of new
The public relevance of philosophy
Various authors have recently expressed doubts about the public relevance of philosophy. These doubts target both academic philosophy in general and particular subfields of philosophy. This paper


Quantitative Analysis of Culture Using Millions of Digitized Books
This work surveys the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000, and shows how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology and the pursuit of fame.
Text Analysis with R for Students of Literature
  • L. Lei
  • Art
    J. Quant. Linguistics
  • 2016
The author is more of a storyteller than a serious professor and always explains the seemingly monotonous technical odds and ends with analogies and metaphors, which makes the book an obviously witty and enjoyable read.
Characterizing the Google Books Corpus: Strong Limits to Inferences of Socio-Cultural and Linguistic Evolution
Overall, the findings call into question the vast majority of existing claims drawn from the Google Books corpus, and point to the need to fully characterize the dynamics of the corpus before using these data sets to draw broad conclusions about cultural and linguistic evolution.
JSTOR - Data for Research
JSTOR has created a new tool called "Data for Research" that allows users to interact with the corpus in new ways, and using DfR researchers can now explore the content visually, analyze the text and the references, and download complex datasets for offline analysis.
Culturomics: statistical traps muddy the data.
In their generally worthwhile discussion of developments in the English language (“Quantitative analysis of culture using millions of digitized books,” Research Article, 14 January, p. [176][1]),
What Is Textual Analysis
The starting point for our study consisted of two different kinds of analysis of 51 texts authored by 45 astronauts and cosmonauts either during their space travel (n= 17) -available at
Exploration and exploitation of Victorian science in Darwin’s reading notebooks
Ten Simple Rules for Creating a Good Data Management Plan
A data management plan is a document that describes how you will treat your data during a project and what happens with the data after the project ends, and is used in part to evaluate a project’s merit.
Computational methods in authorship attribution
Three scenarios are considered here for which solutions to the basic attribution problem are inadequate; it is shown how machine learning methods can be adapted to handle the special challenges of that variant.
Voyant Tools
Voyant Tools ( is a free, open source text analysis software package that is a great option as an entry level digital humanities suite. This web-based set of tools has an