A New History of Modern Computing

  title={A New History of Modern Computing},
  author={Thomas Haigh and Paul E. Ceruzzi},

Auditing Advanced Information Systems and Technologies in a Modern Digital World

In 1965 Gordon Moore posited that the number of transistors on microchips doubles every 2 years, implying that the technical developments underlying the authors' increasingly complex systems continue to develop at an impressive pace.

The emergence of protein dynamics simulations: how computational statistical mechanics met biochemistry

It is illustrated how Martin Karplus and his research group effectively set in motion the engine of molecular dynamics simulations of biomolecules between 1969 and the early 1970s with Karplus’ landing in biology.

A Case Report On The "A.I. Locked-In Problem": social concerns with modern NLP

Practical experimentation with GPT-3 shows that there is a recurring problem with these modern NLP systems, namely that they can “get stuck” in the narrative so that further conversations, prompt executions or commands become futile.

Becoming universal

A new history of modern computing.

The immortal soul of an old machine

Taking apart a book to figure out how it works is a great way to learn more about how the world works.



This Is Not a Computer: Negotiating the Microprocessor

  • Z. Stachniak
  • Computer Science
    IEEE Annals of the History of Computing
  • 2013
The Intel 4004 μ-Computer is the earliest known microprocessor-based hardware distributed by Intel and is related to the liminal period in the corporate history of Intel when the company was wrestling with the "one-chip CPU--computer or component?" dilemma.

Demography and Decentralization: Measuring the Bulletin Board Systems of North America

  • History
  • 2020
For many home computer enthusiasts of the 1980s and 1990s, a local dial-up bulletin board system, or BBS, provided the first opportunity to get online, chat with strangers, share files, and play

Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing

How Britain lost its early dominance in computing by systematically discriminating against its most qualified workers: women. In 1944, Britain led the world in electronic computing. By 1974, the

Historical Reflections Where Code Comes From : Architectures of Automatic Control from Babbage to Algol

The work of Charles Babbage and Ada Lovelace provides an important milestone on the road to this invention, but marks the beginning of the story rather than its end, before moving on to the 1940s when their ideas were independently rediscovered, extended, and finally realized in actual machinery.

The Graphics System for the 80's

This is the story of the creation of a new graphics system and the startup company that produced it in the early days of raster computer graphics.

Cookies: a legacy of controversy

Abstract Cookies are a legacy of the early commercial web. Developed and deployed by Netscape in 1994, debated by developers since 1995, and subject to political scrutiny since 1998, cookies have

Fighting and Framing Fake News

We begin by looking at definitions of fake news, taking ideas from science studies and philosophy to argue that the status of a news story as real or fake depends not on its truth content or on the