Production of Large Computer Programs

@article{Benington1983ProductionOL,
  title={Production of Large Computer Programs},
  author={H. D. Benington},
  journal={Annals of the History of Computing},
  year={1983},
  volume={5},
  pages={350-361}
}
  • H. D. Benington
  • Published 1983
  • Engineering, Computer Science
  • Annals of the History of Computing
The paper is adapted from a presentation at a symposium on advanced programming methods for digital computers sponsored by the Navy Mathematical Computing Advisory Panel and the Office of Naval Research in June 1956. The author describes the techniques used to produce the programs for the Semi-Automatic Ground Environment (SAGE) system. 

Figures and Topics from this paper

Role of Mid-Fidelity Prototypes in Facilitating Open Book Accounting
MOHAMMADHOSSEIN KHADJEHALI: Role of Mid-Fidelity Prototypes in Facilitating Open Book Accounting Tampere University of technology Master of Science Thesis, 110 pages, 3 Appendix pages June 2014Expand
Software Development Processes
TLDR
One of Software Engineering’s primary concerns is to establish methods and processes that should be followed in order to develop quality products in the best possible way in a given setting. Expand
On Building Prediction Systems for Software Engineers
TLDR
It is argued that these indicators are statistics that describe properties of the estimation errors or residuals and that the sensible choice of indicator is largely governed by the goals of the estimator. Expand
On Building Prediction Systems for Software Engineers 1
Building and evaluating prediction systems is an important activity for software engineering researchers. Increasing numbers of techniques and datasets are now being made available. UnfortunatelyExpand
An Annotated Bibliography of Secondary Sources on the History of Software
  • W. Aspray
  • Engineering, Computer Science
  • Annals of the History of Computing
  • 1987
TLDR
This bibliography is a product of the National Collection Strategy (NCS) program being undertaken by the Charles Babbage Institute, to develop a national collecting strategy for preserving the historic records of computing. Expand
Low-cost identification system of ICT for depression in children 8-11 years old: The case of the I.E.D. Edgardo Vives de Santa Marta, Colombia
The objective of this research was to implement a low-cost diagnostic information system for children aged 8-11 years of the District Educational Institution (DEI) Edgardo Vives Campo de Santa MartaExpand
Software Defect Prediction from Code Quality Measurements via Machine Learning
TLDR
A number of machine learning techniques such as neural network and random forest are used to determine whether seemingly innocuous rule violations can be used as significant predictors of software defect rates. Expand
A Contrast and Comparison of Modern Software Process Models
TLDR
The progression and remarkable change in Software Processes and their respective models are explained and a contrast of classical software processes with Agility and CBSE is summarized. Expand
Software development lifecycle models
TLDR
The waterfall model is considered before the other models because it has had a profound effect on software development, and has additionally influenced many SDLC models prevalent today. Expand
A Novel, Model-Based, Specification-Driven Embedded Software Integration Platform
This paper presents a model-based, specification-driven Embedded Software integration Platform (ESiP) we refer to as Ensemble. Its primary objective is to shorten the portion of the embedded systems'Expand
...
1
2
3
4
5
...