The “idea” of programming a computer, initially Babbage’s Analytical Engine, started a revolution in the relationship between
men and machines, or as we will soon reveal, between women and machines. It may be easier to trace the idea of
programming a computer closer to its point of origin and identify the critical ideas in the development of software than the idea
of the computer itself. How much did Babbage owe Pascal? Was the Frenchman Pascal aware of the work of the Spanish Monk
Ramon Lull centuries earlier in his effort to produce a “machine” that executed Aristotle’s syllogisms? Or, more recently, did
Mauchly actually visit Atasanoff in Iowa before ENIAC? It is interesting to compare the parallel development of computer
hardware from Babbage to Aitken to Eckert and Mauchly to Von Neumann to Amdahl to Cray alongside that of software from
Countess Ada Lovelace to Betty Holberton to Admiral Grace Hopper to Mandalay Grems, and perhaps even to Kim Polese and
Laura Lemay


This is a story I have told many times and been as often encouraged to write down. Finally, I have had the occasion to do so. I
hope the reader will forgive my telling the story in first person. I do so because, except for Charles Babbage and Ada Lovelace,
I have known all of the people I will mention and have worked with most of them at some time during the past fifty years.

It has been said that humans have been able to achieve civilization not only because of brain size, language ability, and their
opposing thumbs, but also because men and women are the same size (within 15% or so) and endowed with similar, but
perhaps slightly different mental abilities. We do not know what Charles Babbage thought about his colleague, the
mathematician Ada Lovelace’s concept of programming, but we do know that Howard Aitken strongly disapproved of his
mathematician and colleague Grace Hopper’s notion of a compiler, or a program that would write computer programs. It is
interesting that Betty Holberton and Grace Hopper came up with two of the three seminal ideas in the development of
automatic programming while working in Aitken’s Computational Laboratory at Harvard. Almost in spite of his insistence that it
could not be done, and if it could it would not be desirable or useful. I was not a student there until several years later but
even then he was still fuming about it, and firmly holding to the notion that engineers and mathematicians must write their
own programs in carefully crafted machine language. Aitken’s vision was to build computers that could compute tables of
mathematical functions to aid the solution of partial differential equations by the separation of variables. For example, the
solution of the wave equation by this technique produces a series solution at each point in polar coordinate three- space which
is the product of an infinite series of a spherical Neumann functions time a spherical Bessel function times a Legendre
polynomial. Aitken thought the role of computers was to compute countless tables of exotic functions in multiple coordinate
systems and then to publish them each in encyclopedia sized sets of books. For example he did standard (not spherical) Bessel
functions out to order 129 and published them at the computation Laboratory. In spite of his genius as a computer architect I
don’t think he ever entertained the idea of solving a problem on the computer. Years later when I was principal programmer at
UNIVAC, I worked with J. Presper Eckert who was then chief engineer, and with Grace Hopper who was chief scientist. I asked
them if they agreed with my opinion, and whether they did have the idea of solving the problem on the computer. Both agreed
with my surmise about Howard Aitken, and both clearly had the idea of solving the problem on the computer. They had been
under the spell of the then late Dr. John Mauchly, co-founder of UNIVAC, whose vision was to solve the weather problem, on
the computer, and furthermore in real-time. Something we still cannot do.

Still we cannot be too critical of a pioneer like Aitken, after all he was funded, as was Charles Babbage a century earlier and
Eckert and Mauchly five years later to develop an engine to create tables for exterior ballistics. It is easy to criticize the genius
after the fact for his or her high threshold to new data, when we should be lauding them for their focus on a vision and their
tenacity in achieving it. It is often said the Thomas A. Edison who had 1032 patents, still a personal record, went to his grave
as a founder and director of General Electric muttering that this new-fangled alternating current would never work out.

Programming the Analytical Engine

Charles Babbage designed the first computer in two stages between 1819 and 1834. The first stage was a Difference Engine
for computing mathematical tables by the difference method and the second was the Analytical Engine for more general
numerical computation, that is, computing more complex tables than the difference engine could do. A modern copy of the first
engine sits in the British Museum and is run every year on Babbage’s birthday. It is powered by a five horsepower electric
motor rather than the steam engine that Babbage originally envisioned. The more advanced Analytical Engine was never
completed due to funding and technical problems, however some modern commentators have reversed conventional wisdom
on this issue and now think that it may actually have been achievable within the scope of the mechanical clockwork technology
of his day, and if it had been completed, it certainly would have worked to specification. After all it was built with advanced
mechanical clockwork technology and the clock makers of the late Middle Ages were the first geeks, even rivaling computer
designers and programmers today.

Charles Babbage’s assistant in the development of the first computer was Countess Ada Lovelace, the daughter of the British
poet Lord Byron, and herself a gifted mathematician.1, 2, 3 She was also a capable linguist, for we first read about her notion
of “programming” the Analytical Engine in a footnote of a report she translated from Italian to English. General Menabrea, an
Italian Artillery officer, had visited Babbage in London and viewed the plans and progress for the Analytical Engine; upon his
return to Rome he wrote a report on the visit for the Italian Government describing the potential of the machine for computing
firing tables for the exterior ballistics of artillery weapons. Interestingly, Aitken’s Mark I development at Harvard in the late
1930s was funded by the Navy, and Eckert and Mauchly’s ENIAC at the University of Pennsylvania in the early 1940s was
funded by the Army for the same application. In Note C of her annotated translation of General Menabrea’s report Lovelace
introduces the idea of programming the Analytical Engine.4 She writes:

Those who may desire to study the principles of the Jacquard loom in the most efficient manner, viz. that of practical
observation, have only to step into the Adelaide Gallery or the Polytechnic Institution. In each of these valuable repositories of
scientific illustration, a weaver is constantly working at a Jacquard loom, and is ready to give any information that may be
desired as to the construction and modes of acting of his apparatus. The volume on the manufacture of silk, in Lardner’s
Cyclopedia, contains a chapter on the Jacquard loom, which may be consulted with advantage.

The mode of applications of the cards, as hitherto used in the art of weaving was not found, however to be sufficiently
powerful for all the simplifications which it was desirable to attain in such varied and complicated processes as these required
in order to fulfill the purposes of an Analytical Engine. A method was devised of what was technically designated backing the
cards in certain groups according to certain laws. The object of this extension is to secure the possibility of bringing any
particular card or set of cards into use any number of times successively in the solution of one problem. Whether this power
shall be taken advantage of or not, in each particular instance, will depend on the nature of the operations which the problem
under consideration may require. The process alluded to by M. Menabrea in page 239, and it is a very important simplification.
It has been proposed to use it for the reciprocal benefit of that art, which, while it has itself no apparent connexion with the
domains of abstract science, has yet proved so valuable to the latter, in suggesting the principles which, in their new and
singular field of application, seem likely to place algebraical combinations not less completely within the province of mechanism,
than all those varied intricacies of which intersecting threads are susceptible. By the introduction of the system of backing into
the Jacquard loom itself, patterns which should possess symmetry, and follow regular laws of any extent, might be woven by
means of comparatively few cards.

Those who understand the mechanism of this loom will perceive that the above improvement is easily effected in practice, by
causing the prism over which the train of pattern-cards is suspended to revolve backwards instead of forwards, at pleasure,
under the requisite circumstances, until, by so doing, any particular card, or set of cards, that is brought back to the position it
occupied just before it was used the proceeding time. The prism then resumes its forward rotation, and thus brings the card or
set of cards in question into play a second time. This process may obviously be repeated any number of times. Ada L. Lovelace

Programming the ENIAC

The first programmers of the ENIAC at the Moore School of Electrical Engineering at the University of Pennsylvania were the
wives of its creators: Hester Eckert and Kay Mauchly, and Adele Goldstein, wife of the funding Army officer, Captain Herman
Goldstein. Unfortunately they did not get to program the “killer” application that proved the technology and paved the way for
future government support for the development of computing technology. They did not have the level of security clearances
needed but they did teach Drs. Edward Teller and Dimitri Metropolous of the Atomic Energy Commission how to program the
ENIAC. That program was the first simulation of a hydrogen bomb explosion that was reported to a congressional oversight
committee by Dr. John von Neumann of the Institute of Advanced Studies at Princeton University and consultant to the AEC,
along with his strong recommendation that funding also be provided to build more computers.5 Naturally he was one of the
first recipients of such funding at Princeton and Captain Goldstein left the Army to join him at the IAS to build a computer which
John von Neumann modestly named the Johnniac. After the machine was running, Goldstein went to IBM to be part of the team
that developed their first electronic digital computer, the IBM 701.5

There was an interesting rivalry among the world’s first three computer programmers as to which one was actually the first to
get a program running on the ENIAC, a plug board-programmed, rather than stored-program, computer. Hester Eckert sadly
died in childbirth, but the rivalry continued until Adele Goldstein passed away prematurely as well. The epitaph on her
tombstone reads: “Adele Goldstein, The World’s First Computer Programmer.” 5 She won.

Invention of the Sort/Merge Generator

I first met Betty Holberton when she was responsible for FORTRAN language standards at the NIST in Washington, D.C. in the
1980s. I had just finished developing an application generator as a consultant for Analyst’s International Corporation in
Minneapolis. The AiC team was busy commercializing the COBOL application program generator, named CorvetTM; but, I had
been retained further by the firm to obtain a software patent for the product.6, 7 My first effort in 1982 was rejected by the
patent office on the grounds of prior art but all that they had quoted as anticipating it was not relevant. So, I decided to find
out what the prior art for this technology really was before rewriting the patent claims, and soon traced the origin of
application generation back to an early report by Betty Holberton at Eckert-Mauchly Division of Remington Rand Inc. in 1952. I
went to visit her in her office at the NIST. When I told her what I was doing and showed her the rejected patent application,
she went to her file cabinets and pulled out her original paperwork and the first examples of her sort/merge generator. In
those days programmers at Harvard wrote their programs in the same black and red leather bound ledger books that students
use for taking class notes. (I have long since given all of mine from classes at the Computation Laboratory to the Charles
Babbage Institute.) They rarely had to face a blank piece of paper when assigned a new program to write, rather just went
back to something similar and copied it back into the book with modifications. She had the entire history of her development of
the idea of automatic program generation recorded in these ancient ledger books. It is always fascinating to see the birth of an
idea neatly recorded for posterity.

She said that every time she wrote a sort/merge program for the computer she noticed that the some code sequences kept
recurring in spite of the fact that each program was written for a different report specification and had different input data
stored on magnetic tape in different formats.8 She began annotating the code sequences that didn’t change and noting the
fields that varied in those that did change from program to program. Then she got the idea: instead of writing a program to
sort a file, why don’t I write a program to write the program that sorts the file. Operating on an existing manually written
program as if it were data was not new; in fact, self-modifying programs were the norm before index registers were invented.
But the idea of creating a program from scratch from a template by mapping into it given values and derived values from the
program specification was new.9, 10 This novel insight was, and still is, a significant intellectual contribution to the
development of the idea of computer programming.

The Development of Programming Languages and Compilers

Dr. Grace Murray Hopper wrote the first compilers shortly after she left the Harvard Computational Laboratory to join the Eckert
and Mauchly Computer Corporation in Philadelphia. They were called MATH-MATIC, a technical applications programming
language11 and FLOW-MATIC, a business applications programming language.12 One can argue that the former is the
precursor of FORTRAN and the latter the precursor of IBM’s Commercial Translator and ultimately, COBOL. I did not learn the
particulars of their development until I went to UNIVAC as principal programmer in 1961, but I first heard about them as a
student at Harvard in 1956. In a class on computer design taught by Howard Aitken a student innocently interrupted one of
Professor Aitken’s tirades on how engineers and scientists must program their computer application programs in machine
language, with a question about Dr. Hopper’s new compilers at UNIVAC (E&M had become UNIVAC by then). Aitken was a tall,
gaunt, dour man with a boiling point of about 98.7 degrees. I was the only undergraduate in the class, sitting right on the front
row. I remember first his eyes bulging, then his drawn Lincolnesque cheeks balloning out, then his face becoming very red,
then a purple line surrounding his mouth as he roared at the hapless graduate student: “Don’t ever mention that woman’s
name or her G*dd*m compilers in my presence again!” I don’t think anyone in the class asked him another question the rest
that semester. But, Aitken wasn’t especially spiteful about his former colleague and her compilers, I saw him get just about as
angry and abusive if a student would absentmindedly refer to data or program storage as “memory.” That always produced a
furious lecture on the unfortunate and inappropriate use of anthropomorphic terminology when referring to computing
machinery. One term he really hated was “flip-flop;” it was “an Eccles-Jordan trigger pair,” he would quickly point out.

Dr. Hopper had more than creative genius going for her; she also was well endowed with tenacity. Maybe she developed it
working with Howard Aitken at the Harvard Computation Laboratory during the development of the Mark I, II, III, and IV
computers. She was a very smart, pleasant, even scintillating person but she had a will of iron. When COBOL was developed
by the CODASYL committee and announced in 1960 she got squarely behind it and pushed. It is hard to underestimate her
contribution to the acceptance of COBOL. When the Navy asked her to return to full-time duty as a Captain to shepherd the
use of COBOL (she had entered the Navy as a Lt. Cdr. when she left Vassar as a Mathematics professor to join the Harvard
Computation Laboratory as a Navy employee). She did the job for the Navy and eventually retired as the first woman admiral,
but continued to be active as a speaker promoting the application and development of programmer-oriented languages.

When I finished at Harvard in December of 1956 I went to work in the structures programming group at The Boeing Airplane
Company in Wichita, Kansas. I was the first person hired in that group who had actually studied programming in college so
they put me in charge of training. One of the things I taught was the 701 BACAIC interpretive programming system written by
Mandalay Grems of Boeing-Seattle.13, 14 BACAIC was an acronym for Boeing Airplane Company Algebraic Interpretive
Compiler, which Mandy had written for the IBM 701 computers that Boeing had in both Seattle and Wichita. The 701 had a five-
bit op code so it had only 32 instructions, one of which was a NO OP. Most users of the machine programmed it using hand-
scaled fixed-point binary arithmetic until various floating-point interpretative systems were developed. BACAIC provided the
user with floating-point arithmetic and numerous other amenities that did not become available from IBM until the 704 came
out. As the name suggests it was much more than just an interpreter. For example, it had a very valuable feature with allowed
the user to slip out of the interpretive language at any point and write machine code in octal and then go back to the
interpretive coding style. FORTRAN on the 704 did not offer a similar capability until subroutines were introduced in FORTRAN II.

When an early IBM 704 was delivered to United Aircraft an engineer named Roy Nutt got the idea for FORTRAN, or FORmula
TRANslator, a language very similar to Grace Hopper’s MATH-MATIC but uniquely styled for the IBM 704 ISA (instruction set
architecture). The famous arithmetic IF in FORTRAN compiled to only one 704 op code CAS – compare accumulator with storage
and a three line jump table, with branch address entries for less than, equal to, and greater than. On any other computer it
would take as many as 11 assembly language instructions to do a FORTRAN IF. By the time we upgraded from the 701 to a 704
in Wichita along with Seattle, I had been promoted to head of systems programming (at age 21) and had to make the decision
whether we would rewrite BACAIC for the IBM 704 or use the new FORTRAN compiler that IBM had co-opted from UA (by
lending Roy Nutt seven people to help him finish it) and was now promoting as “IBM FORTRAN.” It was a difficult decision
because BACAIC was better, and if it had still been United Aircraft FORTRAN there was no way Boeing would have used it, but it
was now IBM FORTRAN and therefore aircraft industry neutral. So I decided for FORTRAN and Mandy made the same decision in
Seattle. When challenged by the flight test engineers (who owned most of the punched card data at Boeing) as to why I had
made this decision, I simply said: “When I woke that morning and looked out the window, it was a FORTRAN world out there.”
All computer people were thought to be a little crazy by engineers and mathematicians in those days so this remark was
accepted as meaningful. The flight test engineers at Seattle mounted a much more serious challenge to the IBM 704 and
FORTRAN because they would fully enable floating-point arithmetic and these Luddities had millions of punched cards in fixed-
point binary data format. They insisted that IBM remove the floating-point arithmetic from the 704 computer their group had
ordered, and of course IBM refused. To punish IBM they ordered the competing UNIVAC 1103A computer instead for flight test if
the firm would eliminate the floating-point section. UNIVAC merely disabled it in the op code decoder knowing that one day
flight test would have a sudden rush of brains to their collective heads and ask for it to be reinstalled. They did, and it took
about 15 minutes to “re-install” it.

Language Design by Committee

People developed the first high-level or programmer-oriented computer languages, until they became so important they had to
be developed by committees. The standardization process is a political rather than a technical one, or perhaps an “electro-
political” problem at best. It is important that languages be standardized to support their utility and ultimately, program
interchangeability. Roy Nutt recast FORTRAN as FORTRAN IV as a founder and vice-president of Computer Sciences Corporation
in 1961. It was a tremendous success and he was willing to implement it for anyone, not just IBM. The UNIVAC 1107 version
was the most efficient compiler available in 1962. As Leiter der Rechengruppe at the Technical University of Stuttgart I ran the
GAMM (Gesellschaft fuer Angewandte Mechanik und Mathematik) index of some ten or twelve standard scientific and
engineering computations in FORTRAN, ALGOL 60, and assembly language on a number of computers that had been proposed
to the (then West) German Government. The GAMM efficiency of a programming language was computed by dividing the time to
do a weighted average of the typical or benchmark computations written in high-level language by the time it took to do them
written in carefully hand-optimized assembly language. That ratio for Roy Nutt’s 1107 FORTRAN IV was 1.09; that is to say, the
FORTRAN program was only nine-percent less efficient than a hand-optimized assembly language version of the same
computation. By comparison, UNIVAC 1107 ALGOL 60, written in 1962 by Don Knuth, Joe Speroni, and Nick Hubacker at Case
Western Reserve University in only six weeks, although it was the most efficient ALGOL compiler in the world, had a GAMM
index of 8.1 with array bounds checking on, and 2.9 with bounds checking off. Compare these to the Gier ALGOL compiler
(Danish) with a GAMM index of 80. IBM did not catch up in FORTRAN IV efficiency until the H compiler for the 360/75 was
announced in 1966 with its level 2 optimization turned on.

As FORTRAN gained importance it became standardized and subject to revision by committee, but ALGOL 58, 60, and 68 were
the result of the International Algol Committee from the start, although Prof. Doug Halstead at U. C. San Diego made some
pioneering personal contributions to ALGOL 58, the prototype. Some claim that no great work of art was ever created by a
committee. The noted Australian supercomputer (CDC STAR 100) architect and sheep rancher Peter Jones was famous for his
algorithm for computing the IQ of a committee as the IQ of the dumbest member of the committee divided by N, the number of
members on the committee. I have two counter examples to this conventional wisdom: the King James Bible and the ALGOL 60
programming language are two outstanding intellectual and artistic achievements and they are both the result of committee
deliberation. The generality and logical sophistication of ALGOL have not been matched by later language developments. Java
may be the great great grandson of ALGOL (computers, invented by men, are feminine in gender whereas software, invented
by women, is masculine in gender) and both are strongly typed languages, but the subtle differences are critical. The most
serious difference is that Java does not have a go to verb and the average ALGOL program is festooned with go tos. The
second difference stems from the fact that that there are two ways to solve every finite problem in logic and mathematics: one
is by abstraction and the other is by exhaustion. One may add 100 numbers to get the sum or use the formula Gauss
discovered when he was ten. Strongly typed ALGOL has an exhaustive proliferation of data types to support scientific
computation including complex and complex array, for example. Java has only eight primitive data types but invites the
programmer to create new ones willy- nilly a la C++ style (structs) as Abstract Data Types. Unfortunately Java does not also
have the overloaded operators feature of C++ (with the single exception of “+” for addition and string concatenation) so the
programmer has to create Java methods to provide these extended functions for extended or newly defined data types.

Recent Developments and the Age of Java

Java was developed by Jim Gosling at SUN Microsystems with a goal of universality (“Write once run anywhere”) but aimed at
rather small scale and specialized applications. Credit must go to Kim Polese’ marketing skills for the amazingly successful grass
roots campaign she carried out at SUN to popularize it. I was sitting in the audience as guest of Scott McNealy the day he
announced Java in San Francisco. As I listened to his speech and watched his laser light show I thought of COBOL, of ALGOL, of
ADA, and thought “this is never going to happen”- not without someone like Grace Hopper. I did not know that there was
someone like Grace Hopper and that she had already recognized that Java had much more potential than anyone at SUN
thought, and that she also had Laura Lemay of HTML fame writing a book entitled “How to Program Java in 21 Days.” The rest
as they say, is history. Grace Hopper was an admiral, she always used push; but Kim Polese is a marketing guru and she knew
how to employ pull. Today we are progressing from the Age of COBOL (and FORTRAN) into the Age of Java.


Countess Ada Lovelace appears to have first thought of the idea of programming a computer in 1840, reasoning by analogy
from the punched card programming of a Jacquard loom. Her insight was to control the Analytical Engine by a similar punched
card program but she added the idea of looping. Rather than repetitively, manually restacking card sequences in the card
hopper, to loop through a sequence a given number of times or until some numerical criterion in the computation was met.
John von Neumann once said that the idea of looping within a computer program was essential to the computer being able to
do useful work for a reasonable expenditure in programming effort. Ada Lovelace also seems to have had the idea of solving
the problem on the computer in addition to just creating tables of functions. In the quoted Note D above she compares
computing algebraic combinations with weaving the various intricacies of thread intersection on the loom.

Betty Holberton seems to be the inventor of application generation; her sort/merge generator was the first documented
program of its kind. Application generation had a modest development into other areas, for example Joe Ross’ STAR accounts
receivable application generator, but was eclipsed by Grace Hopper’s compiler technology until the 1980s when it was
rediscovered. Today the automatic generation of Java code by PDEs (programming development environments) is
commonplace. Many have round tripping, that is not only can they generate Java code from UML (universal modeling language)
diagrams but they can also generate UML diagrams from Java code. The user can modify the Java and then reconstruct the UML
specification, then modify the UML and recreate the Java, and so on.

Mandalay Grems may not have written the very first floating-point interpretative system but her algebraic interpretive compiler
was rather more than a simple floating-point interpretative package and anticipated later developments in FORTRAN compilers
and perhaps even ALGOL. Its ability to include machine- coded macros anticipated Larry Liddiard’s template-based pre-
generation of machine-coded macros in the MNF (Minnesota Fortran) compiler for the CDC 6600 in 1969.

Of course, Grace Hopper is the leading heroine of the software story. Not only did she invent and demonstrate the idea of
compiling a horizontal programmer- oriented language into a vertical machine–oriented language at Eckert and Mauchley, later
UNIVAC, but she also later managed as a Navy officer to see COBOL standardized by the Department of Defense. All this
against considerable odds, since IBM did not give up its own monopolistic position favoring Commercial Translator easily. To
see how difficult it was to standardize a programming language even with government backing, just observe that it was never
repeated! The Air Force was not able to do it with ALGOL (JOVIAL) and the Navy’s effort to standardize ADA for CCC (command,
control, and communications) data processing was less than completely successful. Also, witness the fact that while there may
be no new COBOL programs being written today, some 75% of existing business data processing applications are still running
in COBOL, some 43 years after it was first announced. The new COBOL 2000 standard even includes object-oriented capability
for COBOL so it may yet have a comeback.

Finally, while we have Jim Gosling and SUN Microsystems to thank for the wonderful Java language, credit is also due to Kim
Polese’s vision and marketing skills and Laura Lemay’s ability as a technical writer.

It seems to me that the history of ideas in the development of programming is first of all the idea of doing it for the Analytical
Engine, for which Ada Lovelace seems to have been inspired by watching the Jacquard loom turn out a paisley patterned bolt
of silk. The next three ideas came at about the same time and at first appeared to compete with each other. These were Betty
Holberton’s idea of the meta-program, or program that wrote a program to do some specific task; Grace Hopper’s idea of an
artificial language that was a subset of some human language or genre in which one could express an algorithm which could
then be statically translated to machine language and executed; Mandalay Grems idea of a similar language that could be
dynamically interpreted to carry out the computation. We see all three ideas in play simultaneously today in most languages
and development environments. COBOL and FORTRAN are static compiler languages, yet MOVE and MOVE EXAMINING in COBOL
and formatted I/O in FORTRAN are interpreted. Most PDEs today generate HLL code and many provide round tripping. Java is
an interpreted language so that it can be architecture independent, yet just-in-time compilers are available for optimizing hot
spots in Java programs.


1. Stein, Dorothy, Ada: A life and legacy, Cambridge: MIT Press, 1985. p.33.
2. Perl, Teri, Math Equals, Biographies of women mathematicians and related activates, Menlo Park: Addison Wesley, 1978.
3. Moore, Doris, Ada, Countess of Lovelace, New York: Harper and Row, 1977.
4. Menabrea, L. F., Sketch of the Analytical Engine Invented by Charles Babbage, in Morrison, Philip and Emily, Charles Babbage
and his Calculating Engines, New York: Dover Publications, 1961, pp. 225-297.
5. Goldstein, Herman, Personal communication, 1996. While CIO at the University of Pennsylvania from 1992 through 1996, I
was responsible for planning and funding the 50th Anniversary of the ENIAC, which had been announced in the New York
Times on February 14, 1946. Herman Goldstein and J. Presper Eckert were a big help in setting up the celebration and regaled
me with stories of the early days of ENIAC.
6. Patton, P. C., CORVETTM Product Opportunity Description, Minneapolis, Analysts International Corp., October 1981, 56 pp.
7. Messerich, P. J., et.al, Automated Programming System for Machine Creation of Application Program Source Code from Non-
Procedural Terminal Input, US Patent No. 4,742,467, May 3, 1988
8. Holberton, F. E., Personal communication, 1983.
9. Holberton, F. E., Master Generating Routine for 2-Way Sorting, Eckert- Mauchly Division of Remington Rand, Inc. Report 965-
1, 1952.
10. Holberton, F. E., Sorting Rerun Procedure Incorporated in the Master Generation Routine 2-Way Sorting, Navy Department,
David Taylor Model Basin, August 1954
11. Univac, MATH-MATIC Remington Rand Automatic Programming System, New York: Remington Rand Univac, 1959.
12. Univac, FLOW-MATIC Remington Rand Automatic Programming System, New York: Remington Rand Univac, 1959.
13. Grems, Mandalay, Boeing Airplane Company Algebraic Interpretive Compiler, Seattle: The Boeing Airplane Company, 1956.
14. Grems, Mandalay, Porter, R. E., A Truly Automatic Computing System, Proceedings of the Western Joint Computer
Conference, 1959, pp. 10-21.
White Paper

The Development of the Idea of Computer Programming
Peter C. Patton, Ph.D.
Trustworthy Software> Web Design> Internet Security