SOFTWARE - THE EARLY DAYS
A little over twenty years ago, there was a growing concern on the
part of the American Department of Defence, about the number and
variety of programming languages that were being used in the various
projects they were funding. A study conducted at that time, revealed
more than 450 languages and dialects in use. It was felt that
considerable cost savings could be made by reducing this number, and
standardising on a single language for all future work. In the decade
that followed, a considerable amount of work went on to produce such a
language. The outcome of it all was ADA, about which we hear so much
nowadays (although not all of it, is quite as enthusiastic as it used
to be).
Today, the abundance of computer languages and software tools that led
to the development of ADA, has become a superabundance. By contrast,
the situation just 40 years ago, couldn't be more marked.
In 1955 there were no software tools whatsoever; no assemblers,
compilers, linkage editors, program generators, editors, interpreters,
operating systems, etc. There wasn't even an assembler level language,
and all programming was done via machine 'orders' or 'instructions',
effectively what we would nowadays consider to be machine code. The
very term 'software' was itself still a very new thing.
Primitive as this may sound, it's worth remembering that there were
still very few computers in use worldwide, and in Britain no more than
a dozen or so, and these were usually specially built 'one of'
designs. At the beginning of the '50s there were less than a handful
in operation in Britain. The age of the computer had begun just over
10 years previously, but so far there were very few signs of it. The
revolution in hardware and software, and the widespread use of
computers in business, industry, and science, was yet to come.
In order to appreciate just how much had already been achieved, it's
worth reviewing the considerable amount of progress that had been made
since the introduction of the first programmable machine some 200
years earlier.
Lady Ada lovelace is usually credited with being the world's first
programmer for her work on Babbage's Analytical Engine, although a
more appropriate title would be that of world's first systems analyst.
The credit for being the first programmer, must go to someone who
preceded her by more than a 100 years, for in the early 18th century,
Falcon introduced the novel idea of a chain of punched cards to
control the operation of weaving looms. This idea was improved on by
Jacquard who introduced his own version of the programmable loom in
1804. It quickly caught on, and by 1824 was in widespread use in both
France and Great Britain. When Babbage proposed the use of a similar
arrangement for his Analytical Engine in the mid 1800's, the idea of a
control program was already well over a century old. Programmable
process control equipment is thus not a new thing, being over two and
a quarter centuries old.
Although Babbage's Analytical Engine was never built, a considerable
amount of 'documentation' was produced, (including the extensive notes
written by Lady Lovelace) and a part of the mill (arithmetic unit) was
eventually constructed by Babbage's son in 1906.
Over the centuries there had been various attempts at producing
calculating machines, some more successful than others. Perhaps the
earliest of them being Pascal's adding machine, the Pascaline. Built
between 1642 and 1644, it was crude by present day standards, but it
was the first reliable device to perform an arithmetic operation
automatically.
But apart from Babbage's, none of the several machines proposed or
built, were designed to be programmable. It wasn't until 1937 that
Babbage's idea of an automatic programmable calculating machine was
taken up again. This time by IBM, following an approach by Howard
Aiken of Harvard. At that time IBM were one of the biggest
manufacturers of what were known as 'book keeping machines'. These
were basically mechanical calculators using punched cards, widely used
for business and scientific purposes. IBM's expertise in this field,
made them ideal for the project, and seven years later in 1944, the
world's first programmable digital computer was finished.
It was called the Automatic Sequence Controlled Calculator (ASCC).
Upon completion it was presented to Harvard University, and was
subsequently known as the Harvard Mark I. Like Babbage's proposed
Analytical Engine, it was a mechanical computer. Unlike his, it was
powered by electric motors.
Like many of the early computers it made use of a decimal rather than
a binary number representation, and stored numbers in genuine hardware
registers - sets of wheels driven by rotating shafts. It had a massive
memory of 72 such registers which could be set up and controlled by
means of an external program, and each one capable of holding a 23
digit decimal number. There was no provision for a stored program,
although constants could be set up by banks of hand switches.
Input and output of data was by punched cards, and program control via
24 hole punched tape. Programming was of a most elementary kind, with
each tape frame being split up into 3 groups of 8 holes, consisting of
numerical codes for input register, output register, and operation.
The ability to evaluate functions, a necessary attribute of any
scientific calculator, was implemented by special program tapes, which
were loaded as required during the course of a program. In order to do
this the programmer had to include a command to stop the calculation
to enable the required function tape to be placed in the tape reader.
Some idea of the speed of the machine can be got from the fact that it
took about 6 seconds to perform a multiplication, and about 11 seconds
to perform a division, both being done in a part of the machine's
hardware dedicated for this purpose. There was also, special purpose
hardware for calculating such things as sines, logs, and exponentials
although these took a bit longer (a minute being the minimum time
required).
Even as IBM's ASCC was being completed, construction of the world's
first electronic machine was under way. It was being used as early as
1945, although it wasn't finally completed until 1946. It was known as
ENIAC (Electronic Numerical Integrator And Calculator). It was big,
occupying some 1,500 square feet of floor space, weighed 30 tons, and
required 150 kw of power. The sheer number of components, which
included some 18,000 valves and 1,500 relays made maintenance a full
time job.
Despite it's size and complexity, the programming facilities provided
were of the most primitive kind. Like the ASCC it was a decimal
machine and made use of punched cards for input and output of data.
However, to make good use of ENIAC's considerably greater speed, the
use of an external control program (as in the ASCC) was dispensed
with. Instead, it used a stored program, set up by hand on plug and
socket patch panels, and banks of switches.
This 'hardware programming' literally set up data and control routes
from one part of the machine to another, the operations to be
performed, the registers to be used, the number of iterations of an
operation and so on. To give an example; if the sign of a number was
to be tested, a 'programmer' would have to set up an electrical
connection from the appropriate part of the accumulator holding the
number, to the part of the machine where the sign was required.
Despite the tedious and time consuming process of setting up and
checking programs, ENIAC's use of stored program control was
considered a big advance because of the vastly increased operational
speeds it made possible. This use of a stored rather than external
program, is a feature of modern machines (albeit implemented in a
different way) most of us take for granted.
ENIAC's method of implementing a stored program was implicit. It
wasn't so much a case of holding it in the machine, as making it a
part of the machine. Programming consisted essentially of physically
reconfiguring the hardware by hand; a process which was extremely time
consuming and prone to error.
The concept of actually holding in a machine the program to be
executed, in the same way as numbers, was described by the now
legendary John von Neumann, in a report published as early as 1945.
Nowadays it's difficult to appreciate that this was once a new and
revolutionary idea. Even as late as the mid 1960's it was still being
considered by some, to be a difficult idea to grasp, and one which
required a considerable amount of explanation.
The original von Neumann report set out the requirements for a general
purpose stored program digital computer. This report was to lead to
the development of EDVAC (Electronic Discrete Variable Automatic
Calculator), the first computer to make use of a genuine stored
program as we understand it today. Like ENIAC this was also developed
by the University of Pennsylvania for the US Army, but it took a lot
longer to complete; work having started in 1946, it wasn't finished
until 1952. Not only was it the first stored program computer, it was
also the first to store numbers in binary form, another of John von
Neumann's proposals.
The ASCC could hold up to 72 numbers of 23 decimal digits, plus 60
'constants'. EDVAC by contrast, had a mercury delay line memory, that
could handle 1,024 numbers, each one 43 bits long. This was a
considerable increase in main memory, although some precision in
number representation was lost, and of course the memory was also
being used to hold the program to be executed.
The program or 'programme' as it was spelt in those days, was made up
of a series of 'orders' which were directly interpreted by fixed
hardware as they were fetched from store (microcode - now widely used
and essentially a level lower than machine code, that implements
processor instruction sets - was nowhere to be seen). Orders in the
EDVAC made use of a four address code system, of which the first two,
specified addresses from which numbers were to be taken, the third
where the result was to stored, and the fourth gave the address from
which the next order was to be fetched. The 'function' or operation to
be performed was given in a very abbreviated mnemonic form - A for
Add, S for Subtract etc. One such order might be:-
203 204 207 500 A
Roughly translated this meant - take the contents of locations 203 and
204, Add them together and place the result in 207, then fetch the
next instruction from location 500 and execute it. An interesting
point to note from this is that operations are performed between
locations in memory, and not using registers. Also, the lack of some
sort of automatic sequencing mechanism or control unit, meant that the
programmer had to explicitly define by means of the fourth address
field, the order in which instructions were to be obeyed.
This may sound a bit tedious, not to say inefficient. But two factors
are worth bearing in mind. Firstly, the order code (the only
programming language available) was an integral part of the machine,
with its form determined largely by the hardware design. Secondly,
there were severe time penalties involved in fetching instructions and
data from the mercury delay line store.
The delays caused by having to wait for addresses to become
accessible, meant that a lot of ingenuity had to be exercised in
writing programs, if execution times were to be kept down to a
reasonable level. By not laying down the orders in continuous
execution sequence, it was possible to arrange for the next order that
was required to be executed, to be almost immediately available when
the current one had completed. It meant of course that programmers had
to take into account the execution times of different orders when
writing the code, and suitably juggling the addresses at which numbers
and orders were stored (not things that many of us have to bother
about these days).
This technique, known as minimum access coding (the objective being to
reduce the overall time taken to access locations in store), was not
limited to EDVAC. Due to the use of delay line stores and relatively
slow speed magnetic drums, as main memory in the majority of early
digital computers, it was a subject of considerable importance to
programmers in general.
EDSAC (Electronic Delay Storage Electronic Calculator) was the first
large British machine to make use of the principles embodied in EDVAC.
However, it wasn't the first British machine to make use of a stored
program, for by the time EDSAC was completed in 1949, three
experimental machines had already been built at Manchester University.
From these, and later work done at Manchester, were to come the early
Ferranti machines, such as the Mark 1 and the Mercury.
Unlike its American counterpart EDVAC,
EDSAC made use of a control unit which determined the sequence in
which orders were executed. As a consequence, its order code didn't
use a 'next instruction address' field. EDSAC also had an accumulator
to store intermediate values, which meant that it didn't need a
resultant field, nor one of the operand address fields. This very much
pared down system was, for obvious reasons, known as a single address
order code. Although perhaps not seen that way at the time, this was a
necessary evolutionary step in the development of software; one that
freed the programmer from some of the more irksome aspects of program
construction.
The code in which they wrote was still determined by the machine, but
the programmer was no longer required to be involved with the finer
details of program implementation. However, as we know, that wasn't
the end of the story by any means. So far the majority of effort had
been expended in developing suitable hardware, with programming being
a necessary, though very much secondary activity. In fact, programming
methods, and generalised software as a major component of computer
systems had yet to be developed.
THE END