The rate of technological progress
----------------------------------
by Steven Flintham (15A)
------------------------
I'd like to start by apologising for
the fact that this isn't strictly about
the 8-bit BBC Micros - or even
computing in particular. However, it is
about technology and it strikes me that
a computer "magazine" is probably going
to be read by people interested in that
sort of thing. Besides, where else am I
going to get the chance to sound off
about it to people who might be
interested?
It doesn't seem so common now, but
about four or five years ago, it was
very common to read articles and books
about just how fast technology is
progressing. In the introduction to a
book published in about 1984 (I can't
remember the title), I read something
that claimed that if aviation
technology had progressed as fast as
computing technology, it would cost
just £5.00 to fly to America. Although
it has never been stressed quite as
strongly, the rate of general
technological progress was also
continually proclaimed as
extraordinarily fast. For completeness,
I will quote the following example,
which should serve to illustrate the
point (this is not an exact quote, just
an approximation):
Imagine that the entire time which man
has existed up until the present day is
represented by a twelve hour clockface.
On this scale, over the period of time
represented by ONE SECOND on the
clockface, we have discovered
electricity, placed objects (and
people) in space, (and so on - you know
the sort of thing)
What worries me is that I can't seem to
see any evidence that this rate of
technological progress is continuing -
or did it never exist? I'd like to
stress that I'm extremely keen on
technological innovation - I'm not
pleased at this apparent shortage.
Being the well-informed person I am, I
tend to watch "Tommorrow's World"
whenever I remember, and it seems to me
that the so-called "progress" is
getting less and less spectacular. A
few years ago, we were told of new
computing techniques, innovations in
the space programme, the discovery of a
new gene. Now, all we seem to hear
about is techniques for looking after
premature babies, a feature about
food-related science, new discoveries
about the ozone layer. I'm not claiming
that this isn't important, but none of
it seems revolutionary anymore.
Please, will someone tell me that the
world isn't really slowing down? Have I
just become disillusioned, or is this
really happening?
Looking at the computer magazines tells
the same story. Where are the
innovations? Where's the invention of
the new compression technique which can
cram 5Mb of data into 3k? Where's the
new style of processor? Where's the
parallel processing we were promised?
Personally, I find it difficult to get
excited about 24-bit colour (impressive
though it is), and although computers
might be getting faster, it's only by
notching up the clock rate. Yes, the
ARM250 is quite a good idea, but I
don't find it stunning. Yes, the new
A30XX series is quite cheap, but not
revolutionary.
It could be that this article is just
revealing me to be an exceptionally
badly informed person - all over the
country, people could be crying out,
"But what about...?" Well, I'll gladly
look like a fool if it means that I can
stop worrying. Please, somebody, tell
me I've missed something...
-- Well, I'll do my best to put you out
of your misery. First of all, it's all
very well reading about how fast
technology has progressed in the last
eighty years or so, or flicking through
two decades of computer development in
a couple of paragraphs. But perception
of time on a day-to-day basis is rather
different. You can't expect to find
major new developments arriving with
enough frequency to give the same
impression of speed.
I can certainly see your point about
Tomorrow's World, but it may just be
that these days the emphasis of the
program is less upon scientific and
technological advance itself than upon
the applications being derived from it
(since the majority of the population
are more interested in new gadgetry
than actual science), and that such
down-to-earth details can very rarely
be as exciting to the informed
bystander as major scientific
discoveries.
At the same time I think it is purely
part of the development of scientific
knowledge that things seem to have
slowed down now, while a century ago
(or more) momentous steps forward from
ignorance were being made. The
discovery that the Earth goes round the
sun was an immensely important step
forward, while modern astronomers
discovering a particular hole on the
moon cannot be compared. Similarly the
discovery of electricity was an immense
step forward simply because of the
ignorance of electricity that had gone
before, while a new discovery about the
way electricity behaves across certain
minor superconducting materials is
pretty unimpressive.
What I mean to say is that as science
progresses from large-scale
fundamentals of how physics works to
the smaller details, things will begin
to be more of a slow advance in overall
knowledge than a progression of sudden
leaps. Though I could be proved wrong
if somebody comes up with a major leap
forward; nuclear fusion maybe?
To get back to computers. It seems to
me that things haven't really slowed
down here. Bear in mind that the 286 PC
became available in 1986, and the 386
took something like another four years
to arrive, while the 486 appeared
almost immediately after the 386, and
can now be obtained for under £1000.
Effectively, the power of the average
personal computer is doubling every
year-and-a-half, although as I've said
before, seeking processing power for
its own sake is rather foolish.
This may not do anything revolutionary,
but the new advances, together with
things like CD-ROM and so on, will
allow a whole new spectrum of
applications. Things like games with
television-quality graphics fetched
ready-made from CD backed by CD-quality
sound (effectively interactive
television progams), multimedia
applications, videophones on your PC,
and developments like virtual reality.
Progress on this sort of thing so far
has been incredibly sluggish
considering the possibilities, but the
progress will happen; within two years
things like this will be available for
the equivalent of under £1000 for a
complete system.
There will never be a technique to
allow 5Mb to be compressed into 3K;
compression can only go so far. But
parallel processing is ALREADY here;
the use of multiple processors
operating different tasks is the basis
of the 486 PC. It is also the basis of
my system, and offers significant
advantages; with the 65C12 performing
input/output operations, the 32016
processor carrying out language
processing, and the 32081 (built into
the 32016) dealing with floating-point
arithmetic, the system can carry out
floating-point operations faster than
an A540, the top-of-the-range
Archimedes.
On a larger scale, prototype
"superneurocomputers" have in fact been
built, and demonstrated to be
significantly more cost-effective than
conventional computers. Networks of PCs
can also be made to work together on
long calculations (imagine an entire
office-block full of 486s working
together). My university's
microcomputer society are apparently
planning to take all their out-dated
ARM2 processors out of their old
Archimedes and build a parallel
computer with them.
I'm afraid the software to support all
this innovation properly has not yet
been written, and although there is now
more than enough processing power
around to handle all the vast quantity
of data available for processing, the
artificial intelligence needed for a
computer to actually understand it is
still a long way off - talks I have
attended on serious AI research and
applications seem to indicate that the
subject is still stuck somewhere in the
early 1970s.