The Magician of Budapest

John von Neumann is a name to be conjured with. A US mathematician of Hungarian extraction, he is the closest modern mathematics and engineering has had to a magician. His influence stretched from theoretical biology to logic and proof theory; it extended to most of mathematics, physics and economics. He is also one of the founding fathers of modern computer science where his work spanned automata theory as well as the beginnings of computer architecture.

His feats of memory and sheer intellectual capacity are obvious from his biography: he managed to acquire a PhD in mathematics at the ripe old age of 23 in his native Budapest, while never attending a single lecture at the local university. This exercise in single-minded absenteeism was entirely understandable, since he was simultaneously studying for a diploma in chemistry first in Berlin and then at Einstein's alma mater in Zurich.

Precocity

Born in 1903, he made most of his contemporaries look like laggards. His father was one of the richest men in Budapest and grew worried about his son's genius: he regarded an academic career as only suitable for paupers, even though he understood that Janos' talent was exceptional. Mathematics degrees were a fairly new invention at the time and most people during the interwar years regarded engineering fields and the law as the royal road to riches. Many sons and daughters of the upper Central European classes faced similar dilemmas: the novelists Robert Musil and Hermann Broch both had mathematical backgrounds; both faced difficult career decisions that turned them away from mathematics and to more lucrative endeavours. Even the philosopher Ludwig Wittgenstein started studying engineering before turning his attention to the philosophy of mathematics. Engineers were understood to be the vanguard of modern 20th century society.

Locality

Scientism seemed the solution to the ills of modern industrial society. Talents like the young Janos, as he was called then, were always rare in the history of the sciences, but at a time where technology and science were regarded as the solution of most problems, the resources that the Hoyles and Einsteins of their time could command were only limited by the resources nation states could provide. The Soviet aeronautical engineer Mikojan, the Russian boy genius Andrei Sakharov, Turing, von Braun, Einstein, de Broglie, and even Howard Hughes exemplify the myth and the tragedies of the 20th century scientist and engineer.
The difficulty in understanding a mathematical genius with a photographic memory were pointed out by his high school and university teachers. They stood in awe when the young Janos solved difficult problems before the lectures in which they had been mentioned came to an end.

A celebrity in mathematical circles before the age of 30, he participated in the Manhattan project during the 2nd World War, after he had become a mathematics professor in Princeton at the age of 26. The dream of a computing engine meant rather less to him personally than, say, it meant to Konrad Zuse or to the man who dreamt of artificial intelligence, Alan Turing.

Godelian Dreams

John von Neumann was intimately familiar with automata theory and number theory, both fields that moved to the centre of mathematical attention in Princeton and Cambridge in the mid 1930s. Both fields are central to the understanding of theoretical computer science as we understand it today. Automata theory was actually being created at the time and mathematicians like Alonzo Church who was working at Princeton for most of his life developed a formulation of the problem that preoccupied Alan Turing: namely that there are mathematical problems that cannot be understood and resolved by mechanical means. Kurt Godel, another Central European genius whose mind was less firmly rooted in society and the pleasures of the world than Neumann's had found a solution to the decision problem ("Entscheidbarkeitsproblem") as applied to number theory.

Church, whose demonstration of a similar solution used a calculus of his own invention to prove that there were mathematical problems that could not be solved by traditional mathematical, or indeed any other, means. Turing had devised his theoretical machines which are generally regarded as the terminus post quem of all modern computer science. Church's ideas found their expression in functional programming generally, and LISP in particular, LISP being a notation for Church's calculus.

Turing, Church and Neumann were present in Princeton during the late 1930s. Turing managed to spend a year in Princeton and met both. Neumann, whose role as the prince of mathematics had been confirmed when he became one of the founding mathematics professors at the newly created Princeton Institute of Advanced Studies, spent the rest of his life working on the same floor as Albert Einstein and Kurt Godel. Freeman Dyson, a British mathematician and physicist whose practical talents were not inconsiderable was hired by Neumann in the mid 1950s without ever bothering to finish his PhD thesis. In all, Neumann's social world consisted of the greatest scientific minds of his time. His numerous consultancies and memberships in scientific societies did nothing to further his teaching career, an enterprise he had given up on even before his work on theoretical computer science had commenced.

The Cardinal of Science

So far we have gained the impression that John von Neumann, as he was called after he moved to the US in 1933, was comfortable with a position as a research professor at an institution that was fast becoming the most prestigious research institution in the Western World. He had published a definitive work on quantum mechanics and had done valuable work on the nuclear bomb.

There is a little wrinkle in this picture. John von Neumann also gave his name to the computer architecture that runs almost all single-core CPUs of today. Some of the work for this architecture had been done by a group of engineers who had worked on the ENIAC in the mid 1940s. But the Neumann architecture or Neumann machine, as it has been called as well, was published by Neumann in 1946. It is curious, though, that this development, at least as far as John von Neumann was concerned was not just a theoretical exercise, but something to be put into practice.

Neumann's Engineers

The von Neumann machine was realized and built under the supervision of John von Neumann himself. This, in itself, is nothing new. That it was built at the Institute of Advanced Studies, the Vatican of pure science, is a more interesting piece of history. George Dyson, the son of another IAS-resident genius, the aforementioned Professor Freeman Dyson, stumbled over the evidence while doing research at the IAS on John von Neumann's career. What was interesting was the fact that it was done at the IAS and what kind of resistance von Neumann had to overcome to complete the first Neumann machine.

In 1946, when the construction of the first Neumann machine was underway, Professor Lighthill of the IAS sent a letter to the IAS faculty registering his concerns about the computer project running within the confines of the institute. One of his comments made on another occasion ran like this: "It is time that von Neumann revolutionized some other subject! He has spent rather too long in the field of automatic computation..." The IAS faculty was not amused to find someone from their midst using government money and institute facilities to build an instrument ostensibly to be used in applied science. To be more precise, it was to be used in the calculation of nuclear detonation waves and for meteorology.

We crammed it all into 5k of RAM

But there were other uses this computer was put to. The IAS computer took quite a long time to build. When the team assembled shortly after the 2nd World War began construction of the computer, about 5k of RAM were assumed and no code existed. There was nothing approximate to what we recognise as code libraries, and the very concept of free programmability was still unproven. There were some very wild dreams about what the lack of distinction between numbers and words, or code and data, might mean in a practical context, but up to that point, most computing experts, who hardly numbered more than a few dozen at the time, had little idea that computers were rather more than programmable calculators. Computers were considered the outcome of engineering and as such, they were supposed to be engineering tools. That the world would soon be run using computers was something the pioneers of computer science like Turing and Neumann were aware of - but the powers that be at the IAS were less than enthusiastic that a substantial part of that revolution was happening inside their hallowed halls.

It is interesting to see how much of our present phraseology had its beginnings in the IAS memos: "The heart of the system is a central clock, carrying an enormous load..." And the benefits of modularity in hardware design were emphasized even back then. The diagrams consisting of a central control unit, arithmetical processor, input, output and memory would not be difficult to read for any high school student. A time traveller going back to the IAS in November 1945 might be struck by the irony that construction began in rooms adjacent to Professor Godel's offices.

Intellectual property concerns were addressed as well: Neumann, Arthur Burks and Willian Goldstine, all of whom were managing the project, insisted that all inventions coming out of the project were to be regarded as public domain. This is important since some of the money came from US ompanies like RCA, whose IP portfolio might have profited if the Neumann architecture had been patented.

It took 15 engineers, 3 managers and a few administrative personnel to build the computer within 5 years. It filled several basement rooms and was generally considered a great nuisance by the mathematicians and scholars of the Institute's Historical School.

Documentation is Boring. Not.

When it was finished, there was a by-product of its genesis that few, if any hardware engineers would appreciate fully today. The IAS reports written by Julian Bigelow, Willian Goldstine and others amounted to a set of instructions on how to build another Neumann machine, a liberty freely taken by a number of departments and scientific institutions all over the world. It is not so much the computer that ran from 1951 to 1958 that was a technological wonder. It was the fact that all technical reports were available for all who cared to ask. In fact it would still be possible to reconstruct this machine with some ease provided one would take care to build 40-bit access memory, use IBM punch cards as input and teletype printers as output. The magnetic "inner drum" memory might prove a challenge to reconstruct.

The IAS computer, however, was put to some rather ingenious uses. Teletype printer output had some fairly evident problems, like lack of speed and the fact that printers often failed. Early experiments with graphical output were tried in 1955 and something resembling command lines was tried at that time as well.

Hackers and Professors

What stunned George Dyson when he was going through the records at the IAS was that it seems that even back then, hacker culture was in full flow. Human errors were recognized as a significant source of problems; long nights and copious amounts of very sugary tea and coffee were applied to effect a speedy solution to coding problems. Unreliable hardware was a much greater problem in the early 1950s, since cathode ray tubes were used with abandon and their failure rate did not endear them to engineers. Engineers commented on memory problems by adding entries like "maniac lost its memory" and "maniac regained its memory". The individual parts of the machine seemed to have acquired personalities. But most participants in the project soon realized that unreliable hardware could be dealt with, even by design and careful coding. Neumann published an important article about this very problem in the early 1950s. What became a major source of irritation was the fact that even the best engineers made unforeseen errrors that proved extremely difficult to track down. Of course, when the building air conditioners failed and the input readers tried to consume punch cards with melted tar dripping from the overheated roof on them, little could be done except to repair the air conditioning unit.

Dirty Bio-engineers

By 1956, however, one programmer was moved to write in the operations log that the "machine is a thing of beauty and a joy forever". After 10 years, the IAS computer was finally becoming predictable in its behaviour and most of the hardware bugs had been ironed out.

There was a revolution that happened at the IAS, that for reasons entirely due to historical accident has remained hidden from view until fairly recently. That John von Neumann wrote the "Theory of Self-Reproducing Automata" during the last years of his life is a well-known fact. Its influence on cellular computation and theoretical biology is immense to this day. But there was someone working at the Institute whose thoughts went much further: Nils Aall Baricelli quite literally wanted to run the evolution of artificial species in the 5k memory of the IAS computer and he began to do precisely that in the mid 1950s. His ideas were seminal to what we would call today genetic algorithms. His goal was to "find analogies or, possibly, essential discrepancies between bionumerical and biological pehnomena (sic)." and "to observe how numerical evolution takes place."

Baricelli's Dreams

Baricelli was aware of early reseach into the structure of DNA. He immediately realized that DNA encoded chemical copying actions and molecular building instructions similar to the way computer code could be made to replicate. The precise details are fascinating: Baricelli seems to have run programs large enough to go through several hundred permutations of genetic information to gather enough material on his algorithms. His programs and their output are available in punchcard and on graphical (!) media.

One has to wonder why the results lay hidden for so long in the vaults of the IAS and why the history of artifical life and genetic algorithms seemed largely uninformed by Baricelli's experiments. A valiant guess might look to the fact that the IAS never publicised applied research to a very great extent. After Neumann's death, the Institute took a policy decision never to allow such applied projects on its premises again. Baricelli's programs run on the world's first Neumann computer were lost to the world.

An unsung hero

William Goldstine is one of the many unsung heroes of early computing. Academic superstars like Turing or charismatic loners like Zuse usually get all the credit, while early computers were normally the product of long-term developments taking millions of dollars, pounds or marks and many years of development.

Goldstine was a University of Chicago-educated mathematician who ended up leading the ENIAC (Electronic Numerical Integrator And Computer) project at the University of Pennsylvania during the 2nd World War. Some able persuasion on his part led to the US Army funding the project. Goldstine's fulltime involvement in the project led to a chance encounter with von Neumann, who saw a chance to think and do computing engines that so far only existed in 3 places in the world. Neumann was aware of projects in England and was fully conversant with the theory, but the ENIAC project was the first computer that tried stored-program computing before John von Neumann helped the project along.

Neumann invited Goldstine to participate in his IAS project, an appointment he richly deserved since he had contributed to the theory and implementation of the granddaddy of modern American computing. For the next 13 years, Goldstine led a group of engineers and programmers to build the predecessor of all modern sequential processors.

Born in 1914, he worked for IBM after Neumann's death. He continued to serve in leading research positions up to 1984, when he retired to write and comment about modern developments in computing. He saw the full extent of the modern computing revolution and was fairly amused to see that his grandchildren preferred their PCs just to work, rather than repair failing vacuum tubes. He died in 2004.

Frank Pohlmann


References

John von Neumann's biography
http://en.wikipedia.org/wiki/John_von_Neumann
http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Von_Neumann.h...

The Institute of Advanced Studies in Princeton www.ias.edu
The importance of Baricelli for evolutionary simulations
www.onlamp.com/pub/a/onlamp/2003/06/17/dyson.html
www.edge.org/3rd_culture/dyson/dyson_p2.html

George Dyson, biography and research
www.annonline.com/interviews/971217/biography.html
http://conferences.oreillynet.com/cs/os2003/view/e_sess/4375



Comments

What is a von Neumann machine?

http://www.youtube.com/watch?v=7dg96tefnEU

What good is a diagram whare a bus is nothing but a big arrow?

Back to top