Computing and the National Science Foundation, 1950-2016. William Aspray
it now calls for exceedingly expensive structures and equipment . . . which already have outrun the financial capacity of private resources, and this will increasingly be the case. Only the Federal Government . . . will be able to meet the deficiency after all possible private resources have been utilized.50
Scientists and engineers outside the military and atomic laboratories were having difficulty accessing computers due to heavy security constraints. The high cost of maintaining a modern computation laboratory and the challenge and pitfalls of charging usage fees, “a practice which affects the character of its scientific program,”51 limited access to academic computing centers.
The NSF entered into an agreement with the Applied Mathematics Laboratories of the National Bureau of Standards (NBS) for “advice on the methods of numerical analysis and the choice of machines for specific computation involved in requests . . . ”52 That year (1955), NSF made computational grants (with advice from NBS) to the Ohio State University; the University of Texas; the University of California, Berkeley; and the University of Illinois.53
In February 1955 the NSF appointed an ad hoc Advisory Panel on University Computing Facilities, led by John von Neumann.54 The panel recommended “that the Foundation establish a limited program to provide computing equipment and partial support for appropriate staff in order to carry on research and training in high-speed computation.” The report also noted that research in the advanced design of computing machines should be recognized as being of basic importance: “it is desirable that the speed of computing machines be increased by a factor of at least 50 and that their capacity be substantially increased.”55 At its October 1955 meeting the panel recommended that “$5 million be expended for the development of a fast, large computing machine of advanced design.”56
Leading this panel was not the only instance where von Neumann played a role in developing NSF’s computing facilities program. He earlier had proposed the stored program concept in his “First Draft of a Report on the EDVAC,”57 and he built such a machine at the Institute for Advanced Studies (IAS) in Princeton. Computer simulations were frequently used for both meteorology and nuclear weapons and von Neumann had realized that these two fields were closely connected scientifically. Both were centrally concerned with highly nonlinear fluid dynamics.58 Von Neumann was the principal investigator on an NSF grant to organize the Conference on High-Speed Computing in Meteorology and Oceanography59 held May 13–15, 1954, at the University of California, Los Angeles. Following this meeting, NSF funded the aforementioned advisory panel convened by von Neumann, then at the Atomic Energy Commission (AEC). In May 1956, von Neumann outlined the needs for facilities, which were critical to the advancement of science yet beyond the financial means of universities and the National Science Board; it subsequently approved a computer facilities program.60 Von Neumann died early the following year.
The career of John Pasta connected von Neumann, his IAS machine, the AEC, the Los Alamos National Laboratory (LANL), and NSF. Pasta had a long and unusual career, beginning as a New York City police officer, then an Army Signal Corps officer, a physics PhD student, and eventually a staff member at Los Alamos. In 1953, Pasta, Stanislaw Ulam, and Enrico Fermi used the LANL MANIAC computer, based on von Neumann’s design for the IAS computer, to identify the Fermi-Pasta-Ulam (FPU) problem,61 a fundamental advance in soliton theory. In 1956, von Neumann invited Pasta to head what became the AEC Division of Mathematics and Computer Research. In 1961, Pasta left the AEC to join the University of Illinois as chair of the computer science department and later became director of the NSF Office of Computing Activities, director of the NSF Division of Computer Research (DCR), and director of the NSF Division of Mathematical and Computing Sciences (DMCS).
NSF continued to make grants for university computing centers and research in numerical analysis through the 1950s, for example at Cal Tech, MIT, Oregon State, Washington, and Wisconsin in 1956. Research grants went to Cal Tech, Berkeley, Cornell, MIT, Oregon State, Penn, Princeton, Purdue, Stanford, Washington, and Wisconsin the following year.
In July 1960, an institutional grants program was created to assist institutions to strengthen their general research and training functions. NSF made 6 grants in 1961 totaling $1,685,000 for the acquisition or rental of high-speed computers and 20 grants totaling $796,000 for computing centers and procurement of small computers. Because NSF funding was limited, the Foundation limited computer center support to an amount equal to 5% of a proposing institution’s research grant income, capped at $50,000 (later reduced to $37,500). Using this formula, NSF made institutional grants for computing infrastructure totaling $1,496,604 to 248 institutions; more than half the awards amounted to $2,000 or less, while just 10 institutions received the maximum grant of $37,500.62
In June 1962, NSF Director Alan Waterman requested that the National Academy of Sciences’ National Research Council undertake a study of “the status and likely growth of computer uses. . . . ” J. Barkley Rosser prepared the National Academy of Sciences report, “Digital Computer Needs in Universities and Colleges.” The Rosser Report63 was completed in 1966 and made a strong case for universities having access to high-performance computers, but it said little about education. In 1963, the Foundation was able to provide only limited support for computing facilities due to the magnitude of the need. Institutions were required to provide as much as two-thirds of the purchase price from a non-federal source. Even though funding increased to $4,980,000 in fiscal year 1963,64 only 13 grants were made.
Arthur Grad administered the computer facilities grants at NSF beginning in 1959 and he recalled that the Rosser Report:
. . . all started with Phil Morse at MIT. They needed a bigger computer. They estimated they would need about ten million dollars. And I told them, well, there wasn’t much I could do about it since my entire budget was only five (million). And I suggested to him that probably the best thing he could do was to have a National Academy study done pointing out the need for more money for computers. So, the Academy duly appointed the committee to make those studies. . . . But it all started from Phil Morse’s need for a big computer.65
At the time Morse was seeking additional funding, MIT had received a 7094 computer from IBM on which MIT faculty began development of the CTSS operating system.66 The CTSS operating system, a forerunner of Project MAC, Multics, and eventually Unix, was based on an idea of John McCarthy, then at MIT. In an influential memo titled “A Time-Sharing Operator Program for Our Projected IBM 709,” he proposed interactive time-shared debugging. Herb Teager and McCarthy gave a presentation entitled “Time-Shared Program Testing”67 at the national ACM meeting in September 1959.68 Much of the CTSS research was funded by NSF grants to the MIT Computation Center. This is clearly an example where fundamental advances