Code Nation. Michael J. Halvorson

Code Nation - Michael J. Halvorson


Скачать книгу
This counterculture narrative emerged in the 1970s when a group of West Coast technology enthusiasts argued that using and programming computers should be an enlightening, communal experience. These values strongly shaped some segments of the computer industry, including the microcomputer community in California and authors of programming tutorials in the 1970s and 1980s.

      The third mythology relates to a belief about the commitments of professors and administrators in the emerging discipline of computer science. In the U.S., many computer professionals came to believe that computer scientists were occupied primarily with theoretical problems related to computational logic, algorithms, and engineering principles, rather than the practical skills needed to implement projects in the computing industry. Although some academics did assume an aloof posture in relation to business computing, this stereotype was largely inaccurate, and it had important consequences for how professional programmers were trained (or not trained) in the coming decades. Finally, there are several mythologies related to what is often called PC Revolution “the PC Revolution,” a phrase that tries to capture the excitement surrounding the creation of the first stand-alone microcomputers and Personal computers (PCs) personal computers (PCs) in the 1970s and 1980s. This term draws attention to vital energies in the American computer industry, but it also tends to lionize the experience of PC users and entrepreneurs over professionals working in other areas of digital computing. After gently nudging aside this rhetoric, Code Nation proceeds by exploring how the microcomputer movement actually did contribute in important ways to the development of programming culture and the commercial software industry in America. We will investigate how this upward trajectory took place in waves or stages—from the time-sharing systems of the 1960s and 1970s, to the bare-bones microcomputers and PCs of the late 1970s and early 1980s, to the powerful Graphical user interface (GUI) GUI.Graphical user interface (GUI) graphical user interface (GUI) workstations of the late 1980s and early 1990s, to the corporate and enterprise computing systems of the late 1990s and early 2000s. These stages involved fascinating operating system platforms, including CP/M, Apple DOS, MS-DOS, the Apple Macintosh, Microsoft Windows, Unix/Xenix, OS/2, and Windows NT Server.

      By giving powerful computing mythologies their due, we acknowledge the importance of cultural memories in the history of business and technology, including the problems that people encountered in the past, and the aspirations of users and programmers in the future. The learn-to-program movement succeeded in part because it wove together each of these mythologies, creatively transforming past memories into a shared vision of progress and human belonging. The movement’s visionaries, authors, tinkers, and entrepreneurs deserve recognition for their contributions to the gradual computerization of society, a process with major cultural and economic consequences that is still underway.

      We’ll begin with a technical problem and a story.

      In October 1968, there was a sense of crisis in the air.

      Although this year has been described as one of the most turbulent in the 20th century, the turning point was not related to political or military disruptions, but to a crisis in the nascent field of software engineering. In fact, executives in North America and Europe had been sounding the alarm since the mid-1960s. Now that powerful mainframe computers were transforming the world’s business and engineering systems, the software that drove these machines was taking on an oversized role in public life. Just weeks before Richard Nixon, Hubert Humphrey, and George Wallace faced off in the 1968 American Presidential Election, the world’s engineers were worrying about computers and software. Computing mythologiesNATO Conference on Software Engineering

      To list a few of the problems, there was a perpetual shortage of programmers to create software for the new systems. These programs were often massive, stretching to tens of thousands—even millions—of lines of code in computer languages like COBOL COBOL, Formula translation (FORTRAN) FORTRAN, and Algorithmic Language (ALGOL) ALGOL. The code configured American military systems and corporate data processing tools—programs like the banking, billing, and reservation systems that proliferated in the late 1960s. However, good software developers were hard to find. There was no clear procedure for locating, hiring, and training the specialists needed to build and maintain the required systems.

      The growing complexity of software also required robust management techniques to ensure that projects were completed on time and on budget, but neither outcome was very common. To make matters worse, the growth of professional organizations like the Association for Computing Machinery (ACM) Association for Computing Machinery (ACM) found it difficult to improve programmer productivity or software quality. Although revenue was pouring in to successful American hardware manufacturers like IBM, Burroughs, and Digital Equipment Corporation (DEC) Digital Equipment Corporation (DEC), the budding software industry seemed undisciplined in its workflows, ill-prepared to expand, and in a perpetual state of disorder.

      Much has been written about the Softwarecrisis “software crisis” of the late 1960s, and some have argued that Software engineering “software engineering” was the wrong metaphor to address the problems.3 But the dilemma was noted by many computer professionals around the world, from North America to Asia to Europe. The 1960s was a time of expansion, as organizations were drawn to utopian visions of mainframe and minicomputer technologies. But reassessments soon followed, and critics pointed to bloated software systems that were complex and error prone; programs designed for engineers with pocket protectors but not real people. The job performance of corporate software developers also came under fire. “When a computer programmer is good, he is very, very good,” concluded one IBM study published in 1968. “But when he is bad, he is horrid.”4Computing mythologiesNATO Conference on Software Engineering

       figure

      Figure 2.1IBM Executives face the camera in front of a bank of IBM 729 magnetic tape drives, 1962. (Photo by The LIFE Picture Collection via Getty Images/Getty Images)

      The gender implications embedded in this statement are subtle, but important to catch. In 1968, approximately 88% of professional programmers in the U.S. were men. Although women made significant contributions to computing in the 1950s, programming work underwent a process of Masculinization masculinization in Britain and America in the 1960s and 1970s.5 As part of this transition, the cultural stereotypes about programming being a male activity increased, and the era’s sources often associate programming with masculinity. The implications of the underrepresentation of women in computing will be examined in Chapters 3, 7, 8, and 10, which explore how Americans learned to code, and how new programmers negotiated for status in the communities that either welcomed or rejected them. Keep an eye on this complex issue; it surfaces in predictable but also unlikely settings. (See Figure 2.1.)

      To address the global software crisis, the Science Committee of NATO Science Committee of NATO sponsored a conference in October 1968, in the Garmisch-Partenkirchen district of West Computing mythologiesNATO Conference on Software Engineering Germany (Bavaria). The organizers planned to discuss the design and production of computer software, the experiences of software users, and the persistent problem of meeting software schedules and budgets. The term Software engineering “Software Engineering” was chosen as a framing title for the meetings, hinting at a proposed solution to the dilemma: The computer industry should infuse programming with theory and practice from the disciplines of science and engineering to address their problems. It was high time to demand structured approaches to design, coding practices, and testing that would improve reliability. If this did not happen soon, the complexity of software, and its concomitant unpredictability, would likely stifle the electronic computer revolution.

      The 5-day conference was attended by 50 people from 11 countries, with


Скачать книгу