acm-header
Sign In

Communications of the ACM

Historical reflections

When Hackers Were Heroes


male at glowing keyboard

Credit: Andrij Borys Associates, Shutterstock

Forty years ago, the word "hacker" was little known. Its march from obscurity to newspaper headlines owes a great deal to tech journalist Steven Levy, who in 1984 defied the advice of his publisher to call his first book Hackers: Heroes of the Computer Revolution.11 Hackers were a subculture of computer enthusiasts for whom programming was a vocation and playing around with computers constituted a lifestyle. Levy locates the origins of hacker culture among MIT undergraduates of the late-1950s and 1960s, before tracing its development through the Californian personal computer movement of the 1970s and the home videogame industry of the early 1980s. (The most common current meaning of hacker, online thieves and vandals, was not established until a few years later).

Hackers was published only three years after Tracy Kidder's The Soul of a New Machine, explored in my last column (January 2021, p. 32–37), but a lot had changed during the interval. Kidder's assumed readers had never seen a minicomputer, still less designed one. By 1984, in contrast, the computer geek was a prominent part of popular culture. Unlike Kidder, Levy had to make people reconsider what they thought they already knew. Computers were suddenly everywhere, but they remained unfamiliar enough to inspire a host of popular books to ponder the personal and social transformations triggered by the microchip. The short-lived home computer boom had brought computer programming into the living rooms and basements of millions of middle-class Americans, sparking warnings about the perils of computer addiction. A satirical guide, published the same year, warned of "micromania."15 The year before, the film Wargames suggested computer-obsessed youth might accidentally trigger nuclear war.

Making hackers into heroes, rather than figures of fun or threat, was a bold move. Even within the computing community, "hacker" was an insult as often as a point of pride. Managers lamented the odd work habits and unmaintainable code produced by those who programmed for love rather than money, while computer scientists decried the focus of the hackers on practice over theory. According to Levy, though, "beneath their often unimposing exteriors, they were adventurers, risk-takers, artists." His book established a glorious lineage for the amateur programmers of the 1980s, crediting their tribe with the invention of videogames, personal computing, and word processing.

Back to Top

The Hacker Ethic

There is a good chance you have read Levy's book, which sits perennially near the top of Amazon's computer history bestseller list and received more than 130 new citations in 2020. Even if you have not, you have probably come across his list distilling the "hacker ethic" into six bullet points. Levy proclaimed it "their gift to us" with "value even to those of us with no interest at all in computers." It goes:

  • Access to computers—and anything that might teach you something about the way the world works—should be unlimited and total. Always yield to the Hands-On Imperative!
  • All information should be free.
  • Mistrust authority—promote decentralization.
  • Hackers should be judged by their hacking, not criteria such as degrees, age, race, sex, or position.
  • You can create art and beauty on a computer.
  • Computers can change your life for the better.

Like an anthropologist visiting a remote tribe, Levy had the outsider perspective needed to recognize and document the core assumptions of an unfamiliar culture. In a sense those bullet points are timeless. If you are reading Communications you surely recognize some of these beliefs in yourself or in people you have worked or studied with. Yet reading the list is no alternative to understanding these beliefs in their original context. In this column I provide that context.

Back to Top

The MIT Hackers of the 1960s

The first part of the book, explaining the origins of the hacker ethic at MIT, is by far the most influential. The word hack was engrained in MIT's culture. Like other students, MIT students liked to play pranks. For gifted and highly competitive engineers, pranking was a chance to show off with breathtaking but pointless feats of engineering creativity. The quintessential MIT hack took place in 1984, when what appeared to be a campus police car appeared on top of the Great Dome.

Computer programs that demonstrated great skill yet served no apparent utilitarian function could also be "hacks." The computer hacker community formed around MIT's TX-0 computer from about 1958, expanding when one of the first minicomputers, a Digital Equipment Corporation PDP-1, was installed in 1962. MIT was then a world leader in computing, thanks to clusters of expertise built around Project Whirlwind (an early digital computer that became a prototype of the SAGE air defense network), pioneering work on timesharing (which became Project MAC), and the Artificial Intelligence lab founded by John McCarthy and Marvin Minsky.


What was exceptional about MIT was not that it had a computer or that unkempt programmers were devising impressive tricks. It was that MIT had enough computers that a couple of surplus machines could be left out for members of the community to play with.


The shared joy of the hackers was manipulating the functioning of formal, rule-based systems to produce unanticipated results. Levy describes them hacking Robert's Rules of Order during meetings of the MIT Tech Model Railroad Club, hacking the English language to produce words that should logically exist like winnitude; even hacking the Chinese symbols printed on the menu of their favorite restaurant to create the inedible "sweet and sour bitter melon."

Computer programming offered unparalleled opportunities for the virtuoso manipulation of symbols. Levy demonstrates this by showing the hackers shaving instructions from a decimal print routine. A communal frenzy of competitive programming over several weeks culminated in the quietly triumphant posting of an optimal routine on a noticeboard. This was not so unusual: getting a computer to do anything during the 1950s required feats of efficient programming. When interviewed, computing pioneers often recall the joy of loader programs squeezed onto a single punched card, subroutine calling mechanisms that saved a few instructions, or assemblers that automatically distributed instructions around a magnetic drum so that they would be read just in time to be executed. Many celebrated computer scientists began as systems programmers, a group known for its unconventional appearance. As early as 1958, before either hackers or hippies were documented, a Business Week article complained that "computers have been in the wrong hands. Operations were left to the long-hairs—electronics engineers and mathematicians …"1 In 1966, as the Data Processing Management Association, a group for supervisors of administrative computing centers, began to contemplate a relationship with ACM one of its leaders repeated a rumor that ACM members "were part of the sweatshirt and sneaker group."4

What was exceptional about MIT was not that it had a computer or that unkempt programmers were devising impressive tricks. It was that MIT had enough computers that a couple of surplus machines could be left out for members of the community to play with. Most computers were in the hands of specialist operators, who in an oddly anti-Catholic metaphor, Levy dismisses as a "priesthood" standing between the faithful and direct access to instruments of salvation. Students would submit programs and get results, but never touch the computer or interact with it directly. Even universities that treated computers like lab equipment, letting researchers sign up for an hour or two with the machine, required a documented purpose.

Levy carefully distinguishes his hackers from the "officially sanctioned users" for whom computing was a means to solve research problems and publish papers rather than an end in itself. For the hackers, most of whom started as undergraduates, computer programming was not an aid to formal studies but an alternative. Many of Levy's characters drop out of college to spend more time playing with the machines. One hacker implements the first LISP interpreter, others leave for California to develop operating systems at Berkeley, and Minsky hires several to produce software for his lab. Yet Levy keeps their official work on timesharing and AI offstage, instead focusing on their nighttime pursuits. They are a rich cast, and Levy does a wonderful job of bringing them to life as quirky individuals with their own characteristics rather than as interchangeable geeks.


Freed of the need to demonstrate any useful purpose for their programs the hackers pioneered applications of computer technology that became widespread once hardware costs dropped.


Freed of the need to demonstrate any useful purpose for their programs the hackers pioneered applications of computer technology that became widespread once hardware costs dropped. It helped that the PDP-1 was equipped with a vector-based graphical display, an unusual capability for the era. One of their programs was the "expensive typewriter," which used the screen to edit program code. It made little economic sense to tie up an entire computer to program, which is why the usual approach was to write code out with pencils, to be punched onto cards or paper tape. Another was the "expensive calculator" that replicated the interactive functioning of an electromechanical desk calculator on a device hundreds of times more costly.

Levy spends an entire chapter on the most glorious misapplication of resources undertaken by the hacker collective: the video game Spacewar (or, as its main author Steve Russell likes to call it, "Spacewar!") Inspired by old science fiction books, the hackers programmed routines to simulate and visualize the movement of rocket ships in space. Adding photon torpedoes, an intricate starfield background, and the gravitational pull of a star created an addictive combat game. This was not quite the first videogame, but it was the first to matter. DEC began to distribute the code as a diagnostic for its computers, spreading it to many other sites. Spacewar was profiled in a 1972 Rolling Stone article by Stewart Brand, founder of the Whole Earth Catalog, bringing it more fame.2 By then, Nolan Bushnell and Ted Dabney had reimplemented the game in hardware as Computer Space, the first coin-operated video arcade game. It proved too complicated for drunken users in bars, but Nolan and Dabney kickstarted the video arcade industry with their next release: Pong.

Back to Top

Celebrating Hacker Culture

The original hackers were neither destructive nor dedicated to the pilfering of proprietary data, unlike the online vandals and criminals who later appropriated the word, but they were quite literally antisocial. Levy describes their lack of respect for any rules or conventions that might limit their access to technology or prevent them from reconfiguring systems. They are seen by-passing locked doors, reprogramming elevators, and appropriating tools.

Most technology writers can be pegged as critics or cheerleaders. To the cheerleaders, new technologies open utopian possibilities and unlock human potential. To the critics, each new technology is a study in unintended consequences or a way to reinforce injustice and oppression. Levy is not uncritical, but he is unmistakably more interested in capturing how his protagonists view the world than in hectoring them. The book's subtitle, "Heroes of the Computer Revolution," does not admit very much nuance.

Not all observers of hacker culture were so accepting. Levy rejected MIT professor Joseph Weizenbaum's portrayal of the institute's "computer bums" (a term borrowed from Brand), which recalled the sordid opium dens found in Victorian novels: "bright, young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler's on the rolling dice …. Their food, if they arrange it, is brought to them: coffee, Cokes, sandwiches. If possible, they sleep on cots near the computer …. Their rumpled clothes, their unwashed and unshaven faces, and their uncombed hair all testify that they are oblivious to their bodies and to the world in which they move."18

MIT professor Sherry Turkle presented an equally biting picture of MIT's hacker culture in her ethnographic study The Second Self: Computers and the Human Spirit, another classic study of early computer use.16 As a humanist joining MIT's faculty she had "immersed herself in a world that was altogether strange to me." Turkle spent most of the book exploring the cognitive possibilities computing opened for education and personal development. Yet she used the hackers primarily as a cautionary illustration of what happens when human development goes wrong.

Hacker life was for the most part celibate, but it was nevertheless highly gendered. Levy writes that "computing was much more important than getting involved in a romantic relationship. It was a question of priorities. Hacking had replaced sex in their lives." Women were almost invisible, as hackers "formed an exclusively male culture." "The sad fact," notes Levy, "was that there never was a star-quality female hacker. There were women programmers, and some of them were good, but none seemed to take hacking as a holy calling …" Levy's silence on other matters communicates that his hackers were white and that the sex they were not having was with women, the default assumptions of that era if not, thankfully, of ours.

Hackers had created a new masculine space as culturally distinctive as the Catholic priesthood or the U.S. Marine Corps. Turkle judges the hackers more harshly than Levy for these choices. Even within the dysfunctional culture of MIT, she suggests, computer science students were the "ostracized of the ostracized … archetypal nerds, loners, and losers."15 Her chapter "Hackers: Loving the Machine for Itself" begins by describing MIT's anti-beauty pageant, an annual competition to choose "the ugliest man on campus." This, she suggests, is evidence of a social illness of self-loathing that "accepts and defensively asserts the need for a severed connection between science and sensuality." According to Turkle, hackers had "got stuck" part way through the normal course of psychological development, in which adults make accommodations with what hackers called the "real world" of human relationships, jobs, and personal responsibilities. That means accepting uncertain outcomes and emotional risks, by giving up the adolescent need for "perfect mastery" of a controlled world of things. Hackers refused to do this, instead creating a "highly ritualized" culture to support and normalize that choice. She was appalled by their rejection of the sensual elements of art and culture, particularly their tendency to hear music only as an expression of algorithmic progression rather than an activity in which human emotion and instrument tuning were important.

Levy does acknowledge some negative aspects of hacker society. Hackers judged each other purely on programming skill and commitment to hacking, rather than on more conventional social markers. They were elitist, making harsh judgments of "winners and losers" based on an ethical code that privileges coding ability and commitment to programming over all other virtues.

The closest Levy came to direct criticism was flagging the most glaring contradiction of hacker life: its relationship to the military industrial complex. Hacker values decried both commercialism and hierarchical authority, favoring the free exchange of code and ideas between individuals. Yet the TX-0 had been paid for by the government. Having fulfilled its military purpose, it could be diverted for student use. And, as Levy notes, all of the AI lab's activities, "even the most zany or anarchistic manifestations of the Hacker Ethic, had been funded by the Department of Defense." In the late 1960s Levy's hackers, enjoying a "Golden Age of hacking" in the AI lab above Technology Square, were mystified to see protestors outside opposing the role of computing in the Vietnam War. As Levy puts it, "a very determined solipsism reigned on the 9th floor" as the hackers denied any connection between the geopolitics of the Cold War and their military-sponsored anarchist utopia, now protected by steel barricades and electronic locks. (Solipsism is the attitude that nothing outside one's own mind is clearly real).

Back to Top

The Californian Hackers of the 1970s

In the second part of the book, Levy moves to California where government and military contracts had nurtured the production of semiconductors and electronic devices by companies in what, for the first time, people were starting to call Silicon Valley. Hacker culture arrived, in his telling at least, via Stanford University's Artificial Intelligence lab with its strong ties (including the newly constructed ARPANET) with MIT. In California, hacker culture merged both with local countercultural movements and with preexisting communities of electronics hobbyists and professionals.

Hackers is not, alas, as well engineered as The Soul of a New Machine. Reviewing the book for the New York Times, Christopher Lehmann-Haupt noted that it starts "to limp halfway through—to bog down in details that are somehow less and less exciting."8 Part of the problem is overfamiliarity. The MIT hackers are known to most of us only through Levy's reporting, whereas the founding of the personal computer industry was well covered in two other 1984 works: Fire in the Valley6 and Michael Moritz's definitive rendition of the early Apple story The Little Kingdom.14 Dozens of subsequent retellings have followed the same basic outlines, most notably Robert X. Cringely's scurrilously entertaining Accidental Empires, Walter Isaacson's exhaustive biography of Steve Jobs, and several movies and television shows.3,7

Levy's distinctive twist was his focus on the story's countercultural strands, personified in his central character for this section: Lee Felsenstein, a gifted electronic engineer and committed member of the Berkeley counterculture. Fred Turner's classic book From Counterculture to Cyberculture, focused on Stuart Brand, showed deep connections between cybernetic ideas developed in the early Cold War and elements of the Californian counterculture of the late-1960s. Their interaction did more to shape future politics, culture, and the application of online communication than to spur the development of core computing technologies.17 Felsenstein, in contrast, provided a rare direct connection between the classic Berkeley antiwar variant of the counter culture and the emerging personal computer industry of the mid-1970s. Joining a collective that had appropriated an obsolete minicomputer but lacked the skills and work ethic to do much with it, he created the "Community Memory Project," a short-lived online community accessed via public terminals.

Felsenstein's commitment to timesharing was soon tempered by the realization that the microprocessors and memory chips he planned to use for cheap video terminals could also power freestanding computers. Levy gives a vivid description of the Homebrew Computer Club, an informal group hosted on the Stanford campus that introduced the technologies of personal computing to the Bay Area community of electronics hobbyists. It inspired Felsenstein to create the Sol 20 personal computer, an elegant design optimized for easy repair even after civilization collapsed. Because Felsenstein's business partners had little knack for business the Sol was quickly eclipsed, though he reappeared a few years later as designer of the budget-priced, suitcase-sized Osborne 1 portable computer.

Levy's romantic attachment to the "hacker ethic," similar in a way to Tracy Kidder's celebration of engineers who worked to find meaning rather than to make money, creates unresolved tensions here. Personal computers were not given away free, but then neither were minicomputers. The original hackers relied on other people's money. The invention of computers that could be purchased by individual users broadened access to hacking and freed it from military patronage. By the early 1980s recreational programming was a feasible hobby for millions of (mostly middle class) Americans rather than the exclusive preserve of tiny communities centered on places like MIT and Stanford.a

Felsenstein is an undeniably fascinating character. Yet Levy's insistence on personal computing as the expression of an anticommercial, university-derived hacker ethic makes it hard for him to deal with the success of Apple, and the often overlooked Radio Shack and Commodore, in selling hundreds of thousands of computers to individual buyers by the end of the 1970s. Steve Wozniak, Apple's founding engineer, gets a fine portrait that helped to establish him in the public imagination as the embodiment of the hardware hacker, more interested in impressing fellow hackers with the elegance of his circuits than in making money. Despite Wozniak's personal virtue, implies Levy, Apple soon betrayed the hacker ethic. As Lehmann-Haupt acidly observed, "it's hard to tell whether [Levy] is celebrating the arrival of an inexpensive home computer or lamenting its astonishing profitability."

Back to Top

The Videogame Hackers of the Early 1980s

The third part of Levy's book is the narrowest: a portrait of the relationship between a young videogame programmer, John Harris, and his personal computer software publisher. Harris's biggest accomplishment, a skilled conversion of the arcade game Frogger, did little to alter the course of history. Instead he serves as an everyman programmer, representing the new commercial opportunities for self-trained software developers.

The shy and unworldly Harris was part of an early-1980s generation of teenage computer programmers. Home computers were marketed as programming machines, displaying a BASIC command prompt when they were plugged in and connected to a television set. The most dedicated programmers graduated to assembly language, like the original MIT hackers. Replicating the smooth animations of coin-operated arcade games on consumer hardware required code perfectly timed to manipulate the unique quirks of each machine. The best programmers, like Harris, tended to work alone and confine themselves to the hardware of a single machine, in his case the Atari 800 for which he had to figure out undocumented features of its sound and graphics chips. Harris never fully moved on to later platforms or more modern development methods, continuing to code for the long-obsolete Atari computers.b

Their games were distributed using a business model borrowed from rock music and book publishing—"software houses" packaged, promoted, and distributed the programs, paying royalties to their authors. Levy casts Porsche driving college dropout Ken Williams, co-founder and manager of the fast-growing publisher Sierra Online, as the villain. Williams makes millions of dollars from the efforts of Harris and the other young programmers. Although Williams works hard to "get Harris laid," he resents paying a 30% royalty and hates being reliant on unpredictable hackers. Williams therefore colludes with venture capitalists, hires a professional manager to bring order, and flirts with software engineering methods taken from large corporate projects. His industry embraces unhacker like behavior such as intellectual property lawsuits and copy protection.

I think of this section as a long magazine article that was somewhat arbitrarily bound in the same volume as Levy's historical research. Lehmann-Haupt complained that each section of the book "seems more trivial than the one preceding it." If, he suggested, "the point of the entire computer revolution was to try to get a frog across a road and stream without being either run over by trucks or eaten by crocodiles, then it's not only unsurprising that the hacker ethic died; it isn't even sad."

That is not entirely fair, but Frogger does make an odd end to the main story. Levy refused to judge hackers for failing to shower, but he did not hesitate to condemn them for selling out their values. He was once a writer for Rolling Stone, and that attitude mirrors the culture of old-school music journalism in which beloved artists were expected to disdain commercialism while selling millions of albums. Rock journalists sneered at record labels and their besuited executives for their unseemly interest in making money and vilified them for placing constraints on artistic freedom.

Back to Top

The "Last True Hacker"

Levy finishes with an epilogue on Richard Stallman, who had recently launched an apparently quixotic effort to implement a free version of the hacker-friendly UNIX operating system. Stallman worked at MIT's AI lab in the 1970s but became lonely when fellow hackers left to build and sell specialized LISP workstations.

The chapter's title, "The Last of the True Hackers," gives an idea of how likely Levy thought Stallman was to succeed in his effort (or even to recruit a successor). Yet within a decade, software produced by Stallman's GNU project and Linus Torvald's work to replicate the Unix kernel had begun to challenge commercial versions of Unix. The GNU project pioneered a new model of software licensing that protected the rights of users to adapt and redistribute the software for their own needs, replicating key aspects of the original hacker culture. By the early 2000s, free software was eclipsing commercial rivals in crucial areas such as Web browsers and servers, database management systems, and programming platforms. Dominant operating systems such as Google's Android platform are built on top of free software. A broader open culture movement, similarly inspired by the hacker ethic, has produced essential resources such as Wikipedia.

Levy's decision to end the book with Stallman, drawing a direct line from the MIT hacker community to today's world of free and open source software, has held up much better than his premature suggestion that commercialism had killed the hacker dream. In the 25th anniversary edition of Hackers he acknowledged that "Stallman's fear that he would become like Ishi, the last Yahi was not realized." Instead, he observed, some of the ideas in the hacker ethic "now seem so obvious that new readers may wonder why I even bothered writing them down."10


The original hackers were neither destructive nor dedicated to the pilfering of proprietary data, unlike the online vandals and criminals who later appropriated the word.


Levy did not just capture hacker culture; he spread it to many who would never set foot in MIT, Stanford, or the Homebrew Computer Club. Since Levy wrote his book, hacker culture has become far more visible thanks to the success of the free software movement and related open culture projects such as Wikipedia. These inspired anthropological and sociological studies by scholars such as Gabriella Coleman and Christopher Kelty. Others have made broad claims for hacking as an activity central to the modern world. McKenzie Wark, for example, issued A Hacker Manifesto which posited the emergence of a hacker class and mimicked the 1848 Communist Manifesto in its call for hackers to rise up against the oppressive "vectoralists" of capitalism.13

The mainstreaming of hacker culture may have changed the character of computer science itself. The proportion of computer science students who were female rose steadily from the field's beginnings in the 1960s until 1985, when it began a precipitous fall even as women's participation in other science and engineering disciplines continued to rise.c That is a few years after typical first experience of computing shifted from a tool in an academic context to a recreational home device for videogame playing and hacking. Computer scientists began to complain that the minds of incoming students were now contaminated by exposure to undisciplined programming methods. As Edsger Dijkstra memorably put it, "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."5 Correlation is not causation, but it certainly seems plausible that images of bedroom hackers and victims of "micromania" created a polarizing association of computing with a new and distinctive form of masculinity. And, as Levy himself acknowledged, classic hacker culture neither attracted nor accommodated young women.

Back to Top

Hacker Hybrids

Although the free and open source software movements are thriving, Levy, correctly I think, notes that the influence of hacker culture now undergirds a "world where commerce and hacker were never seen as opposing values." In the 2010 epilogue he chatted amiably with hacker nemesis Bill Gates and mentioned that a Google executive had credited the book with inspiring his entire career. Back in the 1980s, Turkle had complained that MIT's hackers refused to grow up and join the "real world." As Levy showed in his 2011 book In the Plex, a closely observed study of Google, tech giants have created amenity filled Never Lands where hacker-infused cultures reign unchallenged and nobody ever has to grow up.12 Levy's titular insistence that hackers were the true "heroes of the computer revolution" was echoed thirty years later when Walter Isaacson titled his blockbuster history The Innovators: How A Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. Where Levy had described a marginal, usually overlooked subculture of computing, Isaacson's title implied that the innovators who created personal computers, the Internet, and the modern tech industry, many of them spectacularly wealthy, famous, and powerful, somehow composed a single geeky group of hackers.

In preparing a new overview history of computing, I found myself quoting and referencing Levy more often than any other writer—not just Hackers but also his writing about the Macintosh project, the iPod, VisiCalc, Google, and Facebook. Levy's great strength is his focus on people. Time after time, he has delivered the most closely observed accounts of the most important tech companies. Levy is to computer companies what veteran political journalist Bob Woodward is to presidential administrations. And like Woodward, he receives that insider access because his sources have a reasonable expectation that they will be portrayed sympathetically.

Levy's writing suggests he is drawn to smart, eccentric, and creative people. He wants to see the best in them. In Hackers that works wonderfully, because his sympathetic gaze was turned on obscure characters who might be harshly judged by casual observers. He described the blinkered world view of his characters, their lack of respect for regulations and institutions, and their conviction that nothing should stand between them and the possibilities opened by new technologies. When the hackers saw something that struck them as inefficient or illogical they would go ahead and redesign it, without seeking permission or investigating alternative perspectives. That is not so far from Facebook's motto of "move fast and break things," or from an admonition of Zuckerberg's that Levy quotes in the introduction to his recent Facebook: The Inside Story: "think of every problem as a system. And every system can be better."9

The blending of hacker culture with big tech dominance concerns me, because a mind-set that might seem harmless, or at worst narrowly self-destructive, in obsessive systems programmers is more worrying in executives running the world's most powerful corporations. Consider the unchecked personal power of Peter Thiel, Elon Musk, or Mark Zuckerberg to circumvent government regulation, manipulate the legal system, or short-circuit the democratic process. Indeed, when writing about Facebook Levy was forced by changes in public opinion and the stirring of his own conscience to treat the firm and its leaders far more harshly than he had Google a decade earlier. The "determined solipsism" that engulfed the 9th floor of MIT's Tech Square back in 1968 now hangs like a dense cloud over much of Silicon Valley. Perhaps we need some different heroes.

Back to Top

Further Reading

  • Pretty much anything by Levy is worth reading. My personal favorite is his 1984 essay "A Spreadsheet Way of Knowledge" (republished as https://www.wired.com/2014/10/a-spreadsheet-way-of-knowledge/), a deeply perceptive appreciation of the early impact of spreadsheet software. If you enjoyed Hackers because it shows eccentric men doing stubbornly creative things in academic environments then you are more likely to enjoy his follow-ups Artificial Life: The Quest for a New Creation (Pantheon, 1992) and Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age (Viking, 2001) than his later books on Apple, Facebook, and Google.
  • I already mentioned the work of Chris Kelty and Gabriella Coleman on hacker culture. But if you are interested in how the other sense of hacker, the online vandal or data thief, came to predominate, then there are several key books from the 1980s that helped to spread the new meaning to a world still unfamiliar with online communication. These include Out of the Inner Circle by Bill Landreth and Howard Rheingold (Microsoft Press, 1985) and The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage by Clifford Stoll (Doubleday, 1989).
  • Law professor and activist Lawrence Lessig played a crucial role in broadening the free software movement to a more general free culture movement. His classic contribution, Code: and Other Laws of Cyberspace (Basic Books, 1999) remains readable and provocative.
  • Levy stressed the playful nature of early hacker culture. A similar sensibility drove cult 1979 favorite Gödel, Escher, Bach: An Eternal Golden Braid (Basic Books, 1979), by physicist turned cognitive scientist Douglas Hofstadter. Although Hofstadter denies any personal interest in computers, his book showcases the hacker fondness for word play, recursion, baroque music, mathematical codes, and the manipulation of symbols.

Back to Top

References

1. Anonymous. Business Week reports to readers on: Computers. Business Week 21 (June 1958).

2. Brand, S. Spacewar: Fanatic life and symbolic death among the computer bums. Rolling Stone (Dec. 7, 1972), 50–58.

3. Cringely, R.X. Accidental Empires: How the Boys of Silicon Valley Make their Millions, Battle Foreign Competition, and Still Can't Get a Date. Addison-Wesley, Reading, MA, 1992.

4. Data Processing Management Association, Executive Committee Meeting Minutes, Aug. 5–6, 1966, contained in Data Processing Management Association Records (CBI 88), Charles Babbage Institute, University of Minnesota, Minneapolis.

5. Dijkstra, E.W. EWD 498: How do we tell truths that might hurt. In Selected Writings on Computer Science: A Personal Perspective. Edsger W. Dijkstra, Ed. Springer-Verlag, New York, 1982.

6. Freiberger, P. and Swaine, M. Fire in the Valley: The Making of the Personal Computer. Osborne/McGraw-Hill, Berkeley, CA, 1984.

7. Isaacson, W. Steve Jobs. Simon & Schuster, New York, NY, 2011).

8. Lehmann-Haupt, C. Hackers as heroes. New York Times (Dec. 24, 1984); https://nyti.ms/3uapqyn

9. Levy, S. Facebook: The Inside Story. Blue Rider Press, New York, 2020.

10. Levy, S. Hackers. O'Reilly, Sebastopol, CA, 2010.

11. Levy, S. Hackers: Heroes of the Computer Revolution. Anchor Press/Doubleday, Garden City, NY, 1984.

12. Levy, S. In the Plex: How Google Thinks, Works, and Shapes Our Lives. Simon & Schuster, New York, NY, 2011.

13. McKenzie W. A Hacker Manifesto. Harvard University Press, Cambridge, MA, 2004.

14. Moritz, M. The Little Kingdom: The Private Story of Apple Computer. William Morrow, New York, NY, 1984.

15. Platt, C. and Langford, D. Micromania: The Whole Truth about Personal Computers. Sphere, London, 1984.

16. Turkle, S. The Second Self: Computers and the Human Spirit. Simon and Schuster, New York, NY, 1984.

17. Turner, F. From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press, Chicago, IL, 2006.

18. Weizenbaum, J. Computer Power and Human Reason: From Judgment To Calculation. W.H. Freeman, San Francisco, CA, 1976.

Back to Top

Author

Thomas Haigh (thomas.haigh@gmail.com) is a professor of history at the University of Wisconsin—Milwaukee and a Comenius Visiting Professor at Siegen University. He is the author, with Paul Ceruzzi, of A New History of Modern Computing to be published by MIT Press later this year. Learn more at www.tomandmaria.com/tom.

Back to Top

Footnotes

a. Educational timesharing systems also played an important part in democratizing computing, largely overlooked by Levy but explored recently in Joy Lisi Rankin, A People's History of Computing in the United States (Harvard University Press, Cambridge, MA, 2018).

b. See https://dadgum.com/halcyon/BOOK/HARRIS.HTM

c. For a recent summary of the huge literature on this topic, see Misa, Thomas J. "Gender Bias in Computing." In Historical Studies in Computing, Information, and Society. William Aspray, Ed., Cham, Switzerland: Springer Nature, 2019, 113–133.

This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) - Project-ID 262513311 - SFB 1187 Media of Cooperation.


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.


 

No entries found