Lost and Found
In December of 1922, a young woman named Hadley Richardson decided she would give her husband, Ernest Hemingway, a surprise. Hem was over in Switzerland to report on the Lausanne Peace Conference, and Hadley had been planning a train trip to visit him. While Hemingway was a fine newspaper reporter, Hadley knew that his real passion was for fiction, though he hadn’t managed to publish any of his stuff yet. So she devised a wonderful idea. She searched through their Paris apartment and rounded up all the short stories he had been working on and packed them into her suitcase. That way, when she arrived, he’d be able to work on them in Lausanne.
But Hadley was a bit ill the day of the trip. So she hopped off the train, just before it was taking off, to grab a bottle of Evian. When she returned, the suitcase was gone. Sobbing the entire length of the train ride, she eventually arrived in Lausanne and broke the news to her husband. Hem only laughed. Didn’t she know he had carbon copies of all his stories? Then came the turn of the screw: because she had wanted to be thorough, she’d packed those in the trunk too.1
If you had asked Hem in, say, November of 1922 whether it would have been a good idea to destroy all of his work, the answer probably would have been a resounding gtfo. But the right answer may well have been yes. We’ll never know what those lost manuscripts looked like. But a good guess is that they did not look like the stripped-down, polished stories that would go on to permanently change English prose style.
This isn’t a unique story. Plenty of writers, at least, have had similar experiences. In fact, some make a habit of it. Philip Roth would habitually write at least 100 pages that were bound for the wastebin before he figured out where his novel would begin.2 And a 1956 time-lapse film of the artist at work shows that Pablo Picasso would typically run through five to twenty separate ideas on a single canvass before settling on the painting that would eventually get fleshed out.3
Probably not a lot of people think about the internet in this way: as a first draft that might be just as easily discarded as kept. But if you take the long view, the thirty years over which the internet has come to dominate culture is not very long at all. And the recent history of computing, which is, in large part, the history of the internet, may turn out to look less like the opening chapter, and more like a series of early stories that we’re glad to have left back at the Gare du Nord.
And then there was the internet before the internet, which was not really the internet at all. These days, networked computing and the internet are often used interchangeably. But the first popular way to connect from one’s own personal computer to another’s was directly, through a Bulletin Board Service, or BBS.
The design of these systems, as one might guess, was based on bulletin boards. Anyone with a modem hook-up could dial straight into a certain board, and find games, messages, and file exchanges. The philosophy was a lot like that of Lee Felsenstein’s Community Memory, the terminal for the people in the streets of San Francisco—only in the cases of BBSes, you could just dial in from anywhere, instead of having to visit one specific spot.
The story goes that the first BBS, known as CBBS, was invented in a particularly bad Chicago blizzard during which, for want of a good method by which to contact their computer club colleagues, CBBS founders Ward Christensen and Randy Seuss simply invented one. The system they created ran over telephone wires; one user would dial in to a given BBS, and the host would have to be powered up and available. Users were also confined by, or perhaps liberated by, the challenge of staying within a system defined by the ASCII and ANSI character sets.4 (Some BBSes also used Graphical User Interfaces, resembling what we see when we look at our computer screens today.)
According to the Fidonet History Timeline5, the number of active BBS nodes peaked in August, 1995, with a total of 35,787. Today that number, according to the Telnet BBS guide6 is down to 946. What happened in the interim, of course, was the internet. The switch wasn’t immediately beloved by all. As one former BBS-user would put it, “That personal connection was sorely missing on big-name online subscription services of the time—Prodigy, CompuServe, and AOL. Even today, the internet is so overwhelmingly intertwined that it doesn't have the same intimate feel.”7 What it would have, eventually, was storage space, reliability, and, thanks to some savvy designers, easy onboarding for those who only knew of personal computers from a Super Bowl ad.8
PRINT 2+2
Marc Andreessen did not have an excessively cheery childhood. As his current business partner Ben Horowitz has put it, comparing Andreessen to none other than Kanye West, “his childhood was so intensely bad he just won’t go there.” Andreessen grew up in the hinterlands of New Lisbon, Wisconsin, a “no-stoplight town” populated in large part by “Scandinavian, hard-core, very self-denying people who go through life never expecting to be happy.” Describing his attitude, for instance, to independent bookstores, the Amazon-induced demise of which has been widely bemoaned, Andreessen notes that “there weren’t any where I grew up. There were only ones in college towns. The rest of us could go pound sand.”9
From this perspective, it’s not hard to see how Andreessen arrived at one of the insights—perhaps the insight—that would decide the future of computing. In 1990, CERN researcher Tim Berners-Lee announced a fairly rarefied-seeming project that he was calling the “World Wide Web.” “The project,” Berners-Lee explained, was “based on the philosophy that much academic information should be freely available to anyone.”10 A close reading of the above clause might lead one to zero in the word academic. Berners-Lee, for the most part, was imagining a way for scholars to share research papers. It would be fair to assume that he did not have in mind lolcats or porn. According to Marc Andreessen, he didn’t even have in mind pictures. “He only wanted text,” Andreessen said of Berners-Lee, “his view was that images are the first step on the road to hell. And the road to hell is multimedia content and magazines, garishness and games and consumer stuff. I’m a Midwestern tinkerer type. If people want images, they get images. Bring it on” [11](How the Internet Happened, pp. 15). This was his thinking when he created the first internet browser for the average man, X Mosaic. Immediately, it was a hit. And while Andreessen’s invention would eventually get co-opted by the government agency he was working for when he built it, he would eventually head over to Silicon Valley to form Netscape, a company that would become paradigmatic for all start-ups from that day hence, and which released the browser, Netscape Navigator, that soon came to dominate the market.
Netscape and the browsers that followed it made the internet something that could be fun and interesting for everyone from seasoned computer programmers to complete noobs. But the other necessary element was, obviously, personal computers. As is so often the case, the first pioneer here was not the person who would profit most from his invention.
Born in Miami in 1941, Henry Edward “Ed” Roberts was a mammoth of a man at 6’4” and 250 pounds, who would “consume shelves in libraries” in order to learn about whatever subject he might fancy. Earlier in his life, Roberts acquired experience in cryptography, rocketry, and bee-keeping. And then there was the subject of computing. After settling in Albuquerque, New Mexico, Roberts founded a company, Micro Instrumentation & Telemetry Systems (MITS), focusing on the manufacture of model rocket equipment and calculators. But by the mid-70s, MITS was on the verge of crumbling. His bread-and-butter business selling make-your-own-calculator kits had been severely undercut by larger corporations that were able to keep prices low [12](Hackers, pp. 186).
Then Roberts got one of his wildest ideas of all: a microcomputer—something that hobbyists could assemble at home and keep for personal use. The soon-to-be-manufactured Intel 8080 chip would just be powerful enough. Surely some other people as crazy as Roberts himself would bite on the idea?
As it turned out, Roberts’s Altair 8800, the world’s first personal computer, generated more interest than he could handle. According to Microsoft co-founder Paul Allen, the balance on Robert’s checking account “swung a half-million dollars from red to black in six weeks,” based solely on interest in the Altair. These were computers that hardly ran a useful program, and required painstaking, often frustrating assembly. But no matter. People wanted personal computing. And when Paul Allen and his friend Bill Gates got word of Roberts’s project, in December 1974, they decided they’d write the software for it. Gates, ever the schemer, spoke to Roberts and let him know that the interpreter for the BASIC programming language, which he and Allen were writing for the Altair, was just about complete. Roberts said that was great, and they should show it to him in a month or so. Then and only then, Allen and Gates got down to the real work. They wrote their code without ever seeing the computer it was supposed to run on, and when Allen delivered the paper tape by hand to Roberts, he was as nervous as could be. But lo and behold, the command PRINT 2+2 gave an instantaneous 4. [13](Idea Man, pps. 74-85).
Gates and Allen, of course, would go on to found Microsoft. Microsoft, along with Apple, would play a foundational part in making computers easy to use for non-specialists. And eventually, Microsoft would overtake Netscape as the king of internet browsers, as it had the advantage of being able to lodge its Internet Explorer into every Windows software package that it shipped—while also taking measures to prevent users from replacing Explorer with Netscape. (The United States Government would eventually file an antitrust action against Microsoft for this very practice.) Later, Internet Explorer found itself supplanted by a browser that made things even easier for its user base: Google Chrome.
The history of computing in the age of the personal computer and the internet would appear to be a continuation of the story Marc Andreessen began to tell, when he noted people would want pictures on the web. Consumers appear to want as much cool stuff as they can have on their computers, and they appear to want it simplified as much as possible. The main question, over the past thirty years, is who will have the insight and the muscle to get a monopoly over the coolest, easiest-to-use stuff? In 1985, when Netscape was proving itself a force to be reckoned with, a journalist named Geoge Gilder predicted that some day Marc Andreesen would be bigger than Bill Gates in the internet world. Given the success of venture capital firm Andreessen went on to co-found, Andreessen Horowitz, the jury may still be out on this one. But in the sense in which Gilder meant his statement, he was dead wrong. Netscape Navigator was dead in the water by 1999. The reason for this was observed at the time by an editor of Windows Watcher, who wrote in a response to Gilder’s piece: “Gilder’s fantasy of a Microsoft-free computer industry might come to pass if Bill Gates spent his time in his vault counting his money. He doesn’t. He obsessively watches the horizon for threats to his hegemony. When he spots a danger, he works feverishly to use his current monopolies to leverage his way into the new arena.”14
If that sounds a bit like Mark Zuckerberg and his Facebook/Meta empire, it’s because—well, it’s the exact same story. In Internet applications as in other businesses, there are advantages to central control. The biggest companies can offer the most competitive salaries and draw the most talented engineers. Just as Microsoft did, Meta has every incentive to poach any good idea that comes along, and those who build software have all kinds of incentives to go work for the big dogs. But just as in other organizations, centralized control has its disadvantages too. If Meta were a country, it would be by far the most populous country in history. It has a lot of different types of people to try to keep satisfied under the same umbrella. And it has the major disadvantage that all of its ideas must, by definition, align with the general concept of more. A private capital-seeking company necessarily relies on the idea of growth, not mere satisfaction of its current user base. Even if one of its co-founders want to break up the company,15 Meta will never want to break up Meta. But is it possibly, even theoretically, to have a single platform that keeps the whole world happy?
All the Cokes are Good
Unquestionably, an easy-to-use software package pre-loaded onto just about every personal computer makes great business sense for the company that’s providing the software; and just as unquestionably, it makes great business sense to pack a small computer into a device that also functions as a phone. These were both tremendous business ideas, executed at just the right time. The people behind both of these ideas have been mythologized, described as visionaries. But more than anything, the stories of Microsoft and Apple, and of the big software companies that followed them, are the stories of timing, marketing, and corporate acumen. Both Bill Gates and Steve Jobs had good practical sense, good expertise in their fields, and no small amount of cunning. But if Gates hadn’t cornered the PC software market when he did, likely someone else would have. If Jobs hadn’t gotten a team together to build something like the iPhone, someone else would have done that too. (In fact, a company called General Magic did so a decade before Jobs. Only the timing was wrong.) 16
Of course, it’s all too easy to be blasé about these innovations, after the fact. Only 30 years ago, the idea that everyone would have a computer on his or her desk, to say nothing of his or her pocket, seemed absurd. Why on earth would all those people want computers? But once there was even a small amount of data to work with, the answers were clear: Yes, people in disparate geographical areas want to talk to each other, with as few limitations as possible. Yes, if people can gather information on a given subject with a point and a click, they’ll want to do that. Yes, in addition to playing physical games, and having physical sex, people will want to play games in the digital world, and experience simulations of sex in the digital world, and they’ll want these things as much or more than the real versions, which tend to require somewhat more discipline, risk, humility, and responsibility.
In the end, the more interesting developments of the last 30 years may not be the actual technologies at our fingertips—although those are extraordinary enough—but the information we’ve gleaned, or are in the process of gleaning, about how humans behave in a world where so much of existence can be digitally reproduced. Some of these developments have produced clear-cut answers. Computer and video games are popular, and it’s hard to imagine a world in which they’ll cease to be popular. That we’re hard-wired to enjoy games is something that a neuroscientist could have told us, 17 without the benefit of computers. But perhaps, before the advent of computer games, we wouldn’t have seen just how strong the wiring really is.
However, certain questions about our lives in the digital age have yet to be elaborated. One of those questions is just how public or how private we want our lives to be, now that we can, if we choose, spread our images across the phones of billions. In a 2005 interview at the early Facebook offices, Mark Zuckerberg described the platform he had created as “an online directory for colleges, and it’s kind of interactive. So if I want to look you up, or get information about you, I just go to the Facebook, and type in your name, and it brings me up, hopefully, all the information I’d care to know about you.” There’s no reason to believe he was being particularly sly or withholding in this interview. For Zuckerberg, Facebook was, at first, just a directory. Early Facebook investor Peter Thiel recognized something similar when he noted that Facebook was the social network that successfully ported real-world identity over to the digital world: “We are not,” he commented, “trying to escape the world and create a fictional alternate world.”18 Zuckerberg likely wasn’t thinking, back in 2005, about the ways in which his company might require him to settle difficult questions about the differences between public life and private life, and the differences between digital life and meatspace life. In the end, he appears to have settled on whatever answers have made him (and his investors) the most money19.
For Thiel, despite his huge success investing in internet companies like Facebook, the last 50 years of innovation have been a terrible disappointment. The only genuine technological acceleration, he has pointed out, is in the realm of bits, as opposed to the realm of atoms.20 Another way of stating this is that our technological innovations over the past 50 years have for the most part been an elaboration of Moore’s Law—a prediction made by Gordon Moore in 1965 that the number of transistors per silicon chip would double every year. The fulfillment of this prediction has meant that computers have gotten smaller and smaller, and faster and faster21. But this doesn’t necessarily mean they’ve gotten better. Better, of course, is a value-judgment. And a computer is only better if the thing it’s aiding a human in doing is more—well—good.
What computers obviously are good at, in 2021, is attracting the attention of humans. Comparatively speaking, we spend a lot of time on them. In the same interview linked above, Zuckerberg notes that Facebook was meant as a “mirror” for whatever community “existed in real life.” And more than anything, computers today act as mirrors. Game-playing has its mirror, sex has its mirror, talking has its mirror, music has its mirror, analog images have their mirror. As these mirror images become larger and more important than the original items they reflect, the landscape comes to seem flat: refreshingly egalitarian, or distressingly homogeneous, depending on your point of view.
In a way, the computer has merely sped up what was already a global trend, of which the United States of America has, for the past hundred years or so, been a symbol. As America’s greatest 20th century philosopher Andy Warhol once put it,
What’s great about this country is that America started the tradition where the richest consumers buy essentially the same things as the poorest. You can be watching TV and see Coca Cola, and you know that the President drinks Coca Cola, Liz Taylor drinks Coca Cola, and just think, you can drink Coca Cola, too. A coke is a coke and no amount of money can get you a better coke than the one the bum on the corner is drinking. All the cokes are the same and all the cokes are good. Liz Taylor knows it, the President knows it, the bum knows it, and you know it.22
This chapter has, perhaps unfairly, passed over most of the nearly miraculous technical things that computers can do: fly airplanes, detect illnesses, 3-D print buildings. These are fantastic, and have required huge amounts of innovation, and produced huge amounts of good. And yet there’s a zero-to-one shift that one can imagine with computers, which has not yet taken place. The above innovations, and the personal computer revolution especially, are examples of ways lives can be simpler, more utilitarian, or in some cases simply flatter, when the meatspace world is successfully mirrored in bits. The question still remains, though, whether computers will simply mirror our world, and draw us more and more toward the simulacra they create; or whether they can be tools for fulfilling human potential. This is a difficult question to think about—in part because it requires careful thought about what human means, and in part because the concept is so far from the minds of most of those who have built our digital world. But in the next chapters, we can at least venture a few preliminary remarks on the subject.