More Awesome Than Money: Four Boys and Their Heroic Quest to Save Your Privacy from Facebook

More Awesome Than Money: Four Boys and Their Heroic Quest to Save Your Privacy from Facebook

by Jim Dwyer
More Awesome Than Money: Four Boys and Their Heroic Quest to Save Your Privacy from Facebook

More Awesome Than Money: Four Boys and Their Heroic Quest to Save Your Privacy from Facebook

by Jim Dwyer

eBook

$13.99 

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

Can Facebook be trusted with your data? Years ahead of their time, Diaspora tried to do better. This is their David-versus-Goliath effort to build a revolutionary social network that would give us back control of our privacy.
 
In June of 2010, four nerdy NYU undergrads moved to Silicon Valley to save the world from Facebook. Their idea was simple—to build a social network that would allow users to control the information they shared about themselves instead of surrendering it to big business. Their project was called Diaspora, and just weeks after launching it on Kickstarter, the idealistic twenty-year-olds had raised $200,000 from donors around the world. Profiled in the New York Times, wooed by venture capitalists, and cheered on by the elite of the digital community, they were poised to revolutionize the Internet and remap the lines of power in our digital society—until things fell apart, with tragic results.

The story of Diaspora reaches far beyond Silicon Valley to today’s urgent debates over the future of the Internet. In this heartbreaking yet hopeful account, drawn from extensive interviews with the Diaspora Four and other key figures, Pulitzer Prize–winning journalist Jim Dwyer tells a riveting tale of four ambitious and naive young men who dared to challenge the status quo. 

Product Details

ISBN-13: 9780698176300
Publisher: Penguin Publishing Group
Publication date: 10/16/2014
Sold by: Penguin Group
Format: eBook
Pages: 384
File size: 778 KB

About the Author

Jim Dwyer is a Pulitzer Prize--winning reporter who writes the "About New York" column for The New York Times. He has written or co-written six books, including 102 Minutes, which spent twelve weeks on the New York Times bestseller list. He lives in New York City.

Read an Excerpt

INTRODUCTION

If his laptop had been a mirror, the face staring back at Dan Grippi would have been some blend of boy and wild man. Nature had given him full, pouty, rebel-without-a-cause lips, and a jewelry shop in the East Village had given him a piercing in the bottom one a few days after he started college. He fastened a ring in the hole. His father was not thrilled.

Long and lean to begin with, Dan grew sideburns that ran down below each ear, sketch-strokes of whiskers that further drew his angular face to a point. He gave himself another inch of height by heaping his black hair in a pile, occasionally pinning it to the air with a generous, retro slathering of gel. On his ring finger he wore what appeared to be a cyanide capsule. Most of his arms rippled from short-sleeved white T-shirts. With the Elvis hair, the Springsteen sideburns, and the scary jewelry, Dan Grippi turned out for the world looking like a hybrid greaser-punk.

Except that he was neither. He smiled easily and warmly, spoke softly, but thought twice or three times before uttering a syllable. On the cross-country team, he logged miles in silence. His digital graphic artwork won awards for him while he was a teenager. A portfolio of that work had gotten him into college, overcoming indifferent grades at his high school in the Long Island suburbs of New York. His striking looks got him work as a model, and his mastery of the blend and stitch of music beats scored him gigs as a DJ. The menacing-looking cyanide ring was really a spare piece salvaged from a build-it-yourself printer he had helped assemble. He was a nerd with muscles.

On a February night in 2010, he stared at a page on Facebook, the soul-sorting social network machine. It was time for him to get out. He had joined in 2005, when he was sixteen. Facebook had started circling the globe a year earlier, college by college, working its way to high school students. It became the gyroscope of a generation, a tool for high-speed social navigation. Long after Dan’s high school classmates had scattered to colleges all over the map, their friendship had a digital pulse on Facebook. At New York University in Greenwich Village, Dan would check out the Facebook listings of people who caught his eye in a class or at a party. He could see who they knew in common, sniff their electronic pheromones. By now, in his final months of college, he was so hooked on Facebook that even when he was entertaining guests in his apartment, he would often sit with a computer in his lap so he could keep track of what people were up to elsewhere. That was sick, he knew. But that wasn’t the worst of it.

After a week of fiddling with the settings, clicking and unchecking boxes, disabling certain notifications, he kept coming to the same end point. It was hopeless. While he might be able to regulate some settings on his own account, he had no control over what his friends did with the applications that they ran on theirs. They could, for instance, permit a game to contact everyone in their address books, giving access to everything Dan had shared with friends to these third parties. Facebook was everywhere: go to a music website, and there would be a Facebook “like” button, meaning that it could follow him there. It knew what he read and knew what he listened to and knew what he watched. This was not just an extension of the high-octane party scene that defined his weekends. He could barely think a thought without giving some hint to the machinery that was recording it all. The site had grafted onto his personality.

Enough, he decided.

Using a piece of software, he scraped all the information and pictures he had posted on Facebook. That got him a copy of everything.

One last time, he checked updates from friends. Then he found his way to the account settings. Buried several layers down was the box he was looking for.

Delete account.

Was he sure? the software asked.

Dan clicked his affirmation.

Okay, then. The account would be deactivated but not actually deleted for two more weeks. Any activity by him during that time—logging in, checking on friends, the least flicker of digital life—would be interpreted by Facebook as a sign that he had changed his mind and didn’t really want to shut it down. That was all boilerplate delivered to anyone who declared an intention to leave the flock.

The next message from Facebook was tailored for Daniel Grippi. “Your friends are going to miss you,” it said.

Dan had about four hundred “friends” on Facebook, a number of them people he had never spoken with in the flesh, or, for that matter, online. Someone who knew someone would send a friend request. He’d say yes. And that would be the end of it. No loss.

But there were others: real buds from high school, people he cared about in college, family members. Authentic friends. Their pictures were the ones appearing on the screen. It was a simple matter for the Facebook servers to figure out whose profile he had checked out, with whom he had exchanged messages, even people who were in the same picture.

Facebook kept records of what he cared about. Knowing who and what Dan Grippi wanted in his life—knowing more than Dan himself consciously knew—was Facebook’s business. It sold that knowledge.

Daniel, Facebook pleaded. Wouldn’t you want to stay in touch with these people?

They were hostages; Facebook was transforming his attempt to quit the network into a betrayal of his fondness for them.

Manipulation.

He clicked the delete button.

In the winter of 2010, four talented young nerds with time on their hands decided that they should bring an end to Facebook’s monopoly on social networking. They were man-boys, college students who became friends while staying up all night, eating pizza and hacking at geeky projects in a computer club at New York University. Their mission would have been literally inconceivable just one generation earlier; social networking barely existed when they began high school; Mark Zuckerberg had not yet started “The Facebook” in his dorm room at Harvard. But by 2010, Facebook had become such a global behemoth, joined by hundreds of people every minute, that a revolt seemed inevitable. Someone had to do it.

Like most “free” web services, Facebook was a middleman that made money by peeking into everybody’s business. With the right software tools, the four guys believed, no one would need to surrender everything to big companies like Zuckerberg’s. People could connect to friends and networks of friends without going through the servers of Facebook, Inc. It was a simple notion of vast disruption. Such an alternative would mean remapping the lines of power in digital society. To Dan Grippi and three other guys from the NYU computer club, Ilya Zhitomirskiy, Max Salzberg, and Raphael Sofaer, it would be a fun way to spend the summer of 2010. In the months before then, they spent nearly every conscious minute in room 311 of the math building at NYU working on their upstart social network project. They called it Diaspora.

“We’re not about killing off Facebook,” Raphael said. In Diaspora, they aimed to create a tool that would emancipate the power and vitality of social networks from the control of a single company. “Social networks have only existed for 10 years. We don’t really know what’s going to happen to our data, but it’s going to exist into the foreseeable future. We need to take control of it.”

His words were printed in the New York Times, and the notion that such a thing was possible thrilled hundreds of thousands of people. Social networking occupied an essential place in society, but was Facebook—avatar for the commercial surveillance embedded in the digital infrastructure of modern life—an irreplaceable link for all vital communication and connections? The Diaspora vision was thrillingly countercultural.

The four set out at a pivot point in human history, their project a single tiny pixel on a vast screen of rapid change, watched by millions. In June 2010, they moved to California to build the tools that would save the world from Facebook. Also, they hoped, to have fun. Dan and Max had just graduated. They had the most programming experience and chops. Dan, besides his strong eye for design, had a will of steel when it came to coding. Max, the eldest of the four, would pile thankless tasks on his back until he resembled an ant carrying five hundred times his weight. With a gift for turning phrases, Max described himself as an “effusive salesman type,” and said that his rhetoric often benefited from a “cold bucket of water” dumped on it by Rafi, the youngest of the group, who possessed a cool detached intelligence. He was a quick study in programming. With some trepidation, he took a leave from school. So did Ilya, a math prodigy, Russian émigré, and idea-heaving furnace. Technically the weakest software programmer of the group, Ilya was their most potent spiritual force. No one could imagine Diaspora without him. Once they relocated to San Francisco, Ilya’s home, an apartment on Treat Street in the Mission District, became a center of gravity for all manner of grand schemes and hopes. It was called the Hive.

The neighborhood of the Hive pulsed with people in their early twenties, constellations of dreamers and engineers and strivers in the expanding universe of the tech boom. Google ran a private bus system to bring employees to its offices forty miles south. But the area was not just home to the well-salaried and corporate-backed. Computers and computing had become dead cheap, a fraction of what they had been a decade earlier, and shrinking by the day. Brainpower, not money, was the essential capital.

Not only were there fortunes to be made; there were also fortunes to shape. Who could resist? Like steel at the start of the twentieth century, software had the power to win wars, bring prosperity, change lives. It could guess your appetite before the first pang of desire had struck.

This generation had come from cities around the world, out of schools and universities certainly, but also old factories and basements, derelict garages, and warehouses that had been turned into hacker spaces—rooms where they could tinker with software and cheap electronic parts, write programs and build robots or break into virtual cabinets where they didn’t belong.

Said one hacker: these spaces are a force for chaos in the world; when you walk in, you expect to see kittens in jet packs flying around the room.

They slept in mansions tricked up as dorms, rented rooms in flophouse hotels, surfed from couch to couch, shared apartments at places like the Hive. Creature comforts were for later; right now, the act of creation, of making—not of making it—was all that mattered. Fall on your face? Get up. Fail faster. Like mentalists playing tricks with a spoon, the young people were sure they could bend the steel of their era just by thinking long and hard.

They had a suppleness of spirit unburdened by the thickness of middle-age expertise in what couldn’t be done.

The Hive was a perfect space for parties, and Ilya was the ringmaster. Its spacious backyard included a tent-cabin, a fixture adopted from the utopianist desert festival Burning Man, and a necessity for outdoor gatherings in the foggy perma-chill of the Bay Area. Parties at the Hive customarily had a meme, or perhaps it was the other way around: someone would think of a meme and a party would follow. There was “We Are the Internet and We Come in Peace,” and later, “FuckYeahCarlSagan.” When someone in the circle revealed that his Wi-Fi password was “FuckTomCruise,” the meme for the next party was declared “FuckYeahTomCruise,” and Tom Cruise–themed drinks, like the Top Gun cocktail, were concocted, while the dome of the tent became the screen for a continuously looping video of the actor speaking about Scientology. On any given night, the sound system might be knocking out techno-jams, or music for swing or ballroom dance, or all of them.

Hardly any activity would not be considered: a woman who had cultivated an interest in a diet of insects was invited to whip up something on the barbecue, and she produced plates of grilled-cheese-and-mealworm-larvae sandwiches. These were washed down with craft beer and Two Buck Chuck, the value wine from Trader Joe’s.

Raucous and humming, the Hive parties were descendants of the gatherings eighty years earlier of the young and poor, philosophers and scientists and artists, in a marine biologist’s lab in Pacific Grove, forty miles to the south, caught in the amber of John Steinbeck’s Cannery Row.

One weekend at the Hive, there was a contest to break the world record for links of plastic monkeys, each arm hooked into the next until the line stretched up three flights and right to the big bay window that looked out over the yard, the tent, and the city. The window opened from Ilya’s bedroom. He had practically invented this place.

Convener of many parties, Ilya was the one who stayed up latest talking longest with roommates, or with strangers he’d met somewhere and had invited to hang out, or with people who’d wandered in.

One night, from behind the closed door to his bedroom at the Hive, he’d heard his lawyer roommate and math roommate talking in the living room. The lawyer asked the math guy about the importance of a high-level mathematical theorem; Ilya, dissatisfied with the explanation he heard through the door, burst out of the bedroom clad only in boxer shorts. He proceeded to sketch out the implications of the theorem on the floor. Then he returned to his bedroom, where he had been entertaining a young woman visitor. Of which there were many: he had little reticence about hitting on girls, and many found it hard to resist the package of a slightly disheveled guy who wore giant plastic orange sunglasses and aquamarine neon pants, a brainy polymath who actually wanted to learn what other people were passionate about.

“Gather epic people,” he often declared, “and make unreasonable demands.” He was a nerd who glowed with warmth and humor. The world he and the other Diaspora guys had settled into was one of pure buoyancy; the work they were doing was all gravity.

In the tenth century, Rabbeinu Gershom, a sage in Mayence, Germany, set out a series of prohibitions that would be enduring parts of Jewish life. They were issued in response to human impulses that he deemed to be socially destructive, not to mention an offense against the Creator. For instance, Rabbeinu Gershom barred polygamy and forbade divorce of a woman without her consent.

The rabbi also banned the reading of other people’s private mail.

A millennium after Rabbeinu Gershom, other people’s business was sheathed in fiber-optic cable. Ancient human urges to snoop had lost none of their voltage, but the prohibitions and social inhibitions were dissolving. In a virtual instant, forward-thinking businesses data-mined, data-scraped, data-aggregated. As these became exalted missions, digital culture erased social and legal boundaries that had been honored, however imperfectly, for centuries. Commercial surveillance was built into the ecology of everyday life.

Like nothing else since humans first stood upright, the World Wide Web has allowed people to connect and learn and speak out. Its dominant architecture has also made it possible for businesses and governments to fill giant data vaults with the ore of human existence—the appetites, interests, longings, hopes, vanities, and histories of people surfing the Internet, often unaware that every one of their clicks is being logged, bundled, sorted, and sold to marketers. Together, they amount to nothing less than a full-psyche scan, unobstructed by law or social mores.

“Facebook holds and controls more data about the daily lives and social interactions of half a billion people than 20th-century totalitarian governments ever managed to collect about the people they surveilled,” Eben Moglen, a technologist, historian, and professor of law at Columbia University, observed in 2010.

That shirtless, drunken picture? The angry note to a lover, the careless words about a friend, the so-called private messages? Facebook has them all. And its marketers know when it is time to pop an advertisement for a florist who can deliver make-up flowers, or for a private investigator.

Students at MIT figured out the identities of closeted gay people by studying what seemed like neutral information disclosed on Facebook. At the headquarters of Facebook in Palo Alto, Mark Zuckerberg was said to amuse himself with predictions on which Facebook “friends” would eventually hook up by watching how often someone was checking a friend’s profile, and the kinds of messages they were sending.

The uproar over Facebook’s privacy policies obscured intrusions on an even grander scale by other powerful forces in society. Everyone had heard of AOL and Microsoft; few were familiar with their subsidiaries.Atlas Solutions, purchased by Microsoft in 2007, told businesses that it deploys “tracking pixels”—a kind of spy cookie that cannot be detected by most filters—to follow Internet users as they look at billions of web pages per month. These invisible bugs watch as we move across the web, shopping, reading, writing. Our habits are recorded. The pixels live in computer caches for months, waiting to be pulsed. Facebook bought Atlas in 2013, helping it track users when they left the site.

And virtually unknown to users, AOL’s biggest business was never the cheery voice announcing, “You’ve got mail”; it was the billions of data items its subsidiary, Platform A, mined from Internet users, linking their interests and purchases, zip codes and significant others. The data was stored on servers physically located in giant warehouses near Manassas, Virginia. AOL boasted that it followed “consumer behavior across thousands of websites.”

Facebook was a proxy in a still larger struggle for control over what used to be the marrow of human identity: what we reveal and what we conceal, what we read and what we want. Just as human tissue is inhabited by trillions of bacteria, so, too, our online life is heavily colonized by external forces, invisible bits of code that silently log our desires and interests, and, at times, manipulate them.

An experiment conducted in 2010 by the Wall Street Journal showed how far commercial interests could penetrate personal information, unbeknownst to web users. As part of a remarkable series called “What They Know,” the Journal team set up a “clean” computer with a browser that had not previously been used for surfing. The results: after visiting the fifty most popular websites in the United States, the reporters found that 131 different advertising companies had planted 2,224 tiny files on the computer. These files, essentially invisible, kept track of sites the users visited. This allowed the companies to develop profiles that included age, gender, race, zip code, income, marital status, health worries, purchases, favorite TV shows, and movies. Deleting the tracking files did not eliminate them: some of them simply respawned.

Handling all the data they collected was possible because computing power continued to double every eighteen months to two years, the rate predicted in 1965 by the technologist Gordon Moore. Cheap and prolific by 2010, that power enabled the creation of bare-bones start-ups and the granular monitoring of personal habits. “We can uniquely see the full and complete path that your customers take to conversion, from seeing an ad on TV to searching on their smartphone to clicking on a display ad on their laptop,” a business called Converto boasted on its website in 2013.

The online advertising industry argued that the ability to tailor ads that appeared on a screen to the presumed appetites of the person using the computer was the foundation of the free Internet: the advertising dollars supported sites that otherwise would have no sources of revenue.

Whatever the merits of that argument, it was hard to defend the stealthiness of the commercial surveillance. No national law requires that this monitoring be disclosed, much less forbids it. A few halfhearted efforts by the Federal Trade Commission to regulate the monitoring have gone nowhere. There was no way for people to get their data back.

Or even their thoughts.

In mid-2013, two researchers published a paper entitled “Self Censorship on Facebook,” reporting that in a study of 3.9 million users, 71 percent did not post something that they started to write. That is, they changed their minds. While this might look like prudence, or discretion, or editing, the researchers—both working at Facebook—described it as “self-censorship” and wrote that such behavior was a matter of concern to social network sites. When this happens, they wrote, Facebook “loses value from the lack of content generation.”

The company maintained that users are told that it collects not only information that is openly shared but also when you “view or otherwise interact with things.” That means, the company asserted, the right to collect the unpublished content itself. “Facebook considers your thoughtful discretion about what to post as bad, because it withholds value from Facebook and from other users. Facebook monitors those unposted thoughts to better understand them, in order to build a system that minimizes this deliberate behavior,” Jennifer Golbeck, the director of the Human-Computer Interaction Lab at the University of Maryland, wrote in Slate.

At every instant, the fluid dynamics of the web—the interactions, the observations, the predations—are logged by servers. That such repositories of the lightning streams of human consciousness existed was scarcely known and little understood. “Almost everyone on Planet Earth has never read a web server log,” Eben Moglen, the Columbia law professor said. “This is a great failing in our social education about technology. It’s equivalent to not showing children what happens if cars collide and people don’t wear seat belts.”

One day in the 1970s, a man named Douglas Engelbart was walking along a beachfront boardwalk in California when he spotted a group of skateboarders doing tricks.

“You see these kids skateboarding actually jumping into the air and the skateboard stays attached to their feet, and they spin around, land on the skateboard, and keep going,” Engelbart remembered many years later.

For Engelbart, those skateboarders were a way to understand the unpredictability of technology. “I made them stop and do it for me six times, so I could see how they did it. It’s very complicated—shifting weight on their feet, and so on. You couldn’t give them the engineering and tell them to go out and do that. Fifteen years ago, who could have designed that? And that’s all we can say about computers.”

A little-celebrated figure in modern history, Engelbart had spent decades thinking about how computers could be linked together to augment human intelligence, to solve problems of increasing complexity. At a gathering of technologists in 1968, he showed what could happen when computers talked to one another. For the occasion, he demonstrated a device that made it easier for the humans to interact with the computer. The device was called the mouse. The cofounder of Apple, Steve Wozniak, said that Engelbart should be credited “for everything we have in the way computers work today.”

The emergence of the personal computer and the Internet, with its vast democratizing power, were part of Engelbart’s vision. He died in July 2013, a few weeks after revelations by a man named Edward Snowden that the United States National Security Agency was collecting spectacular amounts of data. Writing in the Atlantic, Alexis C. Madrigal noted: “We find ourselves a part of a ‘war on terror’ that is being perpetually, secretly fought across the very network that Engelbart sought to build. Every interaction we have with an Internet service generates a ‘business record’ that can be seized by the NSA through a secretive process that does not require a warrant or an adversarial legal proceeding.”

The business purposes of such data collection are apparent, if unsettling. But what need did governments have for it? Among Western democracies, the stated purpose was piecing together suggestive patterns that might reveal extremists plotting attacks like those carried out on September 11, 2001. The dystopic possibilities of such powers had, of course, been anticipated by George Orwell in 1984, and by the visionary cyberpunk novelist William Gibson in Neuromancer. But fiction was not necessary to see what could be done: in 2010, its utility as an instrument of surveillance and suppression had been realized in, among other places, Syria, Tunisia, Iran, and China.

So, too, were its other properties: as the Diaspora guys were making plans for their project in 2010, the Arab Spring was stirring to life, some of it in subversive online communications that either were not noticed or not taken seriously by the regimes that would soon be toppled. The same mechanisms allowed more alert regimes to surveil opponents, or to be led directly to the hiding places of dissidents who had unwittingly emitted location beams from phones in their pockets.

By 2010, in just the two years since Raphael Sofaer had entered college, Facebook had grown by 300 million users, almost five new accounts every second of the day. The ravenous hunger for new ways to connect in a sprawling world was not invented by Facebook, but the company was perfectly positioned to meet it, thanks to skill, luck, and the iron will of its young founder, Mark Zuckerberg. A manager in Facebook’s growth department, Andy Johns, described going to lunch for the first time with his new boss, one of Zuckerberg’s lieutenants.

“I remember asking him, ‘So what kind of users am I going after? Any particular demographics or regions? Does it matter?’ and he sternly responded ‘It’s fucking land-grab time, so get all of the fucking land you can get.’”

Could four young would-be hackers build an alternative that preserved the rich layers of connection in social networking without collecting the tolls assessed by Facebook? Would anyone support their cause?

When word got out about their project, they were swamped.

In a matter of days, they received donations from thousands of people in eighteen countries; tens of thousands more started to follow their progress on Twitter, and in time, a half million people signed up to get an invitation. That was more weight than the four were ready to carry. On the night that their fund-raising drive exploded, as money was pouring in through online pledges, nineteen-year-old Rafi Sofaer toppled off the even keel where he seemed to live his life. It was too much. They were just trying to build some software. “Make them turn it off!” he implored the others. It couldn’t be done.

Four guys hanging around a little club room at NYU suddenly found themselves handed a global commission to rebottle the genie of personal privacy. They were the faces of a movement, a revolution against the settled digital order. Their job was to demonetize the soul.

The problem they set out to solve was hard. That was its attraction. They were young, smart, quick to learn what they did not know, and girded for battle. Suddenly, they had a legion of allies. And expectations. It was delightful, for a while.

PART ONE

CHAPTER ONE

Sharply turned out in a tailored charcoal suit accented with a wine-red tie, the burly man giving the lecture had enchanted for twenty minutes, one moment summoning John Milton from the literary clouds, the next alluding to the lost continent of Oceania, then wrapping in Bill Gates and Microsoft. Every offhand reference was, in fact, a purposeful stitch in a case for how the entire architecture of the Internet had been warped into a machine for surveilling humans doing what they do: connecting, inquiring, amusing themselves. Then he made the subject concrete.

“It is here,” the speaker said, “of course, that Mr. Zuckerberg enters.”

Seated in an NYU lecture hall in Greenwich Village on a Friday evening, the audience stirred. Most of those attending were not students but members of the Internet Society, the sponsor of the talk. But no one listened more avidly than two NYU students who were seated near the front, Max Salzberg and Ilya Zhitomirskiy.

And they were keen to hear more about “Mr. Zuckerberg.” That, of course, was Mark Zuckerberg, boy billionaire, founder and emperor of Facebook, and a figure already well known to everyone in the crowd that had come to hear a talk by Eben Moglen, a law professor, history scholar, and technologist. The Social Network, a fictionalized feature film about the creation of Facebook, was still eight months away from its premier. Nevertheless, the name Zuckerberg needed no annotation. And at age twenty-five, he had never gotten an introduction of the sort that Moglen was about to deliver to him in absentia.

“The human race has susceptibility to harm, but Mr. Zuckerberg has attained an unenviable record,” Moglen said. “He has done more harm to the human race than anybody else his age. Because—”

Moglen’s talk was being live-streamed, and in an East Village apartment a few blocks away, an NYU senior named Dan Grippi, who had been only half listening, stopped his homework.

“Because,” Moglen continued, “he harnessed Friday night.

“That is, everybody needs to get laid, and he turned it into a structure for degenerating the integrity of human personality.”

Gasps. A wave of laughs. A moment earlier, this had been a sober, if engaging, talk, based on a rigorous analysis of how freedom on the Internet had been trimmed until it bled. As lawyer, hacker, and historian, Moglen possessed a rare combination of visions. He blended an engineer’s understanding of the underlying, intricate architecture of the Internet and the evolving web with a historian’s panoramic view of how those structures supported, or undermined, democratic principles and a lawyer’s grasp of how far the law lagged behind technology. For nearly two decades, Moglen had been the leading consigliere of the free-software movement in the United States, and even if not everyone in the auditorium at New York University was personally acquainted with him, they all knew of him. A master orator, Moglen knew that he had just jolted his audience.

He immediately tacked toward his original thesis, this time bringing along Zuckerberg and Facebook as Exhibit A, saying: “He has, to a remarkable extent, succeeded with a very poor deal.”

Most of the regalia of Facebook, its profile pages and activity streams and so on, were conjured from a commonplace computer language called PHP, which had been created when Mark Zuckerberg was eleven years old. By the time he built the first Facebook in 2004, PHP was already in use on more than 10 million websites; much of the web came to billowing life on computer screens thanks to those same text scripts, always tucked out of sight behind Oz’s curtain. Knowing that, Moglen put the terms of the Zuckerberg/Facebook deal with the public in currency that his audience, many of them technologists, could grasp in an instant.

“‘I will give you free web hosting and some PHP doodads, and you get spying for free, all the time.’ And it works.”

The audience howled.

“That’s the sad part, it works,” Moglen said. “How could that have happened? There was no architectural reason, really.”

As lightning bolts go, this one covered a lot of ground. A mile away in his apartment, Dan listened and thought, what if he’s right? The guy who created PHP called it that because he needed some code scripts for his personal home page. Which is sort of what Facebook was: You had a home page that could be played with in certain, limited ways. Post a picture. Comment on a friend’s. Read an article that someone liked or hated. Watch a funny cat video. But all these things were possible on any web page, not just Facebook, which was really just a bunch of web pages that were connected to one another. Maybe there was no good technical reason that social networks should be set up the way Facebook was. For Moglen, it was an example—just one, but a globally familiar one—of what had gone wrong with the Internet, a precise instance of what he had been talking about for the first twenty minutes of his speech, in every sentence, even when he seemed to be wandering.

Dan was starting to wish he had gone to the talk in person. He knew that Max and Ilya were there. Practically from the moment Moglen had opened his mouth to make what sounded like throw-away opening remarks, they had been galvanized. “I would love to think that the reason that we’re all here on a Friday night is that my speeches are so good,” Moglen had said. The audience tittered. In truth, the speeches of this man were widely known not just as good but as flat-out brilliant, seemingly unscripted skeins of history, philosophy, technology, and renaissance rabble-rousing, a voice preaching that one way lay a dystopic digital abyss, but that just over there, in striking distance, was a decent enough utopia.

“I actually have no idea why we’re all here on a Friday night,” Moglen continued, “but I’m very grateful for the invitation. I am the person who had no date tonight—so it was particularly convenient that I was invited for now. Everybody knows that. My calendar’s on the web.” No need for Moglen to check any other calendars to know that quite a few members of the audience did not have dates, either. His confession was an act of kinship, but it also had a serious edge.

“Our location is on the web,” Moglen said. Cell phones could pinpoint someone’s whereabouts. Millions of times a year, the major mobile phone companies asked for and were given the precise location of people with telephones. There was no court order, no oversight, just people with law enforcement ID cards in their pockets.

“Just like that,” he said, getting warmed up.

He was making these points three years before Edward Snowden emerged from the shadows of the National Security Agency to fill in the shapes that Mogen was sketching.

“The deal that you get with the traditional service called ‘telephony’ contains a thing you didn’t know, like spying. That’s not a service to you but it’s a service and you get it for free with your service contract for telephony.”

For those who hacked and built in garages or equivalent spaces, Moglen was an unelected, unappointed attorney general, the enforcer of a legal regimen that protected the power of people to adjust the arithmetic that made their machines work.

As the volunteer general counsel to the Free Software Foundation, Moglen was the legal steward for GNU/Linux, an operating system that had been largely built by people who wrote their own code to run their machines. Why pay Bill Gates or Steve Jobs just so you could turn your computer on? For the low, low price of zero, free software could do the trick just as well, and in the view of many, much better. And GNU/Linux was the principal free system, built collaboratively by individuals beginning in the mid-1980s. It began as GNU, a code bank overseen by a driven ascetic, Richard A. Stallman, and found a path into modern civilization when a twenty-one-year-old Finnish computer science student, Linus Torvalds, adopted much of the Stallman code and added a key piece known as the kernel, to create a free operating system. (One of his collaborators called Torvalds’s contribution Linux, and as the GNU/Linux release became the most widespread of the versions was routinely shorthanded as Linux, to the dismay of Stallman.) They were joined by legions of big businesses and governments following the hackers down the free-software road. On average, more than nine thousand new lines of code were contributed to Linux every day in 2010, by hundreds of volunteers and by programmers working for businesses like Nokia, Intel, IBM, Novell, and Texas Instruments.

The successor to Bill Gates as CEO of Microsoft, Steve Ballmer, fumed that Linux had, “you know, the characteristics of communism that people love so very, very much about it. That is, it’s free.”

It was indeed. As the Free Software Foundation saw things, in principle, the strings of 1s and 0s that make things happen on machines were no more the property of anyone than the sequence of nucleotides that provide the instructions for life in deoxyribonucleic acid, DNA.

Linux was the digital genome for billions of phones, printers, cameras, MP3 players, and televisions. It ran the computers that underpinned Google’s empire, was essential to operations at the Pentagon and the New York Stock Exchange, and served as the dominant operating system for computers in Brazil, India, and China. It was in most of the world’s supercomputers, and in a large share of the servers. In late 2010, Vladimir Putin ordered that all Russian government agencies stop using Microsoft products and convert their computers to Linux systems by 2015.

Linux had no human face, no alpha dog to bark at the wind; it had no profit-and-loss margins, no stock to track in the exchanges, and thus had no entries on the scorecards kept in the business news sections of the media. It was a phenomenon with few precedents in the modern market economy, a project on which fierce competitors worked together. In using GNU/Linux, they all had to agree to its licensing terms, whose core principles were devised primarily by Stallman, of the Free Software Foundation, in consultation with Moglen and the community of developers.

The word “free” in the term “free software” often threw people off. It referred not to the price but to the ability of users to shape the code, to remake, revise, and pass it along, without the customary copyright limitations of proprietary systems. Think of free software, Stallman often said, not as free as in free beer, but free as in free speech. So the principles of free software were spelled out under the license that people agreed to when they used it: anyone could see it, change it, even sell it, but they could not make it impossible for others to see what they had done, or to make their own subsequent changes. Every incarnation had to be available for anyone else to tinker with. Ballmer of Microsoft called it “a cancer that attaches itself in an intellectual property sense to everything it touches.”

As the chief legal engineer for the movement, who helped to enforce the license and then to revise it, Moglen was the governor of a territory that was meant to be distinctly ungovernable, or at least uncontrollable, by any individual or business.

Having started as a lawyer for the scruffy, Moglen often found himself, as the years went by, in alliances that included powerful corporations and governments that were very pleased to run machines with software that did not come from the laboratories of Microsoft in Redmond, Washington, or of Apple in Cupertino, California. It was not that Moglen or his original long-haired clients had changed or compromised their views: the world simply had moved in their direction, attracted not necessarily by the soaring principles of “free as in free speech,” or even because it was “free as in free beer.” They liked it because it worked. And, yes, also because it was free.

The hackers had led an unarmed, unfunded revolution: to reap its rewards, all that the businesses—and anyone else—had to do was promise to share it. The success of that movement had changed the modern world.

It also filled the lecture hall on a Friday night. Yet Moglen, as he stood in the auditorium that night in February 2010, would not declare victory. It turned out that not only did free software not mean free beer, it didn’t necessarily mean freedom, either. In his work, Moglen had shifted his attention to what he saw as the burgeoning threats to the ability of individuals to communicate vigorously and, if they chose, privately.

“I can hardly begin by saying that we won,” Moglen said, “given that spying comes free with everything now. But we haven’t lost. We’ve just really bamboozled ourselves and we’re going to have to unbamboozle ourselves really quickly or we’re going to bamboozle other innocent people who didn’t know that we were throwing away their privacy for them forever.”

His subject was freedom not in computer software but in digital architecture. Taken one step at a time, his argument was not hard to follow.

In the early 1960s, far-flung computers at universities and government research facilities began communicating with one another, a network of peers. No central brain handled all the traffic. Every year, more universities, government agencies, and institutions with the heavy-duty hardware joined the network. A network of networks grew; it would be called the Internet.

The notion that these linked computers could form a vast, open library, pulsing with life from every place on earth, gripped some of the Internet’s earliest engineers. That became possible in 1989, when Tim Berners-Lee developed a system of indexing and links, allowing computer users to discover what was available elsewhere on the network. He called it the World Wide Web. By the time the public discovered the web in the mid-1990s, the personal computers that ordinary people used were not full-fledged members of the network; instead, they were adjuncts, or clients, of more centralized computers called servers.

“The software that came to occupy the network was built around a very clear idea that had nothing to do with peers. It was called server-client architecture,” Moglen said.

So for entry to the promise and spoils of the Internet, individual users had to route their inquiries and communications through these central servers. As the servers became more powerful, the equipment on the desktop became less and less important. The servers could provide functions that once had been built into personal computers, like word processing and spreadsheets. With each passing day, the autonomy of the users shrunk. They were fully dependent on central servers.

“The idea that the network was a network of peers was hard to perceive after a while, particularly if you were a, let us say, an ordinary human being,” Moglen said. “That is, not a computer engineer, scientist, or researcher. Not a hacker, not a geek. If you were an ordinary human being, it was hard to perceive that the underlying architecture of the net was meant to be peerage.”

Then, he said, the problem became alarming, beginning with an innocent, and logical, decision made by naïve technologists. They created logs to track the traffic in and out of the servers. “It helps with debugging, makes efficiencies attainable, makes it possible to study the actual operations of computers in the real world,” Moglen said. “It’s a very good idea.”

However, the logs had a second effect: they became a history of every inquiry that users made, any communications they had—their clicks on websites to get the news, gossip, academic papers; to buy music or to stream pornography; to sign in to a bird-watchers’ website or to look at the latest snapshots of the birth of the universe from NASA and of the outfit that Lady Gaga wore to a nightclub the night before. The existence of these logs was scarcely known to the public.

“We kept the logs, that is, information about the flow of information on the net, in centralized places far from the human beings who controlled, or thought they controlled, the operation of the computers that increasingly dominated their lives,” Moglen said. “This was a recipe for disaster.”

No one making decisions about the architecture of the Internet, Moglen said, discussed its social consequences; the scientists involved were not interested in sociology or social psychology, or, for the most part, freedom. “So we got an architecture which was very subject to misuse. Indeed, it was in a way begging to be misused, and now we are getting the misuse that we set up.”

The logs created as diagnostic tools for broken computers were quickly transformed into a kind of CT scan of the people using them, finely scaled maps of their minds. “Advertising in the twentieth century was a random activity; you threw things out and hoped they worked. Advertising in the twenty-first century is an exquisitely precise activity.”

These developments, Moglen said, were not frightening. But, he warned: “We don’t remain in the innocent part of the story for a variety of reasons.

“I won’t be tedious and Marxist on a Friday night and say it’s because the bourgeoisie is constantly engaged in destructively reinventing and improving its own activities. And I won’t be moralistic on a Friday night and say that it is because sin is ineradicable and human beings are fallen creatures and greed is one of the sins we cannot avoid committing. I will just say that as an ordinary social process, we don’t stop at innocent. We go on. Which is surely the thing you should say on a Friday night. And so we went on.

“Now, where we went on—is really toward the discovery that all of this would be even better if you had all the logs of everything. Because once you have the logs of everything, then every simple service is suddenly a gold mine waiting to happen. And we blew it, because the architecture of the net put the logs in the wrong place. They put the logs where innocence would be tempted. They put the logs where the fallen state of human beings implies eventually bad trouble. And we got it.”

The locus of temptation, to dawdle with Moglen in the metaphysical, is not an actual place: the servers that held all this succulent data were not necessarily in a single physical location. Once the data was dragnetted from someone’s Facebook entries, for instance, they could be atomized, the pieces spread across many servers, and then restored in a wink by software magic. The data was in a virtual place, if one that was decidedly not virtuous. The data was in the cloud, and thus beyond the law.

“You can make a rule about logs, or data flow, or preservation, or control, or access, or disclosure,” Moglen said, “but your laws are human laws, and they occupy particular territory and the server is in the cloud and that means the server is always one step ahead of any rule you make or two, or three, or six, or poof! I just realized I’m subject to regulation, I think I’ll move to Oceania now.

“Which means that, in effect, we lost the ability to use either legal regulation or anything about the physical architecture of the network to interfere with the process of falling away from innocence that was now inevitable.”

In 1973, at age fourteen, Moglen had gotten a job writing computer programs for the Scientific Timesharing Corporation in Westchester, north of New York City, work that he continued for one company and another for the next decade. By 1986, at age twenty-six, he was a young lawyer, clerking for the Supreme Court justice Thurgood Marshall, and also working his way toward a PhD in history, with distinction, from Yale. His dissertation was titled “Settling the Law: Legal Development in New York, 1664–1776.” At midlife, his geeky side, his legal interests, his curiosity about how human history was shaped, had brought him to the conclusion that software was a root activity of humankind in the twenty-first century, just as the production of steel had been an organizing force in the twentieth century. Software would undergird global societies.

Software, then, was not simply a rattle toy for playpens filled with geeks, the skeleton of amusements for a naïve public, but a basic moral and economic force whose complexity had to be faced coolly, with respect, not fear.

In the eighteenth century, Jeremy Bentham, a British social theorist, conceived of a prison where all the inmates could be seen at once, but without knowing that they were being observed. He called it the panopticon and predicted it would be “a new mode of obtaining power of mind over mind, in a quantity hitherto without example.” Beginning in the 1970s onward, the cypherpunks, many of them pioneers at leading technology companies, saw that dystopian possibilities were built into the treasures of a networked world.

Moglen said: “Facebook is the web with ‘I keep all the logs, how do you feel about that?’ It’s a terrarium for what it feels like to live in a panopticon built out of web parts.

“And it shouldn’t be allowed. It comes to that. It shouldn’t be allowed. That’s a very poor way to deliver those services. They are grossly overpriced at ‘spying all the time.’ They are not technically innovative. They depend upon an architecture subject to misuse and the business model that supports them is misuse. There isn’t any other business model for them. This is bad.

“I’m not suggesting it should be illegal. It should be obsolete. We’re technologists, we should fix it.”

The crowd roared. Moglen said he was glad they were with him, but he hoped they would stay with him when he talked about how to fix it. “Because then,” he said, “we could get it done.”

By now, Dan in his apartment, Ilya and Max in the auditorium, were mesmerized. His own students, Moglen said, comforted themselves that even though their Gmail was read by Google software robots for the purpose of inserting ads that were theoretically relevant to the content of their e-mails, no actual humans at Google were reading their correspondence. No one could entertain such a delusion about Facebook. News accounts based on various internal documents and sources suggested a streak of voyeurism on the premises.

“Facebook workers know who’s about to have a love affair before the people because they can see X obsessively checking the Facebook page of Y,” Moglen said. Any inferences that could be drawn, would be.

Students “still think of privacy as ‘the one secret I don’t want revealed,’ and that’s not the problem. Their problem is all the stuff that’s the cruft, the data dandruff, of life, that they don’t think of as secret in any way, but which aggregates to stuff that they don’t want anybody to know,” Moglen said. Flecks of information were being used to create predictive models about them. It was simple to deanonymize data that was thought to be anonymized, and to create maps of their lives.

The free-software movement could be proud of the tools it had created and protected from being absconded. It was not enough. “We have to fess up: if we’re the people who care about freedom, it’s late in the game, and we’re behind,” Moglen said. “I’m glad the tools are around but we do have to admit that we have not used them to protect freedom because freedom is decaying.”

An illusion of convenience had eroded freedom, he said. “Convenience is said to dictate that you need free web hosting and PHP doodads in return for spying all the time because web servers are so terrible to run. Who could run a web server of his own and keep the logs? It would be brutal!” The crowd laughed: so many there actually did run web servers.

“What do we need?” Moglen asked.

“We need a really good web server you can put in your pocket and plug in anyplace. It shouldn’t be any larger than the charger for your cell phone and you should be able to plug it into any power jack in the world, and any wire near it, or sync it up with any Wi-Fi router that happens to be in its neighborhood.” Inside the little box would be software that would turn itself on, would collect stuff from social networks, and would send a backup copy of vital stuff—encrypted—to a friend’s little box.

It all might have sounded far-fetched, except that the plug-in computers were already being made; they cost ninety-nine dollars, a price that was sure to drop, and needed only the right collection of free software to run them. He ran through the requirements: a program for social networks, for blogging, streaming music, and so forth. The servers of the world were already running on the free software of GNU/Linux. “The bad architecture is enabled, powered by us,” Moglen noted. “The re-architecture is, too. If we have one copy of what I’m talking about, we’d have all the copies we need. We have no manufacturing or transport or logistics constraints. If we do the job, it’s done. We scale.”

That is: one copy of a piece of free software, and everything afterward is distributed over the air.

“It’s a frontier for technical people to explore. There is enormous social payoff for exploring. The payoff is plain because the harm being ameliorated is current and people you know are suffering from it.”

He reflected for a moment on the history of the free-software movement in meeting such challenges, and then moved back to the case in point. “Mr. Zuckerberg richly deserves bankruptcy,” and the crowd applauded.

“Let’s give it to him.”

A voice shouted from aisles: “For free!”

“For free,” Moglen agreed.

This effort was not about Facebook. The architecture of the web provided scaffolding for “immense cognitive auxiliaries for the state—enormous engines of listening for governments around the world. The software inside the plug-in computer could include special routing devices that disguise the digital traffic, making it harder to trace any individual on the Internet. “By the time you get done with all of that, we have a freedom box. We have a box that actually puts a ladder up for people who are deeper in the hole than we are.”

All this from free software. “The solution is made of our parts. We’ve got to do it. That’s my message. It’s Friday night. Some people don’t want to go right back to coding, I’m sure. We could put it off until Tuesday, but how long do you really want to wait? You know every day that goes by, there’s data we’ll never get back.”

The first critical problem was identifying a way to attack it. “The direction in which to go is toward freedom—using free software to make social justice.”

Someone shouted, “Yeah,” and the applause washed across the room.

“But you know this,” Moglen said. “That’s the problem with talking on a Friday night. You talk for an hour and all you tell people is what they know already.”

As the applause petered out, Max and Ilya felt like gongs that had been struck. They had arrived early for the talk, prodded by two teachers important to them: the adviser to the campus computer club, Evan Korth, who was also an officer of the New York division of the Internet Society; and Biella Coleman, an anthropologist who studied hacker culture and was Max’s senior paper adviser. Now they did not budge. The moment, the possibility, the necessity of what Moglen had mapped out was nothing less than an alternative universe. It called to their idealism, and held the transgressive promise of, maybe, subverting a powerful institution. Its gravity absorbed them. So, yes, Friday would be spent in the math and computer science building. They had plenty to explore right there and then.

“Max,” someone said from behind them.

Fred Benenson, who had graduated from NYU a few years earlier, had met Max at meetings of the campus branch of Students for Free Culture, a movement to ease copyright restrictions on the use of creative material.

Benenson could see that Max had been roused by Moglen’s talk, and he could not resist playing devil’s advocate. It was fine to talk about anonymity and privacy, he said, but in the real world, online retailers like Amazon and Netflix collected data from their customers, and used it to make recommendations on books, music, and movies.

“It’s incredible how powerful those recommendation engines are,” Benenson said. He had recently started at a job where he did research on just this kind of information gathering.

“You need data in aggregate form, even if it’s anonymized, to make these interesting features that the users expect,” he said.

“I know,” Max said.

Yet they both knew that even if the records were kept anonymously, it was possible to match them up with other data, and identify people who had not realized how vulnerable they were to being easily deanonymized through reverse engineering, often by comparing anonymous and public databases. One famous example involved movies people watched and rated on Netflix. In 2007, two researchers at the University of Texas, Austin, showed they could easily figure out a person’s supposedly private Netflix movie-viewing history by using other publicly available data.

A year earlier, Netflix had released 100 million movie ratings, from 500,000 people, and announced a $1 million prize for anyone who could improve the formula for making recommendations. The customer identities were scrubbed from the rankings. But the researchers were able to unmask them by comparing the Netflix movies they had privately rated with ones they had publicly discussed on IMDb, a website that catalogs and ranks movies.

That study was not the only example of “anonymous” data being laced with other, public information to trace its origins. AOL had released 20 million search queries after first stripping away identifying information like the users’ names and their computer addresses. They were then assigned random numbers. It did not take long for Michael Barbaro, a business reporter with the New York Times, to track down the user who had searched for “numb fingers” and for “60 single men” and “dog that urinates on everything.” The same person also queried women’s underwear, landscapers in Lilburn, Georgia, and homes in a certain subdivision. The digital bread crumbs led directly to a sixty-two-year-old woman living with three dogs.

People wanted the convenience of recommendations, and many also wanted their privacy, and even sincere promises by web companies that data was being used only in big, random, atomized clumps was no protection, since information that had been teased apart could be reconstituted. The advances in computing power meant that a jar of bread crumbs could be turned into a full loaf.

“How are you going to reconcile that aggregation of data with the desire to remain private?” Benenson said.

Max was ready. People should be allowed to choose if they wanted to be included in the mass repositories, he said. That is, they ought to be asked what they wanted.

“It should be a conscious, explicit, opt-in,” Max said.

“That’s a great answer,” Benenson said, surprised. “You’ve really thought about this. Wow. I’m inspired.”

After the lecture hall emptied, the computer club convened in the little office on the third floor of the computer science building, electricity surging. High spirits reigned. Rafi and his brother Mike had come to NYU after a splendid meal with their father.

“I just got there for the end of his speech,” Rafi said.

“It was awesome,” Ilya said.

By temperament, Rafi was measured, even after a few glasses of wine at dinner, a cool yin to Ilya’s hot yang. For that matter, he had developed immunities to lawyerly eloquence: his father, Abe Sofaer, was a retired federal judge, former senior counsel in the U.S. State Department, and partner in a big Washington law firm. Rafi, the youngest of five sons, also didn’t need to hear the others rave about a talk that he had mostly missed.

“No speech could be better than the meal I just had,” Rafi said.

Still, the mood was contagious. Even he was intrigued. They all were. And they kept talking about it, late into the night. It was February 2010. The four young men most moved by Moglen’s speech had met a few months earlier in the computer club office, a campus chapter of the ACM, or Association for Computing Machinery.

CHAPTER TWO

Just as in cartoons when toys come to secret, robust life after dark, being inside the Courant Institute for Mathematics in the middle of the night had an emancipating effect on members of the computer club in 2009 and 2010. A student or two learned to pick locks, a skill of necessity when late-night tinkering required something that was not at hand but could be cannibalized from a piece of hardware on the other side of a door.

Raphael Jedidiah Sofaer was the first of the family not born in New York City. He had been aiming toward the city throughout high school. The family had moved to Palo Alto when his father was appointed as a fellow in the Herbert Hoover Institute at Stanford. The family home in Palo Alto, California, often hosted prominent government figures, and the Sofaer sons were always ready to debate. Rafi, baby-faced, was the most level-headed of the Diaspora group, eager to discourage overblown expectations of what they were going to accomplish, but also to spell out its necessity.

Before NYU, Rafi went to the Woodside Priory, a small independent boarding school founded by Benedictine friars near his family’s home. He spent summers at Hebrew camp. Hacking was not the center of his life, but a way to scratch his intellectual and political itches. Slight in build, short in height, and an acute listener, he would sit in a room filled with raging argument, a half-smile laced to his face. No one would mistake him for the stereotypical math whiz, corralled and isolated by his intellect and drawing satisfaction only from numbers; though not a practitioner of small talk, he was open and friendly.

Going to New York had been Rafi’s dream, but it was a long distance from the cozy atmosphere of the friary. And NYU itself is an unruly village that sprawls across square miles of lower Manhattan in patches from the financial district to Union Square. Many dormitories are a subway or bus ride from the core of classrooms near Washington Square Park. Students lug books and laptops for hours. Everyone improvised places to take breaks, preferably somewhere they could just put their stuff down.

The ACM room on the third floor of the Courant Institute was only a few hundred square feet, with five or six computers, but it offered a psychic niche, a hangout between classes. Rafi had been looking for a kind of nest, or safe harbor, and he found one in the ACM room.

At twenty-two, a few months from graduating, Max seemed to have found his footing in the tech world. He had come to NYU with wide-eyed dreams of being a musicologist, and went to work as a radio DJ on the campus Internet station, then took a part-time job with an independent record company. By the end of two years, the culture of nonchalance in the company, the all-purpose dismissive “whatever” that he heard when he pushed to get music out the door, had battered his spirit. “Nobody likes the enthusiastic music kid,” he said later. “I want to be in a situation to succeed, and have that be a good thing—and not to be shunned because of that.”

Born in 1987, Max had been reared, like his classmates, in the digital age. He was fluent from an early age in computer programming. His father had spent much of his life in retail food and beverage stores, having worked for the Campbell Soup Company and Gallo wines. As a boy, Max had tagged along on his dad’s visits to grocery stores, and took in the thinking behind their strict layouts: fresh foods like vegetables, meat, and dairy along the perimeters, and categories like soups, soda, and soaps in the center. They were completely engineered spaces, he thought, much like websites.

His capacious curiosity took him to a course in the anthropology of hackers, and there he became certain that the technology world would welcome his wholeheartedness. “In Silicon Valley, the enthusiastic kid who has the crazy idea gets very lucky,” Max would say. He was also dogged: good at completing a task. The previous September, he enrolled in a class on heuristics, strategies for solving problems. After the first meeting, he was approached by a slightly disheveled, smiling student. His name was Ilya Zhitomirskiy, and he had noticed Max wearing a button with the logo of the Electronic Frontier Foundation, a legal nonprofit that fought to keep civil liberties pinned into place as technology created new platforms for speech and commerce that had not been anticipated by centuries of legal precedent and law. Ilya was impressed.

“I really like your pin,” he said.

“Do you even know what it is?” Max asked.

Ilya snorted. “Of course I know what it is,” he said.

Officially, Ilya was not taking the heuristics class but dropped in because he had heard, as Max had, that the subject was hard and the professor interesting. That curiosity and fearlessness proved a strong adhesive. They had no doubt about the centrality of programming to life in the twenty-first century.

Ilya dropped in on another one of Max’s classes, taught by his adviser, Gabriella Coleman. She had asked the class a question. What happens when people are being watched? No one had the answer, but the question she posed and then answered herself, stayed with Ilya.

When people are being watched, she had said, they perform. Ilya loved that insight. He was an expert at lighting the stage, literally.

Two years younger than Max, a puppy in spirit, Ilya was already well into PhD-level math courses. His grandfather was a mathematician. So was his father. When he was twelve, the family moved from Orekhovo-Zuevo, a town outside Moscow, to the United States, and began their American life in New Orleans. Ilya was decidedly off-kilter, nervous about fitting in, but saw no reason that he shouldn’t wear pants in the brightest neon colors that he could find, or giant plastic orange sunglasses.

At his first school in the new country, he was astounded by the familiarities of the teachers, with their cheery greetings of “Hi!” They had none of the sonorous formalities of the teachers in Russia who had taught him bits of the queen’s English. One day the principal appeared at his classroom door and summoned Ilya to come with him to the office.

What, Ilya fretted, had he done?

At the office door, the principal turned to him. “I can’t get into my computer,” he said. “I lost the password. They tell me that you might be able to fix it.”

Five minutes later, the principal was back on his computer. Ilya was rewarded, to his amazement, with a one-hundred-dollar gift certificate. Still, he managed to run into a bewildering string of trouble with some Americans. The family moved to Boston from New Orleans, and he was not yet sure-footed in English. Another boy from Russia often translated for him. After weeks of this, Ilya realized that the other boy had been entertaining himself by deliberately warping the translation and watching Ilya squirm.

By junior year in high school, Ilya had full command of English, and the Zhitomirskiys had settled in Lower Merion, Pennsylvania, just outside Philadelphia. His Russian accent had been sanded down by four years in the United States. He decided to leverage his perennial newcomer act into a social project. Every day, he would meet a new person. Just walk up to someone he didn’t know and, in the easy American way, say, “Hi, I’m Ilya.” The trick was making it seem casual. In truth, before he approached strangers, the anxiety sweated into his palms, and he would have to dry them on his sleeves. He was one of the few kids in school who did not have a Facebook account. Instead, he built his own social network, one handshake at a time. Before the year was out, he was working with the stage crew on the school drama club productions, happy to dry his palms and pretend that climbing the high rails to adjust the lighting was no big deal. In the summer before his senior year, he knocked on the stage door of every theater in Philadelphia, looking for work. Few people had time for him but he kept going.

At the Academy of Music, which was showing The Lion King, the manager said that he had no jobs, that Disney did all the hiring, and that Ilya wouldn’t be hired in any event because he wasn’t a union member.

Another rejection.

But then: “Do you want to come in and see how things work?” the manager asked.

Soon, Ilya was striding along catwalks in the upper reaches of the hall. Eventually, he landed tech jobs at a few small theaters. For college, he wanted a top-ranked math program, and was accepted to one at the University of Maryland, though his seat would not be available until the January after his graduation. That meant he had a term to kill, so he registered at Tulane in New Orleans, where he joined the juggling team and learned to unicycle. Following that semester, he transferred to the University of Maryland in College Park, and after math classes took up competitive swing dancing. He also power-kited, propelling himself over the crests of hills and gliding as far as he could. For meals, he practiced Dumpster diving—salvaging edible sandwiches and even sushi from the bins outside a coffee shop. The Maryland program was strong, but he was drawn to a place he had never lived: the heart of a big city. The Courant Institute of Mathematical Sciences at New York University, housed in Greenwich Village, piqued his academic and personal interests. So for the third time in three years, he began at a new college.

It was as if he had landed in a black-and-white photograph of the mythic, shimmering metropolis. His first dorm was across from the Brooklyn Bridge; his roommate was a jazz musician who came home from gigs at three in the morning and played a keyboard while Ilya worked on math problems. In the lounge of the math department, he would laughingly harangue students who spent time on Facebook; why, he wondered, were they wasting time with such fake relationships, where people just spread themselves out? Where was the joy of discovery? One day, as a prank, a classmate set up a Facebook account in Ilya’s name and friended everyone in their circle, including other students and the graduate teaching assistants. When he discovered the trick, it was the only time most people saw him truly angry. He confronted another student, Stephanie Lewkiewicz, with whom he had developed a romantic relationship. She conceded knowing that the scheme was afoot, but swore that she had not instigated it or taken part. Another student had.

“That guy friended all my TAs,” Ilya sputtered. He was shutting down the account. “What are they going to think? Everyone is going to be offended thinking I just defriended them.”

“But honestly,” Stephanie said, “I didn’t actually do it.”

“I believe you that you didn’t do it. You probably just thought it was hysterical,” Ilya said.

“I did,” she said.

He hunted for the Facebook account cancellation procedure, but had no faith that it would actually work. Somewhere, he was certain, his personal data was going to be stashed on a server.

“They’re not going to let me really delete it,” he told Stephanie. “It’s going to be stored on the Internet. It’s never going to go away.”

That single episode aside, he was a bright spirit among the muted tones of the math program. He often wore a shirt designed as an American flag and neon blue pants; he sat near the front, hand up, more with questions than with answers. His wandering curiosity brought him to Max’s heuristics class, and then to the computer club room, which had a much livelier hangout scene than the math lounge.

As it happened, Max and another senior, Dan Grippi, were officers of the ACM, and they had an intuitively subversive approach to computing. Or perhaps they saw it as unmapped land, mostly ungoverned. They turned the club office into a hangout through a satisfying series of small hacks. Until their administration at the start of the senior year, it was hard for anyone to just drop in because only the club officers had keys, but Max and Dan fixed that. Not by going to a locksmith for more keys. One night, they put a little radio frequency identification chip on the door. The chip, known as an RFID, is a simple gadget that sends signals a few yards; it’s what lets an identification card swipe open a turnstile, or lets a driver pay a highway toll without cash. Max and Dan set up their RFID to indicate when the door was opened or closed. The signal was strong enough to reach the computers in the room. Then the computer would send out a tweet under a Twitter account registered as @acmroom.

So the door had its own Twitter account. It tweeted simple messages.

The ACM Room is open.

What People are Saying About This

From the Publisher

Praise for More Awesome Than Money
 
“The  courageous and ingenious actions of these four NYU students and the Diaspora hackers who come in their wake will make you want to stand up and cheer. In an age of self-absorbed tweeting and friending, these young people are our Rocky Balboas and Martin Luther Kings. This book is proof that we are no longer customers of social networks, but rather the merchandise. The advertisers are the true customers, and our private thoughts, desires, and needs are exploited, sold, and bartered among them like trading cards—long after we’ve hit the delete button. The tragic death of the talented programmer Ilya Zhitomirskiy stands as testimony to our own inertia about the commercial forces that seek to control us. I’m glad I met this young man on these pages, and I'm glad that the deeply talented Jim Dwyer—who also wrote the best book on 9-11 you'll ever read—brought him and his friends to us with such stirring clarity. It’s a superb work, and a great read.”
—James McBride, author of The Good Lord Bird and The Color of Water, winner of the National Book Award

“Jim Dwyer’s More Awesome Than Money is the story of four young men who dared to go up against the (new) machine—in this case, Facebook. By turns funny, poignant, scary, heartbreaking, and hopeful, More Awesome Than Money includes everything you need to know about how your personal information is being manipulated on the Internet, and what to do about it.”
—Kevin Baker, author of The Big Crowd

“Books have been written about those who struck it rich in Silicon Valley. The four young idealists in this engrossing book did not. Their dreams of creating a more noble social network failed. Their  names will not shadow Mark Zuckerberg. They may not be deemed ‘cool.’ In the deft hands of author Jim Dwyer, they are ‘cool,’ and complicated. We follow them down the rabbit hole as they, like other forgotten names, travel from euphoria, to doubt, to dissension, to dissolution. Readers of this suspenseful narrative will not soon forget the mountaintop-to-valley drama they endured, the classic business and human mistakes they made, nor the nobility of what they hoped to do.”
—Ken Auletta, author of Googled and Greed and Glory on Wall Street

“Failure is all to common for startups, but this is the best-told story of failure I’ve read. I was rooting for the improbable the whole way. It perfectly captures the texture of Silicon Valley’s humanity and dreams better than any success story could.”
—Kevin Kelly, founding editor of Wired, and author of What Technology Wants

“[A] worthy endeavor…Dwyer has painted a detailed portrait of the enormous difficulties facing female programmers and entrepreneurs in Silicon Valley.”—New York Times Book Review
 
“[A] tumultuous story of four young men…offers a useful vantage point for assessing the strengths and weaknesses of Silicon Valley’s culture . . .”—Wall Street Journal
 
“[A] lively account…[that] finds heroism and success, betrayal and even, ultimately, tragedy in the hurtling pursuit of a cause.”—Washington Post
 
“Dwyer’s account . . . is a thrilling read, astoundingly detailed and researched, alternately suspenseful and heartbreaking.”—Daily Beast
 
“[A] lively account of Diaspora’s creation as an alternative to the Silicon Valley megaliths. Like any account of the memorable early days of a revolution, Dwyer’s reporting finds heroism and success, betrayal and even, ultimately, tragedy in the hurtling pursuit of a cause.”—Denver Post
 
“A thoroughly compelling account recommended for those interested in general technology books and business narratives. This book is a welcome addition to the literature on start-ups, particularly for its focus on notions of privacy in the digital era and how entrepreneurs are working to address these critical needs.”—Library Journal
 
“This is a greatly informative book.”—Booklist

  Praise for 102 MINUTES

“A masterpiece.”—Kevin Baker, The New York Times

“A heartstopping, meticulous account.”—The New York Times Book Review

“Impressive.”—People magazine

From the B&N Reads Blog

Customer Reviews