Cognitive Surplus: Creativity and Generosity in a Connected Age
Clay Shirky
The Penguin Press, $25.95 (Hardcover)

Internet enthusiasts come in two flavors: utopians and populists. The rhetoric of both camps is revolutionary, but the revolutions are different.

Utopians believe that the Internet provides promising new solutions to our most intractable problems. With enough tweets, all global bugs—war, poverty, illiteracy, fascism—can be quashed.

Populists promise no such lofty goals. They see the profound social confusion sown by the Internet as a historic opportunity to snatch power from elites and their institutions and redistribute it more evenly among netizens, the ordinary citizens who have been empowered by the Internet. Like the participatory democrats of earlier eras, the populists want a more direct democracy, and they think that most social institutions, from the traditional media to political organizations, are unnecessary ballast.

Of the two camps, the populists—whose ranks include social innovators (Jimmy Wales of Wikipedia and Craig Newmark of Craig’s List), professors of journalism (NYU’s Jay Rosen and CUNY’s Jeff Jarvis), and social-diva-cum-publisher Arianna Huffington—are in much better shape. Although the cyber-utopian project is not dead—consider the irrational exuberance over Iran’s “Twitter revolution”—the traces of evidence it relies upon don’t support a coherent or convincing theory.

The resurgent cyber-populists, in contrast, have a theory and a plan. For them, the Internet is what a hand-made grenade was to 19th-century Russian anarchists. They want to rewire completely our social relations in order to maximize the role that the individual plays in this new—to use their buzzword—“eco-system.”

Clay Shirky, an adjunct professor at NYU’s Tisch School of the Arts, is a towering figure in this camp, with considerable credibility among business executives, technologists, and media critics. Having cut his teeth at several dotcoms, Shirky emerged as a leading popular theorist of Web 2.0 and used his blog as his main publishing platform. His 2008 bestseller, Here Comes Everybody: The Power of Organizing without Organization, was Web 2.0’s equivalent of Thomas Friedman’s The World is Flat. Building on insights from institutional economics and public-choice theory, Shirky argued that the Internet obviated the need for hierarchical structures and the sluggish organizations that perpetuated them: it was now possible to do things on the cheap—and, most importantly, on your own.

The success of Here Comes Everybody owed much to its propitious timing: public anxiety over the cultural barbarity of Wikipedia, YouTube, and MySpace was beginning to subside, while buzzwords like “crowdsourcing” and “the long tail” were becoming integrated into our everyday language. Thanks to his immense charisma (in his pre-Internet life Shirky was a theater director), his Gladwellian eye for the anecdotal, and, ironically, the growing institutional demand for his unabashedly anti-institutional Web 2.0 wizardry, Shirky found himself advising the World Bank, the U.S. State Department, and, in a bizarre turn, the Libyan government.

Cognitive Surplus: Creativity and Generosity In a Connected Age, Shirky’s new book, grew out of a fifteen-minute lecture that became an Internet sensation. It’s not hard to see why: Shirky is a rarity in today’s Internet punditry, where bad Twitter jokes increasingly pass for original insights. When Shirky talks, he gives the impression that he not only writes books, but actually reads them.

From gin to the Internet

The main argument of Cognitive Surplus rests on a striking analogy. Just as gin helped the British to smooth out the brutal consequences of the Industrial Revolution, the Internet is helping us to deal more constructively with the abundance of free time generated by modern economies.

Shirky argues that free time became a problem after the end of WWII, as Western economies grew more automated and more prosperous. Heavy consumption of television provided an initial solution. Gin, that “critical lubricant that eased our transition from one kind of society to another,” gave way to the sitcom.

More recently TV viewing has given way to the Internet. Shirky argues that much of today’s online culture—including videos of toilet-flushing cats and Wikipedia editors wasting 19,000 (!) words on an argument about whether the neologism “malamanteau” belongs on the site—is much better than television. Better because, while sitcoms give us couch potatoes, the Internet nudges us toward creative work.

That said, Cognitive Surplus is not a celebration of digital creativity along the lines of Richard Sennett’s The Craftsman or Lawrence Lessig’s “remix culture.” Shirky instead focuses on the sharing aspect of online creation: we are, he asserts, by nature social, so the Internet, unlike television, lets us be who we really are. “No one would create a lolcat to keep for themselves,” Shirky argues, referring to the bête noire of Internet-bashers, the humorous photos of cats spiced up with funny and provocative captions. “Cognitive surplus” is what results when we multiply our constantly expanding free time by the tremendous power of the Internet to enable us do more with less, and to do it together with others.

According to Clay Shirky, ‘the real gap is between doing nothing and doing something, and someone making lolcats has bridged that gap.’

Arguments about infinite digital opportunities for doing good have been a commonplace of cyber-utopians since the mid-1990s. But Shirky is a populist, not a utopian. His only benchmark of success is the relative standing of “us” against dominant institutions and, in particular, against the mind-numbing, brain-damaging, creativity-suppressing beast that is the traditional media.

For Shirky, doing anything online beats the passivity nurtured by the traditional media. The argument is beautiful in its simplicity: “the real gap is between doing nothing and doing something, and someone making lolcats has bridged that gap,” for “the stupidest possible creative act is still a creative act.”

To drive that point home, he proposes a thought experiment: while Americans spend 200 billion hours a year watching television, the whole of humanity spent something like 100 million hours to create Wikipedia (or, at least, its 2008 version). Thus, even a tiny change in our TV watching habits can lead to significant social gains. Not every Internet project would become a Wikipedia—lolcats are still currency of the day—but Shirky urges us to keep trying. Short-selling the Internet may prevent us stumbling upon a technology as revolutionary as the printing press.

Unencumbered by facts

Shirky’s strong suit is not hard data, but clever anecdotes. He draws on a vast array of provocative and memorable stories—from anime communities in Japan to skaters in Santa Monica, garbage-collectors in Pakistan, and car-poolers in Canada—that help to bolster his thesis. But the anecdotes don’t make up for the lack of rigor. In a book that claims to document broad social shifts across different media eco-systems, revolutionary changes are presumed to be self-evident, linear, and transparent.

This is not the first time that Shirky has made very liberal use of his evidence. In Here Comes Everybody, he discussed how young Belarusians used blogs to organize anti-government flash mobs. When I pointed out that the campaign was short-lived and had died out at the very moment that the country’s secret police—which calls itself KGB even twenty years after the fall of the Soviet Union—started reading the same blogs, Shirky conceded that his book was “about social media rather than politics” and that it offered only “an imbalanced account of the arms race between citizens and their governments.” Unfortunately, Shirky’s confessions came too late: his flash-mob myth is now part of the populist lore.

In Cognititive Surplus, Shirky is comparably inventive. This time, the tech-savvy teenage protesters of South Korea make a prominent appearance. The South Korean example is worth discussing in detail because it highlights how easy it is to draw misleading conclusions from anecdotes.

For more than a month between May and June 2008, the streets of Seoul brimmed with tens of thousands of angry people, unhappy that newly elected president Lee Myung-Bak had lifted a five-year ban on imports of American beef. Many South Koreans felt that the ban, originally imposed because of fears of mad cow disease, had been rescinded too hastily, giving public safety a back seat to the exigencies of foreign policy.

So they took to Seoul’s parks and public squares and mounted candlelight vigils and sang “No to mad cow!” By late June, their efforts paid off: the president was forced to apologize on national television, reshuffle his cabinet, and add a few extra restrictions to the trade agreement.

Shirky zeroes in on the high-school students—most of them girls—who spearheaded the protests. He is particularly impressed to report that they learned about the ban through postings on an Internet forum dedicated to their favorite boy band. “Massed together, frightened and angry that Lee’s government had agreed to what seemed a national humiliation and a threat to public health, the girls decided to do something about it,” Shirky writes, pointing out that the band’s Web site “provided a place and a reason for Korea’s youth to gather together by the hundreds of thousands.”

For Shirky, this suggests nothing less than a revolution in revolution-making: “When teenage girls can help organize events that unnerve national governments, without needing professional organization or organizers to get the ball rolling, we are in new territory.” He uses the story to illustrate the limitations of the South Korean media in fostering such revolutionary pursuits: a similar protest would have been unimaginable in the sitcom age.

Shirky says he is writing about Western democracies, but they are unrecognizable in his book, for they appear to have been sterilized completely of social conflict.

The media, he contends, were passive, as was their audience: “a large number of mostly uncoordinated amateur media consumers.” Meanwhile anything posted on the band’s site “was as widely and publicly available as any article in a Korean newspaper, and more available than much of what was on TV.” The girls “weren’t silent consumers but noisy producers themselves, able to both respond to and redistribute those messages at will”; as a result, “connected South Korean citizens, even thirteen-year-olds, radicalized one another” and were able to shake a government “used to a high degree of freedom from public oversight.”

But before the tale of candle-holding South Korean high schoolers forcing ministers to resign joins the Belarus myth, it might pay to look a little more carefully at what happened.

Discontent with Lee had been brewing before he lifted the ban, especially among students. One of his most controversial ideas involved a radical change to the country’s education system, which would have made English the language of instruction in most high schools. The candlelight protests also were not a novelty: the country went through a similar phase in 2002, when two girls were killed by a vehicle belonging to U.S. forces stationed in the country. Protests are common in South Korea, with about 11,000 annually.

Perhaps because of his scorn for the professional media, Shirky misses what may have been the real cause of the protests: a television report, provocatively titled “Is American Beef Really Safe from Mad Cow Disease?” that aired on PD Notebook, a current affairs program on the popular channel MBC. According to that program, a woman in Virginia recently had died from mad cow disease, the South Korean government had surrendered its sovereignty, South Koreans were genetically predisposed to the disease, and the disease could spread through the powdered soup base in instant noodles.

Shirky never mentions the TV show, nor does he say anything about the role of Korean celebrities in mobilizing the masses (a well-known actress claimed she would rather drink acid than eat American beef). Videos of the MBC broadcast did go viral online, and this “rebroadcast” played a role in getting people onto the streets. Still, rather than a triumph of the digital public sphere, the story of the high school protesters ultimately is an example of old-media alarmism spread with a little help from new-media friends.

The problem isn’t just that Shirky overlooks some facts. His central narrative—people vs. corrupt and irresponsible government—blinds him to the ambiguous implications of that mix of free time and Internet access that he celebrates as “cognitive surplus.” Yes, South Korea is prosperous and wired. But it still harbors numerous social ills that information technology may aggravate.

Shirky ignores South Korea’s epidemic of Internet addiction, from which 2 million residents (4 percent of the population) reportedly suffer. (Remember the South Korean couple that let their three-month-old starve to death while they reared their virtual child?) Nor does he mention the growth of xenophobic cyber-vigilante groups that troll social-networking sites in search of evidence that foreigners who come to teach English in the country behave immorally. And Shirky is similarly oblivious to the patriotic netizens who organize cyber-attacks on Japanese Web sites over matters as petty as figure skating. More substantial issues between the two countries—like the future of the disputed Liancourt Rocks islands—result in even greater online vitriol.

If your only metric of social progress concerns who has access to what tools and at what costs, such “negative externalities” do not matter. But if you are not already a committed populist, such risks may give you pause.

What Dwight Macdonald said of the work of Marshall McLuhan, that earlier media sage, aptly describes Shirky’s as well: “A single page is impressive, two are stimulating, five raise serious doubts, ten confirm them.” Macdonald also gave us an excellent diagnosis of this method:

McLuhan is a fast man with a fact. Not that he is careless or untruthful, simply that he’s a system-builder and so interested in data only as building stones; if a corner has to be lopped off, a roughness smoothed to fit, he won’t hesitate to do it.

When it comes to system-building and corner-smoothing, Shirky is an ultra-McLuhanite.

Populism without politics

A cyber-utopian polemic—a passionate call for the younger generation to ditch consumerist culture and pour its creative energies into fighting all the evil that exists in the world, one tweet at a time—could have made a worthy contribution. But Shirky’s populism urges us simply to stop worrying, love the Web, and focus on liberating ourselves from the oppression of the traditional media.

“However pathetic you may think it is to sit in your basement pretending to be an elf,” Shirky writes of those poor souls who waste too much time on computer games like World of Warcraft, “I can tell you from personal experience: it’s worse to sit in your basement trying to decide whether Ginger or Mary Ann [from Gilligan’s Island] is cuter.” We have fundamentally misunderstood media, he argues, and what it should offer us. “Media is actually like a triathlon. . . . People like to consume, but they also like to produce and to share. We’ve always enjoyed all three of those activities, but until recently, broadcast media rewarded only one of them.”

The Internet rewards all three. Furthermore, now that we know what really matters, we should disregard the people shilling—that is, working—for the print, radio, and television industries. Such people are only obstructing progress, and they will continue doing so, for “those deeply committed to old solutions cannot see how society would benefit from an approach incompatible with older models.”

Like other books that attack television, Cognitive Surplus conveniently glosses over the fact that the BBC, for example, has been churning out superb cultural programming for decades. Forced to choose between the shallow activity fostered by the production and consumption of lolcats and the enlightened passivity fostered by watching BBC Four’s in-depth documentaries, many people might reasonably favor the latter. But calling for publicly funded media, and, particularly, arguing that quality content matters, would be an Internet populist’s suicide.

The broader societal implications of Shirky’s argument are clear: universal access to tools for producing and disseminating information is the ultimate public good, even if it crowds out other such goods. To that end Shirky closes the book with a powerful—if abstract—call to arms:

We look everywhere a reader or a viewer or a patient or a citizen has been locked out of creating and sharing . . . and we’re asking. If we carve out a little bit of the cognitive surplus and deploy it here, could we make a good thing happen? (Emphasis original.)

Maybe. But Shirky’s digital populism not only blinds him, McLuhan-style, to inconvenient facts, it blinds him to the immense complexities and competing values inherent in democratic societies. He says he is writing about Western democracies, but they are unrecognizable in his book, for they appear to have been sterilized completely of social conflict.

Shirky presents a world without nationalism, corruption, religion, extremism, terrorism. It is a world without any elections, and thus no need to worry about informed voters. Class, gender, and race make a few appearances, but not as venues of systemic oppression. They are just more testimony to the mainstream media’s elitism. Describing the media habits of his young students, Shirky remarks that they “have never known a world with only three television channels, a world where the only choice a viewer had in the early evening was which white man was going to read them the news in English.”

But while Shirky seems content to gloss over the deficiencies of democratic politics and declare them transformed, a more sober analyst will realize that the transformation of those politics is far from complete and in fact requires more determined popular engagement. Even in the age of the Internet, the fate of the nation depends on who organizes in the public sphere, who shows up at the voting booth, and how well-informed those people are.

We want to cultivate voters who are less susceptible to propaganda than Shirky’s beloved South Korean teenagers. Very little suggests that we are enjoying greater success in this quest than we did in the golden era of network television. The environment of media scarcity produced voters who, on average, were far less partisan and far better informed about politics than are today’s voters. Yes, this was an accident—viewers had nothing else to watch at 9 p.m.—but the byproducts were valuable.

As Markus Prior points out in his excellent 2007 book Post-Broadcast Democracy, today’s environment of information abundance splits the public into a small cohort of news junkies, who know everything there is to know about politics, and a much larger contingent of entertainment fans, who know the names of the latest YouTube celebrities and their favorite lolcats, but not of their home senators. “Although it is comforting to know that [viewers] finally get to watch what they always wanted to watch,” Prior writes, “their newfound freedom may hurt both their own interests and the collective good.” That is the case of those South Korean Internet users, who helped to spread panic that harmed their country’s diplomatic standing.

Shirky, of course, would never talk about viewers’ interests: that is not populist-speak. Populists prefer to make normative claims about the need to break up the traditional media without specifying how we should nurture responsible citizenship and promote good public policy in their absence. This just happens, apparently.

But the Internet will not automatically preserve—never mind improve—the health of democratic politics. Yes, a wired future might look good for democracy if some of the social functions currently performed by traditional media are taken up by new Internet projects. But that outcome needs to be demonstrated—perhaps constructively aimed at—rather than assumed. For populists such as Shirky, the need for considered political commitment does not even merit discussion. The triathlon must go on, even if the athletes become brainwashed and bigoted.

To paraphrase an old gospel song, do we really want to get what we wanted—but lose what we had?