May 19, 2014
With Responses From
May 19, 2014
3 Min read time
I have a confession to make, though it may diminish my professorial mystique. Before I wrote this, I was playing Real Racing 3 on my iPad.
Whenever I play the game, the developer, Electronic Arts, records my performance. Just as every time I use a search engine, make an online purchase, or tweet something snarky, companies log it, watch it, and seek to exploit what they’ve learned. The panoptic gaze of these companies is likely to be far more complete than the U.S. government’s surveillance.
Reed Hundt is correct to decry the distinctive harms of excessive governmental surveillance but wrong to diminish the harms of collection by corporations. While a few of us may suffer intense injustice from the surveillance state, everyone suffers from exploitation by corporate collectors such as Google and Facebook. They gather even more data than does the state, and they use it all.
Two kinds of harm result from massive corporate collection: revelation and manipulation.
Writing in 1890, Samuel Warren and Louis Brandeis worried that new practices—such as photography and mass journalism—threatened “the right to be let alone.” They argued that everyone needs ample private space—free from surveillance—in order to form free thoughts. Indeed, they wrote that the possibility of the public revelation of letters, writing, art, conversations, and conduct imperils “inviolate personality.”
We have come to accept as routine much more invasive revelations than those Warren and Brandeis had in mind. Currently, that is the price of admission to the information age. Our experience of such violations can be mildly irritating, as Hundt suggests—for example, receiving advertisements for Brazilian bikini waxing after searching online for flights to Brazil. But they are not always benign. What about the father who angrily discovered that Target was sending coupons for maternity products to his teenage daughter, whom he believed couldn’t possibly be pregnant? As the New York Times reported, the company’s predictive model was right: the girl was pregnant and was hiding that fact from her parents.
Corporate data collection may be more harmful than government misuse.
And revelations can be materially harmful. Many employers search the social media footprints of potential employees to assess their rectitude. To the extent that high school and college behavior is available to the digital panopticon—and much of it is—those crucial spaces for the formation of our personalities are routinely exploited for other purposes.
Yet the deeper harm of digital revelation concerns what doesn’t happen. When we watch what we say and do, even with our friends and confidants, we close ourselves off to thoughts that we might have had: we change what we think.
The second harm of massive data collection is manipulation. Marketing in the twentieth century relied upon influencing a mass public, or big chunks of it, to buy or do things. Today the collection and analysis of data empowers organizations to influence people according to their individual characteristics.
This phenomenon, sometimes called micro-targeting, is much more effective than traditional advertising. An advertiser can guess how we will respond, not just in the aggregate or in large demographic slices (moms, urban shoppers, etc.), but as individuals. Micro-targeting allows organizations to better advance their ends, and if those ends match ours, it is purely coincidental.
Micro-targeting is especially pernicious in the political realm. Political campaigns will soon use big data to tailor messages to individuals, based upon study of specific behavior. Political marketers may be able to direct advertisements not just to the shows you watch, but also to your individual television through your satellite hookup, cable connection, or video streaming box. Campaigns are supposed to contribute to a deliberative process in which the democratic public weighs political options. Micro-targeting at best will produce a fragmented aggregation of individual views and at worst a cynically manipulated consensus.
I agree with Hundt that encryption would help limit collection by government. It would help with companies too. However, reliance on user-driven encryption could create a geek aristocracy in which those with know-how are protected, and the rest are vulnerable.
I offer two suggestions. First we need to nurture broad norms that support privacy and update the vision of a free Internet in an era in which it is characterized not only by millions of decentralized creators, but also by powerful information utilities such as Google. I worry that the values of many of the Internet’s creators and conductors—from Richard Stallman to Tim Berners-Lee and not least Hundt himself—cannot be sustained under enormous corporate pressure. My students talk much more in the technocratic terms of innovation, software development, and startups than in the democratic terms of freedom, creativity, access, and equality.
Second, we must develop organizations committed to privacy and broaden the constituencies supporting them. Transparency creates opportunities for political action and regulation. Mobilized constituencies should act on those opportunities.
While we have you...
...we need your help. You might have noticed the absence of paywalls at Boston Review. We are committed to staying free for all our readers. Now we are going one step further to become completely ad-free. This means you will always be able to read us without roadblocks or barriers to entry. It also means that we count on you, our readers, for support. If you like what you read here, help us keep it free for everyone by making a donation. No amount is too small. You will be helping us cultivate a public sphere that honors pluralism of thought for a diverse and discerning public.
May 19, 2014
3 Min read time