Have the thought police won?
What happens when your darkest secret or most agonizing experience can be recorded, stored, duplicated endlessly and broadcast for the world to see? In a two-part article (part one here and part two here) for his Inside Higher Ed column, reviews a book on privacy in the 21st century.
PRIVACY DOES not appear to have much of a future. As with the Arctic ice, we have no plan in place to halt the erosion, let alone to restore what has already melted away. "The right to be let alone," wrote U.S. Supreme Court Justice Louis D. Brandeis in a classic formulation from 1928, is "the most comprehensive of rights and the right most valued by civilized men." The menace to privacy he was addressing came from the state; at issue, evidence collected by tapping a bar's telephone during Prohibition. He issued his opinion with a definite sense of the shape of things to come.
"Ways may someday be developed," he wrote, "by which the Government, without removing papers from secret drawers, can reproduce them in court, and by which it will be enabled to expose to a jury the most intimate occurrences of the home. Advances in the psychic and related sciences may bring means of exploring unexpressed beliefs, thoughts and emotions."
What he did not foresee, and probably could not, was how the right to privacy might be undermined--dissolved, really--by forces within the sphere of private decision making. Today, access to most of the digital tools and conveniences we've grown to depend on comes in exchange for permission to record and analyze our use of them. Every search and purchase, each comment and click, can now be assumed to go on your permanent record.
The scope and intensity of the monitoring are difficult, and unpleasant, to imagine. Meg Leta Jones, an assistant professor of communication, culture and technology at Georgetown University, sketches some of the facts of everyday life in the digital panopticon in her book Ctrl+Z: The Right to Be Forgotten (NYU Press):
When a user logs on to the Internet and visits a website, hundreds of electronic tracking files may be triggered to capture data from the user's activity on the site and from other information held in existing stored files, and that data is sent to companies. A study done by The Wall Street Journal found that the nation's top 50 websites installed an average of 64 pieces of tracking technology, and a dozen sites installed over 100...Some files respawn, known as zombie cookies, even after the user actively deletes them. All of this information is then sold on data exchanges...Sensors creating data about owners and others from phones, cars, credit cards, televisions, household appliances, wearable computing necklaces, watches and eyewear, and the growing list of "Internet of things" devices, mean that more personal information is being disclosed, processed and discovered. All of these nodes of discoverability will be added to the structure of discoverability sooner or later, if they have not been already.
THE SCALE and voracity of Big Data (which is to surveillance what Big Pharma is to medication) are not the only forces at work in liquidating the private sphere. The Internet and the roaming devices that feed into it--cameras, tablets, phones, etc.--now form the individual's auxiliary backup brain. Two imaginative exercises that Jones proposes will make clear the potential vulnerabilities this entails.
First, recall the most painful thing in your life: something so excruciating, regrettable or tinged with shame or guilt, or both, that you would burn out the very memory of it, were that an option. Then, think about the worst person you have ever known: someone whose capacity for, in Jones's words, "inconsiderate, nasty, spiteful or twisted behavior" you wish you had learned about early on and so avoided experiencing firsthand.
There's no substitute for a thick skin or good intuition--least of all the external, digitized gray matter used to store written and visual documents and to mediate what seems like a larger share of each succeeding generation's social interaction. Instead, it has the potential (even, to go by anecdotal evidence, a talent) for making things much worse. The dark secret or agonizing experience can be recorded, stored, duplicated endlessly and broadcast for all the world to see. The barrier between public and private self can vaporize with a press of the "send" button, whether in haste or by accident. Conversely, the malicious person might be able to cover his own tracks--taking down his slanderous blog or threatening comments, for example--and carry on after the damage is done.
Both could happen at once, of course: the vilest person you know could get hold of whatever you most want to hide or escape, and thereupon, presumably, burn your life to the ground. But the situations that Jones describes need not come to pass to have an effect: "Worry or fear may curb your behavior on- and off-line to avoid the risks of unwanted attention, misinterpretation or abuse."
It is tempting to reply that worry and fear can be useful emotions, and that learning "to curb your behavior on- and off-line to avoid the risks of unwanted attention, misinterpretation and abuse" is otherwise called growing up. One problem with moralism of this caliber is that it dodges a hard truth: yes, maturity includes knowing to avoid behavior that is dangerous or likely to have undesirable consequences. (Example: consuming a great deal of alcohol and telling someone what you really think of them.) Sometimes this knowledge must be acquired the hard way and mulled over in private; the important thing is that enough of it accumulates. The learning process involves recognizing and repudiating mistakes, and gaining a certain amount of distance from the earlier self who made them.
Bust what happens to the possibility of growth and change if the infraction can't be escaped? If a document of the regrettable memory is out there, retrievable by others, perhaps impossible to live down? For that matter, what becomes of "the right to be let alone" when seemingly private life generates data that is compiled and analyzed and sold?
The legal and moral implications require a rethinking of much of what we take for granted, and Jones is plugged in to many of the conversations.
WHEN WINSTON Smith discovers the blind spot in his apartment--the niche just out of range of the telescreen, Big Brother's combination video feed and surveillance system--it is, George Orwell tells us, "partly the unusual geography of the room" that allows him to take the risk of writing in a diary.
Later Smith finds another room with no telescreen at all, where he and Julia create another zone of privacy: the shared kind, intimacy. It can't last, of course, and it doesn't, with brutal consequences for both of them. (Thoughtcrime does not pay.)
The dystopia of Orwell's 1984 is very much the product of its era, which spanned roughly the period between Hitler's ascension to power in 1933 and Stalin's death 20 years later. And while the novel's depiction of a world without privacy can still raise a reader's hackles, its technology now looks both retrofuturist and surprisingly inefficient. The telescreens are menacing, but there's always a chance that Big Brother's watchers will overlook something. And look at the tools that Winston uses to carve out his own domain of personal memory and antitotalitarian sentiment: a pen and paper. The authorities manage to read his thoughts eventually, but it takes most of the novel to get to that point. Today, Winston would be destined to Room 101 before he powered down his notebook.
Jones' book arrives at a time when ever-fewer activities or communicative exchanges occur without the accompaniment of some form of information technology intervening. Digital traces generated along the way are gathered, analyzed, sold. And the right to privacy becomes a little more purely notional each time one's eyes slide down the text of a user agreement on the way to clicking "accept."
A kind of fatalism is involved, one resting on the tacit but powerful tendency to assume that technology itself defines what information will gathered, and how, and the use to be made of it. Implied is a trade-off between privacy and various benefits--with both the cost and the reward determined by what our devices do and require. Privacy is, in this view, a function of engineering necessities, not of political or moral decisions.
THE INITIAL, blunt challenge to technological determinism comes in Ctrl+Z's opening chapters, where Jones, an assistant professor of communications, culture and technology at Georgetown University, contrasts how the European Union and the United States frame their policies concerning the availability of personal information online. Here personal information would include employment history, financial data and arrest records, as well as, say, material communicated via social media.
In the United States, she writes, the default attitude "permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces," while E.U. states "operate under comprehensive regimes that protect information across both the public and private sectors and are enforced by specialized data-protection agencies."
The contrast becomes striking when "data protection" might be better described as protecting the reputation or well-being of the individual to which the data pertains. Take the case of someone who, as a young adult, is arrested for vandalism and destruction of property and serves a jail sentence, all of which was written up in a newspaper in 1990 as well as being documented in official records. Once released, he swears off his old ways and spends the next 25 years in steady employment and overall irreproachable conduct. He awakes to find that the newspaper has digitized its archives and made them searchable via Google.
If our reformed graffiti artist lives in America, he can do little if anything about it, apart from asking the paper to take down its accurate but deeply embarrassing article. There is also a chance his conviction will be publicized on any of various websites dedicated to posting mug shots.
In a number of E.U. countries, by contrast, he could appeal to laws that forbid public reference to someone's criminal record if it is no longer news or if the ex-con has undergone significant rehabilitation. He might also file a request with Google to remove links to sites mentioning the old transgression. In 2014, the Court of Justice of the European Union ruled that the search engine had to establish a take-down system for people who wanted personal information removed from its search results.
There are variations from country to country, but Jones finds that the E.U. "data subject" (in effect, the citizen's digital doppelgänger) can claim a "general right to personality"--a certain degree of dignified immunity from unwelcome attention. The American data subject, by contrast, is presumed to take the Wild West ethos of the Internet pretty much as a given, with any effort to delete information or limit its circulation being labeled, almost inevitably, as Orwellian. (Even so, a number of piecemeal efforts have been made in the United States to protect children and victims of harassment and bullying, including laws against revenge porn.)
But as Jones goes on to show, any preference for one of these frameworks over the other will soon enough be faced with the much harder matter of dealing with new and unanticipated shades of gray left out of the public/private distinction. And the other dichotomy--between having every bit of personal data (flattering, humiliating or neither) either preserved forever in a digital archive or destined for the memory hole--is also looking out of date. Jones's book doesn't predict what comes next, but it's a great stimulant for anyone bracing themselves to think about it.
First published in two parts (one and two) at Inside Higher Ed and slightly edited here to appear as a single article.