Right now we're being watched. It might not be literal watching: it might be that a computer somewhere, owned by a government or a corporation, is collecting or mining the crumbs of data we all left around the world today. You probably know all this. But apart from the targeted ads, it's often difficult to grasp just what that watching means, now and down the road.
That's at least one reason we need to start scrambling our tracks. "Our data will be shared, bought, sold, analyzed and applied, all of which will have consequences for our lives," Finn Brunton and Helen Nissenbaum write in Obfuscation: A User's Guide for Privacy and Protest. "Will you get a loan, or an apartment, for which you applied? How much of an insurance risk or a credit risk are you? What guides the advertising you receive? How do so many companies and services know that you're pregnant, or struggling with an addiction, or planning to change jobs? Why do different cohorts, different populations and different neighborhoods receive different allocations of resources? Are you going to be, as the sinister phrase of our current moment of data-driven antiterrorism has it, 'on a list'?"
When it comes to maintaining their digital privacy, many people probably think about software like encrypted messaging apps and Tor browsers. But as Brunton and Nissenbaum detail in Obfuscation, there are many other ways to hide one's digital trail. Obfuscation, the first book-length look at the topic, contains a wealth of ideas for prankish disobedience, analysis-frustrating techniques, and other methods of collective protest. The aim, as Brunton tells Motherboard, was to create an approach that could be adopted by people without access or training to the best tools, or in situations where they can't get away with using strong crypto, for instance.
The project has its roots in the days before Edward Snowden's revelations. In 2011, Brunton and Nissenbaum, who are both professors at NYU focused on technology and privacy, struck up a conversation about Nissenbaum's TrackMeNot project, a lightweight Firefox browser extension that periodically makes false web queries to confuse online tracking technologies. Brunton mentioned his interest in decades-old techniques for concealing one's position from radar detection, and particularly in chaff, the material used by military planes to confuse enemy radar signals.
Chaff are strips of black paper backed with aluminum foil, cut to half the target's wavelength. When released by the pound from an invading airplane, the material can fill a radar screen with more signals than a human operator can handle—a "perfect and intuitive example" of signal obfuscation, says Brunton.
You figure out what you need—time—and what your adversary is looking for—radar pings—and you give them an overload of precisely that
"For those situations when you can't escape observation—you can't decide not to fly, you can't choose to fly in a radar-transparent plane—you figure out what you need—time—and what your adversary is looking for—radar pings—and you give them an overload of precisely that," he says. "Fake signals that conceal the real one, buying you the minutes you need in a cloud of 'false echoes'.
"In the course of trading ideas back and forth, we suddenly realized there was a shared shape to these two problems," he recalls.
But in an environment made of data, a world of laptops, desktops and mobile devices, different strategies are required. Each piece of technology may require a variety of tactics at once. "One of the pertinent issues about obfuscation is its increasing utility for our mobile environment where more and more of our data and activity are offloaded to the cloud, where tools for generating a proliferation of ambiguous, confusing, and misleading data are especially useful," Brunton says. "That's one of the most promising areas for future research."
Brunton and Nissenbaum also see a social purpose in obfuscation, born out of a responsibility that those who have "nothing to hide" owe to those who might: "to conceal, muddle, and obfuscate our activities precisely to confuse the construction of normalcy that can be used to identify the abnormal and secretive." Uploading or appearing to upload non-sensitive material to a whistleblowing website, for instance, could help bury the identity of a leaker in noise, which "helps protect the one who really needs it."
Excessive documentation is a similar tactic. If a user is obliged to provide documents and materials, rather than turning over specific documents, they might hand over a massive ream. It is both a delaying tactic, and a way of burying whatever information could be of use to the adversary. Think of sending pallets of documents in a court case to frustrate prosecutors, a tactic that can be used by good and bad actors, from dissidents to misbehaving governments.
A multi-use name can also be used for privacy purposes. The hacktivist group Anonymous is a recent example, but collective name projects go back centuries to include the British Luddite's "Captain Swing" and "Poor Conrad," the name used by secret peasants groups in rebellion against the Duke of Württemberg in 16th century Germany. Brunton deems this a good approach to concealing the actions of any one individual within the actions of many. Anonymous, Brunton writes, refuses "the culture of celebrity, publicity, and reputation for any one person."
Still, many of us are easily named and tracked by a system that a pseudonym alone can't foil. "Everyday internet users are trivially identified through things like cookies and browser fingerprinting," he writes. "Collective names—unless handled with the deliberate care and opsec that characterizes the best Anonymous operations—are more of a gesture than protection and should be treated as such." While Tor and strong encryption can foil spies, governments still rely on a wide array of legal and technical tools to unmask internet users and break through passwords. In the U.S. those tools are aimed at terrorists and criminals that officials warn are "going dark"; in countries with poorer human rights records, those tools can be aimed at dissidents and journalists.
For Brunton and Nissenbaum, one exemplary instance of hiding in plain sight was Operation Vula. This covert action in apartheid South Africa was remarkable for its time: a secret communication network built on cryptic phone calls, a cipher, a recorder, and a personal computer.
In the mid 1980s, leaders of the African National Congress in South Africa needed a way to communicate with the imprisoned Nelson Mandela, as well as sympathizers and generals around the world. First, the ANC encrypted a message on a personal computer using simple one-time pads. Then, they expressed the ciphered message, as Brunton and Nissenbaum explain, as a "rapid series of tones recorded onto a portable cassette player." An agent would then visit a public pay phone and dial a London phone number, which was routed to an answering machine that one of the network's architects had modified so that it recorded up to five minutes worth of sound.
The agent placing the call would play that recording into the phone mouthpiece, and the tones recorded on the receiving end were then played on an acoustic modem into a computer and, finally, decrypted.
"For me, the crucial thing about Operation Vula is that it shows a full, complex covert system in action— that it's not about single magic fixes but a lot of different elements carefully implemented together: encryption, tradecraft, help from people with specialized skill-sets and occupations, and obfuscation," Brunton says. "They encrypted their communications, but also found a context where encrypted messages wouldn't arouse suspicion or be singled out for particular scrutiny."
Brunton sees Operation Vula as proof that there are no single "turnkey solutions" for privacy in the internet age, and plenty of peril. Systems like the kind Operation Vula used can be deceptively simple, requiring not only technical know-how but persistence: regimens like this can be difficult and tiring to abide by, perfectly, and all the time.
The advance of technology means that old methods are easily made obsolete. Brunton points to facial recognition and routine video surveillance as two of the trends that will require new solutions. There are emerging methods for foiling face recognition systems, like strange makeup or eyeglasses, for instance, but these may be short lived as software and hardware advance. Whatever happens, Brunton sees a crisis in the making for a free society.
What of the ethics of obfuscation? While critics might see individuals misleading and misdirecting businesses and states as inherently dishonest, Brunton and Nissenbaum generally believe the ends justify the means. For them, limiting corporate or government data mining is, in most instances, ethically defensible.
Social media services and data brokers might argue that obfuscation tactics pollute the data flow, contaminating its integrity. For Brunton and Nissenbaum, "[d]ata pollution is unethical only when the integrity of the data flow or data set in question is ethically required." The data repository must have general value and in this case, the authors argue, not just value for the data collector, overriding the interests of the obfuscator. (It could be argued that when Google and others mine or data, they aren't just doing it for their investors, but to create better services for users. But this doesn't seem to pass Brunton and Nissenbaum's test.)
Obfuscation ethics gets a bit more complex when it comes to the issue of free riding. If a user has an ad blocker turned on, Brunton and Nissenbaum acknowledge that Facebook and Foursquare can legitimately ask whether these users are free riding off their services, not fulfilling their end of a terms of service agreement. But if the obfuscation method is freely available and doesn't put other non-obfuscators at a disadvantage, then the authors believe there is "no moral wrong." If, on the other hand, the method isn't freely available to the average user, and puts non-obfuscators at a disadvantage, then the ethics need further probing.
Governments and corporations are likely to continue to fight obfuscation, but they will always be engaged in battle with those seeking privacy
In the L.A. Review Of Books, critic Rob Horning worries about the unintended effects of "muddying the waters": "Our phony data may feed into algorithmic models determining whether other people will receive discounts, get offered loans, or end up on government watch lists." And generally, obfuscation isn't just hard and complicated. Horning warns that its strategies have little effect on weakening the systems of power they try to avoid, and may be counterproductive.
"They provide means of getting along under conditions of enemy occupation, not strategies of revolution or resistance," he writes. "They consolidate rather than dismantle the asymmetries of power; they are fugitive, rearguard actions in the midst of an ongoing collective surrender. As clever and satisfyingly creative as obfuscation's false flags, diversions, and decoys can be, they do not speak truth to power so much as mock it behind its back."
Brunton and Nissenbaum do not propose obfuscation as a be-all-end-all approach to surveillance, but rather a starting point, and as one tool among many. And as technology evolves, and deepens the electronic web we live in, obfuscation techniques will also evolve. Governments and corporations are likely to continue to fight obfuscation, but they will always be engaged in battle with those seeking privacy, both for themselves and for society at large.
The book, its authors imply, is as much a handbook for today as it is a primer for tomorrow. If the prophesies about data collection are true, obfuscators, stopping or at least disrupting information asymmetry between powerful businesses and states and often unaware users, can help to level an inherently uneven playing field. Perhaps then, the book implies, those who hide won't be dishonest sneaks but rather critical players, essential in the effort to take back power, little by little.