When we think of the wars that will take place in the coming decades—whether they're fought for oil, water, or something else—we also have to think about which war technologies are the biggest threat to human rights. Will it be killer robots? Virtual reality torture, perhaps? The answer may actually be closer to home than we'd like to imagine: government surveillance.
A panel of legal experts discussed the topic of which human rights will be most likely to be violated during the conflicts of tomorrow during Wednesday's Future of War conference in Washington, DC. Autonomous weapons were broached upon, as well as the possibility of an endless state of war. But one topic unified all of these perspectives: the mass collection of increasingly varied kinds of data by nations with the authority to kill and detain people after analyzing it.
"Think of what drones really do [that's] revolutionary, or at least transformative: they're mainly intelligence, surveillance, and recognizance platforms," said Daniel Rothenberg, co-director of the Future of War project and law professor at Arizona State University. "They hold missiles, but they also hold the capacity to gather enormous amount of information over the area they surveil."
"And that information isn't solitary—it's unified with human intelligence, with signals intelligence, and it allows for the picturing and imaging of targeting operations and all sorts of other operations that begin to take us to a place that we really haven't yet been or at least we don't fully recognize the implications," he continued.
"The gathering of data in a digital form presents an extraordinary affront to human dignity"
Dignity is the basis for all human rights, Rothenberg argued, and "dignity" is mentioned in Article 1 of the UN's Universal Declaration of Human Rights. Additionally, in November of 2014 the UN general assembly adopted a resolution that condemned mass surveillance and stated that it "may contradict the tenets of a democratic society."
"The gathering of data in a digital form, its correlation through current mechanisms and through future mechanisms, presents an extraordinary affront to human dignity that we don't yet even have the language to process," he said.
Rothenberg's statement comes just weeks after the Justice Department stated that it could not find its privacy impact assessment reports for the drone missions the Federal Bureau of Investigation has been flying for surveillance purposes inside the US since 2005. This statement came six months after the FBI refused to release its reports, which are supposed to be publicly available.
"It's not clear under what certain legal regimes and policy mechanisms this [data] is gathered, and its certainly less true under whose authority that material can be continually processed and analyzed forever," said Rothenberg.
That governments have no qualms with undertaking surreptitious activities that would normally be considered immoral, if not illegal, in the pursuit of data has been well demonstrated. Last week, The Intercept revealed that NSA and Britain's GCHQ had broken into the network of Gemalto, a global SIM card manufacturer, to steal millions of encryption keys that allowed NSA to spy on potentially all voice, text, and data traffic on every cell phone carrying a Gemalto SIM card.
There's no reason to assume that these kinds of activities will abate. "If this is the beginning of this process, imagine where this will take us," Rothenberg said.
The legal standing for these kinds of actions is still up for debate, although Gemalto may have an unparalleled opportunity to take NSA to court over the intrusion, and legal opinion is divided on the correct path forward, according to Ryan Goodman, a law professor at New York University.
"One answer is that the law has a certain flexibility in it, just like how the Constitution has a certain flexibility to deal with certain types of technology today that were unimaginable when it was written," said Goodman. "The other answer might be that these really outstrip even the consideration of what the framers of those original codes anticipated, and what kinds of trade-off they were considering. These are very different trade-offs."