More than 280,000 people globally have died from COVID-19. As countries begin to contain the pandemic, and consider slowly reopening their economies, more than twenty governments are developing and pushing citizens to use apps designed to detect when someone gets in contact with an infected person, and track how the infection spreads.
After years of privacy and surveillance scandals, from the Snowden revelations to Cambridge Analytica, people are wary of such apps. A recent survey found that more than half of Americans would not want to use this kind of app, and a contact tracing app will not work if more than half of the population refuses to use it. A recent study from Oxford estimated that for contact-tracing to be effective, at least 56 percent of the population needs to use the app. On top of it all, it’s not even clear how this kind of app would really help.
All this raises the question: is it ethical to refuse using contact tracing apps during a deadly global pandemic?
We asked Brent Mittelstadt, a bioethicist from the Oxford University Internet Institute who has studied the ethical implications of health monitoring devices, and the digital ethics surrounding the use of personal data for health research.
Mittelstadt was also the lead author on a 2018 academic paper that attempted to explain whether people have a moral duty to participate in digital epidemiology, a field that studies the use of people’s internet and mobile data with the goal of tracking epidemics.
“When there is a serious threat to population health—such as in light of a possible pandemic—public health measures infringe on individual rights and interests for the sake of collective interests, i.e. the health of a population,” Mittelstadt and his colleagues wrote in the paper.
The researchers laid out eight “justificatory conditions,” such as a strong public interest to prevent disease, or the “minimal amount of identifiable data necessary is being used,” to help people assess and decide whether there’s a moral duty to opt-in and participate in an epidemiological study or install an app that has the purpose of tracking an epidemic or pandemic.
I caught up with Mittelstadt by phone last week. What follows is an edited version of our conversation.
Let's start with a personal question: Am I an asshole for putting privacy above anything else and refusing to use a Coronavirus tracking app?
Brent Mittelstadt: It's an interesting way of putting it. [LAUGHS] No, no. I mean, you can't make a blanket statement like that. Certainly not.
I can absolutely understand—I have hesitation about using a contact tracing app myself. I think it's very reasonable to have concerns about it.
I think at least here in the UK—I've been focusing much more on the development of contact tracing in the UK and EU more than in the US—it seems like there's been an absence of a positive case to reassure people on some of the privacy concerns where that reassurance is justified. But then also just to make this positive public health case, like: 'Hey, you know, you could actually be a significant help in fighting this thing.' I feel like the privacy side of it has really sort of overshadowed the rest of it.
Is it just because of all the stories about Cambridge Analytica and other privacy issues in the last few years? Or is it because governments have not been good at communicating what you just said: that this is more important, perhaps, more important than privacy that it is about saving lives?
Yeah, I think before the pandemic started privacy and data protection were certainly on the minds of a lot of people in Europe, in the UK, and in the USA. You have GDPR of course in Europe, you have the Californian Privacy Act.
Basically [privacy] was a relatively hot topic before the pandemic started. I think it's also in a way a very reasonable or very predictable response to a pandemic situation. This positive case that I'm talking about it's like: you can help save lives and here's why. Here's what you'd be giving up in terms of privacy or here's what you'd be giving up in terms of data. And here's how that data that you're giving up could actually be helpful.
For example, making an argument for a particular app design or collecting a particular type of data that may or may not be on the table in other countries, all of that depends on evidence of efficacy or evidence of the utility of those things. You can make an argument for it up front, when you're actually designing the app, but you can't prove it, you can only prove it months or years down the line, once the thing already exists. There's a lot of uncertainty around the utility side and the efficacy side.
Have you studied, analyzed, or designed any COVID-19 contact-tracing apps? We’d love to hear from you. You can contact Lorenzo Franceschi-Bicchierai securely on Signal at +1 917 257 1382, OTR chat at firstname.lastname@example.org, or email email@example.com
Because there is this absence of evidence I think it's quite natural to then turn to topics that feel much more concrete and privacy is one of those. If we know the type of data that the apps would collect, for example, then we can start to imagine all the different ways that those types of data could be misused.
It's also uncertain how the data will actually be used in the future. But we can look at past instances, like Cambridge Analytica, for example, and have a much more serious prediction about how these things can be misused in the future. There's been a lot of that, for example, around sharing identifiable data through these apps sharing your contact networks, which could have huge epidemiological benefit or utility. But that case hasn't or almost cannot be made at this point. So it's quite understandable to be very concerned about it.
Considering all you've said about the current circumstances, and especially considering the extraordinary nature of this pandemic, do I have a moral duty to participate in something like that, downloading an app to save lives?
To be honest it's such a hard question to answer right now because some of the apps have been rolled out already. The one in the UK, the NHS app, is currently being tested, and it's being picked apart as we speak by engineers to see how it works exactly. But a lot of the concerns would be ways that the data that's being collected could actually be repurposed in the future, used for broader surveillance. A lot of the concerns, rightly so, are around future surveillance that's enabled by this sort of technology becoming commonplace or socially accepted.
So, I don't know. I'm torn on it because at least in the UK, I don't think we're at a point where I could definitely say even just to friends and family that there is an obligation—a moral duty—to use this app. I think it's very easy to say: look, if you're willing to accept the risks, the uncertainty that maybe the data could be used in ways you wouldn't completely agree with in the future. And maybe—in an ideal situation—you would not be sharing data, say, about your contact networks.
If you're willing to take on those risks yourself, then it can be effectively just an altruistic act to say yes, I am going to use this app because it could potentially help other people or save the lives of other people.
This is all sort of separate from the political conversations which should be going on at the same time around setting clear limits on how long the data is retained for, how long the app functions for, how the data can be reused in the future. Those conversations should absolutely be going on simultaneously. And even if we say now that it can be an altruistic act, it can be a good thing to use the the apps because of their potential utility, it doesn't then give a blank check to public health authorities or to government to say: "okay, society has accepted this so we can use it however we please."
In the paper we were focusing primarily on potential moral duties on individuals to participate in epidemiological research. But there's also duties on the other side, placed on researchers and on data controllers, or whoever it is that's actually governing the systems. I think it's really important to remember that. We have to have some level of faith in our ability to govern things through law, through policy, through ethical frameworks. We've built these things up over time, and it feels like in a crisis situation, we're suddenly forgetting that actually, hey, we do have—depending on the country—relatively robust regulatory frameworks that can help us in these situations.
Given the magnitude of this pandemic, is there a case for governments to make the adoption of an app like this mandatory?
It would be really interesting to know what that would look like in specific countries. Because as soon as you start thinking about it being mandatory, then you run into the problem of who actually has phones that are capable of running it? What are the demographics of people that don't have access to phones, don't have access to the internet? What sort of problems do you run into in terms of bias in your data and the representative mess of your data? How are you going to actually address those problems?
I think in the UK, there are not significant enough safeguards in place. There's not nearly a strong enough framework in place yet to say that yes, it could justifiably be made mandatory. That's just on the downloading and actually using the app question. The question of making it mandatory for the purposes of say, limiting access to employment, or travel, or restaurants, or just society in general, that, to me is a completely different question. And something that should not even be on the table at this point.
Why is that question or discussion completely different?
If you're talking about making it mandatory as a way to basically limit people's access to society, to employment, we're talking about things that are connected to fundamental human rights. To suddenly turn around and use the pandemic as a excuse to severely limit people's ability to exercise their rights, to me that would require a completely different sort of justification, one that would not be accepted certainly within Europe, that I would hope in the U.S. as well, though I take nothing for granted in the U.S. in terms of politics.
This is sort of a complete authoritarian turn, where you're saying, okay, we're not just going to handle it by trying to detect cases early, we're actually going to prevent new cases by locking people down in their houses or preventing them from essentially participating in the economy and society.
How much does consent from citizens play into this whole discussion? How much importance should we give to consent?
You should always give it significant importance. It's one of the strongest protections that we have in Western society around medical research. And that shouldn't change. We're seeing quite a bit of importance being attached to it where apps are being designed so that you disclosing that you've had a positive test for COVID is actually voluntary, that it's your choice to upload that to a public health authority and to researchers. And the same goes for downloading the app in the first place.
I feel like we need to, because we have examples of countries that have successfully contained the virus without going towards more invasive options such as mandatory app usage or mandatory uploading of different types of identifiable data. The ideal here is that you would always ask for consent, that you would always respect people's privacy as much as you possibly could.
Anything else that you would like to tell people like me who don't really have any experience in this and may be soon faced with this dilemma?
I think because of our recent experiences with Cambridge Analytica and misinformation campaigns the immediate gut reaction a lot of people will be to say, “no, I'm not going to use that because that's not information I want to share with anybody, or my privacy is more important than what's been asked for here.” I would just urge people to give some serious consideration, to really talk to other people and actually take a look at what is being offered not just in terms of the design of the apps, but also the governance of them.
Keep up with the evidence and see whether it turns out that these apps are particularly useful, or if they are being repurposed for other means, and are actually being used in a way that doesn't match up with the initial bargain that was being presented to the public.
Subscribe to our new cybersecurity podcast, CYBER.