Op-ed: The Case For Contact Tracing Apps Built On Apple And Google’s Exposure Notification System

Apple and Google have now released their update to their mobile operating systems to include a new capability for COVID-19 exposure notification. This new technology, which will support contact tracing apps developed by public health agencies, is technically impressive: it enables notifications of possible contact with COVID-positive individuals without leaking any sensitive personal data. The only data exchanged by users are rotating random keys (i.e., a unique 128-digit string of 0s and 1s) and encrypted metadata (i.e., the protocol version in use and transmitted power levels). Keys of infected individuals, but not their identities or their locations, are downloaded by the network upon a positive test with the approval of a government-sanctioned public health app.

Despite being a useful tool in the pandemic arsenal and adopting state-of-the-art techniques to protect privacy, the Apple-Google system has drawn criticism from several quarters. Privacy advocates are dreaming up ways the system could be abused. Anti-tech campaigners are decrying “tech solutionism.” None of these critiques stands up to scrutiny.

How the exposure notification API works

To get a sense for how the Apple-Google exposure notification system works, it is useful to consider a hypothetical system involving raffle tickets instead of Bluetooth beacons. Imagine you were given a roll of two-part raffle tickets to carry around with you wherever you go. Each ticket has two copies of a randomly-generated 128-digit number (with no relationship to your identity, your location, or any other ticket; there is no central record of ticket numbers). As you go about your normal life, if you happen to come within six feet of another person, you exchange a raffle ticket, keeping both the ticket they gave you and the copy of the one you gave them. You do this regularly and keep all the tickets you’ve exchanged for the most recent two weeks.

If you get infected with the virus, you notify the public health authority and share only the copies of the tickets you’ve given out–the public health officials never see the raffle tickets you’ve received. Each night, on every TV and radio station, a public health official reads the numbers of the raffle tickets it has collected from infected patients (it is a very long broadcast). Everyone listening to the broadcast checks the tickets they’ve received in the last two weeks to see if they’ve “won.” Upon confirming a match, an individual has the choice of doing nothing or seeking out a diagnostic test. If they test positive, then the copies of the tickets they’ve given out are announced in the broadcast the next night. The more people who collect and hand out raffle tickets everywhere they go, and the more people who voluntarily announce themselves after hearing a match in the broadcast, the better the system works for tracking, tracing, and isolating the virus.

The Apple-Google exposure notification system works similarly, but instead of raffle tickets, it uses low-power Bluetooth signals. Every modern phone comes with a Bluetooth radio that is capable of transmitting and receiving data over short distances, typically up to around 30 feet. Under the design agreed to by Apple and Google, iOS and Android phones updated to the new OS, that have their Bluetooth radios on, and that have a public health contact tracing app installed will broadcast a randomized number that changes every 10 minutes. In addition, phones with contact tracing apps installed on them will record any keys they encounter that meet criteria set by app developers (public health agencies) on exposure time and signal strength (say, a signal strength correlating with a distance up to around six feet away). These parameters can change with new versions of the app to reflect growing understanding of COVID-19 and the levels of exposure that will generate the most value to the network. All of the keys that are broadcast or received and retained are stored on the device in a secure database.

When an individual receives a positive COVID-19 diagnosis, she can alert the network to her positive status. Using the app provided by the public health authority, and with the authority’s approval, she broadcasts her recent keys to the network. Phones download the list of positive keys and check to see if they have any of them in their on-device databases. If so, they display a notification to the user of possible COVID-19 exposure, reported in five-minute intervals up to 30 minutes. The notified user, who still does not know the name or any other data about the person who may have exposed her to COVID-19, can then decide whether or not to get tested or self-isolate. No data about the notified user leaves the phone, and authorities are unable to force her to take any follow-up action.

Risks to privacy and abuse are extremely low

As global companies, Google and Apple have to operate in nearly every country around the world, and they need to set policies that are robust to the worst civil liberties environments. This decentralized notification system is exactly what you would design if you needed to implement a contact tracing system but were concerned about adversarial behavior from authoritarian governments. No sensitive data ever leaves the phone without the user’s express permission. The broadcast keys themselves are worthless, and cannot be tied back to a user’s identity or location unless the user declares herself COVID-positive through the public health app.

Some European governments think Apple and Google’s approach goes too far in preserving user privacy, saying they need more data and control. For example, France has indicated that it will not use Apple and Google’s API and has asked Apple to disable other OS-level privacy protections to let the French contact tracing app be more invasive (Apple has refused). The UK has also said it will not use Apple and Google’s exposure notification solution. The French and British approach creates a single point of failure ripe for exploitation by bad actors. Furthermore, when the government has access to all that data, it is much more likely to be tempted to use it for law enforcement or other non-public health-related purposes, risking civil liberties and uptake of the app.

Despite the tremendous effort the tech companies exerted to bake privacy into their API as a fundamental value, it is not enough for some privacy advocates. At Wired, Ashkan Soltani speculates about a hypothetical avenue for abuse. Suppose someone set up a video camera to record the faces of people who passed by, while also running a rooted phone–one where the user has circumvented controls installed by the manufacturer–that gave the perpetrator direct access to the keys involved. Then, argues Soltani, when a COVID-positive key was broadcast over the network, the snoop could be able to correlate it with the face of a person captured on camera and use that to identify the COVID-positive individual.

While it is appropriate for security researchers like Soltani to think about such hypothetical attacks, the real-world damage from such an inefficient possible exploit seems dubious. Is a privacy attacker going to place cameras and rooted iPhones every 30 feet? And how accurate would this attack even be in crowded areas? In a piece for the Brookings Institution with Ryan Calo and Carl Bergstrom, Soltani doubles down, pointing out that “this decentralized architecture isn’t completely free of privacy and security concerns” and “opens apps based on these APIs to new and different classes of privacy and security vulnerabilities.”

Yet if “completely free of privacy and security concerns” is the standard, then any form of contact tracing is impossible. Traditional physical contact tracing involves public health officials interviewing infected patients and their recent contacts, collecting that information in centralized government databases, and connecting real identities to contacts. The Google-Apple exposure notification system clearly outperforms traditional approaches on privacy grounds. Soltani and his collaborators raise specious problems and offer no solution other than privacy fundamentalism.

Skeptics of the Apple-Google exposure notification system point to a recent poll by the Washington Post that found “nearly 3 in 5 Americans say they are either unable or unwilling to use the infection-alert system.” About 20% of Americans don’t own a smartphone, and of those who do, around 50% said they definitely or probably would not use the system. While it’s too early to know how much each component of coronavirus response contributes to suppression, evidence from Singapore and South Korea suggests that technology can augment the traditional public health toolbox (even with low adoption rates). In addition, there are other surveys with contradictory results. According to a survey by Harris Poll, “71% of Americans would be willing to share their own mobile location data with authorities to receive alerts about their potential exposure to the virus.” Notably, cell phone location data is much more sensitive than the encrypted Bluetooth tokens in the Apple-Google exposure notification system.

Any reasonable assessment of the tradeoff between privacy and effectiveness for contact tracing apps will conclude that if the apps are at all effective, they are overwhelmingly beneficial. For cost-benefit analysis of regulations, the Environmental Protection Agency has established a benchmark of about $9.5 million per life saved (other government agencies use similar values). By comparison, the value of privacy varies depending on context, but the range is orders of magnitude lower than the value of saving a life, according to a literature review by Will Rinehart.

If we have any privacy-related criticism of the tech companies’ exposure notification API, it is that it requires the user to opt in by downloading a public health contact tracing app before it starts exchanging keys with other users. This is a mistake for two reasons. First, it signals that there is a privacy cost to the mere exchange of keys, which there is not. Even the wildest scenarios concocted by security researchers entail privacy risks from the API only when a user declares herself COVID-positive. Second, it means that the value of the entire contact tracing system is dependent on uptake of the app at all points in time. If the keys were exchanged all along, then even gradual uptake of the app would unlock value in the network that had built up even before users installed the app.

The exposure notification API is part of a portfolio of responses to the pandemic

Soltani, Calo, and Bergstrom raise other problems with contact tracing apps. They will result in false positives (notifications about exposures that didn’t result in transmission of the disease) and false negatives (failures to notify about exposure because not everyone has a phone or will install the app). If poorly designed (without verification from the public health authority), apps could allow individuals who are not COVID-positive to “cry wolf” and frighten a bunch of innocent people, a practice known in the security community as “griefing.” They want their readers to understand that the rollout of a contact tracing app using this API will not magically solve the coronavirus crisis.

Well, no shit. No one is claiming that these apps are a panacea. Rather, the apps are part of a portfolio of responses that can together reduce the spread of COVID and potentially avoid the need for rolling lockdowns until a cure or vaccine is found (think of how many more false negatives there would be in a world without any contact tracing apps). We will still need to wear masks, supplement phone-based tracing methods with traditional contact tracing, and continue some level of distancing until the virus is brought fully under control. (For a point-by-point rebuttal of the Brookings article, see here from Joshua B. Miller).

The exposure notification API developed by Google and Apple is a genuine achievement: it will enable the most privacy-respecting approach to contact tracing in history. It was developed astonishing quickly at a time when the world is in desperate need of additional tools to address a rapidly spreading disease. The engineers at Google and Apple who developed this API deserve our applause, not armchair second-guessing from unpleasable privacy activists.

Under ordinary circumstances, we might have the luxury of interminable debates as developers and engineers tweaked the system to respond to every objection. However, in a pandemic, the tradeoff between speed and perfection shifts radically. In a viral video in March, Dr. Michael J. Ryan, the executive director of the WHO Health Emergencies Programme, was asked what he’s learned from previous epidemics and he left no doubt with his answer:

Be fast, have no regrets. You must be the first mover. The virus will always get you if you don’t move quickly. […] If you need to be right before you move, you will never win. Perfection is the enemy of the good when it comes to emergency management. Speed trumps perfection. And the problem in society we have at the moment is that everyone is afraid of making a mistake. Everyone is afraid of the consequence of error. But the greatest error is not to move. The greatest error is to be paralysed by the fear of failure.

We must move forward. We should not be paralyzed by the fear that somewhere someone might lose an iota of privacy.

CGO scholars and fellows frequently comment on a variety of topics for the popular press. The views expressed therein are those of the authors and do not necessarily reflect the views of the Center for Growth and Opportunity or the views of Utah State University.