Sections

Commentary

Managing health privacy and bias in COVID-19 public surveillance

Illustration picture shows a smartphone with Covid-19 app in Rotterdam, the Netherlands on April 19, 2020. The World Health Organisation says tracking infected people, and tracing and alerting their contacts, are vital if the disease is to be kept under control after lockdown. The government says it has received 750 applications from companies to co-operate on a 'corona app' and will draw up a shortlist over the weekend with the aim of producing a draft proposal on April 21. The app is expected to use Bluetooth technology on mobile phones to see who has been in contact with an infected person. Privacy campaigners are concerned that if the app is rolled out too quickly it will not be secure or sufficiently anonymous and have questioned its effectiveness. The government’s target is for 60% of people to use the app, but in Singapore, where a voluntary app was introduced, take-up is around 20%. Photo by Robin Utrecht/ABACAPRESS.COMNo Use Netherlands. No Use Germany.
Editor's note:

A version of this blog post appeared on the website of the John Locke Foundation.

The Center for Technology Innovation is hosting a webinar titled “Public health surveillance, AI bias, and risks to privacy in the fight against COVID-19” on April 21 at 2:00 PM. Register to watch the event here.

Most Americans are currently under a stay-at-home order to mitigate the spread of the novel coronavirus, or COVID-19. But in a matter of days and weeks, some U.S. governors will decide if residents can return to their workplaces, churches, beaches, commercial shopping centers, and other areas deemed non-essential over the last few months.

Re-opening states will require widespread and immediate coronavirus testing, which at this time may not happen as some of the standard supplies, like cotton swabs and reagents, are still in demand. A comprehensive plan for COVID-19 contact tracing, which is actively tracking and monitoring people potentially exposed, is also required before resuming some level of normalcy. For contact tracing to be effective, the Centers for Disease Control and Prevention (CDC) recommends that individuals potentially exposed be quarantined, large cadres of people be deployed as formal contract tracers, and that digital health tools be used to expand the reach and effectiveness of these workers.

On April 10, Apple and Google announced their response to the call for digital contact tracing, which would involve subscribers voluntarily downloading an app. Both companies issued press releases about the partnership, which would first release APIs to enable interoperability with apps from public health authorities in May. The next phase of the joint project will involve a Bluetooth-based contact tracing platform to allow for more interactions between individuals who opts in and public health authorities. Both companies have asserted that the design does not collect location data or personal or health data from anyone without a positive COVID-19 diagnosis.

Privacy and bias risks

While it is seemingly clear that widespread contact tracing and surveillance can help identify coronavirus cases and possible hot spots for new and recurring infections, several questions remain. The first one is related to the security and anonymity of one’s personal data. Both firms have proposed that the use of Bluetooth-enabled technology will obscure the personal identities of the infected person and the people in near proximity. However, more discussion is needed on just how anonymous the data is and whether it can be easily de-anonymized, which may discourage individuals from downloading the app. The platform also needs to ensure that the collected location data won’t engender inferences about the infected person and his or her environment, i.e., the use of one’s location as an indicator of neighborhood quality.

Second, who has access to the data also matters. While both companies have made assurances around their handling of collected data and the intent to stop tracking once the pandemic has ended, what expectations have federal and local public health authorities shared around their data collection and use? How long will the data be retained, and the longer that it is kept, what is the risk of this data being used for other purposes? In the absence of current federal privacy legislation, these are all important considerations.

In a worst case scenario, communities that exhibit higher cases of the coronavirus infection can be subjected to geofencing by public health officials, which can be enabled through location tracking and create limitations on the mobility of residents. Some countries are already deploying such digital tools, resulting in invasive mass surveillance. In response to such concerns, the tech companies have proposed a solution that relies upon advanced cryptography, where randomly generated IDs from devices are distributed through Bluetooth signals to others with the app. However, despite these advances in the technology, raising transparency concerns with government agencies and potential third parties are still important because they will ultimately have the test results.

For these reasons, full encryption and crytography of collected health information for those who are infected and the people with potential exposure must be the standard. Without the possibility of an enticing “back door” into the app, individuals that opt in to use the service will be better served and protected from potential misuse by government and other companies.

In such uncertain times, the prospect of digital health surveillance will become a plausible supplement for the thousands of physical contact tracers that may be overrun by the demand for their services. Other technology companies are also in discussions with the federal government about leveraging similar resources, like cell phone data or mobile advertising data, to fight the coronavirus. And, certain government agencies are also using cell phone data to monitor the movement of individuals.

Any such use of digital tools should continue to raise legal and ethical questions around privacy to avoid unintended consequences for the people being helped. The conversation about the privacy risks should lend itself to a broader conversation on inequality, especially racial and ethnic profiling and the digital divide.

COVID-19 infections and fatalities have hit majority-black and brown cities the hardest, where poverty, lack of access to quality health care, dense living situations, and higher rates of pre-existing medical conditions exist. Contact tracing among individuals who live or spend a large portion of their time in these communities will indicate a higher likelihood that the people around you are prone to the virus, which could stigmatize rather than support them. Moreover, the recent revelation that the proposed smartphones apps may not work on older devices could inadvertently affect these communities where the cost of technology and access to devices are quite prohibitive.

Engaging in public health surveillance will be critical to reduce current and future outbreaks of COVID-19. But any supplement to traditional practices must be done in ways that ensure security, transparency, and more importantly, equity, especially at a time when the U.S. is expeditiously working to keep the curve of infections flat.


Google and Apple are general, unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and not influenced by any donation.

Authors