When Privacy News Online first wrote about Covid-19, in February, we noted that it would touch on key concerns of this blog: freedom of speech, surveillance and privacy. Already by March, it was becoming clear that the actions taken by governments to deal with the pandemic posed a serious threat in that regard. Since then, this blog has reported on various examples of how privacy was being eroded as a result of national responses to Covid-19.
The US human rights organization Freedom House has produced a report called “The Pandemic’s Digital Shadow”, which reviews what has been happening around the world since February (available online and as a free downloadable PDF). It has two main sections. The first looks at how governments are using the pandemic as a pretext to crack down on free expression and access to information. The second part, entitled “False Panacea: Abusive Surveillance in the Name of Public Health”, looks more closely at the impact on privacy of pandemic responses. The section picks up on many of the themes that have appeared on this blog over the last six months:
Brick by brick, governments and companies responding to the public health crisis are laying a foundation for tomorrow’s surveillance state. Opaque smartphone apps collect biometric and geolocation data in an effort to automate contact tracing, enforce quarantines, and determine individuals’ health status. State agencies are gaining access to larger swaths of user data from service providers in a process that lacks oversight and safeguards against abuse. Police and private companies are accelerating the rollout of advanced technologies to monitor citizens in public, including facial recognition, thermal scanning, and predictive tools.
It provides a good discussion of which countries have deployed contract tracing apps – some of them compulsory, and even requiring mandatory Bluetooth bracelets to ensure round-the-clock monitoring. On the plus side, the authorities in countries like Estonia, Brazil and the US are using privacy-respecting architectures such as DP3T. This shows that it is possible to tackle the Covid-19 pandemic without jettisoning privacy protections, although the report notes that even these decentralized, opt-in contact-tracing apps are not risk-free.
Alongside the use of smartphone apps that are often effectively spies that people carry around in their pockets, in at least 30 countries the researchers found that governments are using the pandemic in order to carry out mass surveillance through direct partnerships with telecom companies and others. Often a country’s intelligence services are given access to personal data from mobile phone operators, with little or no oversight. Again, on the plus side, the authorities in Australia and the US have instead drawn on aggregated and anonymized data sets in order to understand population movements and the effectiveness of social-distancing regulations. However, such anonymized data sets are always vulnerable to de-anonymization, for example when combined with other data sets, or analyzed using big data tools to find underlying patterns. The lack of robust privacy protections in some countries exacerbates this problem.
This section also notes the rise of AI-based surveillance as a way of controlling the pandemic and populations. Privacy News Online has been reporting on this issue in China, which is the leader in this sphere, but the new report notes that other countries are following in its footsteps, for example France and Russia. It also reminds us that such high-tech tools may not be very effective at tackling the spread of Covid-19 because of their dependence on inaccurate or biased training data. Other kinds of biometric technology, such as DNA collection and the highly-dubious “emotion recognition”, are similarly affected by discriminatory inaccuracies built in to the systems. The pandemic has allowed governments around the world to deploy these technologies, but has not helped to address their flaws.
Finally, the section on surveillance notes the growing use of algorithmic decision making. In the context of healthcare, black box approaches are likely to create new inequalities and further disadvantage those already vulnerable to discrimination. As the report rightly says, the pandemic is ushering in a new era of “digital social sorting”, in which which people are identified and categorized according to their perceived health status or risk of catching the virus. And once these labels have been applied by an inscrutable, and unchallengeable system, there is a real risk that people may face limits on their ability to access public services or education, return to work, send their children to school, go shopping or use public transport. This new kind of discrimination may even affect family and friends, who are penalized simply because there are associated with individuals who have been branded as a risk in some way. This part concludes with thoughts about what needs to be done going forward:
The future of privacy and other fundamental rights depends on what we do next. As schools reopen, people head back to offices, and travel resumes despite the ongoing pandemic, the push for mandatory mobile apps, biometric technology, and health passports will only grow. It is vital for the public to consider whether certain new forms of surveillance are necessary or desirable in a democratic society, to resist overblown or unrealistic promises from promoters of high-tech tools, and to push elected officials to build strong privacy protections and other democratic safeguards into law.
Although we are barely six months into the Covid-19 pandemic, the danger is that the current undermining of privacy around the world could become the norm. Action needs to be taken now to eliminate the worst excesses described in the Freedom House report, and to put in place safeguards for the future.
Feature image from Scientific Animations.