The global crisis caused by the spread of COVID-19 thrust data and technology issues into the center of civic life, as people around the world turned to technology to improve public health, work and study remotely, and connect, communicate, and organize in new ways.
As government leaders, policymakers, and tech companies searched for solutions to slow the spread of COVID-19, CDT actively monitored responses and worked to ensure that they are grounded in civil rights and liberties.
Early in the pandemic, we launched the Coronavirus: Data for Life and Liberty Task Force, a group of civil liberties advocates, representatives from industry, academics, health professionals, and technologists to explore how data — particularly mobile location and proximity information — should and should not be used to fight the pandemic. We urged governments to favor consent-based, transparent, and time-bound measures that avoid disparate impacts, and highlighted situations in which aggregated data could be used instead of individualized data.
The COVID-19 pandemic has driven home the importance of strong privacy protections to protect the ever-growing amount of information we share online: from web browsing history that could reveal someone’s positive COVID status, to schools gathering new types of student health data, to the considerations around notifying people of possible exposure to the virus.
In testimony to the U.S. Senate Commerce Committee on the role of big data in the fight against COVID-19, Michelle Richardson, who heads our Data and Privacy Project, highlighted the absence and importance of clear, meaningful federal privacy rules, and of using data only in ways that support meaningful public health responses.
In 2020, we also expanded CDT’s health privacy work, collaborating with the Robert Wood Johnson Foundation, the eHealth Initiative, and dozens of partners to build a framework that identifies what standards and rules should govern consumer health data. The effort culminated in the release of proposed standards that we hope will shape industry practices and provide guidance for future regulation in the space.
In education, technology allowed many students to continue schooling from home, but the pandemic also worsened the inequities of the digital divide and created new issues for student privacy. As school districts enacted distance learning plans, we provided educators and parents with resources on protecting students’ privacy as they made choices about how technology could support remote learning. We conducted original research on parents’ and teachers’ views on student privacy during the pandemic, and called on policymakers to ensure that, when tackling the digital divide by providing funds to connect students remotely, students’ data are not commodified by the infrastructure they rely on.
From the ways companies moderate content, to the types of data that are collected about people, to the ways people socialize, work, and get an education, the pandemic has changed a lot — but more than anything, it has shown how much modern societies rely on online services. It is clearer than ever that there is a public interest imperative in setting the right frameworks for how digital technologies operate. When built thoughtfully, technology can help us effectively manage the COVID-19 pandemic — and so much more — while also protecting civil liberties and human rights.