Experts examine privacy, surveillance and public health issues during Future Tense conversation
The proliferation of advanced technology has created many obvious benefits. But it has also raised new questions about privacy, data collection and data monitoring. Debating the trade-offs has inspired books, documentaries, movies, podcasts and TV shows, and it has made its way into political scandals and congressional hearings. As the world faces an unprecedented pandemic, how will technology help us move past the crisis? And how will our conceptions of privacy change as a result?
Privacy, surveillance and public health were the topics of conversation during the April 16 “social distancing social” convened by Future Tense, a partnership between New America, Arizona State University and Slate that explores emerging technologies and their transformative effects on society and public policy. This conversation was part of a series of events that take in-depth, provocative looks at issues that will dramatically reshape the policy debates in the coming decade.
In this webinar, Jennifer Daskal, professor and faculty director of the Tech, Law and Security Program at American University Washington College of Law, sat down with Kathryn Waldron, of the R Street Institute, and Al Gidari, director of privacy at the Stanford Center for Internet and Society, to discuss the ways in which data can help us understand and respond to the current health crisis.
“We need to think about good health and good privacy as going hand in hand,” Daskal said. “Both are essential as we move our way through this crisis. Good privacy and good privacy practices help ensure that data is used in ways that can promote the public health goals.”
The discussion around technology and public health typically focuses on location data. Gidari explained two primary types of location data that are interesting and useful during this time. The first is contact tracing, which allows companies to see whom individuals have been in physical contact with and let individuals know if and when they need to self-isolate. The second is aggregate tracking, which allows public-health experts to track the trajectory of the disease at the population level. Both of these data types can cause privacy concerns.
For instance, several companies are looking into ways to create apps that use Bluetooth technology to track the people you have been in contact with and alert you when one of them tests positive for COVID-19.
“The benefit of this approach is that it’s decentralized,” Gidari explained. “None of the data resides with a government agency. It’s as close to anonymous as you can get. There are some potential weaknesses or flaws that people have identified in using Bluetooth from a security perspective. But on balance those risks are pretty small.”
In other parts of the world, countries are developing apps that require individuals in self-isolation to upload photos or use location data to prove they are staying home.
“There is a wide spectrum of how intrusive the data is,” said Waldron, who is a national security and cybersecurity resident fellow at the R Street Institute. “Up until now there have been stay-at-home orders, but there hasn’t been as much legally required quarantining.”
Waldron did issue a warning to consumers as these different apps and types of data collection continue to be developed during this public-health crisis.
“Always keep in mind: ‘What is the data being collected?’ ‘Who’s going to hold the data?’ ‘How long will the data be held?’ and ‘What type of data is it?’” Waldron said. “Without asking those questions it’s really hard to compare and contrast all of these different types of surveillance apparatuses and apps that are being designed.”
Gidari urged the companies developing these apps, and the governments using the data, to be as transparent as possible.
“Trust and transparency are critical for these systems to work. People need to understand there is no long tail to the data in the hands of the government,” he said. “If you just don’t know where your data is going to end up afterwards, you’re just not going to share it and none of these systems will be very effective. Transparency is key.”
Top image courtesy of Pixabay