opening_unsplash

When the US Begins to Reopen, Plenty of Privacy Questions Will Remain

Share this article

(This article is brought to you courtesy of the International Association of Privacy Professional (IAPP) and first appeared in The Privacy Advisor, IAPP’s original content publication for privacy professionals).

Everyone wants the world to go back to normal. The last six weeks have been taxing in just about every way imaginable. We all dream of a time when we can talk to our friends and family face-to-face and when trips to the grocery store aren’t riddled with anxiety.

Depending on where you’re hanging your hat at present, that time is likely far away. But as speakers indicated during a recent Brookings Institution webinar, that doesn’t mean it’s too early to plan for what a reopening of the economy looks like, what may happen to the data that will be used in those efforts and whether a U.S. privacy law may help to provide clarity should a similar event occur in the future.

During the webinar, The Brookings Institution Rubenstein Fellow Alex Engler said people are familiar with the traditional technologies used to fight against COVID-19; however, the use of artificial intelligence during the pandemic has proven to be nebulous thus far.

Engler believes AI has the potential to be helpful to assist in tracking the spread of the disease, but most efforts to incorporate the technology aren’t panning out.

“Over the last decade, we’ve seen a fundamental change in how valuable AI can be. You might be tempted to think, ‘Well, it helped in all these other ways; obviously, it’s going have a big impact with COVID-19.’ That, at least so far, hasn’t turned out to be true,” Engler said during the event. “We have examples of AI in the news that are probably snake oil. I point toward using AI and thermal imaging cameras to detect people walking around with fevers. The evidence that that is working or good to implement to keep people out of grocery stores is not very good.”

Michelle Richardson of the Center for Democracy and Technology’s Privacy and Data Project warned AI should not be treated as a “cure-all.” In some instances in which it may be tempting to use AI, a simpler, safer measure to gather information is likely out in the open, she said.

“There was an article that cited a small vendor that said, ‘Just give me access to people’s medical records and that way I can predict where we should be sending (personal protective equipment).’ That data is already available,” Richardson said. “We have public health officials screaming it from the rooftops saying, ‘We know where it needs to go.’ We do not need to throw open everyone’s medical records for AI processing to get that information.”

Contact tracing has been cited as a vital method to help mitigate the spread of COVID-19. Google and Apple made news by announcing they would develop tools to help notify smartphone users when they come into contact with someone who tested positive for COVID-19. A smartphone user would voluntarily download a contact tracing app, and through the use of Bluetooth technology, would receive the notice of another person who also has the app.

For the system to work, Engler said, users would have to update their operating system, download the app and provide consent. He also highlighted how privacy could impact the total amount of people who would be willing to participate.

“Pew said 81% of people in the U.S. own smartphones, so you are starting with a baseline of 81%. Of those, how many update their operating system and then download the app. If they have real privacy concerns, they may be disincentivized to engage in this if they do not trust what is going to happen to their data. Of those, how many voluntary report that they got sick or enable their public health organization to,” he said.

There is also the matter of how anonymous the data will remain. The information collected by the apps is not officially location data, according to Engler. That does not mean it can’t become quickly become location data.

“We’ve seen some of those overseas identified and they are being harassed. It’s quite easy to reassociate people. You only need a few data points to figure out a real person’s identity, especially when you are talking about location. While we say this is not location per se, you only need a few data points to put in there before it becomes location,” Richardson said. “It’s going to be easier to reassociate some of this than people realize.”

A lot of information is set to be collected from all these different initiatives, and it will be important to protect it from malicious actors. Engler pointed to another avenue of data misuse that should be considered. If the pandemic goes on for longer than most anticipate, all that information may become very valuable, and Engler believes the erosion of standards around privacy is a bigger concern than “people literally stealing” data.

“You can imagine a circumstance where the pandemic really starts to drag on and we are a couple of years in, and there’s a network of health applications that are using this data for COVID-19. But they’ve also mission creeped into various other things and suddenly there’s a financial market for the data,” Engler said. “That’s how I think about this becoming a bigger problem when the data comes out. Not necessarily the lack of the security, though I think that’s worth being concerned about, but I worry more about the systemic leakage. This just becomes another market for data.”

There is a lot of uncertainty around what will happen to all this data as the pandemic continues and how it can be safely used. Part of that is fueled by the lack of a U.S. privacy law. Richardson said federal rules would have helped fill in some of those gray areas, not just for citizens, but for companies as well. She hopes Congress will work to get a federal law on the books to avoid this level of uncertainty the next time a major event occurs.

“Corporate behavior has not met people’s expectations. It’s been surprising to them, and they have become suspicious. We may be in a better situation right now if we had a law that better aligned those things. People could trust technology more,” Richardson said. “They might be more comfortable sharing their information knowing that it would be locked down and not repurposed for things that would be surprising or offensive to them.”

“Companies would also then have more clarity. A lot of the proposals we see at the federal level have clear exceptions for public interest research, so to the extent some of them want to help public officials and contribute to this, they would be able to do so with some clear parameters and liability protection instead of worrying about where the line is.”

Ryan Chiavetta on Email
Ryan Chiavetta
Associate Editor at IAPP
Ryan Chiavetta, associate editor, produces content for the IAPP’s Daily Dashboard, European Dashboard Digest, Canada Dashboard Digest and Asia-Pacific Dashboard Digest. In addition, Ryan is a regular contributor to the Privacy Tech blog, where he is able to focus on new technologies.

Before landing at the IAPP, Ryan was the Online Content Coordinator at HealthLeaders Media, worked at Tufts Health Plan Medicare Preferred, assisting seniors with Medicare sign-ups, and interned as a Boston sports writer for NESN.

When he’s not sitting at his desk, Ryan might be watching the continuous Boston sports renaissance (12 titles since 2001), waiting for the next comic book movie to hit theaters (Batman vs.Superman was terrible), and waiting patiently for Game of Thrones’ next season.

Share this article