GROW YOUR TECH STARTUP

Privacy advocates testify on preventing COVID-19 data from being exploited: Senate ‘Paper Hearing’

April 15, 2020

SHARE

facebook icon facebook icon

The amount of data being tracked and shared between tech companies, health organizations, and government bodies in response to COVID-19 presents a litany of privacy concerns over how that data can be exploited.

Testifying in the new “Paper Hearing” format of the US Senate Committee on Commerce, Science, and Transportation on Thursday, representatives from tech and data privacy think tanks told lawmakers they must consider mitigating the many privacy risks of how COVID-19 data is used when drafting data privacy legislation.

As keen observers of recent history, the witnesses presented evidence of how companies have exploited sensitive health data in the past and how lawmakers can better protect individual rights from being exploited by the coronavirus data that is being collected.

The risks include using data to identify, track, and police individuals; the sharing of sensitive health data between companies; and how governments, businesses, and organizations can exploit the data once the pandemic is over.

Immediate and Secondary Privacy Risks

In her written testimony, Stacey Gray, Senior Counsel, Future of Privacy Forum (FPF), said that the privacy risks to individuals and their communities come in two forms: immediate and secondary.

Immediate risks include “discrimination, endangered physical security, disproportionate loss of privacy, and unjustified limitations on freedom of movement, to long term risks to freedom, civil liberties and even to democracy,” Gray testified.

Big tech companies like Google and governments like the Chinese Communist Party are tracking where citizens are going during the pandemic by analyzing location data from people’s mobile phones.

“The collection and use of data, including personal data, to respond to a public health crisis like a pandemic can be compatible with privacy and data protection principles”

Stacey Gray

Stacey Gray

While Google’s Community Mobility Reports say the tracking data is anonymous, the Chinese government is using location data as a surveillance tool through a color coded app, “which dictates whether they can leave the house and where they can go,” according to Business Insider.

Since the Chinese Communist Party has time and time again shown to be an authoritarian regime has been that constantly weaponizes data against its citizens, the app can very easily turn into a policing app for law enforcement to arrest those who venture out of their homes.

This leads us to secondary risks, which have to do with what the data will be used for after the health crisis is over.

“For example,” Gray testified, “some apps may require an individual to take multiple pictures of themselves during the day and upload the selfies to a centralized database.

“With the current state of facial recognition technology it is not far-fetched to foresee the use of the database as a training set for machine learning algorithms.”

Feeding algorithms is just one way the data could be exploited after the COVID-19 crisis is over, but there are many other ways.

Gray covered both data privacy risks and recommendations for lawmakers in more detail in her 12-page written testimony and 13-page Q&A followup.

How Health Data Is Collected and Shared Among Stakeholders

Michelle Richardson, Director of the Data and Privacy Project at the Center for Democracy and Technology, told lawmakers in the Paper Hearing how health data is collected and shared among stakeholders “in ways that would surprise individuals.”

“There is mounting evidence that non-HIPAA-covered entities are regularly collecting, using, and sharing consumer health information in ways that would surprise individuals and arguably exploit their sensitive data,” Richardson testified.

“Successfully fighting the coronavirus will mean ensuring that a government response does not evolve into law enforcement and broad surveillance functions”

Michelle Richardson

Michelle Richardson

She added, “Here are just a few examples where unregulated health data about consumers was collected and shared:”

  • A 2019 study reported that a range of health and wellness applications, including smoking cessation, fitness tracking, mental health, and period tracking applications had shared their users’ sensitive information with third parties.
  • A Financial Times report found that popular health websites share sensitive consumer information, including medical symptoms, diagnosis, drug names, and menstrual and fertility information with advertisers and data brokers.
  • The app GoodRX shared consumer information, including specific prescription information, with advertisers.
  • DNA testing company 23andMe recently sold the rights to a drug that it developed using its customers’ data.
  • Voice recordings from consumer smart speakers are being studied to detect signs of dementia.
  • Consumer pregnancy apps are sharing users’ fertility data with their employers.
  • Data brokers and advertisers have used inferred health information to target ads in a harmful and exploitative manner, such as serving treatment ads to someone facing challenges with addiction.
  • Insurance companies are using data from their partners, such as purchase and browsing history and social media activity, to make decisions regarding eligibility, rates, and targeted marketing.

“The United States does not have a comprehensive privacy law to protect Americans’ personal information,” added Richardson.

“This has led to the explosion of risky and exploitative data-driven behaviors in the vast unregulated space in between.”

Richardson, too, covered both data privacy risks and recommendations for lawmakers more extensively in her 16-page written testimony and 13-page Q&A.

Health Data Privacy Recommendations to Lawmakers

If used ethically, health data collected during COVID-19 can be used for the betterment of society. If used unethically, the data can be abused by governments for law enforcement purposes and be exploited by businesses looking to sell the information to the highest bidder.

“The collection and use of data, including personal data, to respond to a public health crisis like a pandemic can be compatible with privacy and data protection principles,” Gray testified.

“In many cases, commercial data can be shared in a way that does not reveal any information about identified or identifiable individuals,” she added.

Both Gray and Richardson agree that a federal data privacy law is required in order to protect individual rights, especially during a health crisis.

Richardson suggested that lawmakers consider the following when formulating policy on data collection, use, and sharing:

  • Focus on prevention and treatment, not punishment — Successfully fighting the coronavirus will mean ensuring that a government response does not evolve into law enforcement and broad surveillance functions.
  • Require corporate and government practices that respect privacy  — The burden for constructing privacy-protective products and responses must not be on concerned citizens but on companies and governments.
  • Be transparent to build trust — Companies that provide data, or inferences from data, and the governmental entities that use such information, must be transparent to users and residents about how data will be used.

On behalf of the FPF, Gray relayed her organization’s recommendations to lawmakers for consideration when drafting data privacy laws.

“FPF has long supported comprehensive federal privacy legislation and observed that it should be flexible enough to support data-driven public health initiatives under the right safeguards and within limits consistent with privacy and civil liberties.”

Gray recommended:

  • Protections for Sensitive Data – A federal privacy law should create heightened legal protections for sensitive data, including for health information and precise geo-location data, in line with global norms and legal standards.
  • Independent Ethical Review Boards — Although many institutions already conduct research under sectoral privacy laws (e.g. healthcare centers, hospitals, pharmaceutical development, or academic institutions abiding by the federal Common Rule), the current pandemic is demonstrating that a broad range of beneficial research currently falls outside of the scope of these regulations.
  • Purpose Limitation (Secondary Uses of Data) — purpose limitation is a core principle of data protection, and uniquely important for considerations of whether and how to use data for public health initiatives.

The COVID-19 pandemic has made big data an invaluable tool in the fight against the virus, with many public and private entities enlisting the technology.

Once the pandemic is over; however, what will become of all that data? Will it be used to prevent further outbreaks, or will it be used as a tool of oppression?

If Gray and Richardson have anything to say about it, legislation must be made to ensure privacy is protected at all costs.

On govts using data surveillance to combat coronavirus: Q & A with The Sociable team

Tech products, culture are ‘designed intentionally for mass deception’: Ex-google ethicist testifies

SHARE

facebook icon facebook icon

Sociable's Podcast

Trending