Steven B. Roosa is a Partner in our New York office, Christopher G. Cwalina is a Partner and Kaylee A. Cox an Associate in our Washington, D.C. office.

HIGHLIGHTS:

  • The FTC intends to leverage the work of independent researchers to identify potential privacy and data security violations and bolster enforcement actions on these issues.
  • Companies not only have to be acutely aware of what they are doing with the data internally, but they also must take reasonable measures to understand what downstream parties are doing.
  • Even where a company does not have actual knowledge of data being used contrary to how it was originally intended, it can nevertheless still be held liable for its privacy and data security practices.

The Federal Trade Commission (FTC) recently released a report on "Big Data," (Report) raising concerns about potential misuse, or unintended uses, of these resources.  The report focused on potential harms to low-income, disadvantaged, and vulnerable audiences through the use of Big Data and machine learning.  While recognizing that Big Data provides several benefits, the FTC is ultimately concerned that Big Data may be used to categorize consumers in ways that can result in exclusion of certain populations.

Importantly, the FTC did not take issue with the question of whether Big Data should be used in the first place, acknowledging that it "now fuels the creation of innovative products and systems that consumers and companies quickly are coming to rely upon and expect."  Instead, the FTC honed in on how companies use this data, and just today, at the FTC's first-ever "PrivacyCon,"  Chairwoman Edith Ramirez said the FTC intends to use the work of independent researchers to identify potential privacy and security violations pertaining to companies' data practices. 

Best Practices

The FTC identified several consumer protection laws that could apply to Big Data, including the Fair Credit Reporting Act, federal equal opportunity laws, and, importantly, the FTC Act.  With respect to the FTC Act, the Commission articulated four specific practices that companies should adopt when engaging in Big Data analytics:

  • ensure the company is not violating any material promises to consumers
  • ensure the company has not failed to disclose material information to consumers
  • implement measures to "reasonably secure" consumers' data
  • at a minimum, refrain from selling Big Data analytics products to customers the company knows, or has reason to know, will use the data for fraudulent or discriminatory purposes

This last recommendation is very important because, in effect, it means that companies not only have to be acutely aware of what they are doing with the data internally, but they also must take reasonable measures to understand what downstream parties are doing with data.

The FTC explained that, while each inquiry will be fact-specific, in every case, the standard will be whether the company is offering or using Big Data analytics in a deceptive or unfair way.

Similarly, the FTC stated that companies should stay apprised of important research in the field of Big Data, aimed at identifying biases and inaccuracies.  The Commission encouraged companies to take into consideration how representative the data is, whether their model accounts for biases, the accuracy of the data, and whether any ethical or fairness concerns are at issue.

Enforcement

In its report, the FTC reiterated its authority to regulate unfair and deceptive practices, including with respect to Big Data, and emphasized that this authority is very broad, applying to "most companies acting in commerce."  The report underscores the need for companies to clearly understand what data they have, where it is going, and how it is being used.  Moreover, just today, FTC Chairwoman Ramirez mentioned that the Commission intends to leverage the work of independent researchers to identify potential privacy and data security violations and bolster enforcement actions on these issues.

How to Mitigate Your Risk

Both the Chairwoman's statement and the report make clear that, even where a company does not have actual knowledge of data being used contrary to how it was originally intended, it can nevertheless still be held liable for its privacy and data security practices.  As mentioned, it is crucial that companies understand what is actually taking place with their data—at the technical level. 

Holland & Knight's custom-built, in-house Data Privacy and Security Lab helps organizations stay off the collective radar of the FTC and the very independent researchers described by Chairwoman Ramirez by using similar tools and technical approaches to assess the data collection and sharing practices of mobile apps, websites, and network-aware products and services.  This process is intentionally conducted under the attorney-client privilege so that companies can safely identify and mitigate potential privacy and security flaws without fear of garnering the attention of independent researchers or regulators. 

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.