On May 31, the Federal Trade Commission (FTC or Commission) announced two separate enforcement actions against Amazon—one involving its cloud-based voice service, Alexa, and the other involving Ring, its smart doorbell system. Though the two enforcement actions are based on different sets of facts, both of the complaints center on allegations that Alexa and Ring violated users' privacy in a variety of ways, including through misleading data retention practices, overbroad employee access to user data and inadequate cybersecurity practices. As part of these settlements, Amazon will be required to pay more than $30 million to the federal government. The company has responded to the FTC's allegations with regard to both the Ring and Alexa enforcement actions.
These settlements highlight the types of activities where the FTC may focus its future enforcement attention (and indeed, the agency has already announced another enforcement action involving children's data since these settlements were publicized). Most notably, the underlying complaints in this case reflect a broad conception of what the FTC believes constitutes "unfair" practices within the meaning of Section 5 of the FTC Act, sweeping in such practices as overbroad employee access to customer data, inadequate cybersecurity controls, unnecessary retention of customer data and a failure to honor deletion requests. Moving forward, companies can expect the FTC to rely on this expansive interpretation of unfair practices to sanction companies that violate their customers' data privacy.
Additionally, companies operating in the artificial intelligence (AI) space should take note that the FTC takes privacy compliance as a prerequisite to developing these models. As part of the proposed order for Alexa, the FTC required Amazon to delete inactive child accounts and certain voice recordings and geolocation information, and also prohibited the company from using such data to train its algorithms, because it alleged that the information was processed in violation of the Children's Online Privacy Protection Act (COPPA). As AI products continue to proliferate, companies should be aware that the FTC is paying close attention to how these models are developed (and the impact that these AI products may have) and will use data disgorgement as a remedy where it deems appropriate.
These enforcement actions continue what has been an active year for FTC enforcement in the data privacy and security space, with the Commission having already brought several enforcement actions against companies for misuses of health data, signaled increased scrutiny of biometric technologies and warned companies of the legal risks of using AI tools. Given this environment of heightened regulatory scrutiny, companies should ensure that they develop, implement, and adhere to robust and transparent data collection and use policies.
In this post, we summarize the FTC's complaints and proposed stipulated orders in the Amazon enforcement actions and highlight key takeaways for companies looking to understand how these enforcement actions could impact their data privacy and security programs moving forward.
ALEXA ENFORCEMENT ACTION
The Complaint
Alexa is Amazon's voice assistant service that allows users to access a range of products and services through hardware devices (such as Amazon Echo speakers) and a mobile app. Alexa's offerings include, among other things, products and services targeted specifically to children, such as the "Echo Dot Kids Edition" smart speaker and the "FreeTime Unlimited on Alexa" service, which gives children access to audiobooks, games and other offerings. Notably, as part of its operation, the Alexa service collects users' voice recordings and geolocation information.
The core narrative of the FTC's complaint against Alexa is that Amazon allegedly engaged in unfair and deceptive practices by representing to consumers that Alexa was a privacy-conscious offering that allowed users to exercise control over how sensitive data (including voice and geolocation data) was collected, used and retained by Amazon. These representations, the Commission alleges, clashed with the reality of Alexa's data collection and use practices, and amounted to violations of Section 5 of the FTC Act and the COPPA Rule.
The complaint's specific allegations include the following.
1. Deceptive Representations Regarding Deletion of Geolocation Data. The FTC alleges that Amazon made deceptive statements about the ability of Alexa app users to delete the geolocation information collected by the app, asserting that Amazon failed to apply deletion requests to data housed at "secondary data storage locations" and continued to use that data for product improvement purposes.
2. Deceptive Representations Regarding Access to and Deletion and Retention of User Voice Recordings. The complaint alleges that Amazon made deceptive statements regarding its handling of voice recordings. Specifically, the FTC alleges that Amazon retained written transcripts of voice recordings even after deleting the accompanying audio files, granted overbroad employee access to voice recordings and indefinitely retained children's voice recordings by default.
3. Unfair Privacy Practices. The complaint highlights a series of alleged privacy practices that the FTC determined to be unfair, including (1) retaining children's voice recordings for longer than reasonably necessary; (2) insulating geolocation data from user deletion requests, and continuing to access such data for product improvement purposes even after a deletion request had been submitted; (3) failing to fully comply with deletion requests for geolocation data and voice recordings; and (4) failing to notify consumers that Amazon had not satisfied consumer requests to delete their or their children's geolocation information or voice recordings.
4. COPPA Rule Violations. The complaint alleges that Amazon's Alexa-related data privacy and security practices violated the COPPA Rule, 16 C.F.R. Part 312, which imposes requirements on operators of online services directed at children under the age of 13. 16 C.F.R. § 312.3. Among the requirements violated, the complaint alleges, were requirements for the provision of notice informing parents of their right to delete their children's personal information and the opportunity to exercise that right, see 16 C.F.R. §§ 312.4, 312.6, as well as prohibitions on retaining children's personal information for longer than necessary to fulfill the purpose for which that information was collected, see 16 C.F.R. § 312.10.
The Proposed Stipulated Order
The proposed stipulated order imposes the following key requirements on Amazon.
1. Deletion of Information. Amazon is required to implement a process to identify inactive Alexa child profiles and to delete any children's personal information associated with such profiles. In addition, Amazon must ensure the deletion of any geolocation, voice or children's personal information that a user (or child user's parent) had previously requested the deletion of, and it is prohibited from subsequently using any of that data to create or improve models and other data-related tools.
2. Privacy Program. Amazon must implement a privacy program focused on its collection and use of geolocation information. This program must include features such as annual risk assessments, implementation of safeguards (including privacy reviews, employee training and access controls) and service provider screening.
3. No Misrepresentations About Privacy of Geolocation and Voice Information. Amazon is prohibited from making misrepresentations about Alexa's data retention, access and deletion practices with regard to geolocation and voice information, including the extent to which users can exercise control over those practices.
4. Data Retention and Deletion Notices. Amazon must provide consumers with data retention and deletion disclosures that explain why voice and/or geolocation information is being collected and the mechanisms through which users can request the deletion of such information.
5. $25 Million Monetary Judgment. Amazon is required to pay $25 million as a civil penalty.
RING ENFORCEMENT ACTION
The Complaint
Ring is a home security camera company that was acquired by Amazon in 2018. Ring sells "Internet-connected, video-enabled security cameras, doorbells, and related accessories and services," including cameras that monitor private spaces within consumers' homes. As part of its offerings, Ring stores and analyzes video recordings captured by its devices.
Broadly speaking, the FTC's complaint against Ring alleges that the company granted overbroad employee access to customers' video data, failed to notify customers of or obtain customer consent for its employees' viewings of video recordings, and did not implement reasonable data security practices to protect its devices from cybersecurity compromise. These practices, the complaint alleges, amounted to deceptive and unfair practices actionable under Section 5 of the FTC Act.
The complaint's specific allegations include the following.
1. Overbroad Employee Access to Customer Video Recordings. The complaint alleges that Ring failed to impose adequate restrictions on employee access to customer video recordings, resulting in every Ring employee (in addition to hundreds of contractors) being able to access and download any customer video, regardless of whether they had a legitimate business purpose for such activity.
2. Failure to Notify Customers of and Obtain Consent for Human Review of Video Recordings. The complaint further alleges that even when Ring employees and contractors had a legitimate business purpose for accessing customer recordings, Ring did not notify or obtain consent from customers for such human review. In particular, the complaint takes aim at Ring's pre-January 2018 Terms of Service and Privacy Policy. Prior to December 2017, the complaint alleges, Ring failed to disclose in these documents that its employees and contractors would be able to review all video recordings, instead including a general description of the company's right to use such recordings for product improvement and development. Similarly, the complaint notes that Ring's December 2017 to January 2018 Privacy Policy "described [Ring's] use of device recordings for product improvement and development," but it contends that this description was "buried ... in dense and lengthy legalese."
3. Inadequate Cybersecurity Practices. The complaint alleges that Ring failed to implement "standard security measures" for preventing common cybersecurity attacks. Specifically, despite multiple red flags—including cybersecurity compromises, vulnerability reports and media reports of attacks against competitors—Ring allegedly declined to implement basic safeguards or implemented protections that were "too little and too late."
The Proposed Stipulated Order
The proposed stipulated order imposes the following key requirements on Ring.
1. Deletion of Pre-March 2018 Video Recordings and Derivative Products. Ring is required to delete pre-March 2018 covered video recordings, pre-March 2018 "Face Embeddings" and work product (e.g., models and algorithms) developed from review of pre-March 2018 recordings.
2. Privacy and Data Security Program. Ring is required to implement a privacy and data security program, including such features as periodic risk assessments, development of a policy to prohibit human review of customer video recordings unless certain conditions are satisfied, employee training, data access controls (such as multifactor authentication), requirements for strong passwords, encryption of customer video recordings in transit and at rest, and testing and monitoring safeguards. Ring is further required to have its compliance with the aforementioned program validated via biennial assessments by an independent third-party assessor.
3. Incident Reports. Ring is required to provide reports to the FTC of "Covered Incidents," defined to include any incident that either results in Ring notifying a government entity or involves the compromise of video recordings from 10 or more Ring accounts.
4. No Misrepresentations About Employee Data Access and Cybersecurity Protections. Ring is prohibited from misrepresenting its practices regarding employee and contractor access to customer video recordings as well as the extent to which its products are protected against "online attacks resulting from external actors' misuse of valid authentication credentials."
5. $5.8 Million Monetary Judgment. Ring must pay the FTC $5.8 million, which the Commission may then use to provide consumer redress.
KEY TAKEAWAYS FROM BOTH ENFORCEMENT ACTIONS
1. Expansive Interpretation of Unfair Practices. Across both complaints, the FTC adopts a broad conception of what constitutes unfair practices within the meaning of the FTC Act. Included within the scope of this broad conception are practices ranging from overbroad employee access to customer data and inadequate cybersecurity controls (Ring) to unnecessary retention of customer data and failure to honor deletion requests (Alexa). Taken together, these enforcement actions indicate that the FTC is increasingly expanding its interpretation of unfair practices.
2. Data Disgorgement as a Remedy. In addition to the other penalties it levied in its enforcement action regarding Alexa, the FTC also required Amazon to delete the data it used to train Alexa, which the FTC alleged to be processed in violation of Section 5 of the FTC Act and COPPA. Data disgorgement is a remedy that the FTC has used before, and it is likely to continue to use it in the future for enforcement actions involving AI tools. Businesses operating in this space should be aware of this potential risk as it pertains to their privacy compliance programs.
3. Specificity Regarding Internal Uses of Personal Information. The Ring complaint highlights the need for companies to clearly specify in their privacy documentation how they intend to use customer data. In that complaint, the FTC emphasized Ring's alleged failure to specifically disclose to customers that their video recordings would be subject to human review. In this context, the FTC reasoned, general descriptions of video recordings being used for internal product improvement and development were insufficient. Moving forward, companies should be careful not to rely on "internal product improvement and development" as a catchall disclosure for internal uses of customer data and should instead seek to provide more specificity as to how customer information will be used (including who will use it).
4. Limiting Employee Access to Personal Information. The FTC's complaints also highlight the need for companies to restrict employee access to customer information to those with a true business justification for access. The FTC criticized both Ring and Alexa for allegedly failing to adequately limit the pool of employees and contractors with access to sensitive customer information, and this is likely to be an area of focus for the Commission moving forward.
5. Full Compliance With Deletion Requests. The Alexa complaint should serve as a stark reminder to companies of the need to fully comply with data deletion requests. This includes flowing down such deletion requests to ensure that data is deleted by, for example, the secondary data storage locations at issue in the Alexa complaint. And companies, of course, should ensure that they do not continue to use, even if only for internal purposes, data that a customer has requested be deleted.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.