That sixth sense you have that someone is listening – could it be your smart speaker? There's a chance the answer is yes, even when you don't ask it to. A new study from Northeastern University finds that smart speakers often accidentally activate and record conversations, although just how often (sometimes as often as 19 times a day) and for how long (sometimes recording for 43 seconds) depends on the device. Notably, the study also found that devices are not consistently recording conversations, and that the accidental listening and recording could be triggered by dialogue in television shows.

Setting aside whatever concerns you might have as an individual about the information being captured and recorded by your home's smart speaker (and to be clear, those concerns are important), the accidental capture of potentially sensitive information raises a set of compliance issues for smart speaker designers. For example, under the California Consumer Privacy Act (CCPA), an entity that collects information from California residents (assuming other criteria are met) must disclose what information it is capturing "at or before the point of collection." Do the terms of use or privacy policies related to the smart speakers reflect such accidental capture? The EU's General Data Protection Regulation (GDPR) requires "Data Protection by Design and by Default," and specifically notes that data controllers (those determining how personal data is to be processed) must

both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures . . . which are designed to implement data-protection principles . . . in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

Does a smart speaker repeatedly turning on and accidentally recording information meet the requirements of "data protection by design"? Does it matter what the data is being used for? Even if the data is being used to improve the speaker and avoid the accidental capture, the fact that the user is not giving permission and is without knowledge of the capture could run afoul of the GDPR's requirements.

What about biometric information laws? The Illinois Biometric Information Privacy Act (BIPA) requires that if information derived from biometric identifiers (such as a voiceprint) is going to be collected, there must be notice and consent. Is a conversation being recorded from a smart speaker that has previously taken a voiceprint information "derived from" a biometric identifier, and does recording without knowledge or consent violate BIPA?

Even if laws like the CCPA, GDPR, and BIPA are not directly implicated, what about privacy policies themselves? Do they put the user on notice that there is a risk of accidental capture? If not, that could raise consumer protection issues in the United States, implicating Federal Trade Commission or state attorney general enforcement.

In short, accidental capture of personal data by smart speakers could raise a variety of compliance concerns, some of which might be resolved through terms and polices, but others that might only be resolvable through design fixes. But for now, you might want to move your speaker away from your television set.

To view Foley Hoag's Security, Privacy and The Law Blog please click here

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.