Key Takeaways
- Massachusetts AG Andrea Campbell co-led a group of 47 state AGs in calling on technology companies to increase efforts to combat generative AI "deepfake" images, including deepfake pornography.
- The letters to providers of payment platforms and search engines represent a shift in approach, directly engaging with companies that may not themselves produce the AI product causing the identified harm.
- While AGs push for legislative change, they may continue to lean on existing legal frameworks, including consumer protection law, to address perceived harms from AI.
- Companies should evaluate their potential touchpoints with AI enterprises—including where AI firms act as business partners or consumers of services—to evaluate potential exposure to expanding state AG and regulatory inquiries into AI-related harms.
On August 26, a bipartisan group of 47 state attorneys general
issued letters calling on major tech companies to increase their
efforts to stop the spread of computer-generated and nonconsensual
"deepfake" images, particularly including deepfake
pornography. The AGs directed their calls to two groups of
companies providing either search engines or payment platforms.
In the letters, the AGs requested dialogues with the companies to
learn what steps payment platforms and search engines,
respectively, were already undertaking to protect against dangers
of deepfake pornography, and what more could be done within their
existing technical limitations and terms of service. The AGs
pointed to existing limitations in search engines that channel
certain dangerous searches—such as "how to build a
bomb" or "how to kill yourself"— towards safer
results—such as government sources or suicide-prevention
resources—and suggested they adopt similar guardrails for
searches related to deepfake pornography. They also pointed to
exemplar terms of service or acceptable-use policies in payments
platforms that those companies could use to block payments intended
to promote deepfake pornography.
Massachusetts AG Andrea Campbell co-led the effort with the AGs of
Kentucky, New Jersey, Pennsylvania, Utah and Vermont. In a press release, AG Campbell's office
expressly connected these letters to her office's broader
efforts to address the potential dangers resulting from the
emergence of artificial intelligence. These have included a recent
letter to AI companies regarding AI chatbots' ability
to engage in sexual conversations with children, as well as a September 2023 letter to Congress and a
first-in-kind April 2024 legal advisory to AI developers,
suppliers, and users regarding obligations under existing state
consumer protection, anti-discrimination, and data security laws
(which we wrote about at the time, here).
Nonetheless, the August 26 letters signal a slight shift in
approach by sweeping in companies—search engines and payment
platforms—that are not themselves the developers of the
subject AI technologies that pose the identified harm. The letters
also demonstrate that while the AGs have encouraged legislative
action on AI-specific protections, they will not necessarily wait
for legislative reforms, and will continue to lean on existing
legal frameworks and practices, including state consumer protection
laws, to address perceived harms in the meantime.
The letters provide occasion for companies operating in a variety
of technological spaces to evaluate their exposure to AI-related
harms, including through their relationships with AI developers as
business partners or consumers of services, and the manner in which
their existing practices or policies could be modified to account
for and mitigate such harms. Foley Hoag's State Attorneys
General and Privacy and Data Security Practices have substantial
experience preempting and resolving interactions with State
Attorneys General and other regulatory authorities, and can assist
with navigating this increasingly scrutinized space.
To view Foley Hoag's State AG Insights blog click here
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.