The use of ChatGPT and similar generative language models provide stark risks to trade secret protection, and organizations should be vigilant that no employee is inadvertently sharing trade secrets with ChatGPT.
ChatGPT is an AI which can be used to generate text of any kind.
ChatGPT (Chat Generative Pre-trained Transformer) is a chatbot (or generative language model) developed by OpenAI which uses large language models to generate text in response to user prompts. ChatGPT has over 100 million users by January 2023.
ChatGPT may be used to generate:
- Advertising copy
- Quiz questions based on a general topic
- Summaries in any style
OpenAI, the creator of ChatGPT, will soon begin charging a $42 monthly fee for ChatGPT Professional, which offers unlimited use and a faster response time. OpenAI will also soon offer an API (Application Programming Interface) for the artificial intelligence chatbot. This API will allow other programs to integrate ChatGPT into their application. For example, ChatGPT could be integrated into a word processing program to generate introductions or provide editing.
How does ChatGPT work?
ChatGPT, and other generative language models, are trained on a large amount of text. The current ChatGPT was trained before its release and is not currently directly absorbing new text. However, current ChatGPT models indirectly absorb new text through user questions. For example, a user could put in the outlines of an internal memo and ask for ChatGPT to generate the full memo. Or a user could upload a vendor contract or non-disclosure agreement and ask ChatGPT to provide edits.
Later models of ChatGPT will likely include the ability to train the program for specific uses by uploading text. One could, for example, use ChatGPT to generate annual reviews by uploading previous monthly reviews, or to generate research summaries by uploading the underlying research. Other programs already offer the ability to upload one's own data to ChatGPT to produce more specific responses.
Any data entered in ChatGPT, directly or indirectly, is no longer private. It can be used by ChatGPT to respond to future questions or even for other purposes by the AI creators.
In other words, ChatGPT remembers what is provided to it and ChatGPT (or whatever API or large language model is used) may produce that information in response to other queries by other users. This potentially carries risk if the information provided to ChatGPT contains confidential or trade secret information. The user may be inadvertently publishing information to the public that it would have otherwise kept confidential. For example, the user might describe the trade secret in detail and ask for a summary or a title; or the user might enter the trademarked recipe or process to summarize the steps, identify potential safety hazards, generate a title or one of many other potential actions. A search in a normal search engine is limited to keywords, but in ChatGPT the user can upload information or create a dialogue to search, create new steps, edit, summarize, refine instructions, identify risks or any of a number of different actions which exceed merely searching.
Business use of ChatGPT comes with legal risks that include creating advertising copy with material misrepresentations, destroying attorney-client privilege and abandoning trade secrets.
Using ChatGPT to generate advertising copy might produce advertising copy that contains material admissions or misrepresentations.
But the advertising copy—or memorandum, or summary, or disclosure—generated by ChatGPT may be insufficiently specific. The temptation then would be to "feed" the ChatGPT bot text to produce more specific advertising copy. For example, the company might upload internal memos, yearly benchmarks, strategic summaries or emails to "train" ChatGPT to produce more tailored content.
Uploading such documents could potentially waive attorney-client privilege if those documents were privileged. A recent case highlights how easily attorney-client privilege may be waived: as easily as by forwarding a privileged email to the front desk at a hotel to have it printed. See Fourth Dimension Software v. Der Touristik Deutschland GMBh, No. 19CV05561CRBAGT, 2021 WL 4170693, at *4 (N.D. Cal. Sept. 14, 2021). Attorney-client privilege does not exist where the communications are no longer confidential—as could occur through ChatGPT.
We expect this to be an evolving area that merits ongoing attention.
Business use of ChatGPT risks destroying trade secrets.
A trade secret is information, practices or procedures that give a company a competitive advantage. A trade secret is part of a company's intellectual property like a patent but, unlike patents, trade secrets do not require registration, nor do trade secrets "expire," as patents do. Well-known examples of trade secrets include the recipe for Coca-Cola, Google's search algorithm, the New York Time's process for identifying best-sellers, software programs, client lists, databases and strategic plans.
Trade secret protections can ultimately be lost if they are made public. And once lost, those protections are lost forever. This is true even if the disclosure is accidental or unintentional: "upon disclosure, even if inadvertent or accidental, the information ceases to be a trade secret and will no longer be protected." Defiance Button Mach. Co. v. C & C Metal Prod. Corp., 759 F.2d 1053, 1063 (2d Cir. 1985).
Suing for misappropriation of a trade secret may not be possible where ChatGPT is the method of disclosure.
If a trade secret is misappropriated—often by a departing employee—then the company who owns the trade secret may want to sue for trade secret misappropriation. Trade secret misappropriation is illegal under state law in all fifty states and under federal law.
Under the Texas Uniform Trade Secrets Act, for example, the company might sue for (1) an injunction ordering an immediate halt to the use of the misappropriated trade secrets as well as (2) damages. Florida has an identical provision. There is also a similar provision under the Defend Trade Secrets Act, a 2016 federal law that permits individuals to sue in federal court for trade secret theft. See 18 U.S.C.A. § 1836.
However, state and federal laws regarding trade secrets have requirements that may not apply if ChatGPT is the mechanism by which the trade secret is lost.
On a claim for the misappropriation of trade secrets, the most important consideration remains whether the information was secret. Medtech Products Inc. v. Ranir, LLC, 596 F. Supp. 2d 778 (S.D. N.Y. 2008). "An entity claiming trade secret status bears the burden to identify and demonstrate that the material is included in categories or protected information under the statute and additionally must take some active steps to maintain its secrecy." 100 Am. Jur. Proof of Facts 3d 195 (originally published in 2008).
A common defense against a claim of trade secret misappropriation is that the original company abandoned the trade secret by failing to keep it sufficiently secret.
This begs the question of whether what is "fed" into ChatGPT is fed into ChatGPT as a whole—meaning ChatGPT would be able to use the information to create responses to other users. Does that constitute trade secret "abandonment"?
Trade secrets shared with ChatGPT are no longer secret.
In Bunner, an early case of its kind, Andrew Bunner, a Norwegian teenager created a program to decrypt DVD content protection. The program quickly spread on the Internet, with scores of sites linking the program. DVD Control Copy Association sued Bunner for trade secret misappropriation. The California court ultimately held that, due to distribution on the Internet, there was no trade secret for the court to protect because: "Widespread, anonymous publication of the information over the Internet may destroy its status as a trade secret." DVD Copy Control Assn., Inc. v. Bunner, 116 Cal. App. 4th 241, 251, 10 Cal. Rptr. 3d 185, 192 (2004).
Bunner remains the central case on whether "widespread, anonymous publication" destroys trade secret status.
There is no straightforward test to determine whether something has entered the public domain. The only other case law that might provide analogies to trade secret loss through ChatGPT are cases addressing whether intentional use of bots to scrape publicly-available databases is trade secret misappropriation.
In Compulife, Compulife provided a database of life insurance rate-tables to customers. Compulife Software Inc. v. Newman, 959 F.3d 1288, 1296 (11th Cir. 2020). The company alleged that the defendant scraped the data in to create their own database. Id. at *1200. The court ultimately held that the company's database constituted a trade secret as a "unique compilation" of publicly available data. Id. at * 1315. But this is far from being an exact match to the scenario posited above. Here, even though a 'bot" was used to gain the data, the acquisition was intentional: the defendant company hired a "hacker" to scrape Compulife's database.
Other cases have found the opposite: that use of bots to scrape publicly-available information is not trade secret misappropriation: "However, for good or bad, the exponential proliferation of information made available through full-blown use of the Internet and the powerful tools it provides" prevent a finding that the trade secrets are secret. Sasqua Grp., Inc. v. Courtney, No. CV 10-528 ADS AKT, 2010 WL 3613855, at *22 (E.D.N.Y. Aug. 2, 2010), report and recommendation adopted, No. 10-CV-528 ADS ETB, 2010 WL 3702468 (E.D.N.Y. Sept. 7, 2010).
When the Compulife court held that the trade secrets were misappropriated, it did so in part because the acquisition was intentional.
This brings up the far more difficult barrier to pursuing trade secrets claims where the mechanism of loss is ChatGPT: intent. With ChatGPT, there is no intent to misappropriate trade secrets; and therefore no one to sue.
The court in Brunner noted that a trade secret case involving widespread publication and dissemination by the alleged thief:
does not fit neatly into classic business or commercial law concepts. The typical defendant in a trade secret case is a competitor who has misappropriated the plaintiff's business secret for profit in a business venture. In that scenario, the defendant has as much interest as the plaintiff has in keeping the secret away from good faith competitors and out of the public domain. But here...the alleged misappropriators not only wanted the information for themselves, they also wanted the whole world to have it.
DVD Copy Control Assn., Inc. v. Bunner, 116 Cal. App. 4th 241, 254–55, 10 Cal. Rptr. 3d 185, 195 (2004). There is a similar mismatch between traditional scenarios of trade secrets misappropriation and that which occurs through ChatGPT. ChatGPT—and its users on both ends—do not intend to misappropriate or use trade secrets, and insofar as the creators of ChatGPT have any intent, it is to make information available to "the whole world."
Both state and federal statutes require intent for misappropriation, either from the person stealing the trade secret or from the person using it. For example, Texas law requires either that the person who discloses the trade knew or knows that the trade secret was improperly acquired or that the person who acquires the trade secret knows that the trade secret was acquired by improper means. In other words, this phrasing allows the original company to sue for trade secret theft even if the thief sold the trade secrets to an innocent third party or, conversely, if the third party is not innocent. Tex. Civ. Prac. & Rem. Code Ann. § 134A.002 (West). Federal law similarly allows suit against either the thief or the recipient as long as the disclosure of the acquisition was done knowingly.
The problem with ChatGPT is that neither the person disclosing the trade secrets—i.e., the employee who trains ChatGPT using internal records—or the recipient know that they are disclosing trade secrets.
In other words, through ChatGPT, trade secrets could be lost without any intent, and therefore without any culpability—leaving the original company with no one to seek damages from.
Business use of ChatGPT and similar programs should proceed with caution.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.