As artificial intelligence (AI) continues to revolutionize industries, businesses are increasingly entering into contracts for the licensing, subscription, or use of AI tools. These contracts can be complex, and understanding the key issues is crucial to protecting your business interests.
Below, we take a quick look at some of the top issues to consider when customer's do their due diligence on — or enter an agreement to access — any AI tool:
1. Quality Control
One of the top issues to consider is the accuracy and quality of the tool's output. More specifically, this includes such factors as:
- Accuracy: Generally speaking, how accurate are the answers? How is accuracy monitored and maintained by the vendor?
- Bias: AI tools are susceptible "bias" based on who programmed the tool and what sources were used to train the model. To ensure neutrality, consider asking about sources and training materials, so that you can determine whether the sources are reliable and unbiased.
- Hallucinations: How is the tool programmed to handle unknowns? Will it admit to that— or does it attempt to make up an answer? If it is prone to such hallucinations, will it warn you that the answer has no source?
- Source Attribution: Does it cite and link to its sources so you can dig deeper and verify the information, or does it lack transparency— a so-called "black box"?
2. Intellectual Property Rights
Another key issue is understanding who owns the intellectual property (IP) rights to the AI tool and any derivative works created by it. As customer, retaining rights to any data or insights generated by the tool is preferred. IP rights are a common issue in tech deals, but with AI models, additional nuances require further consideration:
- Proprietary Data: If you upload proprietary materials for analysis — documents, contracts, images, and/or other data you own, it is critical that you retain all rights to those materials. For example, if you upload 50 confidential contracts and ask the tool to create a chart with renewal dates, can you be certain that you retain ownership of those documents?
- Training AI: Ideally, the tool will not be permitted to use your confidential data to train its model. Otherwise, future users of this tool who ask the right questions may see part or all of your data in the tool's output. Using the example above, the preference is that the content of those 50 contracts remains confidential and does not become part of the model's future "knowledge base." In other words, consider a tool that is programmed to "forget" your data after generating the output.
- Prompts: The queries/prompts fed into an AI tool are often carefully crafted and highly detailed. In some cases, they may contain proprietary information. When reviewing tools, consider whether the tool will own your uniquely created prompts or whether they will be treated as confidential.
- Outputs: Answers generated by AI (also known
as outputs) are typically derived from both your prompts and the
tool's training data (which may include third-party content
gathered by the tool). You may want to determine what part of these
answers are proprietary to you based on your inputs/data, and who
owns this output. Questions to consider include:
- Are these derivative works of the underlying content and owned by the authors of such content?
- Are they owned by the vendor as part of its service?
- Are they owned by you, as the user of the tool?
- Is it a combination of the above?
3. Liability and Indemnity
Finally, the ever-important issues of liability and indemnity. Before entering an AI service agreement, you may want to ensure that each party's liability is clearly defined, including indemnity clauses to protect against third-party claims. Also, consider confirming that the agreement requires both the vendor and its tool to comply with all applicable laws — particularly with respect to how the tool is used and its underlying design. More specifically:
- Liability Limitations: The vendor may try to limit their liability for infringement and/or for errors or failures in the AI tool. Customers, on the other hand, prefer that the vendor is fully liable for any infringement by the tool and/or its outputs, and for any other issues mentioned above.
- Indemnification Clauses: You may want to clarify who is responsible for legal claims arising from use, including IP infringement and data breaches.
- Compliance with Laws: Consider whether the AI tool complies with applicable laws and standards — like HIPAA (for healthcare) or the EU AI Act — and whether your use complies, too.
Conclusion
Navigating AI tool licensing contracts requires careful consideration of various legal and operational risks. By addressing these key areas, businesses can mitigate risk and better leverage AI tools effectively. Ideally, the agreement and the chosen tool align with your business goals and risk tolerance— and if you're unsure, consult with legal counsel who understands both the tech and the law.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.