- in European Union
- with Senior Company Executives, HR and Inhouse Counsel
- with readers working within the Accounting & Consultancy, Business & Consumer Services and Insurance industries
Disney has long been among the most aggressive enforcers of film and character intellectual property.
For a company that has historically tried to squash new uses of its content, the December OpenAI Sora deal reads as an "if you can't beat them, join them" moment.
In that respect, its posture now resembles that of the major record labels during the early years of digital music distribution. The Sora deal appears to recall the point at which the music industry shifted from litigation-driven resistance to negotiated adoption through select services such as iTunes and later Spotify and other streaming platforms.
This analogy has its limits. In music, the first wave of digital disputes centered on obvious unauthorized copying and redistribution of finished tracks on peer-to-peer networks, which immediately undercut the core recorded music revenue model; in artificial intelligence, many of the early cases are aimed further upstream, at how models ingest and learn from protected works in training, and a large share of visible use so far has been casual or non-commercial, even as the technology is already bleeding into commercial production.
With Napster-era platforms, consumers could obtain the exact same songs they would otherwise have bought, so revenue dropped as listeners shifted to free access. By contrast, most AI tools today are generating new or transformative content, and a "fake Drake" track is not yet a full substitute for the original catalog in the way MP3 file-sharing was for labels.
Even so, the strategic logic is similar: Once it became clear that new technology and user behavior were not going away, rightsholders shifted from a pure enforcement posture to licensing a small number of intermediaries on carefully structured terms, and using those deals to draw the boundaries of acceptable use.
Viewed in that light, the Disney-OpenAI agreement is less an anomaly than an early attempt to define what licensed generative use of entertainment IP looks like in practice, including how AI user-generated content is permitted without eroding ownership and control.
A New Kind of Studio-AI Relationship
Public disclosures indicate that Disney will invest approximately $1 billion in OpenAI and license hundreds of characters and related assets for use in Sora, OpenAI's video-generation model.
The license reportedly covers 2D and 3D character models, environments and props across Disney, Pixar, Marvel and Star Wars, while excluding real-world performer likenesses and voice rights. Select fan-inspired videos generated through Sora may be curated and distributed on Disney+.
For OpenAI, the arrangement provides both capital and a flagship entertainment partner. For Disney, it offers controlled access to generative video technology and a way to bring fan creativity into a licensed and governed environment, rather than allowing it to proliferate across unauthorized third-party user-generated content platforms.
While the dollar figure is eye-popping, the significance of the Disney-OpenAI deal lies less in its headline value than in its structure and implications. It is one of the first large-scale efforts by an IP owner to integrate generative AI into its licensing and distribution model while retaining control over ownership, brand use, downstream exploitation and user behavior.
As such, it functions as an early test case for how ownership, risk and responsibility will be allocated in generative entertainment ecosystems that remain unsettled as a matter of law.
What the Deal Likely Says About IP
At its core, the agreement allocates ownership, risk and operational control over Disney's worlds inside a third-party generative system.
The key IP provisions likely fall into five interrelated categories: ownership of outputs, training and model use, control and moderation, and risk allocation. User-generated content, name, image and likeness protections, and data rights are likely embedded throughout these provisions rather than treated as separate or independent concerns.
Ownership of Outputs and Derivative Works:
For a company like Disney, the most likely structure is one in which any Sora-generated video that incorporates Disney characters, settings or other branded elements is treated as a Disney-owned or Disney-controlled derivative work, regardless of how the rights first vest under the user terms.
In practice, end-user terms typically give OpenAI broad rights in user outputs, and the Disney-OpenAI agreement likely then requires OpenAI to assign or grant back to Disney the exclusive rights in any output that uses Disney IP, so that Disney — not OpenAI or the end user — holds the exploitation and enforcement rights.
Under this regime, OpenAI receives a limited, functional license to generate, store and display the content solely as necessary to operate Sora.
An alternative, but unlikely, structure would treat Sora outputs as owned by OpenAI, with Disney receiving a license — likely exclusive — to exploit outputs incorporating its IP. While operationally cleaner for OpenAI, that is hard to square with Disney's historical insistence on clear derivative ownership, particularly where evergreen characters and franchises are involved.
Disney has long preferred to sit in the driver's seat on enforcement, takedowns and exploitation rights, and outright ownership of derivatives preserves remedies, bargaining power and flexibility in ways a mere license back cannot fully replicate.
Ownership of Technology, Models and Data: What OpenAI Needs From the Deal
The deal is not just about protecting Disney's characters; OpenAI also has to protect one of its core and flagship products.
Even with Disney's massive investment and the legitimacy Disney lends Sora in the licensed IP space vis-à-vis this partnership, OpenAI will want the agreement to state plainly that it alone owns the Sora models, source code and underlying technology, as well as any general improvements those systems make over time, while Disney's rights are limited to using the service and exploiting approved outputs.
The same logic applies to data. OpenAI is likely to treat platform-level information such as user prompts, logs and aggregated usage analytics, as its own confidential and proprietary material, while giving Disney visibility into how its IP is being used and how Disney-branded content performs, probably in an aggregated or anonymized form.
Disney gets insight, not a portable dataset it can use to build or train a rival tool. OpenAI will also want clear non-reverse-engineering and non-extraction obligations, with any Disney-specific tuning ring-fenced for Disney's benefit, but broader enhancements to Sora remaining OpenAI's to deploy with other customers.
In practice, that will also mean Sora is contractually barred from training on or reusing Disney-provided assets outside the Disney-specific environment, so that Disney's IP does not bleed into models or use cases outside the scope of the partnership.
Name, Image and Likeness Inside Sora
One piece of the deal that sits alongside traditional IP is name, image and likeness. Public disclosures make clear that Disney is licensing fictional characters — not actors — and that real-world performer likenesses and voices are excluded unless separately agreed.
That bright line matters in a world of emerging digital replica laws and talent contracts that now routinely carve out AI cloning: Sora can put Darth Vader or Elsa on screen, but it cannot, by default, sound or look exactly like the actors who appear as or voice them.
User likenesses raise a parallel issue. Even if Disney controls the derivative clip that results, the user's own face and voice remain protected by publicity and privacy laws. The Sora end-user terms will therefore need to make clear when and how Disney and OpenAI can reuse a fan's identifiable likeness outside the tool, for example, if a fan-created clip is selected to run on Disney+ or in marketing, and will likely require additional, explicit consent for those elevated uses.
Content Moderation as a License Condition
The Sora partnership pushes content moderation out of the trust and safety layer and into the core licensing bargain. When users can generate video with hundreds of Disney characters, the central question is not only what the tool permits, but what the license allows to be created, shared or amplified.
Disney is therefore likely to treat moderation standards as express conditions of the IP license itself. The right to use Disney characters in Sora would be tied to a defined set of content restrictions, including bans on sexual, graphic, hateful or political depictions, certain mash-ups with third-party brands or real children, and uses that imply official sponsorship outside approved channels.
Failures to enforce those standards would not merely be product issues; they could give Disney grounds to suspend or terminate the license.
On the product side, this structure may require a Disney-specific safety layer built into Sora.
Prompt filtering, output classification and mandatory human review for any content elevated beyond the tool are not optional features, but license compliance mechanisms. Because misuse can spread off-platform, the framework also likely includes watermarking or provenance signals, shared logging and audit trails, and coordinated takedown and communications procedures.
The broader takeaway is that, for deals of this kind, moderation can no longer be just a soft policy commitment. It must be a core term that determines whether AI-assisted fan creativity remains a controlled asset or becomes an unmanageable brand risk.
Risk Allocation and Indemnities
All of the ownership and control pieces are ultimately backed by indemnities. Disney is the only party that can realistically stand behind its own catalog, so it will likely indemnify OpenAI if a third party claims that Disney-supplied characters, settings or other assets infringe their rights.
Once those assets are transformed by Sora or combined with user-supplied material, however, the risk profile changes and Disney's indemnification obligations typically stop.
OpenAI, in turn, will be expected to stand behind the system itself. That usually means indemnifying Disney for claims that Sora's software, its generic outputs (to the extent they do not rely on Disney IP), or its marketing infringe third-party rights, as well as for security incidents, such as data breaches or violation of data privacy laws, that are within OpenAI's control.
Users sit at the third leg of the stool: They will be responsible for the non-Disney material they introduce, e.g., music, logos, real-person likenesses, and for any prohibited uses, and will be required by the end-user terms to indemnify both Disney and OpenAI if that behavior triggers legal claims.
What Users Will Be Allowed To Do
The combined effect of these IP provisions appears most clearly in Sora's user-facing terms. Users will be informed that interacting with Disney IP does not confer ownership of characters or a right to commercialize Disney-branded outputs. Uses involving prohibited themes or unlicensed third-party IP, including real-person likenesses, will be barred.
Users will be required to represent that they have the necessary rights to use any material they provide in inputs and that any non-Disney material they introduce is cleared for such use and to indemnify Disney and OpenAI if that representation proves false. The terms will also grant Disney and OpenAI broad rights to host, modify, remove and selectively reuse content, including for distribution on Disney+.
Export controls are likely to be strict. Users may be prohibited from downloading high-quality files or reposting Disney-branded clips in ways that imply official endorsement or enable monetization. Where sharing is permitted, it will likely be subject to labeling and watermarking requirements designed to make clear that the content is user-generated and AI-assisted rather than studio-produced.
Royalties and Creator Participation
OpenAI's shift to an opt-in model for copyrighted IP is not just about control; it is also a move toward familiar "platform + rightsholder + creator" economics.
In effect, it sketches a path where video-generation revenue can be split, so the platform covers computing costs, rightsholders are paid for authorized use of their works, and, in some cases, individual creators share in upside if their content performs well.
In the Disney deal, nothing public suggests clip-level royalties; economically, the focus appears to be Disney's $1 billion equity stake and the broader license, not per-video payments to OpenAI or fans whose clips make it onto Disney+. That is consistent with Disney's centralized, family brand posture.
But as a template, the Sora model looks a lot like how YouTube and TikTok turned early piracy and fan remixes into a licensed ecosystem where labels, platforms and creators all participate in ad or subscription revenue, and like what user-generated content-heavy games have done with creator funds, in-game storefronts and revenue splits on user-built maps, skins and experiences.
For smaller or less child-focused brands, that analogy is powerful. They can use Sora-style opt-in plus revenue share to say: "If you build with our IP here, on terms we define, you can actually get paid." That might mean offering a share of platform revenue or brand marketing budgets to creators whose AI-assisted clips drive views or sales, much as YouTube shares ad revenue and games like Roblox or Fortnite share spend on user-generated content.
Done well, those incentives can coexist with tight rules on derivative ownership, training limits and moderation, while giving brands a realistic way to compete for creator attention in a world currently dominated by a large, unlicensed grey market.
What This Means Going Forward
Disney's deal with OpenAI is unlikely to be an outlier; it is more plausibly the first major template. As more legacy rightsholders confront the reality that generative tools and creator-driven distribution are not going away, the question will shift from "Can we stop this?" to "On what terms do we participate?"
Going forward, the center of gravity will sit in these negotiated terms, not in broad declarations for or against AI. Brands that want to keep their IP culturally relevant will need to decide how much fan play they will tolerate, how much data and model leverage they will give up to platforms, and whether they are prepared to share revenue and creative credit in ways that look more like YouTube or user-generated content-driven games than traditional licensing.
Platforms, in turn, will have to prove they can offer not just cutting-edge tools, but also credible guardrails, auditability and economic models that respect both catalog owners and the creators who keep those catalogs alive.
Looking ahead, the real impact of the Disney-OpenAI deal may be less about who moved first and more about who follows. Studios, game publishers, sports leagues and consumer brands will all face the same set of questions: whether to license their worlds into generative tools at all, what guardrails they need if they do, and how much of the upside they are willing to share with platforms and creators.
They will also have to confront a harder, more technical question the contract itself cannot fully answer: When a license ends, can a model meaningfully unlearn a catalog, or will traces of that IP always persist in the system's behavior and features built while the deal was in force, even if the agreement requires certifications of deletion or destruction of the underlying assets?
In an environment where innovation will always outpace regulation, the companies that start grappling with those questions now, with carefully selected and vetted partners, will have far more influence over the emerging norms than those that sit back and wait to see where the law eventually lands.
Originally published in Law360
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.