AI Made Friendly HERE

Why Disney and Universal Suing Midjourney Will Change Hollywood

And so it begins …

And it finally happened …

Those were the two reactions — seemingly opposite, actually harmonious —  to the news that Disney and Universal had finally bitten the bullet Wednesday and sued an AI company, the startup image-generator Midjourney.

It had, after all, been nearly 18 months since The New York Times dropped the first shoe, suing OpenAI and its backer Microsoft in December 2023 over alleged unlawful training of its models on Times journalism. As the months dripped by and the Times lawsuit withstood key court challenges —  “this is just fair use,” the AI firms cried —  it made sense that Disney and its studio peers would follow their lead.

And then became uncomfortable when they didn’t. What was the other shoe waiting for? Some media companies were cutting deals to license content to AI models — a Dotdash Meredith here, a Vox Media there. But clearly Disney and the Hollywood crew wouldn’t do that — the stakes were too high, their legal options too varied.

So it was with an air of inevitability that Disney and Universal filed, alleging a “bottomless pit of plagiarism” in the Shrek- and Yoda-like creatures that Midjourney spits out because it has been trained on, well, Shrek- and Yoda-like images. “Piracy is piracy,” Disney’s chief legal officer said in an accompanying statement.

And yet, for all the waiting, this was but the first shot —  even an early symbol — of what will almost certainly be a larger war, with other companies joining up to go after Midjourney, and perhaps Disney and Universal  going after other companies. (Midjourney is the least capitalized and weakest of the Gen AI bunch, which is probably why it was targeted.) If 2023 was defined by authors suing AI firms and 2024 was marked by a bunch of media outfits doing the same, 2025 could shape up as the year the studios confront the silicon.

Which brings us to 2026. And beyond. Because this lawsuit isn’t your standard intellectual-property dispute. It goes to the heart of what studios are — and what their owners ultimately want them to be.

Let’s play this out. There are several ways the lawsuit unfolds. The most obvious is the way of most lawsuits — with a settlement. In this scenario, Midjourney (and no doubt other AI model operators) pay the studios for their infringement and strike a deal to keep on licensing. (They’re never going to yank studio fare from their models – by the executives’ own words the models would collapse without Big Content.) So AI models keep getting trained on, and spitting our facsimiles of, Hollywood material.

Similarly, studios could simply lose. That nets them less money, but it ends in the same place: OpenAI, Google Gemini and the others crank out Hollywood-trained content at will.

Then there’s the other way: With a studio legal victory. The AI models are deemed prohibited from training on this content — this “fair use,” a judge says, ain’t that. In such a scenario we are ensured that for the indefinite future what gets generated in the way of Hollywood images comes from Hollywood and Hollywood alone.

What does this lead to? Well, it leads to studios continuing to do what they have always done — being the main incubators for and generators of so much of the film, television and other entertainment we consume.

And what does the first option lead to? Well, it hardly takes an imaginative leap to see where we end up if anyone can go to an AI model and plug in prompts to generate stuff that looks a lot like the movies and television we know. It means the end of studios doing it for us.

I know that can seem like a bold statement, but it really isn’t. Once content gets automated like that for the masses, there is no need or incentive for studios to do it themselves. Why would you maintain a whole infrastructure to generate original content – entire deals and offices and hierarchies of development and production – when your audiences can get so much of what they want by going directly to the model? If you thought TikTok creators were challenging studios now, imagine when they can just utter a few words and get the next Yoda.

Sure, there’d be some boutique operations to do something new even in such a circumstance; originality gonna originate. But it would be the exception. Plus that stuff would eventually get devoured by the  maw too since AI models could just grab it to train on. You see how this game goes.

And if this whole vision leads you to say “that doesn’t exactly seem like it will produce the next Godfather or Star Wars!” – well no, it won’t. Capitalism sends its regrets.

It’ll lead to some cool stuff, sure; creativity isn’t dead. The next MrBeast? Man he’s got some tools. And some filmmakers will have some fun; we’re already seeing what Harmony Korine and Darren Aronofsky could do with these things. But studios as we know them? Nope.

In this scenario, Hollywood studios morph into something else: IP rights managers. They’re still here to make money off the property they created. Disney is still running theme parks, for instance, and people are still employed on the lot to work with the AI companies to make sure everything runs smoothly and the checks come in on time.

But the idea of a studio as we think of it, as it has existed for a century – the idea of a Dream Factory in any meaningful sense of the term — it’s gone. We’ve woken up. Disney stops being what Walt Disney founded Disney to be. In fact, you wouldn’t really need a lot, come to think of it.

I don’t think Bob Iger wants to be the guy to do that to his company. That’s why I don’t see him settling. But the imperatives of the dollar are strong. And if his legal advisors are saying he might lose…

Plus some of you cynics out there might say studios have been heading in the IP-management direction for a while now.

Incidentally the studios are in an especially interesting position because they want to use AI themselves. They may not be tech companies, eager to rip through content libraries so they can hawk products based on the scavenged. But they’re not actors and writers either, trying to protect human endeavor. If they can make the next Avatar at a fraction of the cost? “Sign us up for that AI model!” (“Just as long as you don’t feed our stuff into it.”) Yeah, the only way these models will be good enough for the studios to use is if they train on the data the studios don’t want them to have. Catch-22. A pretty good studio movie, by the way.

So here we are, an L.A. District court holding the future of Hollywood in its hands. What would you do if you were the judge? What would you do if you were an executive?

Take the cash given all these headwinds and pivot your model? Or stand pat and try to preserve the concept of a studio as it’s always been constructed, knowing full well you could end up with neither the old way or the new money? It’s a juicy question. And one drama, at least, that ChatGPT couldn’t engineer.

Originally Appeared Here

You May Also Like

About the Author:

Early Bird