AI Made Friendly HERE

How to Detect AI-Generated Video From OpenAI’s Sora

As generative AI continues to improve at a breakneck pace, it’s becoming increasingly difficult to tell when something you see online is real or fabricated. And it’s not just static images that we have to worry about either. AI-generated videos are on the rise as well, and with the impending arrival of OpenAI’s Sora text-to-video tool, discerning fact from AI-generated fiction could soon be tougher than ever.

So how exactly do you spot videos made by Sora and other AIs? Currently, software-aided detection tools for AI-generated video are extremely scarce (unless you are only focused on deepfakes that have a face visible), so for now, you’ll have to trust yourself instead of a robot.

To help you sharpen your skills and learn what to watch out for, we’ve gathered some Sora video samples from OpenAI (including a few that accompanied the company’s research paper “Video Generation Models as World Simulators.) Using these samples as examples, we’ll show you some of the telltale signs that a video might be AI-generated. Here’s what you should look for:

Defying Physics

Unless you’re watching Inception, most filmed things follow the laws of physics. This is not so for Sora. In its paper, OpenAI talks about Sora’s limitations as a simulator, saying “For example, it does not accurately model the physics of many basic interactions, like glass shattering.”

Here is a video showing a similar scenario. A glass jumps into the air, with no apparent cause nearby, and the liquid passes through the solid glass which itself dissolves as it hits the table.

When trying to determine whether a video has been created by AI, you should closely observe all of the phenomena in it, whether it’s the main action, or something taking place in the background.

Also, gauge your feelings while you’re looking. When things behave differently than they should, it’s often uncomfortable for us humans—a phenomenon known as the uncanny valley. So trust your gut. If something seems a bit off—even if you can’t quite put your finger on what exactly it is—take that as a sign that the video deserves some additional scrutiny.

Unreal States

Bite into an apple in real life and a chunk disappears. That’s not necessarily how it works in Sora’s world. OpenAI notes that “interactions, like eating food, do not always yield correct changes in object state.”

So while you should be monitoring actions to tell whether something is AI, it’s just as important that you track the reactions, especially on solid objects. 

A lot happens in the video above, but where the man walks through the snow, you see an existing set of footprints behind him. As he walks, no new footprints are created. Also, there’s a lot of yarn involved in space helmets.

Nonsense Sequence

Pay careful attention to a video created by AI and you’ll likely notice inconsistencies. They might be glaring or just something that tugs at the corners of your thoughts. If it’s the latter, note where you start to sense that something is off and rewatch it closely.

Recommended by Our Editors

In the video above, a hand paints a cherry blossom tree. While the painting develops along with the brushstrokes, there are moments when the paint from the brush changes color despite the paintbrush never leaving the canvas. 

Look for the Familiar

In an interview with the Wall Street Journal’s Joanna Stern, OpenAI CTO Mira Murati repeatedly would not say what Sora was trained on beyond “publicly available and licensed data.” Putting aside the ethical and potential legal repercussions of this, it means that you can attempt to reverse engineer a video to see if it’s real or not. 

Nick St. Pierre, a creative director and fan of Midjourney, followed a feeling he had about the sources of Sora’s work. He came up with prompts that suited Sora videos he’d seen and put them through Midjourney to generate AI still images. Sure enough, he came up with over a dozen examples where Sora’s videos are not much more than Midjourney images in motion.

This Tweet is currently unavailable. It might be loading or has been removed.

If you want to use more than your senses when assessing a video, come up with the prompt you believe it took to create them, and put it through a text-to-image generator or two. See how similar they are. Also, turn the prompt into a Google search to locate sources that might have been used for the video. 

Stay Vigilant!

Text-to-video AI is still very much in its infancy right now, and since the technology is constantly evolving, the best way to avoid being fooled is to evolve along with it. As AI progresses and generates increasingly realistic footage, we’ll need to adopt new tools and techniques in order to effectively spot it. So don’t get comfortable just yet! Keep an eye out for new detection tools and authenticity verification systems, and be sure to check back on this article periodically—we’ll update it as new tools emerge.

Like What You’re Reading?

Sign up for Tips & Tricks newsletter for expert advice to get the most out of your technology.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Originally Appeared Here

You May Also Like

About the Author:

Early Bird