AI Made Friendly HERE

OpenAI Sora video tool large-scale deployment uses 720,000 NVIDIA H100 GPUs worth $21.6 billion

OpenAI’s impressive new text-to-video tool, Sora, loves some GPU compute power. New numbers from Factorial Funds estimate that 720,000 x NVIDIA H100 AI GPUs would be needed for peak times on Sora.

Open Gallery 2

AI inference compute comparison (source: Factorial Funds)

VIEW GALLERY – 2 IMAGES

720,000 x NVIDIA AI GPUs is a monumental amount of AI GPU computing power. With each costing around $30,000 x 720,000, that’s $21.6 billion. Not only is it a mountain of money, but the amount of power at 700W per GPU is astounding, too, totaling 504,000,000W of power. Yeah, that’s a lot of power.

Factorial Funds estimated that Sora used between 4,200 and 10,500 NVIDIA H100 AI GPUs for one month, with a single H100 AI GPU capable of generating a one-minute video in about 12 minutes, or around 5 x one-minute videos per hour.

NVIDIA’s record-breaking revenue pushing the company over a $2.1 trillion market cap, as well as completely dominating the AI GPU market with 90%+ of the AI GPU market share powered by NVIDIA. OpenAI’s new Sora text-to-video tool will be used by some of the world’s biggest companies and people, so AI GPU demand will only skyrocket from here.

I would love to see a breakdown of NVIDIA’s new Blackwell B200 AI GPU powering Sora videos, as it represents a gigantic leap in AI performance over Hopper H100.

Factorial Funds writes that if the inference compute in Sora-like models that achieve significant market share will see around 5 minutes of video generated per NVIDIA H100 AI GPU per hour “would be needed to run Sora-like models at significant scale, meaning that AI-generated videos achieve a significant market penetration on popular video platforms like TikTok and YouTube.

  • We assume 5 minutes of videos produced per NVIDIA H100 per hour (see above for details), equivalent to 120 minutes of videos per H100 per day
  • TikTok: 17M minutes videos per day (34M total videos × avg. length of 30s), assuming 50% penetration by AI (source)
  • YouTube: 43M minutes videos per day, assuming 15% penetration by AI (mostly video below 2 min)
  • Total videos produced daily by AI: 8.5M + 6.5M = 10.7M minutes
  • Total NVIDIA H100 needed to support the creator community on TikTok & YouTube: 10.7M / 120 ≈ 89k

The site continues, saying that this figure is “likely too low due to various factors that need to be accounted for:

  • We assume 100% FLOPS utilization and do not consider memory and communication bottlenecks. In reality a utilization of 50% is more realistic, which adds a factor of 2x.
  • Demand is not distributed equally across time but instead is bursty. Peak demand is especially problematic since you need proprotionally more GPUs to still serve all traffic. We think that peak demand adds another factor of 2x for the maximum number of GPUs needed.
  • Creators will likely generate multiple candidate videos to select the best one from these candidates. We make the conservative assumption that on average 2 candidates for each uploaded video are generated, which adds another factor of 2x.
  • In total this leaves us with ~720k NVIDIA H100 GPUs at peak

Originally Appeared Here

You May Also Like

About the Author:

Early Bird