In This Article:
Nvidia (NVDA) is assuaging Wall Street’s concerns about any potential slowdowns in the production of its next-generation Blackwell chip, telling investors during the company’s earnings call on Wednesday that it’s on pace to exceed its previous revenue estimates for the current quarter.
“Blackwell production is in full steam,” CEO Jensen Huang said during the call. “I think we're in great shape with respect to the Blackwell ramp at this point.”
Prior to the earnings call, The Information reported on Nov. 17 that Nvidia was contending with overheating issues related to its Blackwell-based servers leading suppliers to adjust the design of the racks that house the servers. Nvidia responded at the time saying that design iterations are normal and expected.
The report followed a separate Blackwell design issue that Nvidia addressed over the summer, which gave Wall Street pause about whether the chip would land on customers' doorsteps on time.
Now Nvidia is saying that the chips aren’t just shipping, they’re in the hands of all of the company’s major partners as well. In Q3, Nvidia reported that cloud service providers, companies like Microsoft (MSFT), Amazon (AMZN), and Google (GOOG, GOOGL), made up 50% of the chipmaker’s data center revenue. Ensuring they’re getting Blackwell chips sooner than later is paramount for Nvidia’s continued growth. And the company appears to be doing just that.
Read more: Nvidia nearly triples in value over 11 months: Is it time to invest?
“With any product ramp, there's always a great deal of complexity, and Blackwell is no different,” Dan Flax, senior research analyst and managing director at Neuberger Berman, told Yahoo Finance.
“And I think what's notable is that Nvidia and its partners are executing well. I think supply will improve over the next couple of months as they really scale Blackwell.”
But there’s one problem that continues to bedevil Nvidia: supply constraints. The sheer number of companies jockeying for position to grab the AI behemoth’s chips makes it difficult to meet demand.
“It is the case that demand exceeds supply,” Huang told analysts. “And that’s expected as we’re in the beginning of this generative AI revolution.”
For some perspective, Meta CEO Mark Zuckerberg told investors during the company’s most recent earnings call that the social media giant is training its Llama 4 AI models on a server cluster made up of more than 100,000 Nvidia H100 chips.
The H100 is Nvidia’s prior-generation Hopper AI accelerator. With Blackwell promising far better performance than Hopper, it only makes sense that massive AI companies like Meta (META) are angling to get as many of those chips as possible.