The logo of NVIDIA as seen at its corporate headquarters in Santa Clara, California, May 2022. /Reuters
Nvidia's stock climbed on Tuesday after the chipmaker said its new flagship artificial intelligence (AI) processor is expected to ship later this year and CEO Jensen Huang said he is chasing a data center market potentially greater than $250 billion.
Nvidia's stock rose nearly 2 percent to $901 after Huang and Chief Financial Officer Colette Kress answered questions from investors at the company's annual developer conference in San Jose, California. The shares had dipped nearly 4 percent earlier in the day.
"We think we're going to come to market later this year," Kress said, referring to the company's new AI chip, which the company debuted on Monday.
Huang estimated that companies operating data centers will spend more than $250 billion a year to upgrade them with accelerated computing components that Nvidia specializes in developing. He said that market was growing by as much as 25 percent a year.
Nvidia is shifting from selling single chips to selling total systems, potentially winning a larger chunk of spending within data centers.
"Nvidia doesn't build chips, it builds data centers," Huang said.
Called Blackwell, Nvidia's new processor combines two squares of silicon the size of the company's previous offering. Nvidia also detailed a new set of software tools to help developers sell AI models more easily to firms that use its technology.
Nvidia is working with contract chip manufacturer TSMC to avoid bottlenecks in packaging chips that slowed shipments of its previous flagship AI processor, Huang said.
"The volume ramp in demand happened fairly sharply last time, but this time, we've had plenty of visibility" into demand for Blackwell chips, Huang said.
Some analysts said Wall Street has already factored in the debut of the B200 Blackwell chip, which the company claims is 30 times faster at some tasks than its predecessor. The Blackwell chip will be priced between $30,000 and $40,000, Huang told CNBC.
Huang later clarified that comment, saying Nvidia will include its new chip in larger computing systems and that prices will vary based on how much value they provide.
"The Blackwell technology shows a significant performance uplift compared to Hopper (the current flagship chip) but it's always hard to live up to the hype," said David Wagner, portfolio manager at Aptus Capital Advisors.
In a discussion about Nvidia's cooperation with South Korean chipmakers, Huang said Nvidia is qualifying Samsung Electronics' high bandwidth memory (HBM) chips.
Samsung's cross-town rival SK Hynix on Tuesday said it has begun mass production of next-generation HBM3E chips, with sources saying initial shipments will go to Nvidia this month.
At the center of Wall Street's AI euphoria, Nvidia's stock has more than tripled over the past 12 months, making it the U.S. stock market's third-most valuable company, behind only Microsoft and Apple.
Nvidia expects major customers including Amazon.com, Alphabet's Google, Meta Platforms, Microsoft, OpenAI and Tesla to use its new chip.
Its hardware products will likely remain "best-of-breed" in the AI industry, Morningstar analysts said, lifting their estimates for Nvidia data-center revenue for 2026 and 2028.
"We remain impressed with Nvidia's ability to elbow into additional hardware, software and networking products and platforms," they said.
The software push shows how Nvidia, whose chips are mostly used to train large-language models like Google's Gemini, is trying to make its hardware easier to adapt for companies rushing to integrate generative AI into their businesses.
Many analysts expect Nvidia's market share to drop several percentage points this year, as competitors launch new products and the company's largest customers make their own chips, although its dominance is expected to remain unchallenged.
(With input from Reuters)
Read more: Nvidia unveils flagship AI chip, the B200, aiming to extend dominance