He also detailed a new set of software tools to help developers sell AI models more easily to
Nvidia’s new flagship chip, called the B200, which takes two squares of silicon the size of the company's previous offering and binds them together into a single component.
While
the B200 “Blackwell” chip is 30 times speedier at tasks like serving up answers from chatbots, Huang did not give specific details about how well it performs when chewing through huge amounts of data to train those chatbots, which is the kind of work that has powered most of Nvidia’s soaring sales. He also gave no price details.
Tom Plumb, CEO and portfolio manager at Plumb Funds, whose largest holdings include Nvidia, said the Blackwell chip was not a surprise.
“But it reinforces that this company is still at the cutting edge and the leader in all graphics processing. That doesn't mean the market is not going to be big enough for AMD and others to come in. But it shows that their lead is pretty insurmountable,” said Plumb.
Nvidia said major customers including
Amazon.com, Alphabet’s Google, Microsoft, OpenAI, and Oracle, are expected to use the new chip in cloud-computing services they sell, and also for their own AI offerings.
Though Nvidia is widely known for its hardware offerings, the company has built a significant battery of software products as well.
Nvidia also introduced a new line of chips designed for cars with new capabilities to run chatbots inside the vehicle. The company deepened its already-extensive relationships with Chinese automakers, saying that electric vehicle makers and Xpeng will both use its new chips.
Toward the end of his keynote speech, Huang also outlined a new series of chips for creating humanoid robots, inviting several of the robots made using the chips to join him on the stage.