Nvidia's early investment in AI bears fruit as its A100 chip now powers ChatGPT
Nvidia is already getting ready to introduce its next-generation system called H100.
Nvidia is known in the gaming industry for its top-of-the-line computer graphics cards. But gamers might not know that the company also made a big bet in artificial intelligence, which is now starting to pay off for the firm.
Since the computer gaming sector in 1999 with the introduction of the GeForce 256, Nvidia continues to dominate the graphics processing unit (GPU) market today. The company's gaming revenue last year soared past $9 billion (roughly €8,533,980,000) despite a recent downturn in the sector.
It now appears that Nvidia's GPU products will no longer be for gaming or the occasional crypto mining rig. In fact, the technology is now at the centre of the recent boom in artificial intelligence.
"We had the good wisdom to go put the whole company behind it," Nvidia CEO Jensen Huang told CNBC. "We saw early on, about a decade or so ago, that this way of doing software could change everything. And we changed the company from the bottom all the way to the top and sideways. Every chip that we made was focused on artificial intelligence."
Thanks to its early investment in AI, Nvidia is reaping the rewards as its chips are now used as the engine behind large language models (LLMs) such as ChatGPT. While the company has successfully opened up a new revenue source, Nvidia is still affected by the U.S.-China trade tensions. For instance, it ran into some issues in October last year when the U.S. banned AI chip exports to China, affecting sales of its popular AI chip, the A100.
"It was a turbulent month or so as the company went upside down to reengineer all of our products so that it's compliant with the regulation and yet still be able to serve the commercial customers that we have in China," Huang added. "We're able to serve our customers in China with the regulated parts, and delightfully support them."
Nvidia is expected to showcase its AI plans in the annual GTC developer conference, which is happening from March 20 until March 23. Huang also discussed his company's role in the explosion of generative AI.
"We just believed that someday something new would happen, and the rest of it requires some serendipity," he said. "It wasn't foresight. The foresight was accelerated computing."
While Nvidia has multiple revenue streams, GPUs remain its primary business. GPU sales account for more than 80 per cent of the company's total revenue. They are sold separately but can be plugged into a PC's motherboard to add computing power to the CPU.
As more companies try challenging ChatGPT, they usually promote their project by boasting about how many Nvidia A100s they have. For instance, Microsoft revealed that the supercomputer developed for OpenAI used 10,000 A100s.
"It's very easy to use their products and add more computing capacity," Bank of America Securities senior semiconductor analyst Vivek Arya explained. "Computing capacity is basically the currency of the valley right now."
Despite the current popularity of its AI chip A100, Nvidia is already getting ready to introduce its next product. The company's next-generation system is called H100, where H stands for Hopper.
"What makes Hopper really amazing is this new type of processing called transformer engine," CEO Jensen Huang said. "The transformer engine is the T of GPT, generative pre-trained transformer. This is the world's first computer designed to process transformers at enormous scale. So large language models are going to be much, much faster and much more cost-effective."
While other tech companies are bound to develop their own wares to compete with whatever Nvidia has released on the market, some investors are not too worried about competition. Instead, they are paying more attention to the volatile U.S.-China relations considering that China sales account for around one-quarter of its revenue.
"The biggest risk is really U.S.-China relations and the potential impact of TSMC. If I'm a shareholder in Nvidia, that's really the only thing that keeps me up at night," Evercore analyst C.J. Muse said. "This is not just a Nvidia risk, this is a risk for AMD, for Qualcomm, even for Intel."
© Copyright IBTimes 2024. All rights reserved.