Nvidia's competitors are circling the globe, but they still have years of catching up to do

NVDA is the king of artificial intelligence. Its share of the global AI chip market is estimated at between 70% and 90%. Its high-powered graphics processors, which are ideal for training and running AI models, are in such high demand that obtaining them is a task in itself.

In June, with the AI ​​craze in full swing, Nvidia's market cap crossed the $1 trillion mark. On Friday, the company's shares reached an all-time high of $549.91.

It's not just Nvidia's hardware that helps it stay ahead of its competitors. the The company's Cuda softwarewhich developers use to create AI platforms, is just as important to Nvidia's survival.

“Software remains the strategic moat for Nvidia,” explained Chirag Dekati, vice president analyst at Gartner. “These…ready experiences enable Nvidia to be at the forefront of mindshare, as well as adoption.”

Nvidia's progress didn't happen overnight. It has been working on AI products for years, even as investors questioned the move.

“To its credit, Nvidia started about 15 years ago working with universities to find new things you can do with GPUs, aside from gaming and visualization,” explained Patrick Moorhead, CEO of Moor Insights & Strategy.

“What Nvidia does is it helps create markets and that puts competitors in a very difficult position there, because by the time they catch up, Nvidia is on its way to the next new thing,” he added.

Jensen Huang, co-founder and CEO of Nvidia Corp, speaks during Hon Hai Tech Day in Taipei on October 18, 2023. (Photo by I-Hwa Cheng/AFP) (Photo by I-HWA CHENG/AFP via Getty Images)

Jensen Huang, co-founder and CEO of Nvidia Corp, speaks during Hon Hai Tech Day in Taipei on October 18, 2023. (I-HWA CHENG/AFP via Getty Images) (Ai-Hua Cheng via Getty Images)

But threats to Nvidia's reign are on the rise. Rivals Intel (INTC) and AMD (AMD) are marshaling their forces to grab their own slice of the AI ​​pie. In December, AMD debuted its MI300 accelerator, which is designed to go head-to-head with Nvidia's own data center accelerators. Meanwhile, Intel is building a Gaudi3 AI accelerator, which will also compete with Nvidia's offerings.

See also  E3 Canceled for 2023 - The Hollywood Reporter

However, it's not just AMD and Intel. Hyperscalers, which include cloud providers Microsoft (MSFT), Google (GOOG, GOOGL), and Amazon (AMZN), as well as Meta (META), are increasingly turning to their own chips in the form of what are known as ASICs, or application-specific integrated circuits.

Think of AI graphics accelerators from Nvidia, AMD, and Intel as jacks of all trades. They can be used for a range of different AI-related tasks, ensuring that the chips can handle anything a company needs.

ASICs, on the other hand, are masters of a single trade. They're specifically designed to meet a company's AI needs and are often more efficient than GPUs from Nvidia, AMD, and Intel.

This is a problem for Nvidia, since hyperscalers spend big when it comes to AI-powered GPUs. But as scalers focus more on their ASIC hardware, they may have less need for Nvidia chips.

However, on the whole, Nvidia's technology outperforms its competitors.

“They have…a long-term research pipeline to continue driving the future of GPU leadership,” DeCati explained.

Sign up for the Yahoo Finance Tech Newsletter.Sign up for the Yahoo Finance Tech Newsletter.

Sign up for the Yahoo Finance Tech Newsletter. (Yahoo Finance)

Another thing to consider when it comes to AI chips is how they are used. The first method is training models, which is called, well, coaching. The second is to put these models into practice so that people can use them to do things like create specific outputs that you want, whether that's in the form of text, images, or something else entirely. This is called inference.

OpenAI has conclusions for ChatGPT, while Microsoft has conclusions for Copilot. Every time you send a request to either program, it takes advantage of AI accelerators to generate the text or image you want.

See also  The Pixel Watch 2 may kiss Samsung chipset goodbye to battery optimization

Over time, inference will likely become the primary use case for AI chips, as more companies seek to leverage different AI models.

However, the AI ​​explosion is still in its infancy. The vast majority of companies that will benefit from AI are not yet in the game. So, even if Nvidia's market share takes a hit, its revenue will continue to increase as the AI ​​field booms.

Daniel Holley He is the technology editor at Yahoo Finance. He's been covering the tech industry since 2011. You can follow him on Twitter @Daniel Holly.

Click here for the latest technology news that will impact the stock market.

Read the latest financial and business news from Yahoo Finance

Leave a Reply

Your email address will not be published. Required fields are marked *