NVIDIA GPUs have powered the AI ​​revolution. The new Blackwell chips are up to 30 times faster

In less than two years, NVIDIA's H100 chips, which are used by almost every AI company in the world to train large language models that power services like ChatGPT, have become one of the most valuable companies in the world. NVIDIA on Monday announced a next-generation platform called Blackwell, whose chips are seven to 30 times faster than the H100 and use 25 times less power.

“Blackwell GPUs are the engine powering this new industrial revolution,” NVIDIA CEO Jensen Huang said at the company's annual GTC event in San Jose attended by thousands of developers. Compare some To Taylor Swift's concert. “Generative AI is the defining technology of our time. By working with the world’s most dynamic companies, we will realize the promise of AI for every industry,” Huang added in his remarks. press release.

NVIDIA's Blackwell chips are named after David Harold Blackwell, a mathematician who specialized in game theory and statistics. NVIDIA claims the Blackwell is the world's most powerful chip. It offers a significant performance upgrade for AI companies with speeds of up to 20 petaflops compared to just 4 petaflops offered by the H100. Much of this speed is made possible by the 208 billion transistors in the Blackwell chips compared to 80 billion in the H100 chips. To achieve this, NVIDIA has connected two large chip dies that can talk to each other at speeds of up to 10 terabytes per second.

In an indication of how much the modern AI revolution relies on NVIDIA chips, the company's press release stated Includes Testimonials from seven CEOs who collectively lead trillion-dollar companies. These include OpenAI CEO Sam Altman, Microsoft CEO Satya Nadella, Alphabet CEO Sundar Pichai, Meta CEO Mark Zuckerberg, Google DeepMind CEO Demis Hassabis, Oracle CEO Larry Ellison, and Dell CEO Michael Dell and Tesla CEO Elon Musk.

See also  Musk nabbed former NBCUniversal CEO Yaccarino as Twitter's new CEO

“There is currently nothing better than NVIDIA hardware for AI,” Musk says in the statement. “Blackwell delivers huge leaps in performance and will accelerate our ability to deliver leading-edge models,” says Altman. “We are excited to continue working with NVIDIA to advance AI compute.”

NVIDIA did not disclose the cost of the Blackwell chips. Its H100 chipsets are currently priced between $25,000 and $40,000 each, According to to CNBCComplete systems running these chips can cost up to $200,000.

Despite their costs, NVIDIA chipsets are in high demand. Last year, delivery wait times were The same height as 11 months. The accessibility of NVIDIA's AI chips is increasingly seen as a status symbol for technology companies looking to attract AI talent. Earlier this year, Zuckerberg praised the company's efforts to build a “tremendous amount of infrastructure” to support Meta's AI efforts. “At the end of this year, we will have approximately 350,000 Nvidia H100 cards — and a total of 600,000 H100 compute equivalents if you include other GPUs,” Zuckerberg wrote.

Leave a Reply

Your email address will not be published. Required fields are marked *