Connect with us

Hi, what are you looking for?

Investing

Opinion: Nvidia is pushing to stay ahead of Intel, AMD in a high-stakes, high-performance computing race

Nvidia
NVDA,
+2.13%
was built on the promise of the GPU for gaming and advanced rendering, but its rise to a $1 trillion valuation was on the back of high-performance computing and AI. What began as a small project called “general purpose GPU” (GPGPU) that looked at in-game physics and video transcoding applications transformed the company into the titan of the silicon space, displacing Intel as the clear thought-leader for the future of computing. 

What has made Nvidia successful in this transition from being a gaming company to one of the premier computing and AI leaders is its ability to not just build silicon, but to build an entire platform and ecosystem. It calls this the “Nvidia Scientific Computing Platform” and it’s a combination of hardware including its Hopper GPU and Grace CPU with system software such as CUDA and PhysX that aim to simplify the programming models for developers.

Add on to that end-to-end platforms including Nvidia Omniverse and Nvidia AI and you get applications that can spur scientific development and AI advancement, all of course optimized for Nvidia silicon.

But like all titans, Nvidia constantly needs to stay ahead of the curve and competitors. The company’s struggle is to maintain the momentum that got it where it is.

At SC23, the leading conference for high-performance computing, Nvidia this week made a couple of interesting product announcements and gave some insight on its thinking about the next big thing in high-performance computing.

A new GPU to maintain leadership 

The primary announcement from Nvidia was the H200, a mid-generation update to the H100 Hopper GPU that makes significant advancements in the memory design. By moving from HBM3 (high bandwidth memory) to HBM3e, the new H200 can improve memory bandwidth and maximum memory capacity, and as a result offer significant performance improvements over the current generation H100 product.

Memory technology is often just as important to high performance computing (HPC) systems as the GPU or CPU itself. New AI models and HPC workloads will test the limits of memory capacity. The new Nvidia H200 increases memory capacity by 76% and memory bandwidth by 43%.

Nvidia was already the market leader in this space so the improvements might seem superficial. But as Intel
INTC,
+3.09%
continues to push its GPU and Gaudi AI accelerator strategy, and AMD
AMD,
+2.65%
executes the roll out of its MI300 AI chip, its critical for Nvidia to stay aggressive.

Read: Nvidia’s stock notches longest winning streak in seven years, with all-time high in view

For investors interested in what Nvidia has planned next, the company teased performance of its upcoming architecture (codenamed Blackwell) and the B100 GPU slated for 2024 as offering more than twice the performance of the H200 in a GPT-3 AI inference benchmark, similar to the work needed for AI chatbots like ChatGPT.

Nvidia walked out numerous partners and customers to back up both the product claims and long-term commitments. For example, the EuroHPC Joint Undertaking, a group founded in 2018 to invest in European supercomputing systems) showed the plan for a supercomputer named ‘Jupiter’ with almost 24,000 GH200 nodes, the combination of Arm-based Grace CPUs and Hopper GPUs. Another was the Texas Advanced Computing Center, one of the U.S. centers of computational excellence, which has a system called ‘Vista’ that will use both GH200 Grace Hopper and Grace CPU superchips.

The next computing frontier – Quantum computing

Perhaps the most substantial change to computing will be the move to quantum systems. Quantum computing differs from classical computing in that it depends on quantum mechanics, rather than simple electrical impulses used in classical computing. Quantum computing can help solve problems that require calculating large numbers of combinations, like machine learning and encryption, faster than any currently available supercomputers. 

For now, Nvidia does not have a QPU (quantum processing unit) in its stable of technology but that isn’t stopping the company from being involved in this developing technology. Whether or not Nvidia chooses to build or perhaps buy quantum capability, it is creating the supporting hardware and software ecosystem for quantum compute to make sure that it isn’t left out.

From a hardware perspective, nearly all quantum computing systems today have associated classical computing systems that are used to simulate or control the quantum systems. This can be for error calculation, systems control, or simply for secondary processing where quantum devices do not excel. Nvidia calls these “hybrid quantum systems” and it’s a way for them to ensure that quantum engineers and system researchers are working with Nvidia throughout the process. 

Maybe even more ingenious is the creation of “CUDA Quantum,” which is the quantum equivalent to CUDA, the programming and software model that enabled Nvidia to dominate the AI and HPC spaces for the past 15 years. CUDA Quantum includes a high-level programming language that allows quantum system designers and application developers to write code that can run on both classical systems and quantum ones, diverting work between them or simply simulating the quantum portion. 

Nvidia claims that 92% of the top 50 quantum startups currently use Nvidia GPUs and software and 78% of companies building and deploying quantum processors use CUDA Quantum as their programming model.

To me this represents a move that no other technology company today can make, enabling the high-performance computing industry that is already tailoring software for Nvidia GPUs and ensuring that the next generation of quantum applications are created as “Nvidia-first” even before the company has a clear stake in the quantum computing hardware landscape. 

Ryan Shrout is the founder and lead analyst at Shrout Research. Follow him on X @ryanshrout. Shrout has provided consulting services for AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia and others. Shrout owns shares of Intel.

More: These tech stocks scored this earnings season even as AI hype slowed — and the winner may surprise you

Also read: PCs and cars are the future for this giant but little-known chip-designer



Read the full article here

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Videos

Watch full video on YouTube

Videos

Watch full video on YouTube

Videos

Watch full video on YouTube