Friday, July 26, 2024
HomeAINvidia is talk of the town at AI events leading into this...

Nvidia is talk of the town at AI events leading into this week’s earnings

Last week, OpenAI’s technology leader hailed Nvidia CEO Jensen Huang for “bringing us the most advanced” CPUs needed to execute a demo of their latest AI models during a presentation.

At Google’s annual developer conference, Alphabet CEO Sundar Pichai emphasized the company’s “longstanding partnership with Nvidia,” announcing that Google Cloud will use the chipmaker’s Blackwell GPUs in early 2025.

This week, Microsoft, OpenAI’s server provider, will unveil new AI features created on their enormous Nvidia GPU clusters. The corporation is holding its Build conference in Redmond, Washington.

Heading into its quarterly earnings announcement on Wednesday, Nvidia finds itself at the center of the technology action, a position that has grown increasingly regular for the 31-year-old business, whose market capitalization has surpassed $2 trillion this year.

Nvidia is anticipated to record year-over-year sales growth of more than 200% for the third consecutive quarter, with analysts estimating a 243% increase to $24.6 billion in the fiscal first quarter, according to LSEG. Nvidia’s data center division, including sophisticated processors sold to Google, Microsoft, Meta, Amazon, OpenAI, and others, is estimated to generate over $21 billion in revenue.

Nvidia’s AI suite of products is generating so much profit that net income is predicted to increase more than fivefold from the previous year to $13.9 billion.

The stock is up 91% this year, after more than tripling in 2023.

According to Dan Niles, founder of Niles Investment Management, Nvidia’s position in the AI boom is comparable to the “internet buildout” of the 1990s, when Cisco played a major role. According to Niles, Cisco saw many severe pullbacks during a three-year period before increasing by 4,000% to its high in 2000. Nvidia will go through similar cycles, he added.

“We’re still really early in the AI build,” Niles said on CNBC’s “Money Matters” on Monday. “I think the revenue will go up three to four times from current levels over the next three to four years, and I think the stock goes with it.”

Bernstein estimates that Google, Amazon, Microsoft, Meta, and Apple will spend $200 billion in capital expenditures this year, with Nvidia chips accounting for a significant chunk of the total.


In addition, OpenAI’s latest chatbot, GPT-4o, is powered by Nvidia technology. Meta revealed plans in March to purchase and build computers including 350,000 Nvidia GPUs, costing billions of dollars, and CEO Mark Zuckerberg even swapped coats with Huang and posed for a photo with the Nvidia CEO.

“If you look at today for the AI build out, who’s really driving that?” Niles said. “It’s the most profitable companies on the planet — it’s Microsoft, it’s Google, it’s Meta, and they’re driving this.”

Prior to the recent AI boom, Nvidia was known as the leading manufacturer of chips for 3D gaming. About a year ago, the chipmaker provided investors their first indication that the company will see unprecedented growth, suggesting to Wall Street that it would produce approximately 50% more in sales than analysts projected in the July 2023 quarter.

Growth rates have since accelerated. However, beginning in the second quarter, expansion is likely to decrease, with analysts forecasting considerable deceleration in each of the next three months.


“We just don’t know how long this investment cycle lasts and how much excess capacity will be created over that time in case this AI thing doesn’t materialize as quickly as expected,” Bernstein analysts wrote in a note earlier this month.

That is not to argue that Nvidia is at risk of losing a significant portion of the AI chip market to competitors. Piper Sandler analysts estimate it to retain at least 75% of the AI accelerator market, even while companies such as Google develop their own proprietary processors.

“We expect the percentage of hyperscaler spend dedicated to compute to increase in 2024 and 2025,” Piper Sandler analyst Harsh Kumar wrote in a note.

One worry for the business is how smoothly the transition to its next generation of AI chips, known as Blackwell, will happen. These chips are anticipated to ship later this year. Some are concerned that there will be a lull when customers avoid purchasing earlier Hopper GPUs such as the H100 in favor of Blackwell-based chips like the GH200.

“To some degree, the setup has shifted,” wrote Morgan Stanley analyst Joseph Moore in a note on Monday. “Six months ago, short-term expectations were high, but there was concern about durability. Now, following hyperscalers’ discussion of longer-term expenditure plans for AI, those longer-term views are more optimistic, but there is concern about a slowdown in front of Blackwell.”










RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments