Attention Nvidia shareholders…

The fat lady isn’t singing yet, but she’s warming up.

This year, the world’s No. 1 AI chipmaker is up 213%.

But as I’ll show you today, now is a good time to consider taking some profits.

You see, next year is shaping up to be a challenging year for the semiconductor giant.

In fact, Nvidia may be more vulnerable than ever. The company accomplished a lot this year. But it also made a critical mistake that’s about to come home to roost.

Let me explain…

Don’t Make This Bad Bet

Nvidia is one-dimensional. It specializes in the design of Graphics Processing Units (GPUs).

A GPU is a kind of specialized chip that goes into gaming consoles and AI systems.

They allow for a lot of computations at once. This is important for rendering realistic computer graphics and helping AIs “think.”

And GPU sales account for almost all of Nvidia’s income.

In Nvidia’s favor, not only were its GPUs top-tier in 2023, but also they were available in substantial volumes.

Other chipmakers sell GPUs for gaming and personal computing. But they couldn’t match Nvidia when it came to the premium GPUs used to make AI apps viable at scale.

That left Microsoft, Amazon, Google, Meta, and other major AI players spending billions of dollars on Nvidia’s GPUs this year.

And the demand for GPUs for AI computing took Wall Street by surprise.

In a year when some Wall Street analysts expected Nvidia’s sales to fall, they grew. Sales jumped a jaw-dropping 87% to $13.5 billion in the second quarter over the first quarter.

That sent shares soaring… and for good reason.

But if you’re buying Nvidia shares now, you’re betting on the company repeating this massive increase in sales.

And that’s a mistake. The odds of Nvidia surprising Wall Street again next year are basically zero.

And Nvidia has bigger problems than just meeting Wall Street estimates.

The company has backed itself into a corner. And it’s going to have a lasting impact on the company’s future.

Playing With Fire

Nvidia had a monopoly on AI data center GPUs for most of 2023. So, it could name its price without worrying about the competition undercutting it with lower prices.

And yes, Nvidia sold a ton of its GPUs this year. It’s on track to sell 550,000 of its H100 GPUs by the end of 2023.

That’s worth more than $16.5 billion in chips, depending on how they’re configured.

But instead of catering to the needs of the giant cloud providers, Nvidia chose a different path.

All year, Amazon, Microsoft, Meta, Oracle, Tesla, IBM, and Google wanted to buy as many Nvidia GPUs as the company would sell them.

As Tesla boss Elon Musk put it in July…

We’ll actually take Nvidia hardware as fast as Nvidia will deliver it to us.

But instead of maxing out these clients’ order books, Nvidia allocated supply to CoreWeave, Lambda, and other smaller cloud-computing providers.

On the surface, this seems like a savvy strategy.

Nvidia reportedly got stock investments in each of those companies as part of the deal to supply them with its GPUs.

That’s made Nvidia a part-owner in these companies.

And what’s a good way to boost a startup cloud provider’s valuation in a year when AI server demand is red-hot?

Allocate your GPUs to the startup instead of selling them to Amazon, Microsoft, and Google!

But Nvidia is playing with fire.

It boosted the valuation of its equity stakes in CoreWeave and Lambda. But the largest cloud providers aren’t going to sit around and watch some startups gobble their market share without putting up a fight.

These big tech companies have the resources to build their own computer chips. And that’s exactly what they are doing.

Faster and More Efficient Than a GPU

Since 2016, Google has been designing and using an AI chip it calls a Tensor Processing Unit (TPU).

TPUs are custom-built for a type of mathematical operation known as a tensor operator that’s commonly used in machine learning tasks.

This makes them faster and more efficient than GPUs.

This means Google has little use for Nvidia GPUs. It can use its own TPUs instead.

And like Google, Amazon has been designing and using its own custom-designed chips for years.

Now, Microsoft is expected to throw its hat in the ring with a custom AI chip set to launch later this year.

Even Tesla – an electric car maker – has been forced to design its own chips.

Bottom line: Nvidia backed the wrong horses in the AI race.

Backing startup cloud providers with early access to its GPUs isn’t a durable business model. Thanks to their scale, Amazon, Google, and Microsoft can offer steep discounts and parallel services to their cloud-computing clients that CoreWeave and Lambda can’t match.

Worse, Nvidia’s throttling supply to the major players, further encouraging them to develop and advance their own silicon.

It’s a big mistake.

Nvidia should cater to the large cloud providers. This would lock in Nvidia GPUs as the go-to device for AI applications for decades.

Now, Nvidia has a bunch of giant competitors that have tons of resources, customers, and a reason to not use Nvidia’s products in the future.

This is why I’ve zeroed in on a company that’s set to be the “Next Nvidia.”

It has a more diversified product portfolio than Nvidia. It’s catering to the giant cloud companies in a way that Nvidia isn’t. And it’s already working with Microsoft and Meta on their AI projects.

You can buy this company’s shares before the market realizes Nvidia’s days as the top AI chipmaker are numbered. Wall Street will catch on soon, and by then it will be too late.

Paid-up subscribers of my flagship tech investing advisory, The Near Future Report, can read the full details here.

And if you’re not already a subscriber you can check out my presentation on the Next Nvidia here.

Regards,

Colin Tedards
Editor, The Bleeding Edge