Managing Editor’s Note: Before we get to today’s Bleeding Edge on the status of semiconductor stocks from senior analyst Nick Rokke, I wanted to put another chip-powered opportunity on your radar…
Specifically, it’s a new AI “super chip” that’s 50 times faster than NVIDIA’s and is so powerful that it could transform the artificial intelligence sector and establish a new order in the stock market.
You can go here to sign up to join Jeff as he gets into the details of this opportunity – including the firm behind this AI “super chip” – next Wednesday, May 21, at 10 a.m. ET.
Over the past few months, we’ve been unapologetically bullish here in The Bleeding Edge. And if you’ve been following along, it’s paid off handsomely.
Since the S&P 500 bottomed near 5,000 on April 8, the S&P 500 Index has surged 18%. But semiconductor stocks have done even better.
We predicted this the day before the bounce in Tariffs Are Here – And the Market’s Reaction Has Been Swift, where we also highlighted three names where institutional money was quietly piling in…
One sector we’re watching closely right now is semiconductors […]
[The] Trump administration knows that semiconductors are the foundation of the AI arms race.
As Jeff has predicted, Elon Musk’s xAI will reach artificial general intelligence (AGI) within 12 months, and very possibly by the end of this year. That’s when they plan to have successfully linked together one million NVIDIA GPUs.
That’s the path to artificial superintelligence (ASI) – and it will represent a technological and economic edge that will shape the future for decades.
[…]
And that’s why this beaten-down sector – trading at deeply discounted valuations – is where smart capital is quietly returning…
We saw that today with major semiconductor companies like NVIDIA (NVDA), Micron (MU), and Broadcom (AVGO) all gaining on the day.
Things are a bit hectic right now. Emotions are running high and it’s triggering a lot of volatility. But we firmly believe the turnaround is coming.
NVIDIA (NVDA), Broadcom (AVGO), and Micron (MU) are now up an average of 41.6% from when we featured them.
If you acted on that insight, we’d love to hear from you. Tell us how you played it and what you’re doing with the profits. We love hearing these stories.
Now, for everyone wondering whether they’ve missed the move in semiconductors, the answer is short and simple…
No. Not even close.
We’re still in the early innings of the largest and most explosive infrastructure buildout in history.
One of the best forward indicators of semiconductor demand is the number of hyperscale data centers under construction.
These aren’t like traditional server closets tucked in the back of an office. Hyperscale data centers are sprawling, energy-intensive facilities – many the size of a dozen football fields. And they are packed with semiconductors, the most expensive chips being GPUs from NVIDIA and AMD.
The cost of these new data centers is reaching into the hundreds of billions of dollars. Yet demand just keeps increasing.
In 2024, hyperscalers completed 137 new data centers. And according to Synergy Research Group, another 504 are currently in development.
But it’s not just the number of data centers that matters. It’s the size and scale of each data center.
Five years ago, we measured data centers in megawatts (MW). Now, we measure them in gigawatts (GW). We use watts to judge scale.
Electricity is the single most important input for an AI factory. It’s the fuel, the food, that artificial intelligence requires to train, think, and ultimately evolve. The amount of energy consumption, the wattage, is therefore a good proxy for the number of GPUs in a data center.
Just look at what’s coming online within the next 18–24 months:
And that’s just in the U.S.
Abroad, the AI infrastructure race is just as intense. This week, the Trump administration was in the Middle East negotiating AI trade deals. One proposal: allowing the UAE to import 1.5 million NVIDIA chips – four times what was previously allowed under Biden-era controls.
Saudi Arabia’s Humain – a new global AI company under the Saudi Public Investment Fund (PIF) – has already ordered 18,000 of NVIDIA’s new GB300 chips and is working with AMD on a $10 billion AI infrastructure push, intending to accelerate global AI adoption.
And NVIDIA’s scaled-down H20 chip, designed to comply with Chinese export restrictions, is set to ship in July. It already has an $18 billion backlog.
This isn’t just a United States AI boom. The entire globe is racing to build AI capacity. And wherever there’s AI infrastructure, there’s demand for semiconductors.
Bottom line: Wall Street still hasn’t grasped the full scale of what’s coming. Especially when it comes to semiconductor revenue. And to demonstrate this, let’s have a look at NVIDIA’s future revenue…
At NVIDIA’s recent GTC conference, CEO Jensen Huang dropped a stunning data point. He said that a 2-gigawatt AI data center costs around $100 billion to build – and NVIDIA captures half that value.
That number gives us a new framework for understanding NVIDIA’s future revenue. And it helps us challenge the conservative estimates from Wall Street analysts.
Consulting firm McKinsey has done great work modeling the growth of data center capacity through 2030. Their projections vary depending on the scenario.
Data Center Growth Projections | Source: McKinsey & Company
McKinsey’s mid-range estimate calls for 164 GW of new global hyperscale data center capacity by 2030. That alone is staggering.
Now, not every one of these facilities will be packed wall-to-wall with NVIDIA chips. Some will use AMD, Intel, Broadcom, or Marvell for inference and specialty workloads. And many will support traditional cloud computing.
But when it comes to training artificial intelligence, NVIDIA dominates with more than 90% market share.
So let’s stay conservative and assume NVIDIA powers only half of that future capacity. At 2 GW per $50 billion in revenue (as Jensen Huang outlined at GTC), that puts NVIDIA on track for $2.05 trillion (trillion – that’s not a typo) in cumulative data center revenue between now and 2030.
Wall Street’s consensus forecast? Just $1.81 trillion over the same period.
That’s a 13% shortfall under even very modest assumptions.
And while a 13% upside might not sound like a gamechanger… there’s something else to keep in mind.
At NVIDIA’s GTC keynote, Jensen Huang told us that in just one year – between 2023 and 2024 – AI compute demand increased by 100x relative to the industry’s prior expectations.
That’s not a rounding error. That’s a complete failure of forecasting.
And we’re seeing massive adoption of AI by people and businesses.
At a recent TED Conference in Canada, OpenAI CEO Sam Altman said, “Last we disclosed [months ago], we have 500 million weekly active users, growing fast.”
Then the host responded with an interesting comment… He said, “But backstage, you told me that it doubled in just a few weeks.”
That implies more than 1 billion users – or over 10% of the global population – are actively engaging with one AI model. And OpenAI is now projecting revenue to rise from $4 billion last year to $125 billion in 2029.
It’s no wonder Altman recently said the GPUs are “melting.” Demand is off the charts.
In fact, Altman even joked on X that OpenAI is spending “tens of millions of dollars” just processing users who say “please” and “thank you” to the chatbot.
And it’s not just consumer curiosity driving this. A study from Stripe revealed that the top 100 AI startups on their platform reached $5 million in annualized revenue in just 24 months – a pace 50% faster than traditional software-as-a-service (SaaS) companies.
Source: Stripe Data
We’ve never seen anything like this.
The product-market fit is real. The economics are real. And the venture capital is flooding in accordingly.
As AI becomes more powerful and embedded in enterprise workflows, this exponential growth will continue.
That’s why we believe most analysts are dramatically underestimating how many AI data centers will come online in the next five years.
Let’s go back to McKinsey’s high-end scenario: 243 GW of new capacity by 2030.
Using our same framework – NVIDIA capturing half that capacity at $50 billion per 2 GW – puts their cumulative data center revenue at $3.04 trillion over that stretch.
That’s a 68% upside to current analyst projections.
And if we zoom in on the fiscal year ending January 2031, our estimates show some big numbers.
Even assigning a modest 30x EV/EBITDA multiple – below what many high-growth tech companies trade for today – NVIDIA would be worth $14.1 trillion.
That’s a 370% gain from today’s $3 trillion valuation. And easily implies it will be the most valuable company the world has ever seen.
Sound crazy?
Maybe. But we’ve already been through one AI demand curve that analysts failed to predict. And there’s no reason to think it won’t happen again.
With that kind of long-term upside, does it really matter whether you bought NVDA at $100, $120, or even $140? Not really. The compounding effect of exponential growth will make up for it.
And NVIDIA is just one example.
We believe the entire semiconductor sector is underestimated and undervalued.
Why? Because most investors are still stuck using outdated frameworks. They haven’t updated their spreadsheets to account for the new AI buildout cycle. They’re underestimating both the scale of new data center construction and the semiconductor density of these next-gen facilities.
That’s a huge blind spot.
Meanwhile, the Philadelphia Semiconductor Index (SOX) has pulled back from its July 2024 highs. It’s now trading near the same valuation levels we saw in 2022 – before ChatGPT unleashed the AI megatrend into the public consciousness.
5-Year Chart of the Philadelphia Semiconductor Index (SOX) | Source: Bloomberg
So there’s still time to buy semiconductor stocks at levels below the average in the ChatGPT era. Once investors realize this fact, they will buy in anticipation of the future earnings. The stock market looks forward. And when they do, they’ll likely push valuation multiples up to where they were in 2024 – which was 50% higher than now.
Semiconductors remain one of, if not the biggest, growth stories in the market right now. The demand for AI will only increase as we get near and reach artificial general intelligence. And that’s why we are heavily invested in semiconductor stocks in our Near Future Report and Exponential Tech model portfolios.
In other words, the market has given us a second chance.
The average semiconductor stock is still trading below its peak multiples from the early AI wave. But the business fundamentals today are far stronger. That’s why we know a rerating is coming.
Once institutional capital catches up to this reality, valuations will rise once again. And semiconductor stocks will soar.
Semiconductors remain one of the biggest growth stories in the market today. As demand for AI continues to explode, and as we race closer to AGI, the companies powering this transformation will be the ones creating huge wealth.
Investors can get exposure to the trend by buying the iShares Semiconductor ETF (SOXX). And that’s one way to go, but it’s a very broad ETF with a wide range of semiconductor companies in different areas.
But subscribers to the Near Future Report and Exponential Tech Investor know our top picks that provide the most exposure and leverage to the most powerful trends in high tech that will result in the largest gains going forward.
Regards,
Nick Rokke
Senior Analyst, The Bleeding Edge
The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.
The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.