Managing Editor’s Note: Tomorrow night, Jeff is diving into all the details of the “Hyper Acceleration”…
It’s a market phenomenon we’ve only seen twice before in history – with the Industrial Revolution in the 19th century and with the 1990s tech boom.
Both times saw major technologies of the time converge… and fortunes were made.
And now, it’s happening again. We’re at a critical convergence point… and tomorrow night, August 20, at 8 p.m. ET, Jeff is sharing how we can make the most of this Hyper Acceleration period.
Just go here to automatically sign up to join him.
It’s official. AI agents are no longer experimental. They’re mainstream.
No coding. No workarounds. Just a few clicks. This shift will change how we work, how we consume news, how we shop online, and how the world uses data centers.
On June 16, Elon Musk’s xAI rolled out its new “Tasks” feature to all paying customers. It lets Grok automatically run daily, weekly, or monthly queries. And when the job is done, it emails the results straight to your inbox.
That was the starting gun. In July, OpenAI followed with its own AI agent, which Jeff recently explored in The Bleeding Edge – The Frictionless Future of Shopping. Soon after, Perplexity and Claude released their versions. What was once experimental is now mainstream.
Regular readers will remember that Jeff has talked about agentic AI since July… of last year, and for years before that, before “agentic AI” even had a name. Last year, it was less accessible. For most, it was more theory than practice. Today, it’s real.
Think of an agentic AI as a super-smart digital assistant.
As mentioned above, Jeff has written at length about this technology in past issues of The Bleeding Edge…
Agentic AI, or agentic reasoning, is kind of like it sounds.
The technology, the AI, is given agency. It is given the authority or directive to solve a problem or complete a task through a series of steps.
This differs from today’s LLM technology, which provides users with a zero-shot response. When we use something like ChatGPT, we give it a prompt, and then it returns a complete response. The response is based on the information from our prompt, along with its pre-trained knowledge, and returned in a matter of seconds.
An agentic workflow is quite different. It is an iterative process, where an agentic AI uses a more human-like workflow to accomplish a task.
Agentic AI is artificial intelligence built with a measure of autonomy. Enough that it can move and reason through an assigned task in a similar way to how a human would.
Unlike a typical chatbot, agentic AI doesn’t just answer questions. It can plan, adapt, and execute without you needing to micromanage and prompt at every step.
Tell your phone, “Plan my vacation to Hawaii.” A chatbot will give you a list of options.
An AI agent will research flights and book them based on your preferences. Then they will book the hotel, check the weather, build a packing list, etc… And then adjust the itinerary on the fly if a flight gets delayed.
That’s the difference. The LLM chatbots just responded. Agents act, learn, and react to handle complex, multi-step tasks autonomously.
At home, that can mean something like tasking an agentic AI with meal planning, making a grocery list, and arranging a delivery of groceries to take all the stress and time out of sorting out dinner for the week.
Taking it a step further, in the future, when we integrate agentic artificial intelligence to autonomous robotic technology, your at-home robot assistant will be able to get the groceries, prepare the meal, and get everything on the table without you having to worry about a thing.
In the office, that means agents can manage your emails, schedule meetings, or even tackle work projects by breaking them down and completing them step by step.
Until recently, that kind of capability required an expensive service to connect an LLM with apps… And a working knowledge of programming. Not anymore.
Today, I use Grok to run daily scans for important technology and stock announcements. Here’s a picture of Grok with a red box to show where to click to set up a repetitive task you may need.
I also use Perplexity to summarize my carefully curated financial news emails each morning to get a jump start on the overnight news.
With OpenAI, it takes just a few clicks to link ChatGPT with other apps and let it take action. For those who pay for the Plus plan, they can access the Agent Mode by clicking where I have boxed in below in their ChatGPT prompt area, and then clicking on Agent Mode.
And we can see it gives a few suggested prompts to help spur ideas for what the agent can do.
When asked a question, the OpenAI agent will open a fresh desktop and start trying to answer your question.
For example, I’m feeling the urge to go beach hopping in October. So I asked it to find the cheapest direct flight from South Florida to Hawaii for a seven-day vacation.
After an extensive search, the agent found out the closest direct flight to Hawaii departed from Orlando. And it would cost about $600.
It took the agent about 15 minutes to complete the task. I’ve spent over an hour trying to book a flight before. And I didn’t have to watch its thinking process.
Then I asked it to book the flight, and it would set up the purchase page for me to enter my payment information… Soon, it will make the purchases directly for us.
Welcome to the world of AI-agent to AI-agent interactions. And this is only the beginning.
AI agents are the next iteration of LLMs.
First, the LLMs could answer basic questions. Then came “chain-of-thought reasoning,” which let AIs show their work by breaking complex problems into logical steps. It was the foundation for deep research.
Now they’ve crossed another threshold: reasoning across multiple steps, coordinating with other agents, and acting beyond the original prompt.
That shift has massive implications.
Every time an agent plans its next move, it requires more compute. When it collaborates with another agent to complete a task, the compute requirements multiply.
Michael Intrator, CEO of data center operator CoreWeave (CRWV), which went public this March, recently described the impact:
We have seen a massive increase in our workloads that are being used for inference…the infrastructure that we’re building has increasingly been used for chain of reasoning, which is driving a substantial amount of consumption on the inference level.
Now take that chain of reasoning and add multiple agents working together. The demand for compute skyrockets.
Booking a plane ticket may not sound like a complex task that will be needed on a daily or weekly basis. But other tasks will.
For instance, if a software application developer were building a weather app, they could get one by asking an AI, “Build me a simple weather app that pulls live data and displays a forecast.” Then the following would happen…
The orchestrator agent assigns one task to a research agent to find the best API, another to a code-generation agent, and another to a testing agent. It loops the process until the app works.
The same will happen across research tasks, which will demand even more compute than software programming. Already, we see biotech companies use AI to develop new drugs and judge the likelihood of successful drug trials.
That’s not science fiction. It’s already here for some and will soon be available to the masses. And when it arrives, the strain on compute infrastructure will be unlike anything we’ve seen before.
And that means one thing for investors: we’re going to need a lot more data centers, built at a scale few can imagine today.
The evidence of this shift is already here.
Last week, networking giant Cisco (CSCO) made it clear that agentic AI will bring about an entirely new level of strain on global infrastructure. CEO Charles Robbins explained it plainly:
As we move towards Agentic AI and the demand for inferencing expands to the enterprise and end-user networking environments, traffic on the network will reach unprecedented levels. Network traffic will not only increase beyond the peaks of current chatbot interaction, but will remain consistently high with agents in constant interaction.
That’s the critical point. Because they’re only in use when being actively prompted, chatbots cause usage spikes. AI agents, by comparison, will be constantly running on task, billions of tasks a day, causing constant demand… and a lot of it.
And it’s not just networking executives saying this. OpenAI CEO Sam Altman explained what it would take to succeed at a recent dinner with reporters:
You should expect OpenAI to spend trillions of dollars on data center construction in the not very distant future… We have to make these horrible trade-offs right now. We have better models, and we just can’t offer them because we don’t have the capacity.
Notice he didn’t say a trillion. He said trillions. And he was talking about the near future.
That’s the size of the secular trend we’re tracking. Investors are still quibbling over quarterly earnings. Meanwhile, the companies at the center of this buildout are preparing to spend on a scale we’ve never seen before in the coming years.
Morgan Stanley now projects that $2.9 trillion will be spent on data centers from 2025 – 2028. Roughly half of that will come directly from the cash flows of hyperscalers like Microsoft, Google, Amazon, Meta, xAI, Oracle, and a handful of others..
These projects are already underway. A hyperscaler doesn’t casually decide to break ground on a $10 billion facility. These decisions were approved months or even years ago. The spending is locked in.
This is why we don’t get lost in the noise of quarter-to-quarter earnings. The data center boom is a generational investment cycle. And for those positioned in the right infrastructure, semiconductor, and power companies, it will be one of the most profitable trends of our lifetime.
Even in the middle of a secular boom, markets don’t move in straight lines. August and September are historically weak months. A 5% pullback in the indexes would not surprise us here at Brownstone Research. And when that happens, some individual stocks could easily fall 20% or more.
But that’s not a sell signal. It’s a chance to buy great companies at a discount.
We saw this volatility last week with Applied Materials (AMAT). It’s one of the most critical companies in the semiconductor supply chain. AMAT makes the advanced equipment that chipmakers like TSMC, Samsung, and Intel depend on to fabricate their semiconductors.
Last Thursday, AMAT released earnings. The numbers were excellent. The company beat expectations on both sales and EBITDA. Yet the stock plunged 14% the very next day, its worst drop since March 2020.
The reason? Management issued cautious guidance for next quarter. Not because orders are weak, but because of uncertainty around what they can ship into China.
In other words, fear drove the sell-off. Not the actual fundamentals of the business.
That single earnings call dragged the entire semiconductor sector down 3% in a matter of days. Even NVIDIA, arguably the strongest growth story in the market today, suffered its first down week since May.
This is where most investors make the wrong move. At the first sign of turbulence, they panic and sell. They lock in losses and miss the life-changing gains that come from holding through volatility.
Some more “sophisticated” traders hedge by shorting a semiconductor ETF like the SPDR S&P Semiconductor ETF (XSP). Since April, short interest in that fund has grown from just 1% of shares outstanding to nearly 6%.
XSP is an interesting way to play this trend. Its top three holdings include Credo Technologies (CRDO) and Astera Labs (ALAB), which Exponential Tech Investor subscribers are currently up 426% and 216% respectively.
The third position is AMD (AMD), which NFR subscribers are up a more modest 12% in. AMD is a semiconductor powerhouse in AI and is trading at a very reasonable valuation considering its growth rate, gross margins, and incredible free cash flow growth.
But hedging isn’t the solution. It’s a distraction. The real prize is owning the companies at the heart of the AI buildout and holding them with conviction through the noise.
We can’t predict every twist in the market. We don’t know what Fed Chair Jerome Powell will say at Jackson Hole on Friday. That speech could send stocks rocketing higher or cratering lower… or have no impact at all.
That’s the nature of short-term volatility. It’s unpredictable. And in the context of a five-year secular trend, it’s irrelevant.
The five-year trend is crystal clear.
Multiple companies – including OpenAI – are preparing to spend trillions of dollars on new data centers. Networking demand will surge. Semiconductor innovation will accelerate. Power infrastructure will scale in ways we haven’t seen in decades.
This is the secular boom.
Yes, August and September may test investors’ resolve. Yes, some names will stumble. But when we look back five years from now, these dips will look like opportunities to have bought into the foundation of a once-in-a-generation trend.
That’s why we don’t panic if the semiconductor sector sells off because of cautious guidance from a couple of companies.
Remember, as we talked about last week, the earnings season has been very strong. And earnings revisions momentum has surged to levels not seen since 2021.
Businesses are raking in record profits – in a large part due to the AI infrastructure buildout and to companies utilizing AI to increase efficiency and profitability.
And that’s why we keep holding and adding to the companies that will benefit most from the AI data center buildout.
The future is set… trillions in capital investment, exponential compute demand, and AI agents transforming daily life.
For patient investors, that’s the roadmap to enormous gains. And for consumers, life is about to get a lot easier.
Stay disciplined. Stay invested. And remember, we’re just getting started.
Regards,
Nick Rokke
Senior Analyst, The Bleeding Edge
The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.
The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.