The Next Phase of AI Begins

Colin Tedards
|
Jul 19, 2023
|
Bleeding Edge
|
3 min read

Dear Reader,

This week, Microsoft unveiled its AI add-on to its Office365 suite of products like Word, Excel, and PowerPoint.

Known as Microsoft Copilot, it’s a $30 per user per month upgrade. Microsoft is starting off by targeting enterprise accounts.

365 Copilot uses ChatGPT to answer emails, summarize meetings and create PowerPoint presentations.

Adding Copilot raises the cost of the Office365 subscription by 83%. But given everything we know about the capabilities of generative AI, I wouldn’t be surprised if we see plenty of customers upgrade. And that could have important implications for the stock.

Last year, Office365 accounted for about 32% of Microsoft’s revenue. If just a third of Office365 users upgrade to Copilot, Microsoft will be able to grow total revenue by 8%.

That’s why shares reached an all-time high of $365.

To me, this is an illustration of an important dynamic in the adoption cycle of AI.

The first stage of the AI cycle was focused on hardware makers. Afterall, companies like Microsoft, Google, Meta, Amazon, and Tesla have spent billions on chips and other components to build and train their AI models.

This is why Nvidia (NVDA) is up 220% this year alone. And it’s why I continue to like AMD (AMD) as a hardware play on the AI trend.

But hardware is only one aspect of this adoption cycle. After all, companies aren’t spending billions of dollars on AI hardware without a plan to monetize.

AI Is Expensive

ChatGPT is estimated to cost $700,000 per day to run. It breaks down to about $0.36 per question… or about $255 million per year just in operating costs. By comparison, each Google search cost just over 1 cent.

That’s on top of the $13 billion Microsoft has funded OpenAI with.

Microsoft isn’t alone in spending heavily on AI.

Meta’s been developing its AI Research SuperCluster since 2020. The supercomputer has over 16,000 GPUs at an estimated cost of $240 million.

This supercomputer is helping Meta to develop its large language model AI, Llama 2.

Elon Musk’s xAI bought 10,000 GPU’s at an estimated cost of at least $150 million.

Google invested $300 million in Anthropic, an OpenAI rival.

These costs are just the beginning. All told, each of these companies will end up spending billions on AI to staff, operate, and upgrade the technology.

That means AI developers are going to have to have to monetize. And Microsoft’s release of Copilot tells us where we’re at in the monetization phase.

AI’s Payoff

Microsoft announcing its paid Copilot add-on this week makes perfect sense. On July 25, it’s going to announce quarterly earnings.

I expect them to tell investors that they’ve sunk even more into AI development costs than what we know about today.

If Microsoft didn’t have an answer for how they’re going to monetize AI and offset these costs, they risk analysts downgrading their stock and seeing a sell off.

Right now, Copilot is an option. But in the next year or so, I expect Microsoft to make it standard for Office365. That move would boost its revenue by $52 billion − 24% higher than last year.

That’s a massive opportunity for Microsoft − and it more than justifies their billions in investment costs.

I expect Google to follow suit with a paid AI service in the coming months. Meta is taking a different approach with its open-sourced Llama 2 AI. In a future letter, I’ll cover exactly how Meta could monetize its AI.

The important thing to understand now is that we’ve just moved into the second phase of AI mass adoption. That means the focus will shift from just hardware companies like Nvidia, to software companies that are finding ways to monetize this technology.

Regards,

Colin Tedards
Editor, The Bleeding Edge


Want more stories like this one?

The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.