- Your future weatherman could be an AI…
- The maker of The Sims is jumping into the NFT gaming space…
- Google just made a power move…
Not only is 2021 proving to be a record year for venture capital financing and initial public offerings (IPOs), but it’s also now a breakout year for mergers and acquisitions (M&A).
Now that the numbers are in through the third quarter of this year, $4.4 trillion of M&A activity has happened in the past nine months. As we can see below, this exceeds the previous full year record of $4.3 trillion back in 2015.
In addition to the deal value, the number of M&A deals is also at record levels. Year to date, over 40,000 deals have been announced. That’s already 24% higher than last year, and we still have three months to go.
The U.S. market has been the most active globally, responsible for $2.14 trillion of M&A out of the $4.4 trillion total.
I doubt Bleeding Edge subscribers will be surprised at all to know that the largest sector for M&A deals is in high tech. The reason is simple. High tech is where the fastest growth is happening, which means that is where the best investment opportunities are found.
And the pandemic has accelerated the adoption of bleeding-edge technology. This is forcing companies to refocus, restructure, and make strategic acquisitions to better position themselves for this accelerated pace of technological advancement.
Also interesting is that special purpose acquisition corporations (SPACs) have already reached record levels this year. Deals have totaled almost half a trillion dollars already. SPACs are considered M&A activity because they are reverse mergers that enable private companies to go public. And of course, the most active sectors in SPACs have been high tech and biotech.
What’s causing all this excitement? Several things are happening in parallel:
Near-zero interest rates, which drive private equity looking for higher returns.
An abundance of capital or dry powder – about $2 trillion – looking for a place to invest.
An economy that was so strong in 2020, it shrugged off the pandemic and just kept going.
An acceleration of technological advancement that is requiring companies to restructure and acquire assets for future growth.
The fear that interest rates and taxes are going higher, which will make for less desirable conditions for M&A.
It’s the last point that presents the largest risk for our economy and the markets. If an egregious tax regime were to be put in place, coupled with a rapid increase in interest rates, I’m afraid we’re all in trouble.
Everyone loses in that scenario. That’s what we’re going to be keeping an eye out for…
DeepMind is taking on the weather…
Google’s U.K.-based artificial intelligence (AI) division DeepMind just revealed its next application for AI: the weather.
As a reminder, DeepMind’s initial claim to fame came when it mastered the game of chess. That turned out to be far too easy of a task. So DeepMind set its sights on the incredibly complex game of Go.
What made Go such a challenge is that there are more possible outcomes than atoms in the universe. This sheer scale of outcomes makes it impossible to calculate every possible outcome.
So it had to use pattern recognition and infer the best possible moves. And DeepMind’s AlphaGo AI crushed the world’s grand champion of Go.
That led DeepMind to set its sights on massive multiplayer games. So it created AlphaStar, which is the AI that achieved “grandmaster” status in the game StarCraft II. AlphaStar demonstrated that it could beat 99.8% of all human players. Again, this was a task that proved to be all too easy.
And most recently, DeepMind rolled out AlphaFold, which is quite possibly the biggest scientific development of the century. AlphaFold can accurately predict the folding of a protein with 92.4% accuracy. This will lead to an onslaught of new therapeutic developments to cure diseases.
So DeepMind already has an incredible track record of using AI to solve complex problems. And now it is taking on the weather. This is possibly the toughest challenge yet.
Weather is so difficult to model because factors on one side of the planet contribute to weather conditions on the other side. No matter how nuanced, even small events affect each other on a planetary scale.
So DeepMind is starting small. Its first task is to master the ability to predict rain within one or two hours in a given area. DeepMind calls this “nowcasting.”
Here’s a visual of how it works:
As we can see, DeepMind feeds national weather data from the past 20 minutes into its deep neural networks. The AI processes the data and then predicts exactly where and when it will rain over the next 90 minutes.
And this model has already proven to be accurate 90% of the time. That makes it the best tool we have for predicting rainfall. This prompted DeepMind to publish a big paper in the scientific journal Nature explaining how it works.
This is a great start. And what gets me excited is that DeepMind now has a clear path forward.
One of the biggest challenges with regards to the weather is that it has been impossible to process all the data on a planetary scale. The datasets are just too big for classical computers to handle.
Yet here we have powerful neural networks that are successfully processing weather data on a local level. Now we simply need the computing power to scale this up to something that can model weather patterns all over the planet. It’s just a matter of time.
The semiconductors that will power the supercomputers are being designed and built now. And with the advent of quantum computing, which is also growing at an exponential speed, we’re going to be able to tackle problems that have been impossible to even attempt.
And once we are there, we will get some incredible insights into how Earth’s ecosystems work together and impact each other. That could inform sound decision-making on how we can be good stewards of our planet.
A legendary game designer just set his sights on NFTs…
I doubt many readers will be familiar with the name Will Wright. But I suspect more than a few will recognize his popular video game The Sims. It was a genre-defining game that originally launched back in 2000. It’s since sold nearly 200 million copies worldwide.
What made The Sims so popular was that it was a life simulation game. Players created an avatar and entered a world in which they built a digital life for themselves by making decisions and interacting with the people and objects around them.
To the industry’s surprise, Wright disappeared after creating just one more simulation-based game in 2008 called Spore. He didn’t follow that up by working on any next-generation games. Until now…
Through his company, Gallium Studios, Wright is now developing a game called Proxi. And the concept is fascinating.
The game empowers players to create their own environments by describing memories. The players start with a blank space and then fill it with memories, each creating its own “proxi.” Here’s a good visual of how it works:
Source: Gallium Studios
Here we can see that the player’s memory created a snapshot in time within the virtual world. And it doesn’t stop there.
Players create a conceptual map of many different memories, each with its own proxi. And then players can interact with each other and their proxis as they see fit.
And here’s the kicker – Wright is partnering with blockchain gaming company Forte Labs to make Proxi a blockchain game that uses non-fungible tokens (NFTs).
In fact, each memory-created proxi will be an NFT that is wholly owned by the player. What’s more, many of the items and decorations created in the proxi will also be NFTs. And players will be free to buy, sell, and trade these NFTs any time they want. This enables players to earn real money from their efforts.
This is certainly a unique concept. And given how category-defining The Sims was, I think Proxi is absolutely a game to watch. It could even be an accelerant for the adoption of NFT gaming applications.
Given how quickly new technology and games are adopted now, if Proxi goes viral, it could have more players in a year than The Sims had in two decades.
The game is set to launch before year-end. I’m very curious to see how it is received and what the gameplay is like.
And in the meantime, if any readers are interested in exploring the NFT trend, I recommend going here for more information.
Google Search is getting a major upgrade…
We have talked about great alternatives to Google’s search engine quite a bit recently. And Brave – the browser we have called the “Google killer” – just launched its own search index back in June. That was a big step toward bypassing Google.
Well, it seems Google is not going down without a fight. The tech giant is making a major upgrade to Google Search’s capabilities.
Google has been working hard on a new language model called MUM. That stands for Multitask United Model. MUM is 1,000 times more powerful than the language model Google was using previously.
Applied to Google Search, MUM will be able to infer what types of information would be useful to the person conducting each search. Then it will instantly scour the internet for bite-sized pieces of useful information and provide that to searchers in a “Things to know” section at the top of the screen. Here’s a visual:
“Things to Know” Search Results
Here we can see that somebody searched for acrylic painting. Before getting to its normal search results, Google used MUM to generate the “Things to Know” section. We see that this section can answer specific questions that somebody interested in acrylic painting may have.
So if the person conducting the search was only interested in how to clean acrylic paint, they could click that dropdown to get specific information on how to do so. This saves them the time it would take to go through the normal search results, click links, and then read through the articles to find information on cleaning.
As regular readers know, I’m not a fan of Google’s business practices. But objectively, this is a fantastic feature. It puts useful snippets of information right at a user’s fingertips.
What’s more, MUM can prompt users to make their search more specific if the model determines that it is too broad.
For example, let’s say someone searched for “tomato gardening.” That’s a very broad search. MUM may come back with a list of prompts:
Are you searching for growing techniques?
Are you searching for information on fertilizers?
Are you looking for natural pesticides?
Clicking one of those prompts would allow MUM to provide more targeted information for the searcher. That will be very useful and a great time saver.
So Google just made a power move here. The company will deploy MUM by year-end, so it won’t be long until we see it in action.
Editor, The Bleeding Edge
Like what you’re reading? Send your thoughts to [email protected].