• A glimpse of Apple’s upcoming car technology…
  • Interior designers will love this technology…
  • The hottest tech topic in the second half of 2022…

Dear Reader,

We’re in for a tough week.

I wish it weren’t the case, but what has transpired in the last three days is pretty remarkable. The bad news kicked off with the consumer price index (CPI), which surprised the markets in the wrong way:

This is the worst reading that we’ve seen since December of 1981. That was so long ago, I was a teenager, and I didn’t even know what the CPI was at that time.

The prevailing expectation from the “experts” was that the Federal Reserve’s aggressive talk would reduce demand and inflation would start to decline. This is often referred to as demand destruction. Another common refrain is one of ‘the cure for high prices is high prices.’

The problem is that these aren’t normal market conditions. Never before have we seen the kind of money printing that we’ve seen in the last two years. And inflation, at its core, is a monetary phenomenon.

The unexpected spike in the CPI has led to something much more worrisome. The mortgage-backed security (MBS) market went “no-bid” on Friday. That means that there were literally no buyers for MBS. 

The last time I remember this happening was back in 2008 amidst the financial crisis… Not good.

For the last couple of years, I’ve been strongly encouraging my readers to make sure that they lock in a 30-year fixed-rate mortgage. It’s about the single smartest thing that we can do in an environment with irresponsible monetary and fiscal policy. It protects against inflation, and we get to pay back our mortgage with devalued currency.

Today, it’s easy to understand why that was such a smart move. A friend thanked me for the advice this morning, as he locked in his 30-year at 2.875% last fall. Brilliant. 

Even as recently as January of this year, we could lock one in the low 3% range. Now, rates are going to spike above 6%, and anyone on a variable rate loan is getting crushed right now.

If that wasn’t bad enough, the University of Michigan Consumer Sentiment Index, not surprisingly, hit its lowest level in recorded history.

And digital assets are feeling the pain as well. Bitcoin and Ethereum are being punished and broke key support levels.

What’s next? All eyes are on the Federal Open Market Committee (FOMC) meeting this Wednesday/Thursday. And the government is in a very tight spot right now. 

If it raises rates aggressively, it will cause far more economic harm to everyone. And higher rates would result in an implosion of the housing market.

Or it can step in and engage in some form of quantitative easing to ease the pain. Back in 2008, the Fed’s answer to the mortgage-backed securities (MBS) market going no-bid was to step in and buy all the MBS it could get its hands on. It printed money and expanded its balance sheet to get out of the crisis. 

That actually worked. It’s not good in the long term, but it helped to address the short-term problem.

The real question is, when will the government choose to pay the price for its foolish policies? Will it happen in the second half of this year? Or will it be pushed out into 2023?

Both scenarios are possible, but I believe the pain would be too great to let the housing market fail. Sometime between now and September, the Fed is going to have to step in to accommodate to avoid an all-out collapse.

The feeling I have in my gut today reminds me a lot like what I felt back in 2008. I remember the day when the market bottomed back then. I wasn’t an analyst back then, so it was easy for me to stand up, step away from the computer, and go for a walk. These bouts of market panic never last.

A sneak peek at what Apple’s self-driving cars will look like…

Apple’s annual Worldwide Developer Conference (WWDC) took place last week. And as always, it did not disappoint.

WWDC is an event Apple hosts every summer to share its latest tech with its developer community. These are professionals who create software applications that run on Apple’s hardware.

And today, we’ll discuss three of the most exciting revelations from Apple’s big conference…

To start, one of Apple’s big announcements last week centers around the next generation of its CarPlay system. This gives us an idea of what Apple’s been up to with its automotive strategy.

And Apple is taking it far beyond simple infotainment.

I suspect many readers will be familiar with CarPlay. It allows an iPhone to connect to a car’s user interface system. This feature is included in about 98% of all new cars produced today.

Here’s what the new and improved CarPlay looks like:

The New CarPlay System

Source: Apple

Here we can see that the next generation of CarPlay basically takes over the car’s entire control panel. To me, this makes perfect sense.

Most cars come with a rather clunky display system. For the carmakers, it’s largely an afterthought.

That’s where CarPlay comes in. It can replace the standard display with a clean and sleek design that is simple to use. This is what Apple is great at.

And this gives us a window into what Apple’s own self-driving car will look like on the inside. We can expect that it’s going to be as simple to operate as an iPhone. That will make it very easy for consumers to get acclimated to it.

Apple has already partnered with some automotive manufacturers for its new CarPlay system. We can expect to hear about these partnerships in the second half of next year. Given the timing of the announcement, we could see this new concept emerging in production cars as early as the 2024 model year.

And this is just the beginning of Apple’s foray into the automotive sector. These developments are going to lead to incredible investment opportunities as Apple reveals its new car. To learn more about how we’re going to position ourselves, simply go right here to see a recent presentation I gave on this topic.

More breadcrumbs on Apple’s AR front…

Continuing our theme on Apple’s WWDC, many were hoping that Apple would unveil its upcoming augmented reality (AR)/virtual reality (VR) headset at the conference.

That didn’t happen. There were some rumors that the current supply chain issues in China, exacerbated by the lockdowns in Shanghai, put production at risk. Given everything that we’ve seen in the first half of this year, that would make perfect sense.

However, software development wouldn’t be affected by these things, and Apple did reveal some compelling AR features. These announcements give us strong hints at the direction Apple is going with its approach to AR.

And one of the fantastic features Apple revealed combines AR software with lidar technology. This is the same 3D depth-sensing technology offered in the iPhone.

Pairing AR with 3D sensing enables something Apple calls the Swift API Roomplan. This tech can automatically generate a 3D map of a room.

Using just an iPhone, we can walk through our homes and create a digital 3D floorplan. And it’s not just the space. The 3D map identifies objects and their dimensions.

Here’s a visual:

3D Roomplan

Source: Apple

Of course, this has immense implications for interior design and home renovations.

Instead of photographs and hand drawings, Apple’s Swift API Roomplan would allow designers to map out homes with a level of accuracy that was not possible before. And it can be done in a matter of minutes. 

This technology will enable better, more precise designs for renovation projects, and it even has useful applications for shopping for home furnishings. Once a room is mapped, new design templates can be augmented into a room so that a consumer can see what the redesigned room will look like.

In addition, these 3D maps are perfect for training household robots.

We talked about Dyson’s foray into home robotics earlier this month. The big takeaway is that Jetsons-style robot butlers are closer than most people realize. And if these robots can ingest an accurate 3D map of their home, it will greatly improve their ability to navigate from room to room.

Lastly, this tech will be even more important once Apple’s AR headset comes out.

The combination of artificial intelligence, with the LIDAR technology in our iPhones and ultimately the AR eyewear, will enable all sorts of functionality. Apple’s AR eyewear will be able to map out any environment and sense depth for any object. 

The most obvious application outside of commerce and design would be gaming. Mapping out an environment enables software applications, like games, to insert virtual objects and creatures into our field of view. A simple example would be playing Pokémon Go through AR eyewear instead of using our phones.

Or imagine wearing Apple’s new AR eyewear to a sporting event where we can see a data overlay just like we see when we’re watching an event on TV. There could be gaming and social media overlays, as well as cool interactive features like being able to see metrics like how far a ball was kicked or thrown – all in real-time.

Feature announcements like this are like breadcrumbs that give us obvious hints about what the future product is going to look like.

The iPhone can now translate foreign languages…

We’ll wrap up today with yet another cool feature revealed at WWDC last week. This one will also enable some great AR applications…

Apple just unveiled the iPhone’s auto-translate feature. This allows users to hold their phone’s camera over any text, and the camera app will instantly translate it into the user’s native language.

Here’s a visual:

Translation by Phone

Source: Apple

Here we can see the original text in the images on the left. And the image on the right shows us the text translated into English. This all happens right within the iPhone’s camera application.

As we can imagine, this feature could be invaluable for anyone traveling internationally. It will allow travelers to hold their phone up to a sign, a menu, or written directions in another country and immediately be able to read what they say.

That said, we might wonder what this has to do with augmented reality…

Well, this feature will almost certainly be integrated into Apple’s AR headset product. And think about what can happen if the forward-facing camera can read and understand text.

I can envision walking into a café and looking up at the menu on the wall. The AR headset would be able to ingest the items listed and display nutritional information right in the lenses.

From there, we could order what we want directly through the headset. It would relay that order to the café and then pay for it using Apple Pay. And all this could happen in seconds. We would never need to take our phone or wallet out of our pockets.

I know that something like this might feel a bit odd to some of us, but the world is on the verge of becoming a lot more interactive through the use of this kind of technology.

And the younger generations are going to quickly gravitate toward this kind of technology. It disintermediates human interaction and puts transactions at the tip of our fingers or with the blink of an eye. 

Apple just showed us all the pieces of its AR system. All the functionality is in place. And now the developer community has everything it needs to create some amazing AR-based apps.

The only thing Apple didn’t demonstrate at WWDC last week was the actual AR headset. Whether or not Apple’s contract manufacturers will have enough control over the supply chains will determine if the AR eyewear is ready by the fall or early next year. Either way, it’s right around the corner, in the near future.

And as we begin seeing these bleeding-edge headsets appear around us, we are going to do our best to profit from this rising trend. And I see one easy way how – without even buying shares of Apple (AAPL)… 


Jeff Brown
Editor, The Bleeding Edge

Like what you’re reading? Send your thoughts to [email protected].