• We just got new insight into the nature of our universe…
  • We’re getting closer to Star Trek’s universal translator…
  • Mojo’s AR contact lenses are coming along fast…

Dear Reader,

BNPL has become popular lexicon in the world of e-commerce. Short for “buy now, pay later,” it allows consumers to purchase products and pay for them in increments over some specified period of time.

It’s easy to understand why this would be attractive. Consumers are able to purchase products now – products that they would normally have to wait for, save up, and buy in the future. And retailers are effectively able to pull forward consumption. They do this by offering attractive terms like “zero-interest payments” over a series of installments.

An Example of BNPL

Buy Now Pay Later payment plans

Source: Affirm.com

This reminds me of the old-style financing deals that used to be offered by home appliance retailers or furniture stores. To take advantage of the terms, we had to fill out a lengthy credit application and get approved for the offer.

No more. BNPL companies have become pure technology plays… No applications required.

I’ve tested a few different platforms myself. It’s seamless. Just a couple more clicks, a few seconds for the approval, and we’re done. 

The process slides right into a normal e-commerce checkout flow. We hardly know it’s there, and it’s immensely useful when we need it.

BNPL is made possible thanks to machine learning technology. In seconds, a form of artificial intelligence is able to assess a consumer’s creditworthiness by accessing a suite of data related to the consumer, which even includes their social media information, to determine their risk profile and whether or not they should proceed with the “loan.” No human underwriters are required.

Businesses like this are always attractive to me. It’s a service that many consumers will need from time to time, it’s tech-enabled, and it doesn’t require additional human underwriters to scale. So naturally, I follow companies active in this space.

Affirm Holdings (AFRM) was one such company that went public early last year. It was growing like a weed, but there were two things I didn’t like – The gross margins were in the low 20% and the valuations made no sense.

The company was losing money, and it had negative free cash flows, yet it was trading at a valuation of 23 times annual sales in the first quarter of last year. By the third quarter, its valuation had jumped to 33 times sales and a $32 billion valuation. Insane. 

It was one of those companies that I knew would collapse. And not surprisingly, it has fallen more than 90% from its high last year.

And then there’s Square, now known as Block, which was already active in the space in the U.S. market. It stepped up and purchased an Australian BNPL player, Afterpay, for $29 billion last August. The acquisition was designed to strengthen the company’s international presence and further extend into the consumer markets.

Which leads me to Klarna, the giant in the space – or at least it was. Klarna had a remarkable run as a private company over the last five or six years, and was valued at an incredible $45 billion last summer. 

That valuation, however, was always suspect… as none other than SoftBank was behind the valuation. SoftBank is notorious for muscling into deals by paying absurd valuations in a frantic attempt to deploy capital as quickly as it can. It’s a strategy that has not proven to work well for the company.

And Klarna is a perfect example. We’ve just suffered through the worst first half of a year in the stock markets in more than half a century. Valuations for publicly traded growth companies were severely compressed, even more so for those companies that have negative free cash flow.

The private markets have now caught up. This is especially true for later-stage, high-growth private companies that are not yet generating free cash flow – like Klarna. Just this month, the company is raising a new venture capital round at just a $6.5 billion valuation. That’s an 86% decline from last year’s valuation.

The drop in valuation is almost exactly what we saw with publicly traded Affirm. This may sound counterintuitive, but I see this as a good thing. The valuations that we’re seeing now in both the public and private markets are more consistent with what we see around market bottoms. 

The froth is definitely out of the private markets now as much as it is in the public markets… And that means that growth is ahead.

Digging deeper into quantum physics…

It’s 2022 and we’re still discovering what our universe is made of. And the latest exciting discovery was made in the world of quantum physics at the Large Hadron Collider (LHC) in Switzerland.

We last talked about the LHC back in February.

As a reminder, this is a massive underground particle collider shaped in a loop that runs for 27 kilometers. It is used to smash ions into one another, in a controlled environment, in an effort to discover new particles of matter. This machine allows physicists to run experiments in similar conditions to the origin of the universe.

Here it is:

The Large Hadron Collider

Hadron Collider by CERN

Source: CERN

Running simulations in the LHC allows researchers to study the structure of matter and quantum mechanics – how particles interact with one another. And this allows us to better understand the nature and origins of our universe.

And here’s the big news… The LHC just discovered three new particles: one pentaquark and two tetraquarks.

Here they are:

New Particles

One pentaquark and two tetraquarks

Source: CERN

These are absolutely tiny, ephemeral exotic particles related to all matter as we know it. Only until just a few days ago, we didn’t know that they existed.

As a refresher, matter is composed of atoms. Inside each atom is a nucleus. And inside that nucleus are protons, neutrons, and electrons. The fundamentals of quantum physics revolve around these subatomic particles.

Well, the LHC captured a glimpse of these pentaquarks and tetraquarks within a proton. They appeared for just thousandths of a second and then disappeared. Without the LHC, we wouldn’t have been able to detect them.

This concept of matter appearing seemingly out of nowhere – and disappearing in such an infinitesimally small period of time – is absolutely fascinating. This research is relevant to the origins of the universe and the Big Bang Theory, which posits that matter filled what was essentially an unstable vacuum, creating an extremely dense, high-temperature environment from which the universe expanded…

The Big Bang Theory

The Big Bang Theory graphic

Source: Space.com

The Big Bang Theory is one of the most popular cosmological models to explain the origins of the universe. And while that might not be as relevant and interesting to everyone, understanding the fundamental nature of matter, what we’re made up of, and how particles interact with one another, is very relevant in our daily lives. That’s why this research is so important.

The LHC was just recently turned on again for experimentation after several years of receiving an upgrade. More exciting discoveries will be made over the next three years or so before the LHC is taken offline again for its next round of upgrades.

We’ll have a lot to look forward to over the next few years between the discoveries here on Earth with the LHC, and out in space with the James Webb Space Telescope.

No language left behind…

Meta (formerly Facebook) just released a new artificial intelligence (AI)-based language model. And it’s quite ambitious.

Meta’s project is called “No Language Left Behind.” The AI can now directly translate 200 different languages, including many obscure ones that aren’t spoken by very many people.

The overarching goal here is to preserve all languages and dialects. But the implications are massive…

We talked about Google Translate just a few months ago. Google’s AI model can now accurately translate 133 different languages. Now we have Meta reaching even farther with 200 languages.

These advancements are moving us toward a universal language translator… something out of Star Trek. This kind of software is bringing us closer to a world where everyone can understand each other regardless of what language we speak.

Obviously, this is great for Meta’s metaverse. It would enable people from all over the world to interact with each other within Meta’s virtual world – without any language barriers whatsoever.

It’s also easy to envision this software working with augmented reality (AR) glasses.

The wearer of the glasses could listen to any spoken language and hear the words in their native tongue as the glasses translate. The glasses will have a small microphone to capture the audio, and a small speaker on the temple of the glasses near the ear to “speak” in a native language.

If we take this one step further and add computer vision to the mix, the AR glasses could also translate written text in real time. This would allow the wearer to read signs and menus in a language that’s foreign to them.

And there’s a nuance here that makes Meta’s model even more powerful. It can translate between each language directly. It doesn’t need to revert to a base language to make it work.

Here’s what I mean…

Many translator services use English as their base language. These services translate the words into English first. Then they translate English into the desired language.

This makes for an easier model – But there are two problems…

The first is that there’s a delay involved. Because there are two steps in this process, it takes a few moments to get the final translation. That can be awkward in real-time situations.

In addition, the final translation is often less accurate. That’s because certain words can get lost in translation when going through this two-step process.

So the ability to translate languages directly into each other speeds up the process of translation and ultimately provides greater accuracy. That makes Meta’s AI language model incredibly powerful.

We can expect to see some big developments around this in the months to come.

The next generation of augmented reality is advancing…

We’ll wrap up today with an exciting update on Mojo Vision. We’ve been tracking this company for about a year now.

As a reminder, Mojo Vision is developing AR-enabled contact lenses. And the latest development is that the first prototype has been “productized.” It is going through human testing right now.

It’s somewhat ironic… We haven’t yet seen the launch of a mass-market AR headset, and we’re already talking about AR-enabled contacts – the next generation of AR. This tells us just how much opportunity there is here.

And let’s look at Mojo’s first product. Here’s a beautiful picture:

Mojo’s Product

Mojo’s lens

Source: Mojo Vision

Here we can see the AR-enabled lens is about the same size as a regular contact lens. But look at all that green around the edges… Those are tiny circuit boards that enable the augmentation, including a tiny micro-battery to power the semiconductors.

The very first person who stepped up to test these contacts was Mojo’s CEO, Drew Perkins. To start with, he’s only wearing one contact at a time for one hour at a time… Best to start slow.

But Perkins is very encouraged by what he’s seen. Mojo’s contact lens sports a micro-LED display with 14,000 pixels per inch… but it’s only about half a millimeter in diameter. That makes it the highest density display ever created.

And right now, the contact lens supports some interesting capabilities.

One feature that I really like is the ability to display a compass in the wearer’s field of vision. This ensures that the person could quickly determine which direction they were facing. If we take this a step further, it’s easy to envision receiving directions to navigate a street, a station, or a highway right inside the contact lens. That would eliminate the need to look down at a screen.

The contact lens can also display text messages right in the user’s field of view. That’s another great feature.

Obviously, this is especially useful when we are driving. We can read our texts without needing to fumble with our phones or take our eyes off the road.

So Mojo Vision’s first AR-enabled contact lens product is certainly on the right track. It will take years before we start to see something like this adopted by the mass market, but there is clearly a path to getting there. The key is further miniaturization of the technology inside the lens.

As we have discussed before, the leading edge of semiconductor design right now is 5 nanometers (nm). Companies like TSMC will roll out 3 nm chips later this year. And we’ll see the first 1 nm chips by 2024.

The key here is that the smaller semiconductor manufacturing design will shrink the electronics that go in items like Mojo Vision’s AR-enabled contacts even further. That means Mojo will be able to fit even more functional electronics into its contact lenses two generations out.

And that means that functional, lightweight contacts like the ones that Mojo Vision have been developing will be enabled in the second half of this decade.

This will be the next evolution of AR technology, after AR eyewear, that we are already beginning to see enter the market today.

Regards,

Jeff Brown
Editor, The Bleeding Edge