• Google Brain’s freakish text-to-image AI …
  • Will we all have “souls” in the future?
  • This startup is taking a shot at DoorDash…

Dear Reader,

Never before have so many eyes been focused on the actions of the Federal Reserve. With market volatility and government-induced inflation, the markets look with both fear and hope about what the Fed may or may not do.

Earlier this year, the consensus was that the Fed would conduct 11 rate hikes throughout the course of 2022. Since late last year, I’ve maintained that was never going to happen.

And the latest rumblings are becoming far less hawkish.

It’s certainly possible that the Fed pushes up rates by another 25 or 50 basis points during the June and July Federal Open Market Committee meetings. But if it does, I’ll go so far as to predict that it will only end up pivoting by September and start injecting stimulus into the economy again, and even drop rates.

To put things in perspective, I had a look at the actions of the Federal Reserve over the last decade. In particular, let’s look at a rolling chart of the S&P 500 percent change year over year. Put more simply, the below chart shows what percent the S&P 500 is, over or under the performance of the S&P 500 compared to a year ago:

Click to Enlarge

What we find is very useful.

In 2015, we can see how the chart dropped into negative territory. Fed Chair Yellen had been openly signaling her intent to start raising the Fed Funds rate after it had been kept at zero since the end of 2008.

Zero-interest rate policy (ZIRP) was the aftermath of the global financial crisis… that was ironically caused by the Federal Reserve, government policies, and investment banks.

Yellen held off on raising rates when the S&P 500 went negative year-over-year. She did the same later in 2016, raising rates only by 25 basis points in late 2015, further delaying any hikes to calm the markets.

Starting in 2017, Chair Powell went on an aggressive course to raise interest rates to 225 basis points over the course of 2017 and 2018, only to crash the stock markets in the fourth quarter of 2018. 

And what did he do when the year-on-year returns became persistently negative? He quickly reversed course and dropped rates back down to zero in even less time than it took to raise them!

Of course, in the spring of 2020, the markets crashed again due to policy and politics around the pandemic. When the year-on-year returns turned strongly negative, the Fed stepped right back in and started flooding the country with trillions of dollars of stimulus. Party time!

And here we are again. 

With all of the aggressive talk, we’re already in June and the Fed Funds rate is just at a measly 75 basis points. It’s basically free money for those that can access it. But the year-on-year S&P 500 returns have dropped into negative territory again.

And that’s how we know something is coming.

Powell is going to pivot again, and it will happen in the second half of this year. I believe that we’ll see it by September at the very latest. We can definitely expect stimulus and an expansion of the Fed’s balance sheet, and I believe that we’ll also see a reversal, of course, in the Fed Funds rate as well.

Short term, this will be good for the markets, which have been expecting exactly the opposite. Long term though, we’re eventually going to have to pay the piper.

Google is going head-to-head with OpenAI…

Google just made another major development in artificial intelligence (AI). And this one seems to put the tech giant in direct competition with OpenAI…

Regular readers may remember OpenAI’s text-to-image generator, DALL-E. It’s an AI system that can generate photo-quality images from scratch, simply using text input.

For example, someone typed in “Teddy bears mixing sparkling chemicals as mad scientists, steampunk,” and here’s what the AI produced:

DALL-E-Produced Image

Source: OpenAI

It’s amazing to think that an AI produced this image in a split second from its own “mind.”

But it turns out OpenAI isn’t the only one building such a powerful system…

Google Brain – another AI division within Google (separate from DeepMind) – just released what it calls “Imagen.”

It is Google’s text-to-image generator. And it’s even more advanced than DALL-E.

Check it out:

Imagen-Produced Graphic

Source: Google

This looks like a photo somebody snapped from their phone… But it’s entirely AI-generated. Google Brain created it using this text: “A blue jay standing on a large basket of rainbow macarons.”

Here’s another great example…

Another Imagen-Produced Graphic

Source: Google

This image came from the text, “A photo of a raccoon wearing an astronaut helmet, looking out of the window at night.”

It’s incredible to see the AI work from such a specific – yet complex –  request. And again, this almost looks like it could be a real photo.

As much as I dislike Google’s business practices, I must give them credit here: Imagen outperforms OpenAI’s DALL-E head-to-head. In fact, it’s better than anything else we have seen.

This progress is indicative of how quickly advancements are happening with deep learning and neural networks. The team at Google Brain found a way to effectively amplify the quality of the images, using super-resolution models that add detail and clarity.

On a practical level, this technology could empower the graphic design industry in a way that’s never been possible before.

Gone are the days of spending days, weeks, and months producing designs. Instead, an AI can do the work in seconds. It just needs a description from a designer, and then it can produce a number of images that can be used or edited.

And I’m sure there will be some fun consumer applications that leverage this kind of technology. It’s easy to envision all kinds of social media applications being built from this. Consumers would go crazy over the ability to instantly create GIFs for social media that are specific to something that is happening.

And when that happens, readers of The Bleeding Edge will be among the few who understand how it works behind the scenes.

Ethereum’s founder has an interesting vision for the future of society…

Vitalik Buterin, the founder of top-tier digital asset Ethereum, just co-authored a new research paper. And it’s a major thought piece that envisions a new use case for blockchain technology.

The paper is titled “Decentralized Society.” Some are referring to it now as “DeSoc.”

The core idea here is that every individual has unique credentials and experiences that are a representation of who they are and what they have accomplished.

However, in the world we live in today, there’s no mechanism for documenting and validating our key credentials. We can list these things on a resumé, but there’s no way to know if they are true without an extensive amount of effort.

The DeSoc vision solves this problem with what it is calling “soulbound tokens” (SBTs). These are non-fungible tokens (NFTs) that serve as certifications for an individual’s past accomplishments.

For example, we would each receive an SBT when we earn a bachelor’s degree. We would hold that token in our digital wallet, which is our “soul.” Then, anyone who wanted to verify our bachelor’s degree could simply look at our corresponding SBT, which would be issued by the university that we attended.

Of course, the same thing would happen for master’s degrees and all other forms of higher education. And this is just the tip of the iceberg…

I have talked about the future of work in these pages before. It’s clear that we are on the cusp of a massive shift. The paradigm of earning a degree in a specific field and then working our entire career in that field is deteriorating.

Instead, we will each work on many different projects in different fields throughout our lives. And that means we will need to constantly train, retrain, and learn new skills.

In most cases, this will happen by completing specialized courses and programs. The days of going back to school full-time to change careers are over. Technology is advancing far too rapidly for that traditional construct to be viable much longer.

And with this increasingly complex landscape, we will need a way to quickly document and verify all those specialized courses and programs that we have completed. That’s what SBTs could do for us.

When a tech startup is looking for developers, it can look at each candidate’s portfolio of SBTs to gauge each person’s skill levels.

We can think of this as a “proof of work” system. It’s a way to prove that we have done the things we say we have done.

I see this as a fantastic idea… if used in the right way.

There is a risk that is deeply concerning, however. Authoritarian governments could co-opt this technology and convert it into a China-style social credit system.

Instead of using it as a proof of work system, governments could use it as a “proof of compliance” system. And then they could cut off access to civil services for people who haven’t done all the things the government says they should do.

For example, there are reports of Chinese citizens having their scores docked because they voiced displeasure with a certain government program or initiative. There are other cases of people getting knocked because they jay-walked instead of crossing the street at a crosswalk.

So we are walking a very fine line here with this idea. It’s all about how we use the technology.

I would love to hear what readers think about this kind of vision.

Is Vitalik’s “DeSoc” something that you think could be useful? Write to me with your thoughts right here.

This company is set to disrupt DoorDash on last-mile deliveries…

We’ll wrap up today with big news on the autonomous delivery front.

Refraction AI just revealed that fast-food giant Chick-fil-A is deploying a fleet of Refraction’s autonomous delivery vehicles for two of its restaurants in Austin, TX.

And make no mistake about it, this is a direct shot at DoorDash.

We profiled Refraction AI way back in July 2020. At the time, the company was making great strides with its REV-1 robots. These are autonomous three-wheeled vehicles designed for last-mile deliveries.

Here’s a look at one in action:

Refraction’s Autonomous Delivery Vehicles

Source: Refraction AI

We can see that the technology is fantastic. The REV-1 is perfect for deliveries within a three-mile radius of any store or restaurant.

And it’s not just the tech that’s setting Refraction AI up to disrupt DoorDash. Just as important is the business strategy…

DoorDash operates as a middleman. When consumers place an order for delivery, DoorDash matches them up with a delivery driver who will pick up the food and bring it to their doorstep.

This is certainly a valuable service, but it adds costs to both consumers and restaurants. DoorDash takes about 15% of every order in commission. That means less revenue for restaurants and higher prices for consumers.

What’s more, DoorDash deliveries tend to take a while.

DoorDash has to route each order to its pool of delivery drivers, and there’s no guarantee that any of them will be close. And most drivers are juggling multiple deliveries. As a result, DoorDash deliveries can often take 45 minutes to an hour.

That means the food has cooled down by the time consumers get it, which reduces the quality of the customer experience. And consumers tend to blame this delay on the restaurant, not DoorDash.

That’s why Refraction AI’s REV-1 makes so much sense. Restaurants like Chick-fil-A can keep a fleet of vehicles on location so that deliveries can go out immediately. The goal is for consumers to get their food within 10–15 minutes while it is still warm.

In addition, Refraction AI doesn’t take 15% of each order. Instead, it works out a separate licensing deal with restaurants. That model is preferable for restaurants operating at scale like Chick-fil-A.

And consumers don’t need to tip their REV-1 delivery “drivers.” They couldn’t even if they wanted to. This means less cost and friction for consumers.

So Refraction’s business model makes sense from a number of angles. I think it has serious potential. We’ll continue to track this company very closely going forward.

Regards,

Jeff Brown
Editor, The Bleeding Edge


Like what you’re reading? Send your thoughts to [email protected].