• Will 6G’s range be the next tech challenge?
  • How to optimize our warrant coverage…
  • Soon, we’ll use genetic editing to lower our cholesterol…

Dear Reader,

Welcome to our weekly mailbag edition of The Bleeding Edge. All week, you submitted your questions about the biggest trends in technology.

Today, I’ll do my best to answer them.

If you have a question you’d like answered next week, be sure you submit it right here.

But before we turn to our mailbag, I want to make readers aware of a special opportunity coming up…

Right now, we’re seeing a convergence of two forces… and they are impacting a very critical sector of small, early stage technology companies. These stocks are unique because, thanks to the federal government, they have a countdown timer attached to their share price.

And when that timer ticks down to zero… these “timed stocks” can skyrocket – increasing hundreds or even thousands of percent in as little as a day.

And because of this convergence, the profits from these stocks are getting bigger… they’re happening more often… and they’re moving faster than ever before.

This is a space I want all of my readers to know about. That’s why, on Thursday, March 18, at 8 p.m. ET, I will be hosting a special investment summit I’m calling Timed Stocks: Final Countdown.

There, I’ll tell attendees how to identify these timed stocks… and how to start profiting from them right now. I promise this event will be worth your time.

Please go right here to reserve your spot, and then mark March 18 on your calendar.

I look forward to seeing you there.

Now let’s turn to our mailbag questions…

A challenge for 6G…

Let’s begin with a question on the future of sixth-generation networks:

Good evening, Jeff. Although 6G may be the logical progression from 5G, under the currently available technologies, I do not see 6G being a practical alternative with a range of 500 feet.

In my opinion, an alternative solution to the tower needs to be found. For example, a network of low-flying drones. But they would need to be wirelessly powered from base stations. The next technology challenge?

 – Alan H.

Hi, Alan, and thanks for writing in.

To catch up other readers, I wrote recently about how I’ve been researching sixth-generation wireless technology (6G). While the next generation of our networks is still years out, I always make sure to study technology well in advance – before other analysts are even aware of the opportunity.

This enables me to be deeply entrenched in the technology, understand the competitive landscape, and of course know which companies present the best investment opportunities. It’s a key part of how I keep my subscribers ahead of the curve.

And it’s how we’ve achieved gains like 63%, 85%, and 125% on 5G plays in our Near Future Report portfolio and 89%, 187%, and 228% in our Exponential Tech portfolio. (If any readers would like to find out about my top 5G plays right now, go here for the details.)

In my research on 6G, one of the key talking points is the frequencies at which it will operate – namely, terahertz frequencies.

6G deployed over terahertz frequencies will be even faster than the lightning speeds of 5G. We can expect 6G network speeds of at least 10 gigabits per second (Gbps). That’s at least 10 times faster than the speeds at which I have tested 5G around the country on the mmWave frequency bands.

And to achieve this, a network at these frequency levels will require even more base stations than 5G.

With 5G, base stations tend to provide high-performance coverage with a radius of about 700–1,000 feet. With 6G operating at sub-terahertz and terahertz frequencies, base stations will likely have a high-performance range of 300–500 feet.

That means we’ll need at least twice as many base stations as 5G to cover the same geographical area with 6G.

As for your question, Alan, I agree that it is kind of hard to imagine doubling the number of 5G base stations, which are already going to be about five to 10 times more than 4G base stations.

Yet 10 years ago, there were some that thought the small cell architecture was not going to be possible for 5G.

It might be a surprise to learn that in some places, we already have wireless networks with base stations that are within 500 feet of one another. I lived in Japan for the better part of 20 years, and wireless network coverage was extraordinary.

Subway tunnels, underground shopping centers, dense office buildings and complexes all had perfectly contiguous coverage. And the reason was that the network was built out for dense populations.

Countries with dense urban populations have actually already done a lot of that infrastructure development for small cell architectures. It is much more common than we think.

And as we think about overall network coverage, it won’t be the same everywhere. Busy urban areas with dense populations will be built out for full 6G coverage, but it just doesn’t make sense to build out that kind of network in a sparsely populated suburban area. Those areas will likely fall back to 5G network coverage, or even 4G in certain cases.

Your idea of enabled network coverage by drones is actually a good one. If we project 10 years into the future, battery technology will have advanced significantly. Assuming that there is enough energy density to power both the drone and the base station on the drone, this might be possible for a set period of time.

For example, if a wireless network needed to be deployed over a remote disaster area without coverage, or there was a large pop-up event in an area that didn’t have 6G, a fleet of drones could be deployed to provide coverage for a few hours.

Of course, we’ll have fully developed autonomous technology by then, so network coverage will be remarkably easy to deploy.

The biggest development that we’ll see in 6G, however, is that artificial intelligence will be woven into the fabric of the 6G wireless standards. This will enable these networks to be “intelligent” and capable of managing network traffic dynamically as necessary.

I’m very excited to see how 6G develops over the next few years. And of course, I’ll be sure to keep all my readers here updated on anything I learn…

How to choose the right number of units…

Next, a reader wants to know more about special purpose acquisition corporation (SPAC) investing:

Hi Jeff,

For certain reasons, when I first subscribed to Blank Check Speculator, I didn’t have the time to fully read the manifesto, nor was I able to learn what was necessary in order to invest wisely.

I didn’t understand the breakdown in units and warrants, so I didn’t follow your advice to invest according to that breakdown, meaning I didn’t buy units with the number of warrants in mind.

Since I’m already unequally invested (i.e., 1 unit = to 1/3 warrant) and instead of buying units in breakdowns of three, I’ve positioned myself in actually losing warrants, which leads to my question: Can I level out these buys by selling enough units to equalize the warrants?

I hope I’ve made myself clear. I’m new to investing and am learning my way through you. Thank you.

 – Patricia M.

Hello, Patricia, and thanks for writing in. I’m glad you’ve joined us in Blank Check Speculator and are taking an active interest in learning to invest. While I can’t give personalized advice, I can provide some general guidance for the issue you brought up.

But first, let me provide a little background for unfamiliar readers.

In Blank Check Speculator, we are focused on special purpose acquisition corporations (SPACs). This is one of the most exciting opportunities for investors right now.

SPACs – sometimes called “blank check companies” – exist to bring private, early stage companies public via a “reverse merger.” By investing in SPACs before a reverse merger is announced, we can effectively gain access to pre-IPO shares of promising companies.

In Blank Check Speculator, we invest in SPAC units. Then, within 52 days of the SPAC going public, these units split into shares and some fraction of warrant coverage. By purchasing units, we actually benefit by receiving what will ultimately become two separate securities.

And that warrant coverage can be a factor in our decision about how many units to buy.

For example, if a unit comes with one-third warrant coverage, that means for every three units we purchase, we will receive one warrant. Only full warrants are issued, however. An investor who buys 10 units would still only receive three warrants.

Patricia, this sounds like the issue you described. Thankfully, there is a simple solution. If you would like to maximize your warrants, you can simply buy or sell units to reach a multiple of three.

If you bought 100 units, for example, you could either sell one of your units or purchase an additional two units. Either way, that would make your total a multiple of three and would result in maximized warrant coverage.

And you brought up another excellent point – if any current paid-up subscribers haven’t yet had the chance to read our Blank Check Manifesto, I highly recommend doing so now. In that report, I provide a lot of useful information for understanding our SPAC investments.

And if any readers are interested in joining us, you can go right here to learn more.

The power of genetic editing…

Let’s conclude with some feedback on genetic editing:

Jeff: Love your work, your research and reporting. Excellent and constantly updating us on what is developing technologically. DTIL’s breakthrough in gene editing is indeed impressive and promising. But such power must be carefully employed.

I know it’s impossible to keep up with all the research going on, so let me pass along some things I’ve learned related to LDL (low-density lipoprotein). LDL is actually poorly correlated with cardiovascular risk. Recent research has shown that what matters is oxidized LDL (ox-LDL). This is what leads to plaque formation. In fact, it looks like LDL must be oxidized to begin forming plaques. So ox-LDL has a high correlation with coronary heart disease.

LDL is produced by the body because it is vital to numerous bodily functions and is very important to brain function. So lowering overall LDL could likely be harmful. What seems to matter is lowering the oxidation of LDL and/or finding processes to remove it. This does not lessen the significance of the advance in gene editing. It simply clarifies the need for caution in application and in explanation of the biological processes involved.

 – David E.

Hi, David, and thank you for sharing your thoughts. I am very excited to see what other advances in genetic editing we’ll see over the coming years, and I completely agree on the need to manage these powerful technologies wisely.

As a refresher for readers, we recently wrote about Precision BioSciences (DTIL) in The Bleeding Edge. It published research showing that its genetic editing therapy successfully lowered LDL cholesterol in monkeys as much as 56%.

And the most exciting part was that these effects appear to be permanent. Three years later, the monkeys are still seeing the benefits of the therapy.

As you noted, David, there is some nuance to this health issue. While cholesterol has a bad reputation, it does serve necessary functions for us. Our bodies use the waxy substance to build cells, make vitamins and hormones, and help produce bile.

Our livers produce all the cholesterol we need, though. That’s why diet plays a key role in the “high” cholesterol levels our doctors warn us about. Our doctors often simplify this issue by telling us to lower our “bad” LDL cholesterol.

But research over the last 30-plus years has offered some additional context for why LDL cholesterol causes problems.

When LDLs become oxidized, they are damaged. This triggers inflammation and attracts white blood cells called macrophages. These accumulate, sticking to blood vessel walls… and turning into plaque. That leads to the hardening of arteries.

Future research may help us isolate ox-LDLs for treatments. But right now, the best health advice for avoiding the negative health effects is to lower LDLs altogether.

That’s why Precision BioSciences’ research is such good news. Right now, our doctors will tell us to lower our LDLs with medication, diet changes, quitting smoking, and exercising. But some people still struggle to control their cholesterol due to factors like hereditary issues.

In particular, Precision BioSciences aimed its genetic therapy at helping people with familial hypercholesterolemia (FH) – essentially, high cholesterol due to an inherited mutation of a gene that controls how cholesterol gets cleared from the body.

For people with this condition, lifestyle changes aren’t enough. Even children and young adults with FH can suffer from heart attacks and stroke.

One day soon, though, anyone with high cholesterol might be able to simply have a genetic editing procedure done… and no longer have to worry.

This is just one piece of proof of the potential that genetic editing holds for the field of medicine. And I can’t wait to see the progress we make in the coming decade.

And to your point, it will take the entire industry to set guidelines regarding how to use such powerful technology in ethical ways. This issue will be very true of the use of artificial intelligence as well. This debate will be one of the biggest struggles over the next decade, and we’re still at very early days.

That’s all we have time for this week. If you have a question for a future mailbag, you can send it to me right here.

Have a great weekend.

We have so much to look forward to…

Regards,

Jeff Brown
Editor, The Bleeding Edge


Like what you’re reading? Send your thoughts to [email protected].