The Bleeding Edge
6 min read

Scaling Quantum Computing

2025 was a significant year for quantum technology… and 2026 is poised to carry that momentum, sending the industry into hyperdrive.

Written by
Published on
Jan 19, 2026

Managing Editor’s Note: Last October, Jeff put together a series of issues dedicated to the technological trend advancing us towards the next great wave of computational power – quantum computing.

Longtime readers know Jeff’s been tracking the development of quantum computing for years now. But Google’s demonstration of quantum supremacy in late 2024 – more on that today from Jeff – set up 2025 to be a significant year for quantum technology…

Which has set 2026 up to carry that momentum, sending the industry into hyperdrive.

The implications of this technology are profound, and the pace of progress is dizzying. Before 2025, hardly anyone was talking about quantum. Now, it’s all eyes on what’s next for the technology.

As such, it can be tricky to know where to focus our attention when it seems like every other week, we’re getting new major developments… more potential industry leaders are emerging… and the ones we’ve been watching for years are quickly rising to prominence.

Which is why Jeff has put together a presentation focused on the future of quantum computing… the revolutionary potential of quantum technology… and a few companies he has his eye on to profit from the trend. You can go here to learn more

Then read on for a special featured Quantum Week issue of The Bleeding Edge… 


It all happened suddenly in December 2024…

The state of quantum computing literally changed overnight.

The inflection point was an announcement…

Google (GOOGL) announced its latest superconducting quantum semiconductor, which it manufactures in-house.

Willow.

Google’s Willow | Source: Google

Most articles focused on the qubits – the fundamental building blocks of quantum computers that can be represented by circuits, atoms, photons, or other forms of exotic matter.

They focused on the jump from Google’s previous superconducting quantum chip of 53 quantum bits (qubits) to Willow’s impressive 105 qubits, as well as the associated leap in computer processing power.

The headlines about the performance gains were pretty incredible, given that Willow can perform a computation in under five minutes that would take today’s fastest classical supercomputer 10 septillion years to complete.

Ten septillion is 1025 or a 1 with 25 zeros…

10,000,000,000,000,000,000,000,000 years

Imagine that. That’s how long it would take El Capitan – currently the world’s fastest supercomputer managed by the Department of Energy at the Lawrence Livermore National Laboratory in California – to complete the same calculation.

While it is possible, it is a completely unreasonable amount of time, far older than the age of our universe, hence Google’s demonstration of quantum supremacy with Willow.

I’ve long said that the naysayers attempting to proselytize us and convince us that Moore’s Law is coming to an end are dead wrong. These ridiculous proclamations started more than a decade ago, and they continue.

Their framework has been completely wrong. Not only has the semiconductor industry continued to shrink the transistor size and improve power efficiency per unit of compute and performance every year, but the industry has also been hard at work on what comes next…

That being quantum computing.

The Warp Drive for Moore’s Law

Quantum computing radically shifts the framework of computational power.

The naysayers were proclaiming that the laws of physics would win out, and there was a limit to how fast we can go.

It reminds me of the age-old scientific limit as defined by Einstein’s theory of special relativity. An object with mass, like a spacecraft, would require an infinite amount of energy to reach the speed of light, which makes achieving the speed of light impossible.

In theory, the solution to this limitation is to reframe the problem. Rather than trying to find a way to defy the laws of physics, simply bend them.

A warp drive is designed to bend space and time by compressing the space in front of the spacecraft and expanding the space behind the spacecraft. This will allow a spacecraft to travel at speeds faster than the speed of light, without breaking the laws of physics.

Quantum computing is the warp drive for Moore’s Law. Not only is Moore’s Law not slowing down, but it is taking a quantum leap ahead in computational power.

And unlike a warp drive, it is not theoretical. It’s real and working today.

Quantum computing, however, has one major weakness…

Errors.

Scalable Error Reduction

Quantum computers and their physical qubits are extremely susceptible to errors, which can be caused by the slightest noise in their operating environment.

Changes in temperature, vibrations, electromagnetic interference, or tiny imperfections in the control system – all considered noise – can all lead to quantum errors.

The solution to this challenge in quantum computing is quantum error correction (QEC). And a breakthrough in quantum error correction was the real story behind Google’s December 2024 announcement about Willow. It wasn’t just about scaled-up qubits and processing power.

A major part of Google’s superconducting quantum computer is a software component that delivered a remarkable breakthrough in quantum error correction. It announced something called surface code quantum computing.

The simplest way to explain surface code quantum computing is that the approach groups physical qubits together in a lattice, representing both the physical qubits and logical qubits. Hang in there with me on this one…

  • Physical qubit – In Google’s Willow system, the physical qubit is made from the circuits in the quantum semiconductor. These materials behave like an artificial atom that represents the qubit. And that qubit is capable of superposition, the quantum phenomenon that allows a qubit to be in multiple states at once. This is what gives quantum computers their superpower.
  • Logical qubits – These are enabled by error correction software. They are abstract. We can think of them like error-corrected representations of physical qubits. It takes several physical qubits to represent a single logical qubit. So as a quantum computer scales its physical qubits, it can also increase its logical qubits to ultimately achieve fault-tolerant quantum computing.

Google’s Surface Code Quantum Computing | Source: Google

I know this is a technical subject, so the key point to understand is that Google’s error correction methodology demonstrated that by building larger quantum computing lattices – as shown above – it was able to correct more quantum errors.

This was the breakthrough and the reason that the entire quantum computing industry was reignited. If quantum error correction can be used successfully, there is a clear path towards universal fault-tolerant quantum computing.

Shown another way below, as Google’s Willow increases the number of physical qubits, it dramatically reduces the error probability of its logical qubits. The largest lattice, shown below in blue (7×7), demonstrates the lowest error probability.

Google’s Logical Qubit Performance | Source: Google

The reason that Google’s announcement was so exciting for the industry is that it clearly demonstrated a quantum error correction methodology that can scale, and that with scale comes reduced errors.

It is a clear path forward.

Quantum Computing Hits Hyperdrive

So, how meaningful is this?

According to Google, which consistently undersells how much progress they are making with their quantum computers, it is only at “Milestone 2” shown below.

This represents about 100 qubits (i.e., Willow) with a logical qubit error rate of 10-2, or 0.01.

To put things into perspective, 0.01 may sound really small as an error rate…

But if our classical computers, like the ones that we use every day, had a 0.01 error rate, our computers would be a disaster. They wouldn’t function. Today’s classical computing systems have error rates around 10-18, so infinitesimal that they don’t have any measurable effect on the systems that we use.

Google’s Quantum Computing Roadmap | Source: Google Quantum AI

Google’s next stated milestone is to achieve a long-lived logical qubit with a system of 1,000 physical qubits and a logical qubit error rate of 10-6, or 0.000001.

We can be sure that Google is hard at work on that milestone and that they are a lot further along than we might believe.

We’ve reached a point of acceleration in quantum computing. And just like artificial intelligence, it is both a competitive race driven by monetary incentives as much as it is a matter of national security.

That’s given the implications of fault-tolerant quantum computing, namely that a fault-tolerant quantum computer can crack the security of any security algorithm in use today.

And despite the incredible progress announced by Google last December, the announcement itself was largely symbolic. It received so much attention because it came from Google.

The reality is that several major quantum computing companies have been making just as much progress with their quantum computing technology and error correction methodologies.

These are the same companies that I’ve been researching and writing about for years, like Rigetti Computing (RGTI), IonQ (IonQ), D-Wave (QBTS), Microsoft (MSFT), IBM (IBM), Quantinuum (which was originally Honeywell’s quantum computing division), and a range of other smaller, private quantum computing companies.

The industry has just punched its warp drive, and we’re in for one heck of a wild ride.

Jeff

Jeff Brown
Jeff Brown
Founder and CEO
Share

More stories like this

Read the latest insights from the world of high technology.