4 minute read

estimate

Introduction

In a previous article, we implemented Shor’s algorithm in Qiskit to solve the Elliptic Curve Discrete Logarithm Problem (ECDLP)–the cryptographic basis of Ethereum. In this article, we’ll take it a step further: How much quantum hardware would it actually take to break Ethereum using Shor’s algorithm?

We’ll walk through a practical resource estimate, including:

  • how many logical and physical qubits are needed,
  • how many quantum gates, especially T gates, and circuit depth are required,
  • and how these numbers compare to the current state of quantum hardware.

All of this is based on the Microsoft resource estimate for Shor’s algorithm (Roetteler et al.), which inspired our implementation in Qiskit.

By the way, the full circuit for 256-bit ECDLP is way too large to build on a Google Colab (pretty obvious). But if you’re interested in estimating the number of qubits and gates by yourself, I prepared a modified version of the previous implementation that allows you to build the circuit and measure its size. All you need to do is make a copy of the Colab notebook and set parameters such like EC_MODULUS appropriately (small enough to run on your machine). Let me know how it goes in the comments!

How Many Logical Qubits?

Ethereum uses the secp256k1 curve, which means we’re working over a 256-bit field. To run Shor’s algorithm on this problem, we need to simulate modular arithmetic over that field.

According to Roetteler et al., the number of logical qubits required scales like this:

\[Q_{\text{logical}} = 9n + 2\log_2(n) + 10,\]

for $n=256$:

\[Q_{\text{logical}} = 9 \cdot 256 + 2\log_2(256) + 10 = 2304 + 16 + 10 = 2330.\]

Estimated logical qubits required: 2330

How Many Physical Qubits?

Now things get serious. Physical qubits are noisy, unstable, and prone to error–so one logical qubit must be encoded using many physical qubits through quantum error correction (QEC).

We covered basic QEC (Shor’s 9-qubit code) in this article, but for a realistic estimate, we’ll use a more advanced approach: the surface code.

Surface Code

The surface code is a type of topological quantum error correction. Instead of encoding logical information in one place, it spreads it our over a 2D grid of physical qubits, non-locally. This makes the system more resilient: even if some physical qubits get noisy, the logical qubit stays intact.

We have discussed a similar concept in this article, where the physical qubit is represented by the topology of the Majorana zero modes. Instead, here, the logical qubit is represented by the topology of the physical qubits.

Code distance

The number of physical qubits per logical qubit depends on something called the code distance $d$. $d$ is the minimum number of physical qubit errors needed to cause a logical error. Bigger $d$ means more error protection, but also more physical qubits.

The scaling is roughly quadratic:

\[Q_{\text{physical}} = Q_{\text{logical}} \cdot d^2,\]

Based on Craig and Martin(2019), a code distance of $d=27$ is a solid estimate for reaching acceptable error rates on realistic hardware.

So:

\[Q_{\text{physical}} = 2330 \cdot 27^2 = 2330 \cdot 729 = 1,698,570.\]

Estimated physical qubits required: ~1.7 million

⚠️ Caveat: If physical hardware improves significantly–for example, achieving lower gate error rates–then a smaller code distance may be sufficient to reach the same logical error rate. This would reduce the number of required physical qubits. So, our estimate with $d=27$ is conservative, but realistic choice based on near-future hardware, not a fundamental lower bound.

How Many Quantum Gates?

Besides qubits, we also care about how long the quantum program runs–how many gates are applied and how deep the circuit goes. Gate count and circuit depth are crucial for measuring the feasibility within realistic decoherence times.

Based on the Microsoft estimate:

T gates : ~1.26 trillion

Circuit depth: ~1.23 billion time steps

That’s a lot. But why do we focus on T gates in particular?

Why T Gates Are the Bottleneck

Quantum gates fall into two main categories:

  • Clifford gates (like $H$, CNOT): easy to implement and error-correct
  • Non-Clifford gates, especially the T gate: harder to implement and much more expensive

Here’s the catch: fault-tolerant quantum computers can’t apply T gates directly. Instead, they rely on special quantum states called magic states, which are injected into the circuit to “simulate” the T gate.

But magic states are noisy when first prepared, so they must go through a process called magic state distillation–essentially a quantum clean-up process that consumes lots of space and time. In fact, magic state distillation is often the most expensive part of the whole computation.

That’s why we track T gate count as the main cost metric. If you can reduce the number of T gates, you save a ton of hardware and runtime.

Conclusion

Breaking Ethereum’s cryptography with a quantum computer isn’t just hard–it’s absurdly hard with today’s technology. We’re talking about thousands of logical qubits, millions of physical qubits, and trillions of gates. But the numbers also offer a clear benchmark: if quantum hardware does get close to these levels, then things start to get serious.

In the next articles, we’ll compare these estimates with the current state of quantum hardware and ask: how will we know when the danger is real? We’ll look at the key breakthroughs that would act as “canaries” for Ethereum’s cryptography. If one of them happens, it’s time to sound the alarm.

The Image in this article were generated by ChatGPT.

Leave a comment