Landscape of Quantum Computing in 2024
In 2021 I wrote a brief explainer of where quantum computing was and where it needs to go, specifically to break crypto. I updated the chart below to reflect recent progress in quantum computing:
Check the original post for the explaination of the chart. Starting last year I coloured the dots of existing quantum computers to reflect how recent they are (dark grey = very recent, white = ancient history, i.e., before 2017), and connected dots from the same organization(s) with arrows.
The chart itself shows little change from last year. For example, IBM's chips seem to have moved backwards (fewer qubits, but better fidelity). But this just shows that error rate and qubit count do not capture the full state of quantum computing, since Google's new dot (hardly visible in this chart) represents a huge result: Google created the first logical qubit in the surface code!
A few specific achievements:
- It's beyond threshold, meaning that when they increased the number of physical qubits used to make the logical qubit, the logical qubit had fewer errors
- When they used all the physical qubits they could, the logical qubit performed better than any single physical qubit
- They kept this up for about a million rounds of error correction (roughly 1 second), and decoded the error data in real-time
- Each surface code cycle was roughly 1 microsecond, the same speed as estimated for factoring RSA-2048 in 8 hours
When I first read these results, I felt chills of "Oh wow, quantum computing is actually real".
How does this fit into the path to breaking cryptography? Despite my enthusiasm, this is more or less where we should expect to be, and maybe a bit late. All of the big breakthroughs they demonstrated are steps we needed to take to even hope to reach the 20 million qubit machine that could break RSA. There are no unexpected breakthroughs. Think of it like the increases in transistor density of classical chips each year: an impressive feat, but ultimately business-as-usual.
In fact, they note that their machine suffers correlated errors roughly once per hour. Correlated errors generally break quantum error correction, so that means that even if they made this same machine 200,000x bigger, it still couldn't break RSA.
Another limitation might be the classical co-processor doing the decoding. I can't find the specs on this, but if they need a big classical machine to keep up with the decoding, that's going to cause problems if they try to scale up the runtime and size of the quantum computer.
What's next? I still think that logical qubits will be basically useless until we have at least 40 (and probably many more). However, it remains to be seen whether bigger, better quantum computers can do something useful without error correction in the next few years.
Chart methodology
(see the original methodology and download source data as a CSV here.)
- Current chips:
- Google's data came from the table in their blog post.
- IBM came from live data on "Marrakesh", their 156-qubit chip that seemed to have the lowest errors from the summary data. I excluded qubits with 1 or 2 qubit error rates of 1.0, rendering this effectively a 142-qubit chip.
- Rigetti's new numbers came from here.
- IonQ comes from this page, based on the fidelity of two-qubit gates.