Quantum Computing

From qubits to Q-Day — a grounded overview of quantum computing that separates engineering reality from press-release hype.

What Is Quantum Computing?

Quantum computing is a fundamentally different approach to computation that exploits quantum mechanical phenomena — superposition, entanglement, and interference — to process information in ways that classical computers cannot efficiently replicate. Where a classical computer manipulates bits that are definitively 0 or 1, a quantum computer manipulates qubits that can exist in superposition of both states simultaneously.

This is not simply “faster computing.” Quantum computers do not speed up all computations. They offer exponential advantages for specific problem classes: simulating quantum systems, factoring large integers, searching unstructured databases, and optimizing certain combinatorial problems. For most everyday computing tasks — word processing, web browsing, video rendering — classical computers will remain superior and more practical.

The significance of quantum computing for cryptography, materials science, drug discovery, and financial modeling stems from these specific advantages, not from a general speed increase. The challenge has always been building machines reliable enough to realize theoretical quantum advantages on practical problems. This remains an unsolved engineering problem as of 2025, though significant progress is being made.

Qubits vs Classical Bits

A classical bit is a switch: on (1) or off (0). Two bits can represent one of four states (00, 01, 10, 11) but only one at a time. A qubit, by contrast, exists in a superposition of |0⟩ and |1⟩ states, described by complex probability amplitudes. Two qubits in superposition simultaneously encode information about all four possible states.

Superposition enables quantum parallelism: a quantum algorithm can process multiple input combinations in a single operation. But reading the result collapses the superposition into a single classical outcome. The art of quantum algorithm design is arranging interference patterns so that correct answers are amplified and wrong answers cancel out.

Entanglement creates correlations between qubits that have no classical analogue. When two qubits are entangled, measuring one instantly determines the state of the other, regardless of physical separation. This property is essential for quantum error correction and many quantum algorithms. It is not “communication” — no information is transmitted faster than light — but it enables coordination between qubits that classical bits cannot achieve.

Interference is the mechanism that makes quantum algorithms work. Just as wave peaks can reinforce or cancel each other, quantum probability amplitudes interfere constructively (boosting correct answers) or destructively (suppressing wrong ones). Shor’s algorithm and Grover’s algorithm both exploit carefully designed interference patterns to achieve their speedups.

Key Milestones

The development of quantum computing has progressed through several landmark achievements:

  • 1981: Richard Feynman proposes using quantum systems to simulate quantum physics, planting the seed for quantum computing.
  • 1994: Peter Shor publishes his algorithm for factoring integers and computing discrete logarithms in polynomial time on a quantum computer — the theoretical basis for the quantum threat to cryptography.
  • 1996: Lov Grover publishes his search algorithm, providing quadratic speedup for unstructured database search.
  • 1998: First experimental quantum computation — a 2-qubit NMR system implementing Grover’s algorithm at Oxford.
  • 2019: Google claims “quantum supremacy” with its 53-qubit Sycamore processor, performing a specific sampling task in 200 seconds that it claimed would take a classical supercomputer 10,000 years. The claim was contested by IBM and later classical simulations narrowed the gap.
  • 2023: IBM unveils its 1,121-qubit Condor processor. However, qubit count alone is misleading without considering error rates and connectivity.
  • 2024: Google’s Willow chip demonstrates that increasing qubit count can reduce — rather than increase — error rates, a critical milestone for fault-tolerant quantum computing. The chip achieved below-threshold error correction for the first time in a superconducting system.
  • 2025: Microsoft announces Majorana 1, its topological qubit chip, claiming a new path to hardware-level error resistance. The approach remains early-stage and independently unverified.

Each milestone represents genuine progress, but the distance between current capabilities and cryptographically relevant quantum computing remains substantial. The pattern to watch is not raw qubit counts but logical error rates and the ratio of physical to logical qubits.

Error Correction

Error correction is the central engineering challenge of quantum computing. Qubits are extraordinarily fragile: thermal noise, electromagnetic interference, cosmic rays, and even vibrations from nearby traffic can cause errors. Current physical qubits have error rates on the order of 10-3 to 10-4 per gate operation — roughly one error per thousand to ten thousand operations. Useful quantum algorithms like Shor’s require trillions of gate operations, making raw qubits far too unreliable to run them directly.

Quantum error correction (QEC) solves this by encoding a single logical qubitacross multiple physical qubits. The most studied approach is the surface code, which arranges physical qubits in a two-dimensional grid and performs continuous syndrome measurements to detect and correct errors without disturbing the encoded quantum information.

The overhead is enormous. At current error rates, encoding one logical qubit may require 1,000–10,000 physical qubits using surface codes. Running Shor’s algorithm to break RSA-2048 would require approximately 20 million physical qubits. Breaking secp256k1 (Bitcoin’s elliptic curve) has been estimated at 4–13 million physical qubits, depending on the algorithmic approach and assumed error rates.

Google’s 2024 Willow result was significant because it demonstrated, for the first time in a superconducting system, that adding more qubits to an error-correcting code actuallyreduced the logical error rate — crossing the so-called “break-even” threshold. This does not mean error correction is solved, but it validates that the theoretical approach works in practice. The remaining challenge is scaling from a few logical qubits to the thousands required for useful computation.

Leading Platforms

Several hardware approaches are competing to build practical quantum computers, each with distinct trade-offs:

Superconducting Qubits — IBM, Google

The dominant paradigm, used by IBM and Google. Superconducting circuits operate at temperatures near absolute zero (15 millikelvin) in dilution refrigerators. Advantages include fast gate speeds (nanoseconds) and mature fabrication processes adapted from semiconductor manufacturing. Disadvantages include high error rates compared to trapped ions, limited qubit connectivity, and the engineering complexity of extreme cryogenics. IBM’s roadmap targets 100,000+ qubits by 2033 through modular architectures connecting multiple processors.

Trapped Ions — IonQ, Quantinuum

Trapped ion systems use individual atoms (typically ytterbium or barium) held in electromagnetic traps and manipulated with laser beams. They offer the highest gate fidelities of any platform (error rates below 10-4) and all-to-all qubit connectivity, meaning any qubit can interact with any other without routing overhead. The trade-off is slower gate speeds (microseconds vs nanoseconds) and scaling challenges — moving from tens to hundreds of ions in a single trap is physically difficult. Quantinuum’s H2 processor (56 qubits as of 2024) leads in quantum volume benchmarks.

Photonic Systems — PsiQuantum, Xanadu

Photonic quantum computers use photons (particles of light) as qubits, guided through optical circuits. The appeal is room-temperature operation and potential compatibility with existing fiber-optic infrastructure. PsiQuantum’s approach bets on manufacturing scale: using GlobalFoundries’ semiconductor fabs to produce millions of photonic components. The challenge is that photon loss in optical circuits is difficult to manage, and deterministic two-qubit gates between photons are harder to achieve than in matter-based systems.

Topological Qubits — Microsoft

Microsoft’s approach uses exotic quasiparticles (Majorana fermions) in topological superconductors. The theoretical advantage is that topological qubits are inherently resistant to local noise, potentially requiring far fewer physical qubits per logical qubit. The disadvantage is that the approach is the least mature — Microsoft announced its first topological qubit chip (Majorana 1) in 2025, decades behind superconducting and trapped ion systems. If topological qubits deliver on their theoretical promise, they could dramatically accelerate the path to fault-tolerant quantum computing. This remains a significant “if.”

When Will Quantum Break Encryption?

The moment a quantum computer can break widely used public-key cryptography is sometimes called “Q-Day.” The honest answer: nobody knows when it will happen, but serious institutions are planning as if it is inevitable.

Current quantum computers have approximately 1,000–1,500 physical qubits with error rates around 10-3. Breaking RSA-2048 requires approximately 4,000 logical qubits, each potentially requiring thousands of physical qubits for error correction. Breaking Bitcoin’s secp256k1 requires roughly 2,300 logical qubits. At optimistic physical-to-logical ratios of 1,000:1, this means machines of 2–20 million physical qubits — a 1,000–10,000x scale-up from today.

However, three factors could accelerate the timeline beyond linear extrapolation:

  • Algorithmic improvements: New algorithms or optimizations could reduce the qubit requirements for breaking encryption. Research continues to refine the resource estimates downward.
  • Error rate improvements: Better qubits reduce the physical-to-logical overhead. If error rates improve from 10-3 to 10-6, the physical qubit requirement drops dramatically.
  • Topological or novel qubit breakthroughs: A fundamentally more stable qubit type could change the entire scaling equation overnight.

The consensus range remains broad: most experts place Q-Day somewhere between 2035 and 2050. But the “harvest now, decrypt later” threat means that for data with long confidentiality requirements, the effective Q-Day is today — adversaries are already collecting encrypted data for future decryption. This is why governments and standards bodies are pushing for post-quantum cryptography migration now, not when Q-Day arrives.

Current State of the Field

As of early 2025, quantum computing is in what researchers call the “noisy intermediate-scale quantum” (NISQ) era — machines with enough qubits to be interesting but too noisy for fault-tolerant computation. The field is at an inflection point between NISQ experimentation and the beginning of early fault-tolerant quantum computing.

Key indicators of progress:

  • Error correction crossing threshold: Google’s Willow demonstrated below-threshold error correction, a necessary (but not sufficient) milestone for fault-tolerant computing.
  • Qubit counts scaling: IBM, Google, and Atom Computing have demonstrated processors with 1,000+ physical qubits. The focus is shifting from qubit count to qubit quality and logical qubit count.
  • Real-world applications emerging: Quantum computers are beginning to show practical advantages in specific chemistry simulations and optimization problems, though not yet for cryptography.
  • Investment accelerating: Government and private investment in quantum computing exceeds $40 billion globally, with national quantum strategies in the US, EU, China, UK, Japan, and Australia.

The honest assessment: quantum computing is real and progressing, but the hype cycle frequently outpaces the engineering reality. The machines exist, the algorithms work at small scale, and the error correction theory is being validated. What remains is the vast engineering challenge of scaling these systems by orders of magnitude — a challenge that is being pursued by well-funded teams across multiple hardware platforms. Whether this takes ten years or thirty is genuinely uncertain, but the direction is clear.

Further Reading

  • Scott Aaronson, “Quantum Computing Since Democritus” — Accessible introduction to quantum computation theory
  • IBM Quantum Roadmap — Hardware scaling plans through 2033
  • Google Quantum AI blog — Technical updates on Willow and error correction progress
  • “Quantum Error Correction For Dummies” by James Wootton — Intuitive introduction to QEC concepts
  • National Quantum Initiative (quantum.gov) — U.S. government quantum computing strategy and progress reports

Latest Dispatches