Quantum computing leverages quantum mechanics, the most accurate physical theory ever created by science. Since physicist Richard Feynman first proposed the concept in 1981, quantum computers have moved from theory to reality, with multiple functioning prototypes available today (you can access one here).

Their realization at the smallest scales was a theoretical triumph. However, these first-generation machines are rudimentary compared to even consumer-grade classical hardware. For quantum computers to ever be relevant, they must scale. Although we call them "computers," quantum computers differ fundamentally from the classical machine you’re using to read this. Understanding the core tenets of quantum mechanics, and in particular how it differs from our classical notions of computation, is critical to understanding the potential power of a quantum computer and the challenges of building one.

This post is the first in a five-part series that will provide a foundational understanding of quantum computing, and a methodology for estimating the timeline for a cryptographically relevant quantum computer. This foundation will ultimately provide a framework for us to realistically assess the timeline for Q-Day to understand how long we have to prepare. 

Fundamental Differences Between Classical and Quantum Computing

While classical computers operate on relatively straightforward logical concepts, quantum computers rely on principles of quantum mechanics that challenge our everyday intuitions about information. Concepts like superposition, entanglement, interference, and the no-cloning theorem give quantum computers radically different properties compared to classical computers and thus, different capabilities and limitations.

Here are some key facets of quantum mechanics that inherently define a quantum computer:

  • Superposition - In quantum mechanics, particles don’t occupy definite states like classical bits. Instead, they exist in a superposition, or a linear combination of possible states, described by a wavefunction. This wavefunction encodes all possible states of the system it's describing.
Quantum Superposition

Concretely, whereas a classical bit definitively represents either 0 or 1, a qubit can be in a superposition of both simultaneously. The outcome you get upon measurement depends on a probability distribution derived from the wavefunction. In other words, superposition allows a qubit to encode a much richer space of states than a classical bit, which is what gives quantum computing its exponential potential.

This point is crucial for understanding one of the major challenges in building a quantum computer. In classical computing, measurement is passive, in that reading memory doesn’t change it. But in quantum mechanics, the act of measuring a system collapses a superposition into a definite state. To gain meaningful advantage from a quantum computer, that superposition must be carefully preserved until the right moment.

  • Entanglement - In quantum mechanics, particles can be entangled, meaning their states become linked in such a way that they must be described as a single system. Even when separated by large distances, the measurement outcome of one particle is correlated with (or even determined by) the state of the other.
Quantum Entanglement

In other words, entanglement is a special kind of superposition that spans multiple particles. It’s one of the key features that enables quantum computers to scale exponentially, but also one of the most fragile to maintain over time and/or distance.

  • Interference – One of the key differences between quantum and classical probabilities lies in the concept of interference. In classical systems, probabilities simply add (for example, flipping two coins gives a 25% chance for each outcome). But in quantum mechanics, amplitudes (the components of the wavefunction) can interfere with each other before measurement. These amplitudes can reinforce (constructive interference) or cancel out (destructive interference), depending on their relative phases.
Quantum Interference

Quantum computers can exploit this phenomenon to "steer" a computation toward correct answers. Instead of just exploring all paths in parallel, a quantum algorithm is designed so that wrong answers interfere destructively and cancel out, while desirable paths leading to right answers interfere constructively and dominate the final result.  Without this ability to amplify correct outcomes and suppress incorrect ones, quantum computing would offer no advantage over classically randomized approaches.

  • No-Cloning Theorem - Because read-out has a direct impact on the system, in that it collapses superpositions into definite states, it is impossible to "copy" quantum states. This is the no-cloning theorem.

No-cloning makes the implementation of low-level primitives that we take for granted in classical computing (like memory registers) much more complex in practice. Instead, operations like quantum teleportation and entanglement swapping must be used to safely transmit or share quantum information during the course of evaluating a given circuit or program.

Unparalleled Computational Power vs. Enormous Engineering Complexity

The fundamental properties of quantum mechanics enable a much more powerful computational paradigm. Whereas the resources required to represent a complex system (such as individual molecules in a fluid) might overwhelm even the most powerful classical hardware, quantum computers can harness superposition and entanglement to solve these otherwise intractable problems.

One of the most famous examples is Shor’s algorithm, which can efficiently factor large integers and break widely used cryptographic systems like RSA and ECDSA. It does this by combining superposition to explore many potential factors at once, entanglement to maintain correlations across qubits, and interference to amplify the correct solution while canceling out incorrect ones. What would take classical computers billions of years to compute, a sufficiently large quantum computer could solve in hours.

However, due to the nature of the wavefunction, this paradigm is inherently probabilistic. Moreover, any measurement, or even an accidental subatomic interaction, can instantly destroy this fragile system. Thus, the theoretical potential of a quantum computer is almost matched by the daunting engineering challenges involved in practically building one.

In the next post, we'll examine the specific challenges introduced by the quantum computing paradigm, and construct a framework for how to evaluate different real-world approaches for solving them.