Classical computers store information as bits: each one is definitively 0 or 1. Quantum computers store information as qubits — quantum objects whose state is described by a vector in a two-dimensional complex vector space. A qubit can be in state \(|0\rangle\), state \(|1\rangle\), or any superposition \(\alpha|0\rangle + \beta|1\rangle\), where \(\alpha\) and \(\beta\) are complex numbers satisfying \(|\alpha|^2 + |\beta|^2 = 1\).
This is not "being 0 and 1 at the same time" in any classical sense. The amplitudes \(\alpha\) and \(\beta\) are complex numbers that interfere like waves: they can reinforce (constructive interference) or cancel (destructive interference). A quantum algorithm is an arrangement of operations that makes the amplitudes of wrong answers cancel and the amplitudes of right answers reinforce. That interference is the engine of quantum speedup.
When you measure a qubit, the superposition collapses: you get \(|0\rangle\) with probability \(|\alpha|^2\) or \(|1\rangle\) with probability \(|\beta|^2\). The information encoded in the amplitudes is destroyed. This is why quantum computing isn't just "parallel classical computing" — you can't peek at all the parallel branches. The art is in extracting useful information from the interference pattern before measurement destroys it.
A classical computer with \(n\) bits is in exactly one of \(2^n\) possible states at any moment. A quantum computer with \(n\) qubits has a state vector with \(2^n\) complex amplitudes — all of which exist simultaneously and interact with each other. This is an exponentially larger state space, and it's the source of quantum computing's potential power.
But there's a catch: you can only extract \(n\) classical bits from \(n\) qubits through measurement. The exponential state space is working memory, not output. To exploit it, you need algorithms that are carefully designed so that the answer you want has high amplitude when you measure. Random quantum circuits don't outperform classical computers — only specific algorithmic structures do.
Three properties distinguish quantum from classical computation:
The idea that physics could be used for computation in fundamentally new ways emerged in the early 1980s:
In 1981, Richard Feynman observed that simulating quantum systems on classical computers is exponentially expensive: the state space of \(n\) quantum particles requires \(2^n\) amplitudes. He proposed that a computer built from quantum components could simulate quantum physics efficiently — the first argument for why quantum computers might be useful.
In 1985, David Deutsch formalized the quantum Turing machine and proved the first quantum speedup: the Deutsch algorithm, which determines whether a function is constant or balanced using one query instead of two. The speedup was tiny, but the principle was revolutionary — quantum mechanics allows computations that no classical algorithm can replicate.
In 1994, Peter Shor published his factoring algorithm, showing that a quantum computer could break RSA encryption in polynomial time. Classical factoring is believed to require exponential time. This was the result that made quantum computing a national security concern and triggered massive investment.
In 1996, Lov Grover showed that quantum computers can search an unstructured database of \(N\) items in \(O(\sqrt{N})\) time instead of \(O(N)\). A quadratic speedup — less dramatic than Shor's exponential one, but applicable to a vast range of problems.
Also in the 1990s, quantum error correction was developed (Shor, Steane, Calderbank), proving that quantum computation could in principle be made fault-tolerant despite the fragility of quantum states. Without this theoretical result, building practical quantum computers would have seemed hopeless.
In 2019, Google claimed quantum supremacy (now called "quantum advantage"): their 53-qubit Sycamore processor performed a specific sampling task in 200 seconds that they estimated would take a classical supercomputer 10,000 years. The claim was debated, but it marked a milestone.
In 2022, the Nobel Prize in Physics was awarded to Alain Aspect, John Clauser, and Anton Zeilinger for experiments with entangled photons, establishing violations of Bell inequalities — proving that quantum entanglement is real and not explained by hidden classical variables. This experimentally sealed the foundation that quantum computing rests on.
We are currently in the NISQ era (Noisy Intermediate-Scale Quantum) — devices with 50–1000+ qubits that are too noisy for full error correction. The race is to either find useful algorithms for noisy devices or to build fault-tolerant quantum computers with enough logical qubits to run Shor's algorithm on real cryptographic keys.
Three forces make quantum computing a pressing subject for computer scientists today:
Shor's algorithm means that RSA, Diffie-Hellman, and elliptic curve cryptography will all break when large-scale quantum computers exist. NIST finalized its first post-quantum cryptography standards in 2024. The migration is already underway. Understanding why quantum computers threaten classical cryptography requires understanding the quantum algorithms that do it.
Beyond cryptography, quantum algorithms offer potential speedups for: simulating molecules (drug discovery, materials science), optimization (logistics, finance), linear algebra (machine learning kernels), and sampling (statistical physics, Monte Carlo). Whether these advantages survive in practice — with noise, limited connectivity, and finite qubit counts — is an active research question.
Even if you never program a quantum computer, understanding quantum computation deepens your understanding of computation itself. The Church-Turing thesis says all classical models of computation are equivalent. Quantum computers don't violate the Church-Turing thesis (they compute the same functions), but they do violate the extended Church-Turing thesis (the belief that classical computers can efficiently simulate any physical process). Quantum computing reveals that the physics of the universe enables computational models that are fundamentally more efficient for certain tasks. That's a profound insight about the nature of computation.
You don't need to understand quantum mechanics to do quantum computing (this is a CS course, not a physics course). But understanding why the math looks the way it does helps build intuition.
Quantum states are vectors in a complex vector space. A qubit lives in \(\mathbb{C}^2\) — a two-dimensional space with complex coordinates. The state \(\alpha|0\rangle + \beta|1\rangle\) is a point on the unit sphere in this space (the Bloch sphere). Operations on qubits are rotations of this sphere — unitary matrices that preserve the unit norm. This is why linear algebra is the language of quantum computing.
When two qubits interact (e.g., through a CNOT gate), their joint state can become entangled: a state that cannot be written as a product of individual qubit states. The Bell state \(\frac{1}{\sqrt{2}}(|00\rangle + |11\rangle)\) says: "if you measure the first qubit and get 0, the second is guaranteed to be 0; if you get 1, the second is guaranteed to be 1." This correlation is stronger than anything classical physics allows (Bell's theorem), and it's a computational resource that quantum algorithms exploit.
Measurement is where quantum mechanics gets strange. Before measurement, a qubit's state is a continuous complex vector. After measurement, it's definitively \(|0\rangle\) or \(|1\rangle\), with probabilities given by the squared magnitudes of the amplitudes. The act of measurement is irreversible and probabilistic — it destroys the superposition. This is not a failure of precision; it's a fundamental feature of the theory, confirmed by decades of experiment.
The choice of measurement basis matters. Measuring in the computational basis (\(|0\rangle, |1\rangle\)) gives different results than measuring in the Hadamard basis (\(|+\rangle, |-\rangle\)). The state hasn't changed — but what you learn from it depends on how you look. This basis-dependence is central to quantum algorithms: many speedups come from measuring in the right basis at the right time.
The headline results of quantum computing are algorithmic. Shor's algorithm factors integers in polynomial time (exponential speedup over the best known classical algorithm). Grover's algorithm searches unstructured data in \(O(\sqrt{N})\) time (quadratic speedup, provably optimal). The Deutsch-Jozsa algorithm solves a specific promise problem in one query versus \(2^{n-1}+1\) classically. These are not engineering tricks — they exploit the mathematical structure of quantum mechanics to solve problems in ways that are provably impossible classically.
Shor's algorithm threatens all public-key cryptography based on factoring or discrete logarithms. This includes RSA, Diffie-Hellman, DSA, and elliptic curve cryptography. Post-quantum cryptography uses problems believed to be hard even for quantum computers: lattice problems, hash-based signatures, code-based cryptography. Understanding the quantum threat requires understanding the quantum algorithms.
Feynman's original motivation. Simulating a molecule with \(n\) electrons requires tracking \(2^n\) amplitudes classically. A quantum computer can simulate quantum systems in polynomial time — because it is a quantum system. This has implications for drug discovery, materials science, and fundamental physics. It's potentially the first practical application of quantum computers.
Quantum computing introduces the complexity class BQP (bounded-error quantum polynomial time) — the quantum analogue of P. The relationship between BQP and classical classes is partially understood: P \(\subseteq\) BQP \(\subseteq\) PSPACE, but we don't know if BQP contains NP-complete problems. Proving (or disproving) that quantum computers can solve NP-complete problems efficiently would be one of the biggest results in the history of mathematics.
The course builds from the ground up, following a strict dependency chain where each week requires everything that came before:
The thread connecting everything: define the mathematical object (state vectors) → define valid transformations (unitary gates) → understand what information you can extract (measurement) → build algorithms from these primitives (interference → speedup).
Quantum computing has a specific aesthetic that's different from most of CS: