A Different Kind of Machine
The classical computer, in all its forms — from the room-filling machines of the 1940s to the devices carried in every pocket today — is, at its heart, a device for manipulating bits. A bit is the simplest unit of information: a binary digit that is either zero or one. Every calculation a classical computer performs, however complex, reduces ultimately to operations on strings of zeros and ones. The power of modern computing comes from doing this with extraordinary speed and reliability, not from any fundamental departure from this binary logic.
A quantum computer operates on a fundamentally different principle. Its basic unit of information is the qubit — a quantum bit that, unlike its classical counterpart, can exist in a superposition of zero and one simultaneously. Moreover, multiple qubits can be entangled, creating correlations between their states that have no classical analogue. This means that a quantum computer of n qubits can, in principle, represent and process 2^n states simultaneously — a number that grows exponentially with the number of qubits.
This is not merely faster classical computation. It is a different kind of computation, capable in principle of solving certain classes of problems that are intractable for even the most powerful classical supercomputers. The differences in capability arise not from hardware speed but from the deep mathematical properties of quantum mechanics — superposition, entanglement, and quantum interference — exploited by carefully designed quantum algorithms.
What Quantum Computers Can Do That Classical Ones Cannot
The most celebrated early quantum algorithm was developed by Peter Shor in 1994. Shor's algorithm can factorise large numbers exponentially faster than any known classical algorithm. This is significant because the security of most modern public-key cryptography — including the protocols that protect financial transactions and communications — depends on the computational difficulty of factorising large numbers. A sufficiently powerful quantum computer running Shor's algorithm could, in principle, break this encryption.
Lov Grover's algorithm, also developed in the 1990s, provides a quadratic speedup for searching unsorted databases. While less dramatic than Shor's exponential speedup, it is applicable to a broader range of problems, and it represents a genuine quantum advantage over classical search methods.
Beyond these foundational algorithms, quantum computing holds particular promise for simulating quantum systems. This was Richard Feynman's original motivation for proposing quantum computation in 1981. Classical computers struggle to simulate quantum systems efficiently, because the amount of information required to describe a quantum state grows exponentially with the size of the system. A quantum computer can simulate a quantum system using resources that grow only polynomially, potentially enabling the accurate modelling of molecules, materials, and chemical reactions at a level of detail impossible for classical computers.
The implications for drug discovery, materials science, and chemistry are profound. Many of the most important problems in these fields — designing more efficient catalysts, discovering new drugs, developing better batteries — require accurate simulation of quantum chemical processes. Quantum computers may enable calculations that are currently years beyond classical reach to be performed in hours or days.
The Current State: Promise and Distance
As of the mid-2020s, quantum computers exist and are operational, but they remain in what researchers call the NISQ era — Noisy Intermediate-Scale Quantum. Current quantum processors have dozens to a few hundred qubits, but these qubits are fragile: they are subject to errors caused by interaction with their environment, a phenomenon called decoherence. Maintaining quantum states long enough to perform useful calculations requires extraordinary isolation from thermal and electromagnetic disturbances, typically achieved by cooling the processor to temperatures near absolute zero.
The major technological challenge is quantum error correction: developing methods to protect quantum information from errors without destroying the quantum properties that make quantum computation valuable. Theoretical error correction schemes exist and are well-understood, but implementing them in practice requires many physical qubits to encode each logical qubit, and the overhead is substantial. The transition from NISQ computing to fault-tolerant quantum computing — computing that can run arbitrarily long algorithms with acceptable error rates — remains a major unsolved engineering challenge.
This honest assessment of the current state is important. Quantum computing is a genuine technological revolution in the making, but it is not yet mature, and many of the most important applications are years or decades away. The serious seeker does not inflate nascent promise into present reality. The doctrine's demand for proportion — for confidence proportioned to evidence — applies as much to enthusiasm as to scepticism.
Quantum Computing and the Acceleration of Discovery
When fault-tolerant quantum computing arrives, its impact on the pace of discovery could be transformative. The ability to simulate molecular processes accurately — to model how a protein folds, how a drug binds to its target, how a catalyst lowers the activation energy of a chemical reaction — could compress timescales in pharmaceutical and materials research that currently stretch over decades.
In machine learning and artificial intelligence, quantum algorithms may offer advantages in training and optimisation that accelerate the development of AI systems. The intersection of quantum computing and artificial intelligence is an active research frontier, with potential applications in pattern recognition, optimisation, and the analysis of large data sets.
In fundamental physics, quantum computers could enable the simulation of quantum field theories at scales and regimes inaccessible to current methods, potentially shedding light on questions in particle physics and cosmology that classical computation cannot address. The simulation of quantum chromodynamics — the theory of the strong nuclear force — is one candidate application with the potential to deepen understanding of the structure of matter.
For education and learning more broadly, quantum computing will eventually create a demand for new forms of mathematical and scientific literacy. Just as the classical computer created a need for computational thinking that is now a standard component of education at every level, quantum computing will require a workforce conversant with quantum concepts, quantum algorithms, and the distinctive logic of quantum information. This is a transformation in the landscape of learning that the doctrine would recognise as a new kind of Crossing: a collective entry into a domain of knowledge that requires new intellectual formation.
The Moral Stakes of Quantum Computation
As with every powerful technology, quantum computing raises questions that extend beyond the technical. The cryptographic implications are among the most pressing. The prospect of quantum computers capable of breaking current encryption is not a distant science fiction scenario. It is a planning horizon that governments, corporations, and security researchers must address now — through the development and deployment of post-quantum cryptographic standards, which are already under active development by standards bodies including NIST.
Access and distribution are further concerns. Quantum computing infrastructure is capital-intensive and technically demanding. In its early commercial phases, it will be available only to well-resourced institutions and nations. The question of how the benefits of quantum computation — in medicine, materials, and energy — are distributed, and the question of how the competitive advantages it confers are managed between nations, will be among the significant geopolitical challenges of the coming decades.
The doctrine holds that knowledge severed from service is unfinished. The same principle applies to the technologies knowledge produces. The power to compute — including, eventually, quantum power to compute — carries with it the responsibility to use that power in ways that serve the common good, reduce harm, and do not concentrate advantage in ways that deepen existing inequalities. These are not questions for the distant future. They are questions being formed now, in the choices made about investment, access, governance, and purpose.
Knowledge that serves no one remains unfinished.