Overview
Computers today have processing units (typically CPU or GPU) that run computations. These processing units are made up of billions of transistors, which act as switches that can be in one of two states: allowing current to flow (represented as 1 or "on") or blocking current (represented as 0 or "off"). So let's say you want to calculate 1 + 4 on a computer. A processor will receive a 1 and 4 each as a series of 0s and 1s (which correspond to transistors being on or off) and then perform a coded operation, and send back a series of 0s and 1s (in the example 0001 + 0100 = 0101, or 5, represented as transistors being on or off, current flowing or not). The representation of numbers as 0s and 1s is called binary, which is a base-2 number system with the exponent being the position, starting at zero. For clarity, here’s a 4-bit example:
This means that each transistor can only hold a 1 or 0 state at any given time.
The idea behind quantum computing is: instead of bits (0s and 1s), you have qubits (quantum bits). These qubits can be in a 0 or 1 state, but can also exist in a state called superposition, which means they can be both 0 and 1 simultaneously. Quantum superposition allows these computers to evaluate all possible paths simultaneously, rather than sequentially, exponentially speeding up the time it takes to solve complex problems.
- Classical Bits: 1 bit can be 0 or 1. 2 bits have four possible combinations (00, 01, 10, 11), but a normal computer can only hold one of those at a time. 3 bits have 8 possible combinations (000 through 111), and again a classical system is in only one of those states at any moment.
- Quantum Qubits: 1 qubit can be 0, 1, or both at once. 2 qubits can exist in a superposition of all four combinations (00, 01, 10, 11) simultaneously. 3 qubits can occupy all 8 combinations at once. In general, n qubits can represent 2^n states at the same time.
This exponential growth is the “power of doubling.” For example, 50 qubits can theoretically hold over a quadrillion (1,000 trillion) states at once.
There is one other feature of qubits which is important to mention: entanglement. This means that two or more qubits can be correlated in such a way that measuring one instantly affects the state of the others, regardless of distance. This helps quantum algorithms explore solution spaces more efficiently.
What does this mean for computing? For certain problems, a well-built quantum computer could try many solutions in parallel, solving problems that have a large number of variables, are highly complex, or have strong correlations. Scientists estimate that a fault-tolerant quantum computer with ~50 logical qubits could perform specific tasks that would take classical supercomputers an impractical amount of time.
D-Wave summarizes this nicely in their 10k with: “In classical computation, binary information is encoded in bits that can be in a 0 or 1 state. Classical processors manipulate and transform this binary information to run classical algorithms and perform computations. Still, many important and high-value problems remain difficult or out of reach for classical computers, which creates a demand for quantum computing…[quantum] systems contain quantum bits (qubits) that can be in a superposition of both 0 and 1 simultaneously, and support entanglement across many qubits. These properties provide computational tools that enable new algorithms and applications for solving problems that are outside the reach of classical computing systems.”
The impact of quantum on AI today is relatively undefined. AI workloads thrive on linear-algebra operations that already run in vast parallel on GPUs and tensor accelerators. Quantum computers could someday accelerate specific sub-tasks (e.g., large-scale optimization or sampling problems), but today they are not a drop-in replacement for classical AI hardware and the practical reality for using them in the context of AI is unproven.
Quantum computing today faces several challenges that make their scalability limited:
- Decoherence: Environmental noise (thermal vibrations, electromagnetic fields, etc.) causes qubits to lose their superposition states extremely quickly. Qubits are sensitive and a significant part of experimentation is attempting to build low noise systems that can maintain their current.
- Current scale: The best quantum computers have reached over 1,000 physical qubits, but these are "noisy" qubits. We need many physical qubits to create one error-corrected "logical" qubit.
- Temperature requirements: Most quantum computers operate near absolute zero (millikelvin range), requiring sophisticated cooling systems.
- Limited applicability: Quantum computers only provide advantages for specific problem types.
- Error rates: Current quantum computers are still too error-prone for most practical applications.
The cost, power requirements, technical challenges, and limited problem scope mean quantum computers today complement rather than replace classical computers. However, as we’ll expand on in the last section, there are possibilities for the scope of quantum expanding in the future.
Implementation
There are two types of quantum computers: gate-based quantum computing and quantum annealers.
Gate Quantum Computers:
Classical computers process information by passing electrical signals (representing 0s and 1s) through logic gates. Common gates include AND, OR, and NOT. For example, an AND gate with two inputs will output 1 only if both inputs are 1; otherwise it outputs 0. These gates perform deterministic logical operations. Quantum computers also use gates, but quantum gates manipulate probability amplitudes rather than definite values.
A quantum computation follows three main steps. First, during initialization, qubits start in a known state, typically |0⟩. Next, a carefully designed sequence of quantum gates creates superposition and entanglement, called the quantum circuit. Finally, measurement involves observing the qubits, which collapses their quantum state and yields classical 0s or 1s with probabilities determined by the final quantum state.
Quantum algorithms are specific gate sequences designed to solve particular problems. For example, Shor's algorithm (discovered in 1994) factors large numbers exponentially faster than known classical methods, while Grover's algorithm searches unsorted databases with a quadratic speedup. Variational quantum algorithms solve optimization and chemistry problems using hybrid quantum-classical approaches. The key to developing these algorithms is figuring out the precise structure and arrangement of gates.
The power of quantum computing comes from quantum interference. As qubits pass through the circuit, each gate transforms the amplitudes. These amplitudes can be positive, negative, or complex numbers. When different computational paths lead to the same outcome, their amplitudes add together. Constructive interference occurs when amplitudes have the same phase, increasing probability, while destructive interference happens when amplitudes have opposite phases, decreasing probability. Quantum algorithms are designed so wrong answers interfere destructively while correct answers interfere constructively.
Key differences summarized:
- Classical: Bits (0 or 1) → Logic gates → Deterministic output (0 or 1)
- Quantum: Qubits (0 and 1) → Quantum gates transform amplitudes → Measurement gives 0 or 1 with specific probabilities
This is why quantum algorithm design is challenging - you must orchestrate the interference pattern across the entire circuit to amplify the probability of measuring the correct answer. Simply having qubits in superposition isn't enough; the algorithm must create the right interference patterns.
With this more complex system of gates and inference, there are more errors that arise. Quantum states are fragile and can decay quickly from environmental noise (decoherence). Unlike classical computers where a bit is clearly 0 or 1, quantum states can drift continuously, making error detection and correction extremely challenging (quantum error correction). Finally, quantum gates themselves have error rates of 0.1-1%.
All of this is to say: the necessary algorithms are complex and error prone today. Gates can theoretically run any quantum algorithm, making them more versatile, but clearly challenging to scale. The main commercial providers of gate-based quantum computers are IBM, Google, Rigetti, and IonQ. If you’d like to play around with making quantum circuits, IBM has open source software you can use.
Annealing Quantum Computers:
Quantum annealing is a specialized approach to quantum computing designed specifically for solving optimization problems. The process starts by putting qubits into a quantum superposition where they explore all possible solutions simultaneously. An optimization problem (such as finding the shortest delivery route or most efficient schedule) is encoded as an "energy landscape" where better solutions have lower energy. The quantum annealer then slowly reduces quantum fluctuations - a process similar to cooling molten metal - allowing the system to naturally settle into the lowest energy state, which represents the optimal solution. Quantum annealing requires less precision than a gate-base circuit, which has allowed some companies, like D-Wave, to commercialize more rapidly.
Unlike gate-based quantum computers that can run any quantum algorithm, quantum annealers can only solve these optimization problems, but they can do so with thousands of qubits. D-Wave's quantum annealers, for instance, have over 5,000 qubits and are used by companies to optimize everything from traffic flow to financial portfolios, though they cannot run algorithms like Shor's for factoring or simulate general quantum systems.
History
The history of quantum computing is important because it clarifies if there has been an acceleration or deceleration in the speed of invention.
The 1980s was when the idea of quantum computing was first circulated. Physicist Richard Feynman theorized how to build a device whose qubits and gates can imitate quantum mechanics. David Deutsch published the idea of a “universal quantum computer”, a theoretical model that can efficiently simulate any finite physical process governed by the laws of quantum mechanics. Nothing was actually built, but theories were created.
A decade later in the 1990s was when the potential of quantum became known through Shor’s Algorithm, and in the late 1990s small scale experiments began. By 1998, the first 2-qubit quantum logic gates were demonstrated in labs. In 2001, IBM and Stanford University researchers ran a simple version of Shor’s algorithm on a 7-qubit quantum computer, successfully factoring the number 15.
It was in 1999 that D-Wave Systems was founded, but it took almost a decade for them to commercial their first product which was in 2010, the D-Wave One - a 128 qubit computer that solved optimization problems. In 2016 IBM made a small cloud based quantum processor that could run experiments on a 5-qubit computer. And in 2019 Google had a major breakthrough, announcing their 53-qubit processor Sycamore performed a task in about 200 seconds that would take a classical supercomputer thousands of years. So 2010-2020 was a big decade for quantum commercialization.
In the last 5 years, we’ve seen rapid progress. In 2021, IBM unveiled Eagle, a 127-qubit chip, and in 2022 followed with Osprey, a 433-qubit processor. By 2023, quantum processors broke the 1,000-qubit barrier: for instance, a startup called Atom Computing announced a 1,180-qubit system, and IBM introduced Condor, a 1,121-qubit chip. However, noise and error continue to be a problem and many of the 1000+ qubits are not error corrected. Researchers humorously compare today’s quantum machines to the very first classical computers (like the vacuum-tube computers of the 1940s).
A lot of the skepticism around quantum comes from the fact that over almost 45 years we’re still only at 1000ish qubits for gate-based systems and 5000ish qubits for annealers.
Future Potential
I’d like to believe that quantum computing is our solution for solving some of the world's hardest and most critical challenges. Quantum computing hopefuls say classical computers are plateauing and energy requirements cannot keep up. Quantum computing, with the right advancements, can be the key to scientific advancements in climate change, fusion energy, quantitative finance, drug development and discovery, materials science, and artificial intelligence.
My quick summary of quantum computing is: it feels somewhat far off, and if advancements do come sooner, it feels as though they’d be accidentally realized rather than systematic progress. Said simply, quantum computing feels like an “if” rather than a “when”.
Don’t get me wrong, the specific use cases where it works are magical. Today, annealing quantum computers can solve highly-specific optimization problems in minutes and with a level of accuracy that it would take classical computers millions of years to solve, with a fraction of the energy required. The problem is that in order to solve a wider breadth of problems we would need gate-based computers to make significant algorithmic advancements.
After watching MIT courses, reading research journals, reviewing blogs / youtubes, and digging into public filings…the academic consensus is we are at a minimum 10+ years off from gate based quantum having a wide enough base of algorithms.
When I ask experts where the bottleneck is, they say everywhere. There are two types of scientists working on quantum computing: experimentalists and theorists. Theorists work on finding quantum analogs of classical algorithms. Experimentalists take those algorithms from theory to practice. It seems as though there are a couple of practical issues holding back development on both sides:
- Limited experts: There aren’t many people that are experts in quantum physics and/or computing. The size of the scientific pool, relative to other areas like AI, is limited.
- Expensive build-outs: While you can simulate known quantum algorithms on your computer, to actually innovate you need a massive lab with expensive equipment. This limits the level of distributed innovation that can come from outside of a lab environment.
- Bumping on physical limits: Even if you have a lab, you need coherence times to be long enough to complete a complex computation. This means removing all noise and keeping temperatures low - and from chatting with experts we’re bumping up to the limits of the physical world.
What about the potential of quantum on linear algebra and the application to AI? One of the grandfathers of quantum computing, David Deutsch recently stated in an interview:
“The way that I understand it at present, I don't think that quantum computation will contribute to artificial intelligence in the sense of autonomously acting information processing, which performs useful functions. Either in the sense of mindless machines, which I would call artificial intelligence, A.I. Or in the future, artificial general intelligence AGI, which is like doing the thing that humans do, thinking, explanatory knowledge creation. That sort of thing. I don't think that quantum computers will play a central role in either of those things.”
Overall, I have to admit the more I dug in, the more I was disappointed in the findings. It seems like quantum computing is filled with potential, but progress has been slow and there isn’t a clear path to mass algorithm discovery. Unlocking broader commercial use cases would require that quantum computers are able to solve problems faster, reliability, and more cheaply than classical computers. For this to happen, we need:
- Algorithms: We need more algorithms.
- Reprogrammability: We need quantum computers to be able to run any algorithm, not just a single use case.
- Scale: In order for quantum computers to solve problems out of reach for classical computers, such as modeling molecules with many electrons in order to enhance drug discovery, they require 1,000+ high-performing, error corrected qubits.
- Gate Fidelity: We need gates to have higher fidelity of over 99%.
It does feel like we’re in the vacuum tube era of quantum computing. In 50 years I can see the impact being bigger than AI, bigger than classical computers, bigger than anything we’ve seen. We just have a while to go.
In the meantime, there are a few companies to keep an eye out on. First, quantum computing, because it is expensive to develop, is a big-company game. Google and IBM lead the pack from the big-company side. From the legacy startup perspective, I’ve been a long-time investor in D-wave and think its quantum annealing business helps provide a more near-term solution while gate based algorithms are developed. From the new startup ecosystem, PsiQuantum has raised >$600 million (in talks to raise another $750 million at a $4 billion pre-money from Nvidia / Blackrock) and has developed more fault-tolerant computers using photonic qubits. Quantinuum was formed in 2021 with the merger of Honeywell Quantum Solutions and Cambridge Quantum, and they use a technology called high-fidelity trapped-ion hardware. They have raised $300 million at a $5 billion pre-money valuation. These are the two most highly funded US quantum startups. I can’t tell you how real their technology is, and when you look up the top quantum scientists in the world, none of them work at those companies (they do work at universities, Google, and IBM). There is one other startup, QuEra, which has seemingly had less hype, but has a stacked team of scientists from Harvard and MIT, and interestingly Google made an investment in them. One of their founders, Mikhail Lukin, has a talk here that he gave at Aspen Physics, which reiterates a lot of themes in this deep-dive.
It is hard to say how the industry will shake out, but as of now, from a public company perspective, my money is (literally) behind Google and D-wave.
Please share links in the comments section on any research, videos, podcasts, etc that I should look at to better inform my perspective. I’m not an expert, just trying to learn. I’ll edit this post when I come across new information that I think can add clarity.
Comments
No comments yet. Be the first to comment!