Quantum Firms Diverge on Where Progress Lies: Hardware or Error Correction
As quantum computing inches toward commercial viability, leading companies focus on radically different paths to reach quantum advantage

As the race toward quantum advantage accelerates, industry leaders are coalescing around one shared conviction: today’s quantum systems aren’t yet powerful or reliable enough to outperform classical computers in real-world applications. But there is far less agreement on how to close that gap.
For some, the path forward runs through better qubits and faster gates. Others believe it depends on building smarter software layers that can suppress or correct the quantum errors that plague early hardware. Still others argue that both advances must happen simultaneously and in close collaboration.
Executives from five of the world’s most prominent quantum firms laid out their sharply different strategies at Commercializing Quantum Global 2025 in London on May 13-14. Their responses revealed a split between those doubling down on physics and hardware improvements and those who say progress hinges just as much on algorithms, error mitigation, and application design.
Pushing Physical Limits
Chris Ballance, co-founder and CEO of Oxford Ionics, believes reducing quantum error rates is the most essential—and transformative—step forward.
“If you can reduce error rate just a bit, everything becomes easier,” he said. Oxford Ionics claims its ion-trap systems have achieved gate errors below 10⁻⁴, substantially outperforming competitors.
With such low error rates, “we can tackle these really meaningful problems with just a few thousand physical qubits,” he said.
Subodh Kulkarni, CEO of Rigetti, offered a hardware-first view rooted in concrete metrics.
For Rigetti, unlocking quantum advantage requires four hardware thresholds: at least 1,000 qubits, two-qubit gate fidelities of 99.9%, gate speeds faster than 20 nanoseconds, and real-time error correction.
“If you get those four things, we think we can start doing real-life demonstrations with quantum computing that are superior to classical computers,” Kulkarni said. He estimated Rigetti is four to five years away from reaching those levels.
For photonic quantum computing company ORCA, hardware still matters, but Richard Murray, its CEO and co-founder, warned against overemphasizing physical specs alone.
“If you look at the history of computing, there are countless examples where someone did the hardware really well but failed on software or usability,” Murray said.
ORCA’s approach emphasizes tight integration with high-performance computing infrastructure and building practical user environments that support actual application development. “It’s also about how customers actually interact with the system.”
Software Layers of Trust
While some firms are focused on physical reliability, IBM’s Sarah Sheldon argued that software techniques to handle noise and errors are just as crucial, particularly in the near term.
“We need methods that we can verify that we are getting accurate outputs from our quantum computers,” she said.
IBM is working on error mitigation and detection schemes that can help produce usable results on today’s noisy machines.
“You know, the device benchmarking gives you how good the hardware is,” she said. “But you need to also think about the techniques you’re using on top of the hardware to reduce the effects of the noise.”
Sheldon added that IBM also focuses heavily on application benchmarking to help guide future development.
“We systematically compare quantum and classical state-of-the-art to understand: is quantum a candidate for a solution to a problem?”
Google’s Tom O’Brien also stressed the software side, but from a theoretical vantage point.
With current systems hovering around 10⁻³ to 10⁻⁴ error rates, O’Brien pointed out that running real-world quantum applications would require roughly a billion gates.
“You need to scale up, not to 1,000 qubits, but when you compile everything down to the fault-tolerant level, we expect about the million-qubit mark,” he said. “That’s still about a decade away.”
In the near term, O’Brien said Google is focusing on “early fault-tolerance”—a stage where carefully chosen applications can run on machines with tens of thousands of logical qubits.
“We want applications with around 10⁶ to 10⁷ gates that we can realistically execute in the next five years,” he said. Google has launched a $5 million XPrize in collaboration with the Swiss research hub GESDA to develop such applications.
Hardware and Software Must Converge
While the industry divides over hardware versus software priorities, some panelists insisted that progress depends on tight coordination across the entire stack.
“You don’t want to be building any of this in isolation,” said Murray of ORCA. “Quantum computers are imperfect and will be imperfect for a long time—even with error correction. Application developers need to be deeply familiar with noise models, connectivity constraints, and what the hardware can actually do.”
ORCA’s model integrates hardware and software teams in the same company to enable this co-design approach. “You need very close alignment between the people building the qubits and the people writing the algorithms,” Murray said. “That’s how you get usable systems faster.”
Ballance agreed that this integration is finally starting to accelerate.
“We’ve already seen massive improvements,” he said. “A lot of resource estimates have gone down by six orders of magnitude in the last few years as hardware and software start to converge.”
He added that the shift from whiteboard speculation to real-world experimentation makes the field more attractive to top talent and more productive.
Sheldon emphasized that IBM is working closely with a network of partners to identify good candidate problems for quantum systems. In optimization, for example, IBM helped coordinate a new benchmarking effort across 16 institutions to evaluate 10 types of problems where quantum may already offer an edge at small scales.
Even as quantum firms pursue different technical paths, they converge around a shared realization: to move forward, a full-stack ecosystem must emerge.
“It’s not just about hardware specs,” Sheldon said. “It’s about the techniques, the applications, the people—everything working together.”
Toward Early Quantum Applications
Despite the uncertainties, several panelists expressed optimism that targeted applications could arrive much sooner than universal quantum advantage.
Kulkarni suggested that quantum computing is well-suited for “probabilistic questions,” such as modeling protein folding or forecasting economic risks.
“Deterministic computation—addition, subtraction—will stay with CPUs and GPUs for a long time,” he said. “But probabilistic, analog-type problems are where quantum can shine.”
IBM’s Sheldon pointed to quantum simulation in chemistry and materials science as early front-runners, along with optimization problems. In the longer term, she expects growth in mathematical computing, including linear algebra and differential equation solvers.
O’Brien offered a more cautious take on optimization.
“For general-purpose optimization, we know we can only get a quadratic advantage,” he said. “That won’t be enough to overcome the constant factor gap in a quantum computer.”
Still, he highlighted new algorithmic research from Google that shows promise in domains like solving large systems of linear oscillators—cases where classical methods struggle.
Murray, too, emphasized urgency.
“It is imperative that we strive to deliver the first applications as quickly as possible,” he said. ORCA is exploring how quantum systems can enhance generative AI models, an area he believes is ripe for disruption.
“There’s a massive point of contention about when the first application will arrive, but users don’t want to wait decades. They want to see results soon,” he said.
Despite the technical rigor and billion-dollar investments, quantum computing remains a field full of unknowns—and possibilities.
“We’ve spent 10 years learning what quantum computers can’t do,” said O’Brien. “Now that we have systems we can actually run, we’re in a new phase of discovery.”
Sheldon echoed that sentiment, emphasizing the need for more algorithmic creativity and rapid iteration.
“There’s a huge opportunity for more research in applications,” she said.
Ballance added a historical perspective: “Most of the applications we now run on classical computers, no one would have predicted before those computers existed.”
He stressed that as hardware becomes accessible and reliable, the industry must allow experimentation—even failure. “When you can say, ‘This might just work, let’s go test it,’ that’s when things start to really move.”
For now, the field’s defining characteristic is its openness to new ideas, collaborations, and contenders. While quantum firms may disagree on whether hardware or error correction deserves top billing, they agree on one thing: the only way forward is together.