For Significant Improvement over Classical Systems, the Power of Quantum Computers Must Double Every Year
At the 2019 American Physical Society March Meeting, IBM unveiled a new scientific milestone, announcing its highest quantum volume to date. Quantum Volume is a measurement, a procedure developed by IBM, that determines how powerful a quantum computer is, accounting for both gate and measurement errors, device cross talk, as well as device connectivity and circuit compiler efficiency. It follows that the higher the Quantum Volume, the more real-world, complex problems quantum computers can potentially solve, such as simulating chemistry, modeling financial risk, and supply chain optimization.
IBM has doubled the power of its quantum computers annually since 2017. IBM first made quantum computers available to the public in May, 2016 through its IBM Q Experience quantum cloud service.
IBM’s recently unveiled IBM Q System One quantum computer, with a fourth-generation 20-qubit processor, has produced a Quantum Volume of 16, roughly double that of the current IBM Q 20-qubit IBM Q Network devices, which have a Quantum Volume of 8.
A variety of factors determine Quantum Volume, including the number of qubits, connectivity, and coherence time, plus accounting for gate and measurement errors, device cross talk, and circuit software compiler efficiency.
In addition to producing the highest Quantum Volume to date, IBM Q System One’s performance reflects some of the lowest error rates IBM has ever measured, with an average 2-qubit gate error less than 2 percent, and its best gate achieving less than 1 percent error rate. To build a fully-functional, large-scale, universal, fault-tolerant quantum computer, long coherence times and low error rates are required.
Quantum Volume is a fundamental performance metric that measures progress in the pursuit of Quantum Advantage, the point at which quantum applications deliver a significant, practical benefit beyond what classical computers alone are capable. Potential use cases, such as precisely simulating battery-cell chemistry for electric vehicles, delivering a quadratic speedup in derivative pricing, and many others are already being investigated by IBM Q Network partners. To achieve Quantum Advantage in the 2020s, IBM believes that we will need to continue to at least double Quantum Volume every year.
In “Cramming more components onto integrated circuits” in 1965, Gordon Moore postulated that the number of components per integrated function would grow exponentially for classical computers. IBM Q system progress since 2017 presents a similar early growth pattern, supporting the premise that Quantum Volume will need to double every year, presenting a clear roadmap toward achieving Quantum Advantage.
“Today, we are proposing a roadmap for quantum computing, as our IBM Q team is committed to reaching a point where quantum computation will provide a real impact on science and business,” said Dr. Sarah Sheldon, lead of the IBM Q Quantum Performance team, dedicated to quantum verification at IBM Research. “While we are making scientific breakthroughs and pursuing early uses cases for quantum computing, our goal is to continue to drive higher quantum volume to ultimately demonstrate quantum advantage.”
IBM Fellow Dr. Jay Gambetta will present benchmarking data for IBM Q System One and additional IBM Q systems, as well as the importance of Quantum Volume in achieving Quantum Advantage in an invited APS talk, Benchmarking NISQ-Era Quantum Processors on Friday, March 8. IBM Q scientist Dr. Lev Bishop will also go into greater depth on these results during his APS March Meeting talk X35.00001 : Software and hardware for improved quantum volume of transmon processors on Friday, March 8. More details on IBM’s quantum computer benchmarks can be found via the IBM Research blog.