The nascent quantum industry would benefit from the emergence of a few pioneering technologies.
Development of quantum computers has advanced steadily over the last decade, spurred by the promise of harnessing the unique properties of quantum physics: qubits, or quantum bits, exist as either 0s, 1s or simultaneously as a zero and one.
Multiple companies now offer quantum applications as a service via cloud platforms such as Amazon Web Services, Google Cloud and Microsoft Azure.
Development is led by established companies and startups. An earlier column on quantum computing surveys the field. Here we provide an overview and perspectives on the status of quantum technologies.
For background, a U.S. Government Accountability Office (GAO) report examines the status and prospects for quantum computing. This post draws heavily on the GAO report.
An excellent overview of quantum computers by our colleague Maurizio Di Paolo Emilio is here.
Multiple technologies are required to deploy quantum computers, making it harder to predict when the technology will be practical. As the pace of development accelerates, many experts remained convinced practical quantum computers are still at least a decade away.
Analog vs. qubit-gate
Physical qubits, or quantum bits, are the basic building block. There are two main quantum computing methods: analog and gate-based quantum computers. The table below summarizes the differences between the two technologies.
Physical qubits include naturally-occurring particles and artificial structures. The former includes atoms, trapped ions and photons. Trapped ions and photons are the leading technologies for this segment.
Artificial physical qubits simulate naturally occurring particles, creating qubit gates. Quantum gates are similar to logic gates in conventional computers.
This category includes superconducting circuits, quantum dots and crystal defects. An example is a nitrogen atom within a diamond’s carbon lattice, which is called a color center. Superconducting circuits dominate this category.
In designing quantum computers from qubits, technology has been developed to manipulate quantum properties and entangle multiple qubits with one another. These manipulations are accomplished with lasers, microwaves, electric or magnetic fields and other methods. Examples are listed at the bottom of the table above.
Steady progress may soon yield quantum machines with thousands of qubits and approaching 1 million qubits after 2030. Such advances will greatly expand deployment by cloud services providers, academic institutions and corporations.
The next table summarizes the challenges facing quantum developers. The lower section outlines deployment challenges.
Entanglement is a key feature of quantum mechanics, allowing connected qubits to interact. In one example, using a qubit for, say, a measurement can reveal information about other connected qubits.
Superposition is another key characteristic. A qubit exists as a combination of all possible states simultaneously. Entanglement and superposition give quantum computers extra processing power not possible with traditional binary computers.
Maintaining qubit entanglement is another technical challenge. When entanglement is lost, quantum calculations are no longer valid.
There are multiple techniques for maintaining entanglement. Qubit isolation from environmental noise is the first step. Operating qubits at superconducting temperature reduces environmental noise dramatically. Fault tolerance is another strategy at the system level.
Some quantum technologies have built-in tolerance to environmental noise. The trapped-ion approach appears to outperform superconducting technology in this area.
The entanglement noise problem is often called decoherence. Decoherence occurs when a quantum computer loses information to the surrounding environment since the system is loosely coupled to the active state of its surroundings. Qubits must maintain coherence for quantum machines to operate properly.
Decoherence remains a challenge for quantum implementation due to reliance on the undisturbed evolution of the qubit state. The preservation of coherence, and mitigation of decoherence effects, are related to the concept of quantum error correction. It is generally agreed that error correction is needed for meaningful deployments supporting a range of quantum applications.
Further, quantum information cannot be copied, and measurement disrupts information, preventing implementation of classical error correction techniques. Quantum error correction techniques have been demonstrated but are challenging to implement. Error correction procedures are applied to many error-prone physical qubits. Those quantum procedures are combined with traditional processing techniques to create systems that simulate a robust, stable qubit—known as a logical qubit.
Current quantum platforms also exhibit slow I/O data rates. Future quantum computers will require faster data rates to support demanding quantum apps. Slow I/O rates would diminish overall utilization rates, and the value of quantum computing would therefore decline in areas such as cloud services.
Minimizing decoherence requires operation close to absolute zero, initially limiting quantum deployments for enterprise IT applications. Quantum technologies that operate near room temperature will help expand deployments.
At least six different quantum technologies are in use or development, with others on the horizon. Technology battles are seldom good for nurturing new industry segments, creating market uncertainties. Potential users often delay deployments until a clear winner emerges. The nascent quantum industry would benefit with the emergence of one or two leading technologies.
The current qubit fabrication infrastructure and supply chain is limited. Thousands of physical qubits will be needed per machine, growing to hundreds of thousands of qubits by 2025. By 2030, state-of-the-art machines may include 1 million or more physical qubits.
Developers investing in manufacturing and supply chains for physical qubits will emerge as leaders in quantum application deployments. Superconducting specialists may have an advantage if they can leverage semiconductor industry fab capacity once current chip shortages recede.
Quantum computers also will require an extensive ecosystem across many software platforms at multiple levels, including quantum algorithms and applications. Software development kits to develop, test and verify quantum applications will be needed. Additional requirements include quantum-centric languages, compilers and other development tools focused on unique and demanding quantum applications. Software to develop quantum applications will run on either PCs or cloud platforms.
Leveraging open source software will help reduce development costs. Hardware abstraction across multiple generations and different quantum technologies will also reduce development time and cost.
Current quantum systems are expensive: The GAO report estimates $10,000 per physical qubit. High costs are expected with immature technologies, especially with complex quantum designs. New versions and very low production volumes will only add to those costs. New investments and growing production volumes will help reverse those trends. More strategic planning and cooperation will also help.
Current applications tend to cluster in a few segments as summarized in the next table, largely drawn from GAO’s assessment.
The characteristics of entanglement and superpositions create unique opportunities for quantum applications that otherwise required much time to execute—even on supercomputers.
The spectrum of applications is expected to expand as quantum capabilities advance over the next decade. As new applications emerge, users will find new ways to use quantum computers.
Optimization problems also fit well with quantum technology. Optimization means finding the best decision or action for achieving goals. Available algorithms running on quantum computers could improve optimization methods. Examples include investment strategies, minimizing supply chain costs and identifying optimal locations for solar, wind and other sustainable energy installations. Machines with only 50 physical qubits could provide benefits over classical computers for simple optimization problems.
Along with the emergence of AI, quantum technology could be used to accelerate machine learning algorithms used for applications like disease detection via enhanced screening of genetic data.
Quantum computers are currently unable to process large amounts of data required for machine learning applications. The solution may be hybrid machines that solve problems by splitting calculations to match the capabilities of quantum and binary computers. This will require new software and protocols to distribute tasks accordingly.
Quantum computers appear able to factor large numbers in exponentially fewer steps, much faster than classical computers. Factoring a number means finding the unique set of prime numbers that can be multiplied together to produce the specified result. Factoring takes a very long time on classical computers.
Encryptions algorithms such as Rivest-Shamir-Adleman (RSA) rely on this limitation. Hence, encryption methods will become vulnerable when quantum computers quickly factor large numbers. For RSA encryption, this may require machines with more than 1 million physical qubits, including error-correction technology.
Quantum technology could also be applied to test physics theories, uncovering the mysteries of the universe. Additionally, QCs applications can be used to analyze data from high-energy physics experiments.
How many qubits?
The number of physical qubits needed to provide a significant improvement over classical computers varies by application.
IBM’s recent quantum announcement provides a glimpse of quantum scaling. Its Eagle quantum processor currently includes 127 qubits, up from 65 on its Hummingbird machine released in 2020. IBM’s qubit roadmap includes:
The table below, based on GAO data, summarizes qubit requirements for different applications.
Available quantum computers include less than 100 qubits. Primary applications include developing, testing and advancing quantum technology. Some are available on public cloud platforms.
Machines with less than 100 physical qubits can solve simple chemical calculations and may provide an advantage for some optimization problems.
Quantum computers with 1,000 physical qubits could enhance machine learning and optimization problems.
Cloud-based quantum capabilities will remain a leading deployment opportunity at 1,000 qubits. Individual companies will require substantial numbers of such machines. Based on IBM’s projections, this scenario could emerge in 2023.
As the number of physical qubits increase towards 100,000, the application spectrum increases. For example, machine learning and related AI applications and models will expand.
For IBM to reach the 100,000-qubit milestone in 2030, an annual increase of 90 percent would be needed. Through 2023, IBM’s annual qubit increase is 158 percent. Hence, 100,000 qubits by 2030 seems a reasonable bet.
Beyond that, quantum deployment would take off as many more problems can be solved with greater accuracy. Factoring large numbers or simulating pharmaceutical molecules may require more than 1 million physical qubits. When this happens, current encryption algorithms will no longer safe.
Quantum computing technology has progressed over the last five years and is poised to advance even further over the next five years. According to the PitchBook financial database, venture funding increased dramatically in 2021 with well over $1 billion invested, exceeding the total for the previous three years.
Quantum will primarily augment current computers, rarely replacing today’s machines. New quantum technology will advance rapidly, and innovative applications will be developed. Hybrid systems consisting of classical and quantum computers will emerge as technology deployment accelerates in a few years.
Chemistry simulations may be an application where quantum computers have the most impact. This includes applications ranging from drug discovery to advances in battery technology.
This article was originally published on EE Times.
Egil Juliussen has over 35 years’ experience in the high-tech and automotive industries. Most recently he was director of research at the automotive technology group of IHS Markit. His latest research was focused on autonomous vehicles and mobility-as-a-service. He was co-founder of Telematics Research Group, which was acquired by iSuppli (IHS acquired iSuppli in 2010); before that he co-founded Future Computing and Computer Industry Almanac. Previously, Dr. Juliussen was with Texas Instruments where he was a strategic and product planner for microprocessors and PCs. He is the author of over 700 papers, reports and conference presentations. He received B.S., M.S., and Ph.D. degrees in electrical engineering from Purdue University, and is a member of SAE and IEEE.