In a remarkable display of scientific ambition, IBM has set its sights on constructing a colossal quantum computer comprising a staggering 100,000 qubits.

In late 2022, IBM achieved a new milestone in quantum computing by developing a processor with 433 qubits, surpassing the record for the largest quantum computing system. Building on this success, IBM has announced its ambitious plan to construct a 100,000-qubit machine within the next decade. This endeavor will be undertaken through a $100 million partnership with the University of Tokyo and the University of Chicago, aiming to propel quantum computing into full-scale operation.

The announcement was made at the G7 summit in Hiroshima, Japan, on May 22. The objective of this initiative is to combine the power of quantum computing with classical supercomputers, working in tandem to tackle complex problems that are beyond the capabilities of conventional systems. By leveraging a quantum-centric approach, IBM envisions breakthroughs in various fields such as drug discovery, fertilizer production, battery performance, and more.

Quantum computing harnesses the distinct properties of fundamental particles to store and process information. Superposition allows electrons, atoms, and molecules to exist in multiple energy states simultaneously, while entanglement links the states of particles together. This enables the encoding and manipulation of information in novel ways, enabling computational tasks that were previously unattainable.

However, the current limitations of quantum computers stem from their small qubit count and vulnerability to environmental disturbances, known as noise. Researchers are exploring methods to mitigate noise and utilize error correction techniques. To achieve practical utility, quantum systems need to scale significantly, dedicating a substantial portion of qubits to error correction.

IBM is not alone in pursuing ambitious quantum computing goals. Google aims to reach a million qubits by the end of the decade, with only 10,000 available for computations due to error correction. Maryland-based IonQ targets 1,024 logical qubits formed from 13 physical qubits in an error-correcting circuit by 2028. PsiQuantum, based in Palo Alto, also aspires to develop a million-qubit quantum computer but has not disclosed its timeline or error-correction requirements.

While the number of physical qubits is often highlighted, factors such as resilience to noise and ease of operation are equally crucial. Additional measures of performance, such as quantum volume and algorithmic qubits, provide a more comprehensive assessment of progress. Over the next decade, advancements in error correction, qubit performance, software-based error mitigation, and variations in qubit types will make tracking this quantum race particularly challenging.

Refining the Hardware:

IBM’s current qubits are based on superconducting metal rings, operating at extremely low temperatures close to absolute zero. However, according to IBM’s roadmap, the current technology can only scale up quantum computers to around 5,000 qubits, which is considered insufficient for meaningful computation. To achieve powerful quantum computers, new technologies and innovations are required.

One critical aspect is developing more energy-efficient control of qubits. IBM’s superconducting qubits currently consume around 65 watts each, making it impractical to operate a 100,000-qubit machine without significant energy requirements. IBM has conducted proof-of-principle experiments using complementary metal oxide semiconductor (CMOS) technology, which has shown the potential to control qubits with much lower power consumption. However, further technological advancements are needed for quantum-centric supercomputing.

IBM 100000 qubit quantum computer quantum computing

IBM is also focusing on modular chip designs, as it will be impossible to fit a sufficient number of qubits on a single chip. Modularity requires interconnects to transfer quantum information between modules. IBM’s “Kookaburra,” a multichip processor with 1,386 qubits and a quantum communication link, is currently under development and expected to be released in 2025.

Collaboration with universities, such as the University of Tokyo and the University of Chicago, is crucial for the project’s success. These institutions have made significant progress in areas such as components and communication innovations that could contribute to the final product. IBM emphasizes the importance of academic research and expects more industry-academic collaborations in the coming years.

The industry also needs more “quantum computational scientists” who can bridge the gap between physicists developing the machines and developers designing and implementing useful algorithms. Additionally, software that runs on quantum machines will play a vital role, and IBM aims to encourage the development of quantum software libraries to accelerate industry growth.

While there are no guarantees, IBM’s $100 million investment in the project aims to achieve the 100,000-qubit goal. However, there are risks and potential roadblocks along the way. Control systems at such a large scale will need to evolve significantly to efficiently support a large number of qubits.

Despite the challenges, industry experts believe it is necessary to take risks and overcome technical obstacles to advance large-scale quantum computing. IBM’s plan is viewed as reasonable, although the journey may encounter surprises and require significant advancements in control systems.

Share: