• New research demonstrates a brand-new architecture for scaling up superconducting quantum devices

University of Chicago

Researchers at the UChicago Pritzker School of Molecular Engineering (UChicago PME) have  realized a new design for a superconducting quantum processor, aiming at a potential architecture for the large-scale, durable devices the quantum revolution demands.

Unlike the typical quantum chip design that lays the information-processing qubits onto a 2-D grid, the team from the Cleland Lab has designed a modular quantum processor comprising a reconfigurable router as a central hub. This enables any two qubits to connect and entangle, where in the older system, qubits can only talk to the qubits physically nearest to them.

“A quantum computer won’t necessarily compete with a classical computer in things like memory size or CPU size,” said UChicago PME Prof. Andrew Cleland. “Instead, they take advantage of a fundamentally different scaling: Doubling a classical computer’s computational power requires twice as big a CPU, or twice the clock speed. Doubling a quantum computer only requires one additional qubit.”

Taking inspiration from classical computers, the design clusters qubits around a central router, similar to how PCs talk to each other through a central network hub. Quantum “switches” can connect and disconnect any qubit within a few nanoseconds, enabling high-fidelity quantum gates and the generation of quantum entanglement, a fundamental resource for quantum computing and communication.

“In principle there’s no limit to the number of qubits that can connect via the routers,” said UChicago PME PhD candidate Xuntao Wu. “You can connect more qubits if you want more processing power, as long as they fit in a certain footprint.”

Wu is the first author of a new paper published in Physical Review X that describes this new way of connecting superconducting qubits. The researchers’ new quantum chip is flexible, scalable and as modular as the chips in cellphones and laptops.

“Imagine you have a classical computer that has a motherboard integrating lots of different components, like your CPU or GPU, memory and other elements,” said Wu. “Part of our goal is to transfer this concept to the quantum realm.”

Size and noise

Quantum computers are highly advanced yet delicate devices with the potential to transform fields such as telecommunications, healthcare, clean energy, and cryptography. Two things must happen before quantum computers can tackle these global problems to their fullest potential.

First, they must be scaled to large enough size with flexible operability.

“This scaling can offer solutions to computational problems that a classical computer simply cannot hope to solve, like factoring huge numbers and thereby cracking encryption codes,” Cleland said.

Second, they must be fault-tolerant, able to perform massive calculations with few errors, ideally surpassing the processing power of current state-of-the-art classical computers. The superconducting qubit platform, under development here, is one promising approach to building a quantum computer.

“A typical superconducting processor chip is a square shape with all the quantum bits fabricated on that. It’s a solid-state system on a planar structure,” said co-author Haoxiong Yan, who graduated from UChicago PME in the spring and now works as a quantum engineer for Applied Materials. “If you can imagine a 2-D array, like a square lattice, that’s the topology of typical superconducting quantum processors.”

Limitations in typical design

This typical design causes several limitations.

First, putting qubits on a grid means each qubit can only interact with, at most, four other qubits – its immediate neighbors to the north, south, east and west. Greater qubit connectivity usually enables a more powerful processor with respect to both flexibility and component overhead, but the four-neighbor limit is generally considered inherent to the planar design. This means for practical quantum computing applications, scaling the device using brutal force will likely result in unrealistic resource requirements.

Second, the nearest-neighbor connections will in turn limit the classes of quantum dynamics that can be implemented as well as the extent of parallelism the processor is able to execute.

Finally, if all qubits are fabricated on the same planar substrate, then this poses a significant challenge to the fabrication yield, as even a small number of failed devices means the processor won’t work.

“To undertake practical quantum computing, we need millions or even billions of qubits and we need to make everything perfectly,” Yan said.

Rethinking the chip

To work around these issues, the team retouched the design of the quantum processor. The processor is designed to be modular, in a way that different components can be pre-selected before being mounted onto the processor motherboard.

The team’s next steps are working on ways to scale up the quantum processor to more qubits, find novel protocols for expanding the processor’s capabilities, and, potentially, find ways to link router-connected qubit clusters the way supercomputers link their component processors.

They’re also looking to expand the distance over which they can entangle qubits.

“Right now, the coupling range is sort of medium-range, on the order of millimeters,” Wu said. “So if we’re trying to think of ways to connect remote qubits, then we must explore new ways to integrate other kind of technologies with our current setup.”



Source link

Leave A Reply

Exit mobile version