Google Reports Significant Quantum Advantage with Willow Processor

Recent research from Google, along with teams from MIT, Stanford, and Caltech, has unveiled a significant breakthrough in quantum computing. In two papers published in the journal Nature on October 22, the researchers claimed to have established a verifiable instance of quantum advantage using the Willow quantum processor. This achievement indicates that Willow surpasses current supercomputers in solving a specific problem.

To grasp how a quantum computer operates, consider waves rippling through a pond. When two wave crests converge, they create a larger wave, while a crest and a trough cancel each other out—a phenomenon known as interference. At the quantum level, particles can behave like waves, and their associated “wave functions,” which represent probabilities, can interfere with one another. By managing this interference, scientists can enhance the likelihood of identifying the correct solution to a problem, while diminishing the chances of incorrect ones. This is the fundamental operation of a quantum computer.

In one of the studies, the research team introduced a quantum algorithm aimed at addressing optimization problems—scenarios where the objective is to identify the best solution from numerous possibilities. Named Decoded Quantum Interferometry (DQI), this algorithm employs a quantum version of the Fourier transform to manipulate the wave-like characteristics of particles used as bits in quantum computing. The design allows for constructive interference of waves corresponding to favorable solutions, enhancing their signal, while waves linked to less desirable solutions interfere destructively and diminish. Consequently, by assessing the final state, the algorithm significantly increases the chances of arriving at a high-quality answer. The researchers demonstrated that for the optimal polynomial intersection problem, the DQI algorithm outperformed any known classical computer in terms of speed.

The second study focused on how information becomes scrambled within a complex quantum system. For instance, if a drop of dark blue dye is added to a still pool, the initial information is localized. However, as the dye disperses, the information becomes spread out, resulting in a faint, uniform blue tint across the pool. While the original drop seems lost, the information is not gone; it is simply scrambled across many molecules. This analogy applies to quantum systems, where information initially held in one quantum bit becomes distributed throughout other bits as they interact.

To measure this intricately hidden information, the researchers devised a clever experiment. Imagine shouting in a large warehouse. The sound waves disperse, echoing throughout the space. If a friend strikes a bell while your voice is still echoing, that sound wave receives a unique imprint. When the echoes are played back, those imprinted waves do not cancel out perfectly, revealing a faint, jumbled echo that retains information about the original sound. This leftover sound reflects the process of Out-of-Time-Order Correlation (OTOC) measurement, allowing the scientists to gauge how much the information has interacted and spread within the system.

In their second experiment, the researchers worked with circuits so complex that simulating them on the world”s second-fastest supercomputer would reportedly take over three years. In contrast, the Willow processor completed the task in approximately two hours. However, while the first paper showcased a quantum algorithm solving a problem much faster than classical computers, a definitive mathematical proof is still required to confirm that no efficient solution exists for classical computers. Further research will be needed to demonstrate that the problem remains difficult for all non-quantum systems. Likewise, although the second study illustrated a quantum computer tackling a complex problem, the next step involves an independent team applying the same method to resolve a genuine unsolved issue in fields such as physics or chemistry.

Despite these advancements, the applications of these findings remain largely prospective, as they are still based on lab-created tasks that have yet to translate into actual scientific breakthroughs. Future progress will rely on enhancements in other aspects of quantum computing, including error correction and scaling to accommodate thousands of reliable quantum bits, which are anticipated to take several more years.

In 2019, Google”s researchers claimed to have achieved a quantum advantage through a different experiment focused on random circuit sampling. In that instance, the Sycamore processor ran a random program to generate a list of answers, aiming to predict the most frequent outcomes. However, validating a single answer in random circuit sampling is challenging, as it relies on statistical distributions. Conversely, the problem tackled in the new studies pertains to a scientifically relevant physical quantity, and the results are verifiable since the same problem can be executed on classical or other quantum computers, with verification not dependent on statistical patterns.

A potential initial application for these findings lies in Hamiltonian learning, which involves deducing unknown parameters of a physical system by aligning experimental data with simulated results. The principles that underpin the work of this year”s Nobel Prize winners in physics are integral to making processors like Willow feasible. One of the laureates, Michel Devoret, serves as the chief scientist for quantum hardware at Google Quantum AI. The recent studies build upon their foundational work by addressing an optimization problem and investigating how information propagates within quantum systems.