Here’s a look at what exactly Google has pulled off with its new state-of-the-art quantum computing chip called Willow.
Google has announced that it has achieved a major breakthrough in quantum computing, possibly nudging the technology from the conceptual towards the practical.
Read More:- Jio vs Airtel: Which offers the cheapest postpaid plans in 2024?
For the first time ever, the tech giant said it has developed a state-of-the-art quantum computing chip called Willow that solved in under five minutes a computation so complex, it would have taken a supercomputer around 10 septillion (10^25) years to complete.
“The Willow chip is a major step on a journey that began over 10 years ago,” Hartmut Neven, the Google executive who founded and leads Quantum AI, the research team behind the breakthrough, said in a blog post on Monday, December 9.
Read More: Jio offering 50 Days of AirFiber for Rs 1,111: Offer details
“We’ve always hypothesised that quantum computers can do something that classical computers cannot do. That is the main objective of quantum computing, but it has been based on theoretical constructions. If Google’s claims are true, it is a demonstration of that hypothesis to be correct,” Debapriya Basu Roy, an assistant professor in the computer science department of the Indian Institute of Technology (IIT) Kanpur.
“We all know the potential of quantum computers, but making a computer practical and solve some real-world problems is significant progress,” he further said.
Let’s take a closer look at the essentials of quantum computing, what exactly Google has achieved with its new Willow chip, and whether it stands to gain an edge in the ongoing AI arms race.
Read More: BSNL offers 90-day validity plan for just Rs 200
What is quantum computing and qubits?
Everything that is typed into classical computers such as words and numbers get translated into binary code comprising bits with a value of 0 (ground state) or 1 (excited state). However, a qubit leverages the principles of quantum mechanics to exist in both states simultaneously. For instance, a qubit could have a 25% probability of having a value of 0 and a 75% probability of having a value of 1. This means that a single qubit can represent a greater amount of information than a single classical bit.
As a result, quantum computers are able to process information in ways that are impossible for classical computers to do so. They are capable of solving problems that traditional computers cannot.
Read More: Apple Issues Major iOS 18.1.1 Security Update For iPhone Users: Install It Right Away
How are quantum computers different from supercomputers? With advanced architectures and relying on acceleration techniques such as graphic processing units (GPUs) and multi-core processing, classical supercomputers excel at performing calculations at a faster pace. However, they are still bound by the constraints of classical computing principles and depend on logic gates such as AND, OR, XOR, and NOT gates to manipulate classical bits.
Quantum computers, on the other hand, use quantum gates such as H-gate and Pauli gates that are designed to process qubits and are also reversible in nature. “Using these quantum gates, we can develop circuits and algorithms and solve problems that are otherwise impossible to solve,” Roy explained.
What is Google’s quantum computing chip Willow?
Google said that its new state-of-the-art quantum computing chip was fabricated in a facility in Santa Barbara, California, US. The components of the chip include single and two-qubit gates, qubit reset, and readout that have been engineered and integrated to ensure that there is no lag between any two components as that may adversely impact system performance, the company said.
Errors are considered to be one of the greatest challenges in quantum computing as qubits, in superposition, tend to rapidly exchange information with their environment and make it harder to complete a computation. “Typically the more qubits you use, the more errors will occur, and the system becomes classical,” Google said.
Read More: Super-Duper BSNL Offer: Subscribers To Get Free Benefits, Fiber-Based LIVE TV
However, with Willow, the company said it successfully drove down errors while scaling up the number of qubits processed by a quantum computer.
It tested out arrays of physical qubits, scaling up from a grid of 3×3 encoded qubits, to a grid of 5×5, to a grid of 7×7. “Using our latest advances in quantum error correction, we were able to cut the error rate in half. In other words, we achieved an exponential reduction in the error rate,” Google said. In a first, the quantum error correction in Willow happens in real-time which is crucial as errors can ruin the computation if not corrected fast enough.
Furthermore, the company said that it put the Willow chip through the random circuit sampling (RCS) benchmark test in order to measure its performance.
“Pioneered by our team and now widely used as a standard in the field, RCS is the classically hardest benchmark that can be done on a quantum computer today. It checks whether a quantum computer is doing something that couldn’t be done on a classical computer,” Google said.
In the RCS benchmark assessment, Google found that Willow was able to surpass one of the world’s most powerful classical supercomputers called Frontier.
“With 105 qubits, Willow now has best-in-class performance across the two system benchmarks discussed above: quantum error correction and random circuit sampling,” Google said.
Read More: Weather Today: IMD predicts heavy rain in THESE states; check full forecast here
How will Google’s quantum computing chip impact AI, encryption?
Providing insights on the broader relationship between AI and quantum computing, Roy explained that the domain of quantum AI involves developing AI algorithms and architectures with quantum computing advantages. “One of the major aspects of developing an AI model is to train it on large amounts of data. In that case, quantum computers can be very helpful because it helps you to compute the data faster,” he said.
Similarly, Google said it is exploring quantum algorithms to scale foundational computational tasks for AI. It further highlighted that quantum computers will be able to collect training data for AI models that are currently inaccessible to classical computers.
Read More: 5 SpiceJet Flights Receive Fresh Bomb Threats, Tally Crosses 250 In Just Over A Week
When asked if Google would be able to level-up its AI play with quantum computing chips, Roy opined that the tech giant may still have a long way to go. “The standard circuits that we use for standard AI models may not work. There needs to be some changes to ensuring that the AI model can operate on a quantum circuit, which is an active area of research,” he said.
A fully functional quantum computer could also have code-breaking capabilities that would render all forms of online encryption unreliable. RSA is a public-key encryption algorithm with many real-world applications such as digital certificates, digital signatures, virtual private networks, email encryption, and more. RSA is based on a problem called discrete logarithm problem that is hard for classical computers to solve.
Read More: Go Digit receives IRDAI show cause notice over excessive expenses
However, in 1994, American mathematician Peter Shor came up with an algorithm which showed that a quantum computer scaled up to a certain capability can break the discrete logarithm problem, and hence threaten the underlying cryptography of Bitcoin and other cryptocurrency as well as any system with RSA encryption.
Does Google’s chip stand to weaken RSA encryption? Not quite. Even with the advancements in error reduction and scalability, Willow is a 105-qubit chip and experts have pointed out that it would take a substantially large number of qubits to break RSA encryption.
“Estimates indicate that compromising Bitcoin’s encryption would necessitate a quantum computer with approximately 13 million qubits to achieve decryption within a 24-hour period,” said Kevin Rose, a tech entrepreneur and former senior product manager at Google.
However, such a scenario is not entirely being dismissed either. With the increasing focus on quantum computers in the last few years, researchers like Roy are working on developing new, post-quantum algorithms that would remain secure against quantum computers.
Google also said that it has demonstrated how to exponentially reduce errors in quantum computers while using more qubits to scale up the technology – something that has evaded researchers in the domain for the past 30 years.