I won’t pretend
to know much about quantum computing, so I am mostly parroting and copying
relevant quotes in this post. Google Quantum AI was founded in 2012 to build a large-scale
quantum computer in order to “benefit society by advancing scientific
discovery, developing helpful applications, and tackling some of society's
greatest challenges.” Correcting errors has been one major challenge of
quantum computing and Google claims to have made significant progress on that
front with their new Willow Chip. The chip demonstrated the ability to correct
errors “below threshold” — being able to drive errors down while scaling up
the number of qubits.” Another part of the accomplishment is the speed of
error correction, in real-time. Google Quantum AI founder Hartmut Neven noted:
“As the first system below threshold, this is the most
convincing prototype for a scalable logical qubit built to date. It’s a strong
sign that useful, very large quantum computers can indeed be built. Willow
brings us closer to running practical, commercially-relevant algorithms that
can’t be replicated on conventional computers.”
Neven described
the two major achievements as follows:
·
The first is that Willow can reduce errors
exponentially as we scale up using more qubits. This cracks a key challenge in
quantum error correction that the field has pursued for almost 30 years.
·
Second, Willow performed a standard benchmark
computation in under five minutes that would take one of today’s fastest
supercomputers 10 septillion (that is, 1025) years — a number that vastly
exceeds the age of the Universe.
The researchers created a silicon chip with
105 qubits. The performance metrics for
Willow are shown below:
The most difficult
benchmark for a quantum computer thus far is random circuit sampling (RCS). Neven
explains:
“On the one hand, we’ve run the RCS benchmark, which
measures performance against classical computers but has no known real-world
applications. On the other hand, we’ve done scientifically interesting
simulations of quantum systems, which have led to new scientific discoveries
but are still within the reach of classical computers. Our goal is to do both
at the same time — to step into the realm of algorithms that are beyond the
reach of classical computers and that are useful for real-world, commercially
relevant problems.”
The quantum
computing roadmap below shows that the new milestone in quantum error
correction is milestone 2 in a roadmap that has 6 milestones laid out. It took about
12 years or more to hit milestone 2 so I wonder how long it will take to hit the
last 4 milestones.
At the end of his
blog post, Neven explains why he thinks quantum computing will be revolutionary:
“My colleagues sometimes ask me why I left the burgeoning
field of AI to focus on quantum computing. My answer is that both will prove to
be the most transformational technologies of our time, but advanced AI will
significantly benefit from access to quantum computing. This is why I named our
lab Quantum AI. Quantum algorithms have fundamental scaling laws on their side,
as we’re seeing with RCS. There are similar scaling advantages for many
foundational computational tasks that are essential for AI. So quantum
computation will be indispensable for collecting training data that’s
inaccessible to classical machines, training and optimizing certain learning
architectures, and modeling systems where quantum effects are important. This
includes helping us discover new medicines, designing more efficient batteries
for electric cars, and accelerating progress in fusion and new energy
alternatives. Many of these future game-changing applications won’t be feasible
on classical computers; they’re waiting to be unlocked with quantum computing.”
The milestone in quantum
error correction (QEC) was explained by Google AI researchers in a December paper
in Nature where I note two relevant descriptions from the abstract:
“Our system maintains below-threshold performance when
decoding in real time, …”
“Our results present device performance that, if scaled,
could realize the operational requirements of large scale fault-tolerant
quantum algorithms.”
Thus, that seems to be the goal that was reached: “large
scale fault-tolerant quantum algorithms.”
Dan Garisto for Scientific
American explains QEC and its potential implications:
“If physicists could quell quantum errors caused by noise
on a large enough quantum computer, they could perform some computations, such
as exact simulations of molecules, that are intractable for classical computers.”
In a sense, quantum computers can enable us to reach into
computational probability at a much deeper level than classical supercomputers can.
But there are still many issues to overcome. Getting ahead of the errors is an
important milestone.
“As you make a bigger and bigger system, you get better
at correcting errors, but you’re also causing more errors,” says Daniel
Gottesman, a quantum information theorist at the University of Maryland, who
was not involved with the study. “When you pass this transition, where you can
correct errors faster than they’re caused, is when making bigger and bigger
systems makes it better.”
Even though the
current error rate is still magnitudes higher than for classical computers,
that may not matter since the errors can be corrected quickly.
“The logical qubit lasted more than twice as long as any
individual qubit it was composed of, and it had a one-in-1,000 chance of error
per cycle of computation. (For comparison, the rate of error in a typical
classical computer is about one in 1,000,000,000,000,000,000—essentially zero.)”
“Really good qubits are the thing that enables quantum
error correction,” says Julian Kelly, director of quantum hardware at Google
Google is not
alone in making QEC progress as the joint venture of Microsoft and Quantinuum announced
an error rate of 1 in 500 a few months ago.
“…using qubits made from ions trapped by lasers, they
could encode 12 logical qubits that had a two-in-1,000 error rate.”
Garisto also has
a more sobering account of the immediate implications:
“Even with advances in error correction, practical
applications for quantum computers are unlikely in the near term. Estimates
vary, but consensus among many researchers is that to solve useful algorithms
or perform robust simulations of chemistry, a quantum computer would need
hundreds of logical qubits with error rates below about one in a million.”
Wiilow is a
hardware implementation. It has more cubits, 105 compared to its predecessor
Sycamore’s 72. However, Sycamore was unable to get below threshold on its error
rate.
“Willow’s qubits are more robust than Sycamore’s: they
maintain their delicate quantum state five times as long and having lower error
rates.”
Increasing performance with increasing size
is the part of the breakthrough that allows for scale up. Above threshold means
more qubits = more uncorrected errors. Below threshold means more qubits = less
uncorrected errors. Thus, it now seems possible that hundreds of logical qubits
with lower error rates can happen.
According to an
article in PC Magazine:
“In 2019, Google previously did a quantum computation in
200 seconds that would have taken the fastest supercomputer at that time 10,000
years. But IBM dismissed its claims, arguing that a classical computer could do
the same calculation with greater "fidelity" in 2.5 days and argued
that "quantum supremacy" had yet to be achieved.”
My question is when
will that assumed “quantum supremacy” come about and how would we measure it?
It is speculated
that quantum computing will eventually threaten Bitcoin since it apparently has
the potential to hack Bitcoin’s encryption. That makes one wonder a bit since
China is far along in both quantum computing and hacking.
References:
Meet
Willow, our state-of-the-art quantum chip. Hartmut Neven, Founder and Lead,
Google Quantum AI. December 7, 2024. Meet Willow, our state-of-the-art
quantum chip
Elon
Musk is wowed by Google’s new quantum chip, which it claims ‘cracks a key
challenge’ that’s existed for almost 3 decades. Dave Smith. Fortune. Elon Musk is wowed by Google’s new
quantum chip, which it claims ‘cracks a key challenge’ that’s existed for
almost 3 decades
Google’s
Quantum Computer Makes a Major Breakthrough in Error Correction. Dan Garisto.
Scientific American. December 9, 2024. Google Makes a Major Quantum
Computing Breakthrough | Scientific American
Google's
Quantum Chip Can Do in 5 Minutes What Would Take Other Computers 10 Septillion
Years. Kate Irwin. PC Magazine. December 10, 2024. Google's Quantum Chip Can Do in 5
Minutes What Would Take Other Computers 10 Septillion Years
Google's
'Willow' chip is a big threat to Bitcoin: Eric Jackson. Fox Business, December
6, 2024. Google's 'Willow' chip is a big
threat to Bitcoin: Eric Jackson | Watch
Quantum
error correction below the surface code threshold. Google Quantum AI and
Collaborators, Nature (December 9, 2024). Quantum error
correction below the surface code threshold | Nature
No comments:
Post a Comment