A new quantum computer can execute calculations in mere moments that would take the most advanced supercomputers 47 years to process.
This is article is missleading about how quantum computing works.
Superposition increases the computing power of a quantum computer exponentially. For example, two qubits can exist in four states simultaneously (00, 01, 10, 11), three qubits in eight states, and so on. This allows quantum computers to process a massive number of possibilities at once.
Quantum computers aren’t faster because they “process” multiple “possibilities” at once. Quantum computers aren’t any faster than regular computers when it comes to general purpose computing. You can exploit some interesting properties about quantum computing to solve certain problems asymptotically faster, like with Shor’s algorithm.
This means that the time to solve a problem as the size of the problem grows scales better. Using Shor’s algorithm, the time to factor a polynomial is proprtional to (log N)^2 log log N, where N is the size of the input data, instead of the fastest known non-quantum algorithm which takes time proportional to e^(1.9(log N)^(1/3)(log log N)^(2/3)). Note that the majority of problems that we would maybe like to solve using a computer don’t have any fancy quantum algorithms asociated with them and as such are no faster than a normal computer,
Given a large enough problem that can be solved with a quantum algorithm, a quantum computer will eventually outperform a non-quantum computer. This does not mean that quantum computers can solve arbitrary problems quickly.
Quantum computers aren’t faster because they “process” multiple “possibilities” at once.
@hetscop I thought eigenstates/qbits were what made quantum computing faster and gave it its additional capabilities. For example, if you had an 8-bit integer represented by a bunch of qbits in a superposition of states, it would have every possible value from 0-256 and could be computed with as though it were every possible value at once until it is observed, the probability wave collapses, and a finite value emerges. Is this not the case?
I read up a bit on Shor’s algorithm, and while I don’t fully understand it all it seems to take advantage of superposition of states destructively, which sounds a lot like “processing multiple possibilities at once.”
When we input a superposition through our system and measure the remainder, we get a superposition of all possible powers that result in only that remainder. And this remaining superposition repeats with a period of the power [math didn’t paste nicely]
Just as a classical Fourier Transform translates a wave as a function of time into a function of frequency, so too does a Quantum Fourier Transform (QFT), with a superposition as an output with a frequency of the input.
So, if we input a single state into a QFT, the output will be a superposition of states with varying weights or probabilities that form a wave with the input state as the frequency. And if we input multiple states into a QFT, the output will be a superposition of superpositions - with destructive and constructive interference combining superpositions into one wave. And if our input to the QFT is a superposition with period [math didn’t paste nicely] source
If my understanding is inaccurate, can you recommend a good source to understand this? Thanks!
For example, if you had an 8-bit integer represented by a bunch of qbits in a superposition of states, it would have every possible value from 0-256 and could be computed with as though it were every possible value at once until it is observed, the probability wave collapses, and a finite value emerges. Is this not the case?
Not really, or at least it’s not a good way of thinking about it. Imagine it more like rigging coin tosses: You don’t have every single configuration at the same time, but rather you have a joint probability over all bits which get altered to produce certain useful distributions.
To get something out, you then make a measurement that returns the correct result with a certain probability (i.e. it’s a probabilistic turing machine rather than a nondeterministic one).This can be very useful since sampling from a distribution can sometimes be much nicer than actually solving a problem (e.g. you replace a solver with a simulator of the output).
In traditional computing this can also be done but that gives you the fundamental problem of sampling from very complex probability distributions which involves approximating usually intractable integrals.However, there are also massive limitations to the type of things a quantum computer can model in this way since quantum theory is inherently linear (i.e. no climate modelling regardless of how often people claim they want to do it).
There’s also the question of how many things exist where it is more efficient to build such a distribution and sample from it, rather than having a direct solver.
If you look at the classic quantum algorithms (e.g. https://en.wikipedia.org/wiki/Quantum_algorithm), you can see that there aren’t really that many algorithms out there (this is of course not an exhaustive list but it gives a pretty good overview) where it makes sense to use quantum computing and pretty much all of them are asymptotically barely faster or the same speed as classical ones and most of them rely on the fact that the problem you are looking at is a black-box one.Remember that one of the largest useful problems that was ever solved on a quantum computer up until now was factoring the number 21 with a specialised version of Shor’s algorithm that only works for that number (since the full shor would need many orders of magnitude more qbits than exist on the entire planet).
There’s also the problem of logical vs physical qbits: In computer science we like to work with “perfect” qbits that are mathematically ideal, i.e. are completely noise free. However, physical qbits are really fragile and attenuate to pretty much anything and everything, which adds a lot of noise into the system. This problem also gets worse the larger you scale your system.
The latter is a fundamental problem: the entire clue of quantum computers is that you can combine random states to “virtually” build a complex distribution before you sample from it. This can be much faster since the virtual model can look dependencies that are intractable to work with on a classical system, but that dependency monster also means that any noise in the system is going to negatively affect everything else as you scale up to more qbits.
That’s why people expect real quantum computers to have many orders of magnitude more qbits than you would theoretically need.It also means that you cannot trivially scale up a physical quantum algorithm: Physical grovers on a list with 10 entries might look very different than a physical grover with 11 entries.
This makes quantum computing a nonstarter for many problems where you cannot pay the time it takes to engineer a custom solution.
And even worse: you cannot even test whether your fancy new algorithm works in a simulator, since the stuff you are trying to simulate is specifically the intractable quantum noise (something which, ironically, a quantum computer is excellent at simulating).In general you should be really careful when looking at quantum computing articles, since it’s very easy to build some weird distribution that is basically impossible for a normal computer to work with, but that doesn’t mean it’s something practical e.g. just starting the quantum computer, “boop” one bit, then waiting for 3ns will give you a quantum noise distribution that is intractable to simulate with a computer (same thing is true if you don’t do anything with a computer: there’s literal research teams of top scientists whose job boils down to “what are quantum computers computing if we don’t give them instructions”).
Meanwhile, the progress of classical or e.g. hybrid analog computing is much faster than that of quantum computing, which means that the only people really deeply invested into quantum computing are the ones that cannot afford to miss, just in case there is in fact something:
- finance
- defence
- security
- …
(typed this out yesterday before @ZickZack s excellent answer, but couldn’t post it at the time due to maintenance…)
No, you’ve got it wrong. This is a fairly common missunderstanding which is perpetuated by a lot of coverage about the topic being sloppy.
You could argue that there is a grain of truth to the idea of processing multiple possibilities at once, but it’s a bit more complicated than that and the way it’s usually presented leads to people building a bad intuition of how it works. If you do get in to the nitty-gritty of Shors algorithm it feels to me at least a bit like a weird hack that shouldn’t work at all or at least not be faster than the normal way to compute prime factors. It isn’t a general speedup, just in certain cases where you can exploit quantum mechanics in clever ways.
Of the top of my head the SMBC comic about it is actually pretty good. This article makes basically the same points, but a bit more elaborated (note that it was written a while ago so the part about the current state of quantum computing is outdated). I noticed that Veritasum put out a YouTube video which I haven’t watched, but he is in my experience good at explaining physics and math so I think that there’s a good chance that it’ll hold up. I remember liking this Minute Physics video about Shor’s algorithm too, if you wanna get a better understanding of it.
I should clarify that I’m not a quantum phycisist, I’ve just done a couple of internet deep dives on the topic but I can’t say that I fully understand quantum computing at all. I do think my understanding of it is better than the one in this article and others like it.
It looks like an interesting article, however I couldn’t read it due to 75% of the screen being taken up by ads that could not be closed. I managed to read a paragraph before they covered up pretty much the whole rest of the article
In a significant leap for the field of quantum computing, Google has reportedly engineered a quantum computer that can execute calculations in mere moments that would take the world’s most advanced supercomputers nearly half a century to process.
The news, reported by the Daily Telegraph, could signify a landmark moment in the evolution of this emerging technology.
Quantum computing, a science that takes advantage of the oddities of quantum physics, remains a fast-moving and somewhat contentious field.
Quantum computers hold immense promise for potentially revolutionizing sectors like climate science and drug discovery. They offer computation speeds far beyond those of their classical counterparts.
Potential drawbacks of quantum computing
However, this advanced technology is not without its potential drawbacks. Quantum computers pose significant challenges for contemporary encryption systems, thus placing them high on the list of national security concerns.
The contentious discussion continues. Critics argue that, despite the impressive milestones, these quantum machines still need to demonstrate more practicality outside of academic research.
Astonishing capabilities of Google’s quantum computer
Google’s latest iteration of its quantum machine, the Sycamore quantum processor, currently holds 70 qubits. This is a substantial leap from the 53 qubits of its earlier version. This makes the new processor approximately 241 million times more robust than the previous model.
As each qubit can exist in a state of zero, one, or both simultaneously, the capability of storing and processing this level of quantum information is an achievement that even the fastest classical computer, however rapid or slow, cannot match.
The Google team, in a paper published on the arXiv pre-print server, remarked: “Quantum computers hold the promise of executing tasks beyond the capability of classical computers. We estimate the computational cost against improved classical methods and demonstrate that our experiment is beyond the capabilities of existing classical supercomputers.”
Even the currently fastest classical computers, such as the Frontier supercomputer based in Tennessee, cannot rival the potential of quantum computers. These traditional machines operate on the language of binary code, confined to a dual-state reality of zeroes and ones. The quantum paradigm, however, transcends this limitation.
Revolutionary power
It remains uncertain how much Google’s quantum computer cost to create. Regardless, this development certainly holds the promise of transformative computational power.
For instance, according to the Google team, it would take the Frontier supercomputer merely 6.18 seconds to match a calculation from Google’s 53-qubit computer. However, the same machine would take an astonishing 47.2 years to match a computation executed by Google’s latest 70-qubit device.
Quantum Supremacy
Many experts in the field have praised Google’s significant strides. Steve Brierley, chief executive of Cambridge-based quantum company Riverlane, labeled Google’s advancement as a “major milestone.”
He also added: “The squabbling about whether we had reached, or indeed could reach, quantum supremacy is now resolved.”
Similarly, Professor Winfried Hensinger, director of the Sussex Centre for Quantum Technologies, commended Google for resolving a specific academic problem tough to compute on a conventional computer.
“Their most recent demonstration is yet another powerful demonstration that quantum computers are developing at a steady pace,” said Professor Hensinger.
He stressed that the upcoming critical step would be the creation of quantum computers capable of correcting their inherent operational errors.
While IBM has not yet commented on Google’s recent work, it is clear that this progress in the realm of quantum computing has caught the attention of researchers and companies worldwide. This will open new prospects and competition in the evolution of computational technology. Let the games begin!
More about quantum computing
Quantum computing, a remarkable leap in technological advancement, holds the potential to redefine our computational capacities. Harnessing the strange yet fascinating laws of quantum physics, it could significantly outperform classical computers in solving certain types of problems.
Basics of Quantum Computing
Traditional computers operate based on bits, which can be in a state of either 0 or 1. Quantum computers, on the other hand, operate on quantum bits, known as qubits. Unlike traditional bits, a qubit can exist in both states simultaneously, thanks to a quantum principle called superposition.
Superposition increases the computing power of a quantum computer exponentially. For example, two qubits can exist in four states simultaneously (00, 01, 10, 11), three qubits in eight states, and so on. This allows quantum computers to process a massive number of possibilities at once.
Another key quantum principle quantum computers exploit is entanglement. Entangled qubits are deeply linked. Change the state of one qubit, and the state of its entangled partner will change instantaneously, no matter the distance. This feature allows quantum computers to process complex computations more efficiently.
Applications of Quantum Computers
The unusual characteristics of quantum computing make it ideal for solving complex problems that classical computers struggle with.
Cryptography is a notable area where quantum computing can make a significant difference. The capacity to factor large numbers quickly makes quantum computers a threat to current encryption systems but also opens the door for the development of more secure quantum encryption methods.
In the field of medicine, quantum computing could enable the modeling of complex molecular structures, speeding up drug discovery. Quantum simulations could offer insights into new materials and processes that might take years to discover through experimentation.
Challenges in Quantum Computing
Despite its promising potential, quantum computing is not without challenges. Quantum states are delicate, and maintaining them for a practical length of time—known as quantum coherence—is a significant hurdle. The slightest environmental interference can cause qubits to lose their state, a phenomenon known as decoherence.
Quantum error correction is another daunting challenge. Due to the fragility of qubits, errors are more likely to occur in quantum computations than classical ones. Developing efficient error correction methods that don’t require a prohibitive number of qubits remains a central focus in quantum computing research.
The Future of Quantum Computing
While quantum computing is still in its infancy, the rapid pace of innovation signals a promising future. Tech giants like IBM, Google, and Microsoft, as well as numerous startups, are making significant strides in quantum computing research.
In the coming years, we can expect quantum computers to continue growing in power and reliability. Quantum supremacy—a point where quantum computers surpass classical computers in computational capabilities—may be closer than we think.
Quantum computing represents a thrilling frontier, promising to reshape how we tackle complex problems. As research and development persist, we inch closer to unlocking the full potential of this revolutionary technology.
It’s almost like someone would need a tool to climb over that wall… something that is maybe 11 or 12 feet tall
@MrJameGumb yeah … earth.com isn’t great, but I didn’t notice because I read everything in with adblock and readerview.
I know this is not relevant to the story, but check out brave browser with some built in blocking. The article looked fine for me. (Just fyi). I use it on mobile and desktop.
@MrJameGumb uBlock Origin (best used on Firefox, as Chrome is limiting ad blockers)