Quote: (12-12-2017 07:55 PM)The Beast1 Wrote:
^^Google has already developed a 49 cubit computer this year and IBM intends to top that in 2018.
Moon's law very much applies to quantum computing. A long way isn't decades, but a few years.
This guy presumes a 1500 cubit machine would topple BTC https://www.quora.com/How-easy-would-it-...ele-Righes
1500 isn't a number that that's far out considering the exponential growth of quantum computing.
Exponential growth? I don't know about that.
5-6 years ago they were at 4-6 qubit range. So they've gained an order of magnitude in half a decade. Ok, to be fair, they were stuck at that range until about a year ago. Still I wouldn't say it's exponential growth as there hasn't really been a real trend of sorts. More like academic research for decades, stuck on single digits qubit systems. Industry gets involved, scale by an order of magnitude. Whether they can keep doing that remains to be seen.
They still got a long way to go and as for scaling from 49-qubit to a 1500-qubit computer, it remains to be seen if they can keep up the error correction.
As for Moore's Law, it's based on reducing the dimensions (factor of ~0.7 in length and width, to get a total reduction factor of (0.7*0.7)=0.5) of CMOS devices.
Moore's Law is a trend of doubling number of devices on a chip as you reduce the size of the devices in half.
That doesn't apply to quantum computing, you don't scale qubits like that. Partly because qubits don't have to be made of semiconducting materials, there are other approaches. If they wanted to, they could stack 1500 qubits together today. The problem is with quantum error correction. Every time you add another qubit, you increase the probability of decoherence, which means your qubit loses its information due to external interference (sometimes from other qubits).
So going from 50 to 51 qubits is a whole lot harder than going from 49 to 50. Going from 49 to 1500 isn't just a few doublings, it's orders of magnitudes more challenging to do.
Also, just to emphasize the Forbes link I post, as it directly quotes a Google QC guy:
Quote:Quote:
But Google’s quantum computing expert John Martinis wants to put their minds at ease.
At a major crypto event at University of California Santa Barbara this week, Martinis talked about why it could take a decade or more to build a quantum computer. “This is really, really hard, way harder than building a classical computer,” he said.
He went on to explain to the packed room at Crypto 2017, a four-day conference sponsored by the International Association for Cryptographic Research, that the main reason building a quantum computer is so tough is because qubits (quantum bits), the counterpart of bits in classical computers, are unstable. And that creates extra work for physicists trying to solve the problem.
So even the QC guy at Google is saying it's a long way from happening.
And finally, cryptography has always been an arms race. Whatever algorithm you have has a shelf life. You're meant to upgrade it as technology catches up and makes a certain algorithm easy to break.
Andreas Antonopoulos talks a bit about how governments actually go about using their advanced cryptography skills - if a government knows a way to crack current algorithms, they don't use it nilly-willy, but wait for opportunities that warrant it, e.g. breaking another country's nuclear codes. You get one shot at it, after that you basically lose your advantage:
https://youtu.be/dkXKpMku5QY
Cryptography is a constant arms race, people are working on cryptographic algorithms that'll make SHA-256 pale in comparison.
From the Quora link you posted:
Quote:Quote:
Public-key crypto that is secure against QC does exist, however. Currently, Bitcoin experts tend to favor a cryptosystem based on Lamport signatures. Lamport signatures are very fast to compute, but they have two major downsides:
The signature would be quite large, around 11 kB (169 times larger than now). This would be very bad for Bitcoin's overall scalability, since bandwidth is one of the main limiting factors to Bitcoin's scaling. Advances in scalability such as Segregated Witness (the 11 kB is part of the witness) and Lightning would help.
Both Bitcoin and Ethereum are working hard on the scalability problem. I'll focus on Ethereum for the time being since I know their development a bit better: in the next 2-3 years you'll see several scalability solutions come online, which should enable Lamport signatures to be feasible.
This will be LONG before QCs come into existence as a real risk to cryptocurrencies.
Overall, I'm far from convinced QC is a genuine threat in the next 2-3, 5 and maybe even 10 years or beyond.
I'm open to changing my mind though, especially in the next few years as QC develops. But not worried at the moment.