I'm just wondering, can we run shors algo yet or no? :)
Two quick points on this:
- D-Wave do not claim to be building a general quantum computer (although they willfully muddy the water on this). Their machines in principle can not run something like Shor's algorithm. It is not clear whether in general they will ever be able to run anything better than a competitive classical computer. They are just interesting analog computers that happen to use some quantum effects (almost incidentally).
- Shor's algorithm can only run on an error-corrected quantum computer. This will most certainly not be the first "engineeringly useful" algorithm to run on a quantum computer. The a more informative milestone would be "can they run error correction over many rounds of computation". All cutsie examples of a quantum computer running Shor's algorithm on numbers like 21 are just cutesie tech demos that have no chance of scaling.
Simulation of quantum systems (drugs, new materials, solar cells, batteries) is a much more plausible near-term application of a quantum computer. If it ever happens it will probably happen many years before Shor's algorithm successfully runs.
> D-Wave do not claim to be building a general quantum computer (although they willfully muddy the water on this)
While currently the machines they are talking about are quantum annealing only, there is some indication they’re working towards a universal QC [1] [2].
> During the Qubits 2021 conference held by D-Wave, it was announced that the company is developing their first universal quantum computers, capable of running Shor's algorithm in addition to other gate-model algorithms such as QAOA and VQE.
As for your other point,
> Simulation of quantum systems (drugs, new materials, solar cells, batteries) is a much more plausible near-term application of a quantum computer. If it ever happens it will probably happen many years before Shor's algorithm successfully runs.
Isn’t this exactly the kind of stuff that annealing can be applied to? [3]
[1] https://www.theregister.com/2021/10/05/dwave_quantum_compute...
> Isn’t this exactly the kind of stuff that annealing can be applied to?
Yes, but there is just no evidence that annealers are better at this kind of stuff than classical algorithms. They are "just" an interesting analog computer that can be applied to this type of problems, without a reason to believe that they will be drastically better (in terms of complexity theory).
On the other hand, circuit-based digital quantum computers and error-corrected adiabatic quantum computers (which are similar but sufficiently different from annealers) can run algorithms that are better than all known classical algorithms at this kind of stuff (modulo the fact that such computers do not exist yet).
The analog computer is a quantum system (of atoms for example), and your argument rests in that the complexity of simulating a system of atoms is not high and that nature (i.e. the "analog computer") isn't performing something better from a complexity standpoint.
While I agree that there are pretty efficient algorithms to simulate systems of atoms behaving like Newtonian bodies (like molecular dynamics) it quickly falls apart when you need to include more "quantum like" effects.
The digital quantum computer is itself built out of atoms, so if you could simulate an analog computer (which derives its properties from QM) efficiently, you could then also simulate the digital quantum computer efficiently.
I guess it's a question of semantics - like what properties of nature are an "analog computer" allowed to tap. My point is, I think, that you kind of downplay the capabilities of what an analog computer is capable of by saying it's "just" an analog computer. But any small molecule whizzing around is in some sense a very difficult to simulate analog computer from another point of view.
> your argument rests in that the complexity of simulating a system of atoms is not high and that nature ...
No, that is not what I am going for. I completely agree with the rest of your post, given the premise you are taking.
All I am saying is that, while analog computers are cool and frequently "better" in various ways than digital computers (and it does not matter whether we are talking about quantum or classical ones), analog computers inherently can not scale. Without error-correction codes there is always a limit of size at which noise destroys the result of your computation and error-correction codes exist only for digital quantum and classical computers, not for analog quantum or classical computers.
That is why we use scale models of rivers and gulfs less and less. Even wind-tunels, the last analog computers still in use, are fading in utility.
By the way, in the abstract sense of computational complexity (in the absence of noise and in the presence of infinitely precise measurements), "scalable" classical and quantum analog computers are equally powerful and are both more powerful than "scalable" classical and quantum digital computers. They just can not exist in principle (while "scalable" quantum digital computers are only "difficult", not "in-principle impossible").
Adiabatic quantum computers (the error-corrected version of quantum annealers) on the other hand are a very neat "analog-looking but actually digital" type of computers that are equally powerful to the gate-based model of quantum computers.
And modulo the fact that we don’t even know that we can build ones that are actually faster than classical computers (at meaningful qubit sizes).
That is true -- all we can show is sustained exponential improvement in figures of merit like qubit lifetime and coherence, scalability, gate fidelity over the span of 25 years. As long as this exponential pace of improvement does not stop (we are far from fundamental limits) we should be able to build these machines.
Then there is the question of what they can be used for. Their applicability probably will remain niche, as only very special problems are solvable better on a quantum computer than on a classical one.
> can we run shors algo yet or no?
Progress with Shor's algorithm:
year 2001: 15 = 3 × 5 (IBM)
year 2012: 21 = 3 × 7 (University of Bristol)
year 2019: factorization of 35 attempted, failed (IBM)
https://en.wikipedia.org/wiki/Shor%27s_algorithm#Physical_im...SPOILER ALERT:
It's 7 x 5.
The systems made by D-Wave do quantum annealing, not general purpose quantum computation. As such, they are very useful for optimisation problems but they can't run Shor's algorithm.
> they are very useful for optimisation problems
Do you have a source? The HN consensus whenever they've come up in the past is their stuff is useless from both a theoretical and practical perspective, easily outperformed by normal computers, and there's nothing but dishonest marketing going on. If this has changed that seems like a big deal.
Ex: most recent thread I found https://news.ycombinator.com/item?id=34217218
Nothing has changed. https://scottaaronson.blog/?p=431 still applies as much as it ever did.
Wtf, that post is from 2009. 15 years on and this company is still playing the same game and the public is still being taken in by it?
Much more has been written on the topic in the intervening years, including by Aaronson. I'd challenge cwillu's statement that "nothing" has changed, but I'm a D-Wave employee* so I don't tend to wade in. More specifically, I'm not terribly concerned with what "the public" thinks about our computers. Academic and industrial partners are much more reliable judges of quality.
* I do architecture, algorithms and circuit design; not marketing. That said, I'd encourage you to compare our marketing budget against those of IBM, Google, Microsoft, and the other big players in the field -- if the public is to be swayed, it'll follow the splashy marketing.
So long as they talk about the number of “qubits” their machines have without prominently clarifying that they're using a heterodox definition of “qubit” and “quantum computer”, I will continue to maintain that nothing has changed.
There is nothing heterodox about calling our qubits qubits, nor are our qubit counts deceptive. The limitation is on the algorithms that can be performed on our systems; something that we're extremely clear about in our publications. We describe our systems as adiabatic quantum computers, explicitly calling out the difference between our current products and gate-model quantum computers. And, despite not being explicitly designed for gate-model operation, we've recently observed Bell violations using novel control protocols. This wouldn't be possible without "orthodox" qubits. So whatever your beef is, the critique you've leveled at us above is inaccurate to the point of being misleading.
hn's favorite passtime is perpetuating hype.
[dead]
Can they be used to train DNNs?
only if you can formulate it as an optimization problem under 4,400bit !
So, for my 430B parameter model, I'll go out on a limb and guess the answer is a "no"....?
It's really a shame that popular media made Shor's algorithm the pinnacle of quantum computing. After 3 years in the industry, I've yet to meet a single scientist, engineer, or organization who's truly interested or does active research in implementing Shor's algorithm. Even Grover's algorithm is more "popular", there are actual demos and small prototypes. But neither are actually the focus of the majority of active development.
It's as if the whole world would believe, for some reason, that the goal of AI is to produce punk songs in Farsi, and every news article about AI would be filled with comments "but can it make Farsi punk yet?"
Not to downplay the importance of Shor's algorithm, but it's just weird.
Maybe that says more about the QC field than pop media though? Shor's algorithm truly implies both an algorithmic and practical breakthrough (if possible to implement, the small if). But what other algos are there? It's this, quantum fourier transform, and Grover's like you say. What else can be cool and useful? What is the bulk of the active development, except error correction codes? What should someone start with now if they just enter the QC field?
One of the most interesting fields are not algorithms in this common sense, but optimization problems and reliable simulations of quantum systems. For example, certain car manufacturers are investing in on-premise quantum computers not because they want to break encryption or compute timetables, but because they want to develop novel electric batteries, and it's difficult unless you can simulate materials, chemical interactions, etc. Similar motivations exist in pharma (developing drugs), agriculture (developing additives, pesticides, etc.).
Another field is quantum sensing, from highly scientific (detecting particles/matter/etc.), to highly down-to-earth (e.g. military). Some properties of QCs can be used to improve the precision of quantum clocks.
There are so many exciting fields and applications, where progress is made and will be made before general purpose QC is developed on which you'd run Shor's and Grover's (if ever, in our lifetime). The truth is that QCs are best at niche unique applications related to quantum mechanics, and not so good at general-purpose computing.
Sure, pop media is not the source, it's being fed by industry spokespeople, who are ultimately concerned about funding. Many people in the industry do believe that quantum winter is coming. But it's kind of a self-destructive cycle: to continue progress we need funding, to get funding we make promises, to make them efficiently the industry gravitates towards easy-to-blow-your-mind topics like breaking encryption or solving traveling salesman, but chances are we won't have a QC capable of practically useful Shor in the decade or two at best, and by that time funding may decrease significantly because, well, "we can't Shor yet".
Examples like PsiQuantum raising almost 1B from partially public sources, and then delaying their ambitious plans does not help anybody. If they fail, the general public and the policymakers would, again, confirm their suspicions that "QCs are a scam".
> After 3 years in the industry, I've yet to meet a single scientist, engineer, or organization who's truly interested or does active research in implementing Shor's algorithm.
This is because the current state of the art is still very far away from programmable quantum computers. There is lots of work ahead on improving the error rate of individual quantum logic gates, before we can even begin to dream about building computers from the gates.
There also seems to be a large gap between how little current or near-future quantum computers can do (= nothing useful), compared to what the general public thinks they will soon be able to do.
We have quantum computers that can run the Shor algorithm, we managed to factor the number 21 (=3x7) with it in 2012, I am not aware of any improvement to this day.
D-Wave has been used to factor larger integers, but not using the Shor algorithm as it can't do that, it is not a general purpose quantum computer but a specialized "quantum annealer".
All of these attempts have involved some kind of trickery, and even the largest numbers claimed to have been factored are all ridiculously small compared to what we can do with classical computers. This is why these attempts are not taken very seriously: the preprocessing steps done with classical computers take much more time than what it would take for that classical computer to find the solution.
They used the fact that 21 = 7 x 3 in order to reduce gate-count.
The full-Shor has never been run on a hardware device IIRC.
(quite-funny when you see everyone and their mom hyping QC).
D-Wave makes ising machines. They will never run a QFT or Shor's algorithm no matter how big the computer gets.
It's an analog computer for a specific kind of global optimization problem, not a "quantum computer."
Not on a d wave processor, it's not a general purpose quantum computer.
When we start to see bitcoin wallets around the world empty, we'll know.
The value of something is contingent upon someone being willing to pay for it.
If bitcoin gets broken so obviously by a quantum computer, who would buy the coin?
Seems like the least profitable way for someone to announce their new creation. (Unless destroying bitcoin is the whole point, I suppose)
You short it (or better short or buy puts on stocks correlated like COIN and MSTR). Then empty exhange hot wallets - people will assume they got hacked as part for the course.
If your tech can destroy Bitcoin, then it's already over. It's only a matter of time. May as well capitalize or use it as a marketing stunt.
Appear weak when you are strong.
This. TLAs have probably had this for years and want to keep the illusion up as long as possible, while also projecting uncertainty to their rivals that maybe they have it. Maybe their communications aren't safe....maybe?
From someone actually working on developing this type of hardware: no, there is no entity in the world that is anywhere close to building a device that can run Shor's algorithm. We will have standardised post-quantum algorithms long before there is a machine to run Shor's algorithm.
NSA is always 20 years ahead of anybody else, so ....
I really don’t believe that.
How would we know
>D-Wave
I never understood the value of their product(s). Is someone right now using their computers? How do they make money to keep the company afloat? It feels like whenever I read an article about D-Wave, the details are obfuscated, but that's probably just me being unknowledgeable when it comes to quantum computing.
AFAIK they're still investor-funded and don't make any (noticeable amount of) money yet
They are making sales, like to Los Alamos National Laboratory, but it looks like it's mostly being used for testing if it's valuable.
so whats stopping them from running these chips side by side to add more qubits?
You could add "marketing qubits".
But it won't work for what a large number of qubits are useful for. The point of having quantum computer is that qubits can be in a superposition, be both 0 and 1 at the same time. The consequence is that for some algorithms, the computing speed scales exponentially to the number of qubits, while it scales linearly to the number of classical bits.
But in order to get that exponential speedup, the qubits have to stay coherent, which mean you have to combine them to execute the operation, but with zero disturbance from anywhere else. It is really hard to achieve, a single stray photon could completely destroy your calculation. And the more qubits, the harder it is. If you don't care about coherence, your qubits become no better than classical bits and having a quantum computer become useless.
A way to think of it is that a 30 qubit chip is a million times better than a 10 qubit chip, but three 10 qubit chips is just 3 times better than a single 10 qubit chip.
Or a single stray neutrino? It seems these quantum computers can be used as very sensitive antennas.
A qbit is only a qbit if it's entangled with other qbits. You have to keep the whole state in superposition.
The qubits must have a shared state. I don't think you can entangle separate machines.
How many error-corrected qbits?
Is it basically an array of coupled oscillators with a nuance that the oscillators are quantum and reading or changing their state is nontrivial?
[flagged]
It can make Doom a reality. Iirc, the plot was that UAC used a quantum detector of some sort to find suspicious background noise on Mars. Further research discovered the source of that noise.
Yes, and it can explore the entire map all at once.
Epic comment, fellow Redditor.
Reddit is too new for that type of comment. You've got to go back to slashdot.
What's slashdot
It's like fark but more tech posts
More hot grits, anyway
And, back in the day, Natalie Portman references
wild, that site is older than i am. by a few weeks anyways
Wow, 4,400+ Qubits is impressive, I wonder how they made it stable. Last time I read about the record it was IBM with around 1000 qubits.
D wave don't make general quantum computers. They use what is called quantum annealing. It's not comparable to IBMs, and you can't run general quantum algorithms on it.
I'll bite: what quantum things can they run on it? Or, what is its utility as a quantum computing device?
Quantum annealers (like DWave) can solve optimization problems efficiently. But they can’t run general quantum algorithms such as Shor’s algorithm, that requires a general quantum computer (like what IBM and Google are doing).
> can solve optimization problems efficiently
can possibly solve some (but without any guarantees on which or if any actually exist)
In fairness, neither can the ones IBM-GOOG are building.
True, but only because we haven't got enough (error corrected) qubits. In principle they could, whereas the dwave device will never be able to.
>optimization problems efficiently
did it reach the point where it can do any useful work faster than standard computer? Last time I read about it it was something stupid like "It can simulate its own noise"
Thanks!
It's the quantum version of simulated annealing.
Thanks for the precision, that might also explain the score of my comment. I'll get more info about quantum annealing.
Imagine a Beowulf cluster of those.
This looks like a final goodbye to good old RSA