Our server costs ~$56 per month to run. Please consider donating or becoming a Patron to help keep the site running. Help us gain new members by following us on Twitter and liking our page on Facebook!
Current time: April 24, 2024, 9:10 pm

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[Serious] Quantum v. Classical computing.
#1
Quantum v. Classical computing.
With all the hubbub regarding quantum computing, as an IT worker with 30 years in the field, I am skeptical of quantum computers ever surpassing classical computers in anything. For instance, IBM around ten years ago got around to factoring the number 15 with a quantum computer.

A Google search right now found 1,099,551,473,989 as the largest factored number. I just put that into the Factor Calculator program available from the Google Play store on my Samsung A71 phone. I was not sure about the computational time, and was worried that such might take hours; I have to leave for work soon! The result came back instantaneously: 1,048,589 multiplied by 1,048,601.

Of course, I am not a specialist in this area, but, neither am I a futurist. As always, I am open to change, and, especially, in being proved wrong.
Reply
#2
RE: Quantum v. Classical computing.
I believe that quantum computers are superior at simulating quantum systems. They won't be significantly better at solving classical problems, like factoring.

As a physicist, the whole idea of quantum supremacy seems like magic. It assumes that quantum noise in all parts of a system are uncorrelated with each other, and it requires methods for reducing quantum noise. The exact nature of noise in quantum systems is NOT well understood.
Reply
#3
RE: Quantum v. Classical computing.
Another issue is error detection and correction. As a physicist, do you think that one is even solvable? Factoring large integers is something that can be checked, of course, but other types of simulations presuppose numerical accuracy and stability. If such is not possible with quantum computers, their use would seem rather limited.
Reply
#4
RE: Quantum v. Classical computing.
(March 18, 2022 at 11:27 am)Jehanne Wrote: Another issue is error detection and correction.  As a physicist, do you think that one is even solvable?  Factoring large integers is something that can be checked, of course, but other types of simulations presuppose numerical accuracy and stability.  If such is not possible with quantum computers, their use would seem rather limited.

There are ways to do error correction, but the most important thing is to prevent errors in the first place.  In theory, one can do noise reduction schemes on the quantum system that work mathematically.  I just don't think the math corresponds to reality.

The computation power of a quantum computer is exponential to the number of qubits.  The problem is that the probability of error due to noise also goes up exponentially.  Otherwise, we would have magic.

I forget the theoretical techniques for noise reduction and error correction, but I do know they rely on a decorrelated noise model.  The problem is that perturbations within a quantum system have a time evolution, and I am convinced that the noise is not decorrelated during that time.  If you want 100 cubit systems to give meaningful results, you will have to wait exponentially longer times for the system to achieve a noise-free state (and then, there is the problem of thermodynamics, where it will get hit by a stray bit of energy from somewhere).

Quantum Computing has been very disappointing, and I have been convinced for 30 years that it will not amount to much, except in the areas of simulating quantum systems.  Because it is a quantum system, it can simulate quantum systems.
Reply
#5
RE: Quantum v. Classical computing.
The problem with Quantum computers is noise and hence error and incoherence between qbits. However quantum computing can work in a space where an optimal choice of fields can be made from a large exponential set of fields. kind of like finding an optimal path along a complex light wave array. In order to have all the workings of that wave setup as to find your optimal path on a classical computer you may have to simulate the whole thing where as real world physics can do such things fast without simulation.
When dealing with error quantum engineers may be able to overlap solving so that many instances of clues are gathered multiple times as for the overlapping solutions to amalgamate into a more optimal single solution error free.
Using this approach to solving you get more benefit the better your error correction is per solution thread. 
These exponential fields are directly relative to how many qubits you have so 4 qbits can work with exponentially more fields than 2Qbits.
It's important to note that IBM and Google have yes 56Qbit ect machines but some of the qubits help more in error correction than actually handle the fields so at the moment it's hard to get any kind of coherence at all.
Classical computers need a less centralised data flow and to make use of less accurate more dynamic analogue calculations. Classical computing could also do with going optical.
Reply
#6
RE: Quantum v. Classical computing.
(May 18, 2022 at 3:04 pm)highdimensionman Wrote: The problem with Quantum computers is noise and hence error and incoherence between qbits. However quantum computing can work in a space where an optimal choice of fields can be made from a large exponential set of fields. kind of like finding an optimal path along a complex light wave array. In order to have all the workings of that wave setup as to find your optimal path on a classical computer you may have to simulate the whole thing where as real world physics can do such things fast without simulation.
When dealing with error quantum engineers may be able to overlap solving so that many instances of clues are gathered multiple times as for the overlapping solutions to amalgamate into a more optimal single solution error free.
Using this approach to solving you get more benefit the better your error correction is per solution thread. 
These exponential fields are directly relative to how many qubits you have so 4 qbits can work with exponentially more fields than 2Qbits.
It's important to note that IBM and Google have yes 56Qbit ect machines but some of the qubits help more in error correction than actually handle the fields so at the moment it's hard to get any kind of coherence at all.
Classical computers need a less centralised data flow and to make use of less accurate more dynamic analogue calculations. Classical computing could also do with going optical.

In terms of how Qbit processes actually compare to classical computer processes there are 2 main possible Qbit types in terms of the computational complexity zoo.

The first way is the one most are working on where you have arrays of qubits in a wave where by the wave becomes exponentially more able to handle certain complex problems the more qbits you have. sometimes you get exponential speed ups sometimes you get lower order speed up's. It's quicker shall we say than a classical computer for some problems.
Psharpe time is the type of problems this type of computer can possibly solve it's not really that good at mimicking a classical computer.
Think highly optimised production efficiency and management of supply and demand or understanding the full dynamic of something as complex as a caffeine molecule

The other way is to use stacked qbits. The problem's may only be exponential to a power of 4 stacks but that's say 1000(Stacked Qbits so each Qbit represents a number between 1 and 1000 not between 1 and 0)^4(the degree of complexity). Working this way you can accurately speed up many classical operations but with polynomial speed up and no exponential speed ups and many other quantum calc benefits would be limited.
P time and below is the type of problem this type of computer may solve faster and it could possibly be used to speed up much classical computation.
Think doom 2040 where you lead into your own nightmare on the holographic display with your mind or an AI chat bot that functions correctly.
Reply
#7
RE: Quantum v. Classical computing.
Question is, "Will a quantum computer beat a classical computer on anything?"
Reply
#8
RE: Quantum v. Classical computing.
(May 20, 2022 at 1:41 pm)Jehanne Wrote: Question is, "Will a quantum computer beat a classical computer on anything?"

Photonic Quantum computing already can beat a classical computer at few very specific tasks.
However if a quantum computer will do anything particularly useful a lot faster than a classical computer that question is still open.
Reply
#9
RE: Quantum v. Classical computing.
(May 20, 2022 at 4:38 pm)highdimensionman Wrote:
(May 20, 2022 at 1:41 pm)Jehanne Wrote: Question is, "Will a quantum computer beat a classical computer on anything?"

Photonic Quantum computing already can beat a classical computer at few very specific tasks.
However if a quantum computer will do anything particularly useful a lot faster than a classical computer that question is still open.

Can you provide a reference for photonic quantum computers outperforming a classical one? I can't find anything on this.
Reply
#10
RE: Quantum v. Classical computing.
(May 20, 2022 at 1:41 pm)Jehanne Wrote: Question is, "Will a quantum computer beat a classical computer on anything?"

Both kinds of computer could beat me at chess, but it evens out, since I’m pretty sure I could take them at kickboxing.

Boru
‘But it does me no injury for my neighbour to say there are twenty gods or no gods. It neither picks my pocket nor breaks my leg.’ - Thomas Jefferson
Reply



Possibly Related Threads...
Thread Author Replies Views Last Post
  Will quantum computers slow our Internet down? FlatAssembler 5 494 November 16, 2023 at 3:42 pm
Last Post: Ravenshire
  Is Firefox quantum really better than Google Chrome? ErGingerbreadMandude 19 3127 November 19, 2017 at 2:31 am
Last Post: Ravenshire
  What's the Deal with Virtual Computing? AFTT47 3 990 May 3, 2015 at 5:54 am
Last Post: bennyboy
  quantum computers simplexity 1 1008 July 6, 2013 at 3:27 am
Last Post: Kayenneh
  Cloud computing Zen Badger 18 4577 March 13, 2013 at 6:30 pm
Last Post: Jackalope
  Progress towards quantum computing? popeyespappy 0 885 October 7, 2011 at 12:15 am
Last Post: popeyespappy



Users browsing this thread: 1 Guest(s)