The Case Against Quantum Computing

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
Sorry "DUDE" but you dont have the reasoning ability to ever understand this, and i dont even think you are trying. So quit trolling. Just so you know, i barely even read your long and drawn out posts because you just keep repeating yourself.
Say what you will im done with this topic for now. I have better things to do then argue with you.
Also, i suggest you work on some 50 or 60Hz systems where the frequency is, what? It is constant? Gee what a surprise :)
I can't see one instance of trolling on his part. A large river in Africa is what I see here.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
So you dont think there is such at thing as a system with only one frequency then either?
Your point is valid but limited in scope instead of a general rule, much like the specialized problem used to 'prove' Quantum Supremacy in this thread.
There are systems with only one voltage too where we can use current (like the current rating of a line utility breaker) as a proxy for power. This fact doesn't make ohms law obsolete or empower electricians to do electrical engineering.
 
Last edited:

bogosort

Joined Sep 24, 2011
696
You just know that someone is looking to find a loophole in their proof.
https://gilkalai.wordpress.com/2019...t-probably-false-supremacy-claims-google/amp/
I'm sure lots of people are, including those who believe in quantum supremacy. Which is as it should be, otherwise it's not science.

Gil Kalai's main point of skepticism, if I'm understanding him correctly, is that quantum errors scale quadratically in n for an n-qubit computer. In other words, we'll never be able to build in enough fault-tolerance to make a working quantum computer. His intuition is that a computationally simple device cannot produce a computationally superior result, which is essentially a restatement of the Church-Turing thesis. To his credit, he publicly stated that he'd be satisfied with Google's claim of quantum supremacy if they provide their dataset and a couple of reasonable supplementary other results.
 

bogosort

Joined Sep 24, 2011
696
Sorry "DUDE" but you dont have the reasoning ability to ever understand this, and i dont even think you are trying. So quit trolling. Just so you know, i barely even read your long and drawn out posts because you just keep repeating yourself.
My "long and drawn out" posts are my sincere attempt to respond to your arguments. If you can't be bothered to even read them, then I'm clearly wasting my time.

Also, i suggest you work on some 50 or 60Hz systems where the frequency is, what? It is constant? Gee what a surprise :)
Sigh. You act like I'm an idiot that hasn't heard of 60 Hz systems, but you're the one who seems to believe that the signal coming from the power company is (120)(√2) sin(120πt). Ignoring the crucial fact that pure sinusoids are physically impossible signals, do you not understand that 60 Hz is the nominal frequency, that the only reason we use ω = 120π is because it simplifies our calculations, that the actual signal contains a time-varying ω and harmonics of all orders?

To sum it all up: you believe that, as a universal fact, the frequency domain is computationally more powerful than the time domain, and your argument is based on the very particular case of an idealized pure sinusoid being fed to an RC filter. When I make the effort to show you how the general case proves that both domains are computationally equivalent, your counterargument is that I don't have the reasoning ability to understand you and/or I must not have worked with 60 Hz power supplies before. And you call me a troll...
 

Ya’akov

Joined Jan 27, 2019
9,150
Landauer's biggest contribution, not yet fully acknowledged what his "Information is Physical" paper that completely rejigger the understanding about what information actually is. Because of institutional orthodoxy, funding for experiments was impossible to come by and it was only recently, and only in a Japanese journal that a simple experiment proving him right was done at my own institution by an *undergrad* researcher in a table top box showing the cost of a bit is a *classical* phenomenon and can't be hand-waved away as something only at the quantum level.

I would commend this paper to *anyone* interested in how this world is composed, it's written very accessibly.

https://yaakov.me/landauer.pdf
 

MrAl

Joined Jun 17, 2014
11,474
Your point is valid but limited in scope instead of a general rule, much like the specialized problem used to 'prove' Quantum Supremacy in this thread.
There are systems with only one voltage too where we can use current (like the current rating of a line utility breaker) as a proxy for power. This fact doesn't make ohms law obsolete or empower electricians to do electrical engineering.
Hi,

Yeah sorry man but you just cant get the point. Since you continue to misunderstand, let me give you an example...

"I am not trying to disprove Ohm's Law"

See?
 

MrAl

Joined Jun 17, 2014
11,474
Landauer's biggest contribution, not yet fully acknowledged what his "Information is Physical" paper that completely rejigger the understanding about what information actually is. Because of institutional orthodoxy, funding for experiments was impossible to come by and it was only recently, and only in a Japanese journal that a simple experiment proving him right was done at my own institution by an *undergrad* researcher in a table top box showing the cost of a bit is a *classical* phenomenon and can't be hand-waved away as something only at the quantum level.

I would commend this paper to *anyone* interested in how this world is composed, it's written very accessibly.

https://yaakov.me/landauer.pdf

Hi,

The most striking example is entanglement where there is a distance between entangled entities. Classical physics can not explain how the two communicate over vast distances. Seems impossible, but then again electricity seemed impossible before it was discovered.
Little particles moving through a metal? You've got to be kidding? Nothing can move though a metal it's solid :)
Invisible waves traveling though the air? No way :)
 

Ya’akov

Joined Jan 27, 2019
9,150
Hi,

The most striking example is entanglement where there is a distance between entangled entities. Classical physics can not explain how the two communicate over vast distances. Seems impossible, but then again electricity seemed impossible before it was discovered.
Little particles moving through a metal? You've got to be kidding? Nothing can move though a metal it's solid :)
Invisible waves traveling though the air? No way :)
I'd encourage you to read the linked paper, it's very interesting and paradigm-breaking. It posits that information doesn't exist unless it has a physical representation and that there is a particular cost for deleting a bit of information in ordinary computing which can be circumvented.

Fascinating stuff, and thought provoking. It suggests the stuff of the world and information are not different things which leads to all sorts of possibilities.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
Hi,

Yeah sorry man but you just cant get the point. Since you continue to misunderstand, let me give you an example...

"I am not trying to disprove Ohm's Law"

See?
It doesn't change the fact that you are wrong at the fundamental concept, not misunderstood.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
Landauer's biggest contribution, not yet fully acknowledged what his "Information is Physical" paper that completely rejigger the understanding about what information actually is. Because of institutional orthodoxy, funding for experiments was impossible to come by and it was only recently, and only in a Japanese journal that a simple experiment proving him right was done at my own institution by an *undergrad* researcher in a table top box showing the cost of a bit is a *classical* phenomenon and can't be hand-waved away as something only at the quantum level.

I would commend this paper to *anyone* interested in how this world is composed, it's written very accessibly.

https://yaakov.me/landauer.pdf
That information is physical is not a new concept. If you took cryptography 101 then Claude Shannon is a familiar.
There is no free lunch as QM obeys thermodynamics. That doesn't mean it can't be mechanically more efficient at some thermodynamic limited process.

Entropy in thermodynamics and information theory
https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory


 

Ya’akov

Joined Jan 27, 2019
9,150
That information is physical is not a new concept. If you took cryptography 101 then Claude Shannon is a familiar.
There is no free lunch as QM obeys thermodynamics. That doesn't mean it can't be mechanically more efficient at some thermodynamic limited process.

Entropy in thermodynamics and information theory
https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

Shannon did not address the concepts in Landauer's paper.

it is orthogonal to Shannon.

The fact that the language and math of thermodynamics works for Information Theory wasn't taken to mean that information is physical in fact, I argued with a couple of prominent IT theorists on this until I found Shannon's paper and they conceded there was something there. One other of our faculty, *not* an IT guy, was already working in it, unknown to me.

There seems to be a deep reason for entropy existing in both places, and for Hamiltonian Phase Space apply to both particle and information systems.
 

Ya’akov

Joined Jan 27, 2019
9,150
Shannon did not address the concepts in Landauer's paper.

it is orthogonal to Shannon.

The fact that the language and math of thermodynamics works for Information Theory wasn't taken to mean that information is physical in fact, I argued with a couple of prominent IT theorists on this until I found Shannon's paper and they conceded there was something there. One other of our faculty, *not* an IT guy, was already working in it, unknown to me.

There seems to be a deep reason for entropy existing in both places, and for Hamiltonian Phase Space apply to both particle and information systems.
To put a fine point on it, Shannon channels are CLEARLY physical, but *information* was regarded as abstract until Landauer suggested it is not and always requires a representation to exist.
 

MrAl

Joined Jun 17, 2014
11,474
It doesn't change the fact that you are wrong at the fundamental concept, not misunderstood.
Actually, you're wrong. See how easy it is to say someone is wrong here?
But you really are wrong this time.
If you think i am, then prove it or shut up.
 

MrAl

Joined Jun 17, 2014
11,474
I'd encourage you to read the linked paper, it's very interesting and paradigm-breaking. It posits that information doesn't exist unless it has a physical representation and that there is a particular cost for deleting a bit of information in ordinary computing which can be circumvented.

Fascinating stuff, and thought provoking. It suggests the stuff of the world and information are not different things which leads to all sorts of possibilities.
Hi,

Oh yes i've looked into stuff like that for example that the universe is a 3d projection from some massive computer.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
To put a fine point on it, Shannon channels are CLEARLY physical, but *information* was regarded as abstract until Landauer suggested it is not and always requires a representation to exist.
I believe that Shannon's paper was about information as channels are simply coding conduits for information.
http://philsci-archive.pitt.edu/10911/1/What_is_Shannon_Information.pdf
It is quite clear that H S( ) and H D( ) are average amounts of information. Nevertheless, in the literature they are usually termed ‘entropies’, a strategy that could be explained by the fact that it is a name shorter than ‘average amount of information’. However, according to a traditional story, the term ‘entropy’ was suggested by John von Neumann to Shannon in the following terms: “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.” (Tribus and McIrving 1971, p. 180). In Italian it is usually said: “se non è vero, è ben trovato”, that is, “if it is not true, it is a good story”. In fact, even at present there are still many controversies about the content of the concept of entropy, whose deep implications can be easily compared to those resulting from the debates about the meaning of the term ‘information’.
Information was regarded as abstract by mathematicians in formal papers but not by the engineers who actually build the devices. ;) The only thing that might have a zero cost is losing (erasure) information, information transformation of any sort has an energy cost because there must be a physical implementation even in the abstract case of a persons thoughts.

Claims.
https://www.nature.com/articles/ncomms12068
Counter-claims.
https://www.researchgate.net/public...cro-Electromechanical_Irreversible_Logic_Gate
 
Last edited:

bogosort

Joined Sep 24, 2011
696
To put a fine point on it, Shannon channels are CLEARLY physical, but *information* was regarded as abstract until Landauer suggested it is not and always requires a representation to exist.
The question of whether something is or is not physical is not as simple as it might seem. The most straightforward interpretation is that a physical thing exists "out there", independent of any human (or otherwise) experience. But if we take this view seriously, then we must abandon the notion that physics is about physical things. Physics is necessarily about our experiences of a thing and says nothing of the thing-in-itself. Thus, physicality implies at least one level of abstraction.

Is energy physical? In the sense that we use it to quantify physical processes, surely it is. We say that energy is (locally) conserved, but it is not a substance or a thing, it is a concept, another level of abstraction. Indeed, we can see a hierarchy of physical abstractions in play when we consider frame-invariance. It seems meaningful and significant that the values of some physical properties, like electric charge or the temperature of an object at thermal equilibrium, are invariant, while the values of others, like energy, depend on the frame of reference of the scientist making the measurement.

A related notion is thermodynamic entropy, which relates the number of microscopic states that can be associated with some particular macroscopic state of a system, an idea that sounds perfectly and uncontroversially physical. Yet, even this simple definition admits a classic paradox: Imagine a sealed box with a removable partition separating the two halves of the box. The two halves each contain a different type of gas at the same temperature, in amounts that result in the same pressure. If we remove the partition and allow the two halves to diffuse, then there will be no net change in temperature, pressure, or total energy, however, there will be a change in entropy. In fact, the entropy will change be a constant factor, completely independent of the nature of the two gases. The surprising fact is that the only thing that matters is that the gases are of different types -- if we perform the same experiment with the two halves containing the same type of gas, lifting the partition will not change the entropy.

Why should the number of microstates associated with a particular macrostate depend so specifically on the composition of the constituent parts? It's a hard question that becomes easy if we think about the reversibility of the process. In the case of the same type of gas, restoring the original thermodynamic state is as simple as dropping the partition: there is no macroscopic distinction between the before and after picture as every gas molecule is alike and equally distributed. However, in the case of two different gases, restoring the original partitioned macrostate would require something like Maxwell's demon, i.e., it would require us to do work on each molecule. Which brings us back to physics.

It seems subtle at first, but there is a very real sense in which information (a measure of the number of distinguishable states) is the key player in the physics just described. Whether the entropy of the box increases or not -- which is to say, whether or not work can be extracted -- depends on the distinguish-ability of the molecules. It has nothing to do with the atomic or molecular properties of the gases involved; in fact, if we were to repeat the experiment with a single type of gas but could somehow color the molecules on the left red and those on the right green, then lifting the partition would increase entropy. We know that work could be done because to re-partition the mixture, Maxwell's demon would simply sort by color.

As I see it, information is no more or less 'physical' than energy or entropy, though I do feel that there is a hierarchy of abstraction. At the bottom, invariants like charge are the most directly physical; at the top are the concepts like force and torque, which are far removed from what's happening at a fundamental level. I'd put information and energy somewhere in between.

A relevant question: under suitable restrictions, is information a physical invariant?
 

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
https://www.nature.com/articles/s41586-019-1666-5
Abstract
The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor1. A fundamental challenge is to build a high-fidelity processor capable of running quantum algorithms in an exponentially large computational space. Here we report the use of a processor with programmable superconducting qubits2,3,4,5,6,7 to create quantum states on 53 qubits, corresponding to a computational state-space of dimension 253 (about 1016). Measurements from repeated experiments sample the resulting probability distribution, which we verify using classical simulations. Our Sycamore processor takes about 200 seconds to sample one instance of a quantum circuit a million times—our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years. This dramatic increase in speed compared to all known classical algorithms is an experimental realization of quantum supremacy8,9,10,11,12,13,14 for this specific computational task, heralding a much-anticipated computing paradigm.

https://www.ibm.com/blogs/research/2019/10/on-quantum-supremacy/
Recent advances in quantum computing have resulted in two 53-qubit processors: one from our group in IBM and a device described by Google in a paper published in the journal Nature. In the paper, it is argued that their device reached “quantum supremacy” and that “a state-of-the-art supercomputer would require approximately 10,000 years to perform the equivalent task.” We argue that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity. This is in fact a conservative, worst-case estimate, and we expect that with additional refinements the classical cost of the simulation can be further reduced.
...
For the reasons stated above, and since we already have ample evidence that the term “quantum supremacy” is being broadly misinterpreted and causing ever growing amounts of confusion, we urge the community to treat claims that, for the first time, a quantum computer did something that a classical computer cannot with a large dose of skepticism due to the complicated nature of benchmarking an appropriate metric.
 

cmartinez

Joined Jan 17, 2007
8,252
Top