The Case Against Quantum Computing

cmartinez

Joined Jan 17, 2007
8,182

The anyon is unique because it keeps a kind of record of where it has been. It was first discovered in the 1970s to exist in only two dimensions and resemble quasiparticles, collective vibrations that behave as if they are particles. Swapping anyons keep a record of the number of swaps that influence the way they vibrate, making them an attractive way to do quantum computing, but they had never before been found experimentally.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
12,862
At least there is some balance in the pop-sci article.
However, some researchers claim that Quantinuum has not actually created non-Abelian anyons. They argue that the firm has instead merely simulated them.

“I know they’re very excited about their work and they should be excited, but it is still a simulation,” told New Scientist Jiannis Pachos at the University of Leeds, UK.

But Dryer claims that the quasiparticle nature of anyons means that a simulation is identical to the real thing.
"a simulation is identical to the real thing" We all know that's not true. ;)
 

cmartinez

Joined Jan 17, 2007
8,182
Same article, but available to non-suscribers:


... as the number of particles in the puzzle increased, methods such as approximation were needed to compute the solution, and the results of both machines agreed. Finally, though, the calculations became so complex that the supercomputer could no longer handle them.

The Eagle quantum computer, though continued to churn out numbers. Although the team did not have any means to test if the results were accurate, the results agreed with established calculations.

Even as this was the first time a quantum computer with more than 100 qubits had been demonstrated to work correctly, the research team is nowhere near claiming quantum supremacy. That would be the stage where quantum computers achieve levels that are impossible to accomplish for supercomputers.
 

cmartinez

Joined Jan 17, 2007
8,182

Martinis says that the results validate IBM’s short-term strategy, which aims to provide useful computing by mitigating, as opposed to correcting, errors. Over the longer term, IBM and most other companies hope to shift towards quantum error correction, a technique that will require large numbers of additional qubits for each data qubit. (Google’s strategy has focused on refining quantum error-correction techniques.)
 

Thread Starter

nsaspook

Joined Aug 27, 2009
12,862
At least their prediction falls within a two year span ... and not a fifteen year like the fusion tech ... :rolleyes:

Almost every time someone says that, a clever person finds a way (using a classical form of the quantum algorithm) to make the classical computer equal to the job so far.
https://www.science.org/content/art...-can-beat-google-s-quantum-computer-after-all
If the quantum computing era dawned 3 years ago, its rising sun may have ducked behind a cloud. In 2019, Google researchers claimed they had passed a milestone known as quantum supremacy when their quantum computer Sycamore performed in 200 seconds an abstruse calculation they said would tie up a supercomputer for 10,000 years. Now, scientists in China have done the computation in a few hours with ordinary processors. A supercomputer, they say, could beat Sycamore outright.

“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the … task in a matter of seconds,” says Scott Aaronson, a computer scientist at the University of Texas, Austin. The advance takes a bit of the shine off Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting to 300 feet from the summit is less exciting than getting to the summit.”

Still, the promise of quantum computing remains undimmed, Kuperberg and others say. And Sergio Boixo, principal scientist for Google Quantum AI, said in an email the Google team knew its edge might not hold for very long. “In our 2019 paper, we said that classical algorithms would improve,” he said. But, “we don’t think this classical approach can keep up with quantum circuits in 2022 and beyond.”
 

cmartinez

Joined Jan 17, 2007
8,182

MrAl

Joined Jun 17, 2014
11,281
You should've started your sentence with "So far, almost every time ... "

I truly believe that eventually they'll succeed, and soon. Then the sky will be the limit for serious research and its applications.
...and God knows what else is to follow.

Ok so somebody start the stopwatch...
 
Last edited:

Thread Starter

nsaspook

Joined Aug 27, 2009
12,862
https://phys.org/news/2023-11-limits-quantum-clocks-impossible.amp
Limits for quantum computers: Perfect clocks are impossible, research finds
The research team was able to show that since no clock has an infinite amount of energy available (or generates an infinite amount of entropy), it can never have perfect resolution and perfect precision at the same time. This sets fundamental limits to the possibilities of quantum computers.

Quantum calculation steps are like rotations
In our classical world, perfect arithmetic operations are not a problem. For example, you can use an abacus in which wooden balls are threaded onto a stick and pushed back and forth. The wooden beads have clear states, each one is in a very specific place, if you don't do anything the bead will stay exactly where it was.

And whether you move the bead quickly or slowly does not affect the result. But in quantum physics it is more complicated.

"Mathematically speaking, changing a quantum state in a quantum computer corresponds to a rotation in higher dimensions," says Jake Xuereb from the Atomic Institute at the Vienna University of Technology in the team of Marcus Huber and first author of the first paper published in Physical Review Letters. "In order to achieve the desired state in the end, the rotation must be applied for a very specific period of time. Otherwise, you turn the state either too short or too far.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
12,862
https://spectrum.ieee.org/quantum-computing-skeptics
Quantum Computing’s Hard, Cold Reality Check
Hype is everywhere, skeptics say, and practical applications are still far away
Troyer and his colleagues compared a single Nvidia A100 GPU against a fictional future fault-tolerant quantum computer with 10,000 “logical qubits” and gates times much faster than today’s devices. Troyer says they found that a quantum algorithm with a quadratic speed up would have to run for centuries, or even millenia, before it could outperform a classical one on problems big enough to be useful.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
12,862
https://physics.aps.org/articles/v17/13
A Moving Target for Quantum Advantage

Researchers have used quantum computers to solve difficult physics problems. But claims of a quantum “advantage” must wait as ever-improving algorithms boost the performance of classical computers.
Recently, a 127-qubit quantum computer was used to calculate the dynamics of an array of tiny magnets, or spins—a problem that would take an unfathomably long time to solve exactly with a classical computer [1]. The team behind the feat showed that their quantum computation was more accurate than nonexact classical simulations using state-of-the-art approximation methods. But these methods represented only a small handful of those available to classical-computing researchers. Now Joseph Tindall and his colleagues at the Flatiron Institute in New York show that a classical computer using an algorithm based on a so-called tensor network can produce highly accurate solutions to the spin problem with relative ease [2]. The results show that the classical-computing field still has many tricks up its sleeve, making it hard to predict when the quantum-computing field will gain the upper hand.
 

joeyd999

Joined Jun 6, 2011
5,188
@nsaspook, I figured you'd get around to this eventually:


I generally like Dr. Hossenfelder, but she sometimes makes the mistake of assuming all scientists are honorable (as she is, TTBOMK).
 
Top