How ignorant are we, really?

Thread Starter

KL7AJ

Joined Nov 4, 2008
2,229
It was little under 250 years ago that Antoine Lavoisier dispelled the notion of "phlogiston" in combustion and discovered oxygen, bringing chemistry, overnight, into the modern age. P.S. : Lavoisier was beheaded during the French Revolution, (after which time he made very few scientific discoveries, we presume).

Nonetheless....I wonder if our current understanding of chemistry (and the universe) may be just as primitive as that just preceding Lavoisier? Are there any major FUNDAMENTAL discoveries yet to be made, that would make us look like ignorant alchemists? (Actually, the alchemists ALMOST got it right in a lot of areas...close, but not quite).



Could it be that quantum computing will put everything we THINK we know to shame? Food for thought.

 

nsaspook

Joined Aug 27, 2009
13,312
I would give Joseph Priestley and Antoine Lavoisier credit.
https://www.pbssocal.org/programs/t...e-1-out-thin-air-priestley-lavoisier-preview/

No, quantum computing as a technology won't discover anything new and quantum sensing/communications is just a hype word for something we've been doing for a while with less precision and sensitivity.
https://arxiv.org/pdf/1611.02427.pdf
They open of the possibility of computing types of problems that classical computers can't efficiently like cracking 128-bit number sequences but there is only a small class of problems they really excel at above classical machines.
Any new fundamental discovery in physics must compatible with current theory as an approximation of the better theory. Even the flat earth 'theory' is a good approximation for a small farm in Kansas so most likely there are future discoveries at the extremes of feature size and levels of energy we can't obtain today.
 
Last edited:

cmartinez

Joined Jan 17, 2007
8,257
P.S. : Lavoisier was beheaded during the French Revolution, (after which time he made very few scientific discoveries, we presume).
And to think that the era of Terror in France was the culmination of the so-called era of Enlightenment ...
 

nsaspook

Joined Aug 27, 2009
13,312
Science knows it doesn't know everything but you only need to be a dedicated high schooler in physics to see something like the EM-Drive is bunk. It's easy to say that 'Pig's can't Fly' but it's a bear to prove it as a scientific fact.

 

cmartinez

Joined Jan 17, 2007
8,257
Science knows it doesn't know everything but you only need to be a dedicated high schooler in physics to see something like the EM-Drive is bunk. It's easy to say that 'Pig's can't Fly' but it's a bear to prove it as a scientific fact.

Now that you mention it, I find the most recent developments in "deepfake" videos a little unsettling.
 

WBahn

Joined Mar 31, 2012
30,075
I wonder if our current understanding of chemistry (and the universe) may be just as primitive as that just preceding Lavoisier?
Even if there are fundamental discoveries that revolutionize our understanding of chemistry, it would still not be a true claim that our current understanding is just as primitive as that just preceding Lavoisier.

Now, is it possible that new discoveries might make our new understanding as much more advanced compared to our current understanding as our current understanding is compared to that just preceding Lavoisier? I certainly can't rule it out, but I think that it is a pretty high bar to clear.
 

cmartinez

Joined Jan 17, 2007
8,257
is it possible that new discoveries might make our new understanding as much more advanced compared to our current understanding as our current understanding is compared to that just preceding Lavoisier?
I, for once, would like to know what dark matter and dark energy are before I go six feet under ... and those are two very real and very high bars to clear
 

MisterBill2

Joined Jan 23, 2018
18,584
The fact is that a great deal is known about a fair amount of things. The second fact is that most of the research and study has been directed toward learning things that can be a benefit to at least some of us. So while there is a whole huge amount of things completely unstudied, if they have no effect on any aspect of our existence, why waste our resources studying them? Much of astronomy fits into that category, likewise the study of those civilizations long dead and gone. Fully understanding DNA and learning how to cure diseases are areas of ignorance that learning will be far more useful. And presently there is enough useful knowledge around that all of us can keep learning daily.
 

nsaspook

Joined Aug 27, 2009
13,312
What actually is true is that we discovered a more accurate, and more general description of our world. In this description, it turns out that classical physics appears as a “simplification” or “approximation” whereby it becomes more and more valid as various parameters approach the common, everyday, terrestrial values. And this is an extremely important point to remember, because since classical physics works under our ordinary situation, any new theory or description must somehow converge and look like the classical physics description under such ordinary conditions. Otherwise, this new theory must show that it produces the same set of results as classical physics for all of our known phenomena that classical physics can already accurately described.
...
What this implies here is that, if there are more general and more accurate theories beyond QM, SR/GR, then those theories must also show that they can be “simplified” into the mathematical forms of QM and SR/GR. Subsequent, more general theories must show that they can derive the mathematical forms of existing, already-working theories. The inability to do that will be a fatal flaw in any new theory.
Reference https://www.physicsforums.com/insights/classical-physics-is-wrong-fallacy/
 

bogosort

Joined Sep 24, 2011
696
They open of the possibility of computing types of problems that classical computers can't efficiently like cracking 128-bit number sequences but there is only a small class of problems they really excel at above classical machines.
Indeed, and despite a ton of effort, we still have no idea if quantum computing is actually more powerful than classical computing in that small class of problems. The only thing we know is that there exist a handful of algorithms that (in theory) can beat the best classical algorithms we currently know. It could turn out that we find classical equivalents of Shor's and Grover's algorithms. On the other hand, even if quantum computing is indeed more powerful (P ⊂ BQP is truly a strict inclusion), we may find that we simply can't build a quantum computer with enough q-bits to solve real-world problems (i.e., decoherence scales faster than error correction).

In any case, the relationship between quantum and classical computing is very strange. Given the enormous difference between quantum and classical physics, one would expect that the relationship between quantum and classical computing would go one of two ways: either (1) QC is exactly as powerful as classical computing, perhaps because -- in some information theoretic way -- all sufficiently complex computation is equivalent; or, (2) that QC is exponentially more powerful than classical computing in nearly every way. In other words, we'd expect them to be exactly the same or radically different.

Yet, in terms of computational power, it seems that QC is only slightly different than CC. Which is a weird! Is there some kind of continuum of information-theoretic power? We can be pretty sure that alternative (2) is not the case: after all, if QC and CC were radically different, then it should be really easy to find tons of examples where QC outperforms CC. But the exact opposite is true: we've known about Shor's and Grover's algorithms since the 1990s, and in all the time since we've only come up with a handful of other quantum supremacy candidates.

So, it seems that the more likely truth is that either QC = CC, or that QC ≈ CC, which is weird: what small ε of power makes CC = QC? Are there other computational models for varying ε?
 

nsaspook

Joined Aug 27, 2009
13,312
Most of the problems that I see QC useful for are those that can be converted into a coherent detection (energy as information) problem much like a DSP system converting a frequency domain signal into the time domain to recover signals below the noise floor. Classical amplification of signals provide limited improvements because they also increase the noise. In a time domain transform (QC processing analogy) we can insert a external random noise dithering signal (quantum computing using QM amplitude probabilities) that over many sample tends to null the internal random noise component of the signal with destructive interference of the random energy vectors and reinforce the non-random information stream time coherent vectors. This makes QC a powerful tool in secure communication were the encryption process can seen as a large noise signal created by some number of factors and processes that hides a very small signal component in the total signal. This is useful in cracking public encryption systems where there is a trap-door function hiding things like prime factor keys used for key exchange to a symmetric block cipher like AES. A QC system can't crack something like 256AES with totally random offline distributed keys for each block because it then it's a OTP (One-time_pad) system that even quantum computing can't crack with random keying material and RC.

What's also weird is that quantum (quantum properties) random number sequences are different from classical computer (pseudo) random number generator sequences.
https://arxiv.org/pdf/1004.1521.pdf
In contrast with software-generated randomness (called pseudo-randomness), quantum randomness is provable incomputable, i.e. it is not exactly reproducible by any algorithm. We provide experimental evidence of incomputability — an asymptotic property — of quantum randomness by performing finite tests of randomness inspired by algorithmic information theory.
https://www.technologyreview.com/s/...ntum-processes-generate-truly-random-numbers/

A history of systems. Some have been cracked with classical computing systems and a few were compromised with spies who stole keying material. I've used most of the old obsolete KG, KW, KY and a few KL systems.
https://www.governmentattic.org/18docs/Hist_US_COMSEC_Boak_NSA_1973u.pdf
 
Last edited:

bogosort

Joined Sep 24, 2011
696
Most of the problems that I see QC useful for are those that can be converted into a coherent detection (energy as information) problem much like a DSP system converting a frequency domain signal into the time domain to recover signals below the noise floor. Classical amplification of signals provide limited improvements because they also increase the noise. In a time domain transform (QC processing analogy) we can insert a external random noise dithering signal (quantum computing using QM amplitude probabilities) that over many sample tends to null the internal random noise component of the signal with destructive interference of the random energy vectors and reinforce the non-random information stream time coherent vectors.
The trick is getting the problem space configured such that the superposition of non-solutions will destructively interfere, cancelling out, while the solution constructively interferes and pokes out in, e.g., a quantum Fourier transform. A small class of problems, those with some form of periodicity in their structure, lend themselves naturally to such an attack. Prime numbers have a surprising amount of structure, including periodicity in the partitions of equivalence classes of ℤ. This periodicity is at the heart of the class of one-way functions used by public-key cryptosystems such as RSA. But that periodicity is precisely what lets QC find prime factors in polynomial time. On the other hand, something like the traveling salesman problem has no inherent structure that a superposition of interfering amplitudes can exploit.

The analogy to DSP signal extraction is apt. In the time domain, we can extract a signal with an arbitrarily low SNR if we're willing and capable of waiting long enough -- just keep averaging. This is like an EXP solution. In digitzing the signal, we add dither to decorrelate the quantization noise from the signal, trading overall SNR for spectral purity. If we oversampled the signal, we can get some of the SNR we lost in dithering by filtering out the extra bandwidth. We're still bounded by the SNR of the analog signal before sampling, but if the spectral energy of the signal is enough to poke over the noise, we get a P solution (in frequency) to an EXP problem (in time). As in QC, the more periodic the signal, the easier this will be.

What's also weird is that quantum (quantum properties) random number sequences are different from classical computer (pseudo) random number generator sequences.
https://arxiv.org/pdf/1004.1521.pdf
Thanks, I'll check this out.
 
Top