What's stopping us?

Thread Starter

Shafty

Joined Apr 25, 2023
170
Once there were components which are very Large and Heavy...(Like Hardisks and other components) Now there are so many are put inside a tiny block. But why only a few processors can be put inside a block now. Why not thousands of it inside? What's stopping us?
 

Ya’akov

Joined Jan 27, 2019
9,170
As the features of integrated circuits get smaller, the leakage current increases. Leakage current is power consumed by the circuits when there should otherwise be zero current flow. It is a caused by, essentially, things being so close together the current finds other paths than the one the designer intends and represents not only wasted power but excess heat.

Modern processors have heat densities similar to the blast nozzles on rocket engines, this is why they fail catastrophically when they lose their active cooling. Lowering operational voltage can reduce these problems but comes at a cost of speed.

Leakage currents are a serious problem for modern processor and memory design and one of the biggest factors in the rapid approach to the limit of Moore's Law. Until we can find a substrate that has better leakage and heat characteristics than silicon, we are near the limit of our ability to scale down feature size to increase density. Things like graphene and diamond have been considered with some experimental but little practical success yet.
 

SamR

Joined Mar 19, 2019
5,052
In a word, Voltage! Which in turn, yields power. To increase density and eliminate electron tunneling through the substrate voltage has had to be reduced as as nanometers are reduced.
 

Ya’akov

Joined Jan 27, 2019
9,170
Not every problem is parallelizable. While many are, and so the increasing number of cores in modern processors speeds up many tasks, for some classes of problems, computers have reached a current plateau. These problems need faster clock rate not more cores. Clock rate isn’t going up.

There are a few specialized processors that feature high clock speeds, like ones from Hitachi, but most are focused on multithreading and specialist (GPU, Neural Engine) rather than generalist cores for increasing execution speed.
 

drjohsmith

Joined Dec 13, 2021
852
Once there were components which are very Large and Heavy...(Like Hardisks and other components) Now there are so many are put inside a tiny block. But why only a few processors can be put inside a block now. Why not thousands of it inside? What's stopping us?
Why not thousands of processor in a "stick"
easy answer, processors can be many millions / billions of gates,
look at the top of a AMD / intle CPU ,its a lot of transistors.
 

MisterBill2

Joined Jan 23, 2018
18,568
To the next level in 'Computer Evolution'
WE do not require a "next level" of computers. Besides that, there are already multi processor ICs available.
Physically, the limitations already posted are all correct.
And one more thing, not posted yet, is that there would not be any benefit provided.
At some point additions become bloat, which is a horrible fate common to integrated processor systems.
 

SamR

Joined Mar 19, 2019
5,052
Bloat is common to software as well. I remember when AutoCAD was nothing more than a glorified Paint program residing on a single 360k floppy disk. They kept adding every bell and whistle that anyone thought they might want to use. Today, the average user probably doesn't use 80% of the options available to them and that is true for most large-scale applications.
 

tyro01

Joined May 20, 2021
87
It is likely that the engineers of the Neumann-type computers have just lost their motivation because someone is going to invent a quantum computer.
 

MisterBill2

Joined Jan 23, 2018
18,568
It is likely that the engineers of the Neumann-type computers have just lost their motivation because someone is going to invent a quantum computer.
It would be very interesting to read a detailed technical description of just what the mechanism is of "quantum computing" that is going to deliver the touted wonderfulness. And who is going to write that amazing code that will run a hundred orders of magnitude faster.
 

nsaspook

Joined Aug 27, 2009
13,306
It is likely that the engineers of the Neumann-type computers have just lost their motivation because someone is going to invent a quantum computer.
For some reason, I don't think that's the answer. :rolleyes:
https://www.technologyreview.com/20...nts-to-build-a-100000-qubit-quantum-computer/
One example of what’s needed is much more energy-efficient control of qubits. At the moment, each one of IBM’s superconducting qubits requires around 65 watts to operate. “If I want to do 100,000, that’s a lot of energy: I’m going to need something the size of a building, and a nuclear power plant and a billion dollars, to make one machine,” Gambetta says. “That’s obviously ludicrous. To get from 5,000 to 100,000, we clearly need innovation.”

IBM has already done proof-of-principle experiments showing that integrated circuits based on “complementary metal oxide semiconductor” (CMOS) technology can be installed next to the cold qubits to control them with just tens of milliwatts. Beyond that, he admits, the technology required for quantum-centric supercomputing does not yet exist: that is why academic research is a vital part of the project.
https://www.cs.virginia.edu/~robins/The_Limits_of_Quantum_Computers.pdf
Haggar Physicists Develop ‘Quantum Slacks,’ ” read a headline
in the satirical weekly the Onion. By exploiting a bizarre
“Schrödinger’s Pants” duality, the article explained, these
non-Newtonian pants could paradoxically behave like formal wear and
casual wear at the same time. Onion writers were apparently spoofing
the breathless articles about quantum computing that have filled the
popular science press for a decade.
In the 26 years since physicist Richard Feynman first proposed the idea of quantum computing, computer scientists have made enormous
progress in figuring out what problems quantum
computers would be good for. According to our
current understanding, they would provide dramatic speedups for a few specific problems—
such as breaking the cryptographic codes that
are widely used for monetary transactions on
the Internet. For other problems, however—such
as playing chess, scheduling airline flights and
proving theorems—evidence now strongly suggests that quantum computers would suffer
from many of the same algorithmic limitations
as today’s classical computers. These limitations
are completely separate from the practical difficulties of building quantum computers,
 

Ya’akov

Joined Jan 27, 2019
9,170
Practical quantum computers will be like dedicated GPUs, Neural Engines, and Crytpo Engines. It will be hardware dedicated to a particular task which it is exceptionally good at and a peripheral to a CPU to off-load special class problems for faster execution.

The CPU will still be there, and most computing will still use it.
 

MisterBill2

Joined Jan 23, 2018
18,568
Presently there is no need for that "next level" of processing, neither speed nor complexity. The present level has quite enough capability to handle even the horribly bloated operating systems presently in use. So there will be no benefit in that area.
The other challenges are power, heat, and the fact that as individual details get smaller the normal defects do not. There is indeed a limit as to how many atoms wide a conductor can be reduced to and still have an acceptable production yield. That is currently an issue.
And the other challenge is the cost and complexity of production equipment . Those latest devices are already at the limits of what can be produced with a reasonable production yield. And each change has had about a 10X increase in the cost of production equipment and the many masks needed to produce the devices.

So the ultimate answer is that maybe it could be done, but why bother, since there is no need nor any benefit to doing it.
 
Top