The End of Moore's Law -- Revisited, Again.

Discussion in 'General Electronics Chat' started by joeyd999, May 4, 2015.

  1. joeyd999

    Thread Starter AAC Fanatic!

    Jun 6, 2011
    2,689
    2,751
    Seems like every year since I've been involved in electronics, someone comes along to predict the demise of Moore's Law. Well, here is this year's contribution:

    http://electronicdesign.com/blog/revisiting-moore-s-law-50-years-later

    So far, there have always been a few brilliant minds devising ways to keep the trend going -- in spite of the naysayers.

    I predict more of the same.
     
  2. dl324

    Distinguished Member

    Mar 30, 2015
    3,250
    626
    I worked in that industry for about 4 decades and am constantly amazed by what those genius types can come up with. For the last few years of my career, I was working in a group that developed design rules and checkers. We were working on 10nm when I retired.

    Not to start dropping names, but Gordon Moore and I bought gas at the same station... He drove a brown 911 at the time; there's no accounting for taste...
     
    cmartinez likes this.
  3. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,574
    2,543
    I think that the end of Silicon is the one that is closest at hand... other promising materials, including possibly optical computing, may soon break into the market.
     
  4. dl324

    Distinguished Member

    Mar 30, 2015
    3,250
    626
    Are you involved in the industry? They've been using Ge to strain silicon for over a decade now; first to improve the performance of PMOS, then NMOS. People keep saying we need to move to III-V compounds for more speed, but it's no longer about speed. Carbon is probably the next step, but Si isn't out of gas yet. They've been experimenting with LEDs on Si for a number of years.

    And, who besides Intel, says you have to go to smaller features? Many Foundry customers are happy to stay on 45nm. Going to smaller nodes gives you some performance and area benefits, but processing costs go up very quickly; you need a lot of volume to justify the setup costs. And, it's very difficult to design on 22nm and smaller nodes. Been there, done that...
     
    Last edited: May 4, 2015
  5. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,542
    1,251
    Like all laws, the death of Moore's Law only applies in context, in an apples-to-apples comparison. For example, magnetic disk drive storage capacity. The encoding of data and the physical recording and recovery of data in a hard drive are two different and mostly independent things. In both areas, a modern terabyte drive bears almost zero resemblance to the devices in place when GL first came up. MFM and Reed-Soloman techniques hit the wall way long ago, and comparing them to today's technologies in the context of Moore's Law is as meaningless as comparing an 8" floppy disk to a dual-layerDVD-RW. Same with solid state memory devices, CPU "speed". etc. Yes, the end-user effective capacities are advancing steadily, but only through the evolution and extinction of underlying technologies. So at what point does the comparison become meaningless?

    ak
     
    cmartinez likes this.
  6. joeyd999

    Thread Starter AAC Fanatic!

    Jun 6, 2011
    2,689
    2,751
    That's my whole point! Silicon would not be what it is today had NMOS/PMOS (or RTL/DTL/TTL!) not been discarded in favor of CMOS. CMOS will ultimately be replaced with something else. IMHO, walls are not only meant to be hurdled, but outright demolished by a continuing succession of bright minds with brilliant ideas. This is why I think it is silly to continuously, repeatedly, and incessantly predict the end of progress, regardless of the subject in question.
     
    cmartinez likes this.
  7. dl324

    Distinguished Member

    Mar 30, 2015
    3,250
    626
    Except for the fact that PMOS and NMOS continue to exist in CMOS. You can't have CMOS without a PMOS device and an NMOS device. And bipolar devices are still lurking in the background... So what's been discarded??

    Oh, and FYI, Moore's Law is about the number of transistors doubling every couple years. It has nothing to do with the materials used to manufacture them or how they're manufactured.
     
  8. wayneh

    Expert

    Sep 9, 2010
    12,145
    3,056
    What if the subject is promoting individual freedom instead of freebies from government? I haven't seen much progress in my lifetime.

    Oops, I thought this was in the Off Topic area. Never mind.
     
  9. joeyd999

    Thread Starter AAC Fanatic!

    Jun 6, 2011
    2,689
    2,751
    Naturally, CMOS is a combination of PMOS and NMOS. I may be ugly, but not stupid.

    First generation MOS LSIs were either PMOS or NMOS...not both on the same die. Heat generated by the passive pull-downs or pull-ups limited the upward scaling of transistor counts. CMOS solved that problem.

    Materials and manufacturing techniques have *everything* to do with scaling up transistor counts -- IMHO, of course.
     
  10. joeyd999

    Thread Starter AAC Fanatic!

    Jun 6, 2011
    2,689
    2,751
    Some would say not having to work for a living is progress. But, that wouldn't be me.

    Edit: Hey, they don't call themselves 'progressives' for nothing...
     
  11. wayneh

    Expert

    Sep 9, 2010
    12,145
    3,056
    Oh absolutely. One political party in this country publicly touted the advantages and "freedom" gained from losing one's job. Our Founders would be so proud.
     
  12. tcmtech

    Well-Known Member

    Nov 4, 2013
    2,039
    1,667
    I'm a computer chip circuitry dummy so what wrong with just making the processor packages bigger and or using lots of smaller independent units to do more work more efficiently? o_O

    As far as I know natures answer to organically handling huge amounts of data with efficiency is to use massive quantities of parallel processing.
     
  13. wayneh

    Expert

    Sep 9, 2010
    12,145
    3,056
    That is definitely happening. The whole "internet of things" idea is about isolated, parallel brains.

    But more-power-in-a-smaller-container will remain a prize for a long time. When computers are small enough and integrated enough that they can be under the skin instead of in a pocket or on a lap, maybe we'll approach the end of miniaturization. We're a long way from that.
     
  14. joeyd999

    Thread Starter AAC Fanatic!

    Jun 6, 2011
    2,689
    2,751
    Cost is a function of die size and yield. The smaller a chip, the more can fit on a single wafer. More chips per wafer decrease the percentage of parts affected by defects. Thus, higher yield and lower cost.

    This is a valid approach. But, again, the more separate parts, the higher the cost and lower the yields. GPUs have thousands of cpu cores, but all on one die. Could you imagine the size and cost of a GPU constructed of individual discrete cores?
     
  15. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,910
    2,173
    There are at least two major pushes in the chip market. The top end from people like Intel and the bottom from people like Microchip and the controller companies. There is a fantastic amount of breathing room at the bottom being several generations behind and not needing the amount of horse-power needed for non-embedded applications. The Internet of everything won't be from the latest technology, it will be build at the micro-level from technology Intel discarded 10 years ago.
     
    cmartinez likes this.
  16. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,574
    2,543
    No, I'm not, and have never been, involved in the industry. I'm just an amateur who loves to read articles by Scientific American, New Scientist, and such. And for the previous ten years (or more) the trend has been to research materials other than silicon to minimize power and maximize throughput, and not necessarily miniaturize things even more... And judging from what you've been commenting, I can honestly say that I envy you. What I wouldn't have given to be able to directly or indirectly work for that industry... it would've been fascinating. But my area of expertise is automation, robotics and mechatronics.
    Anyway, I do know that sometimes things take decades of experimentation and testing before they reach the market. And I'm pretty much aware that the time is almost ripe for silicon-less electronics to start making an incursion.

    And btw, I also think that @nsaspook is right, the IOT will be built using atop old, "discarded" technology. And it's going to create a revolution, because it's going to be real cheap.

    My opinion can pretty much be summarized as this:

    Q: What drives the market for smaller, faster and more efficient electronics?
    A: Portable devices and gadgets.

    Q: Is it indispensable that portable devices have enormous computing power?
    A: No. They could work as simple terminals, delegating computing and storage to external devices, like the cloud and such.
    So what's most important right now (IMHO, of course) is solid, reliable and ultra-fast connectivity, even more so than faster and smaller chips.
     
  17. dl324

    Distinguished Member

    Mar 30, 2015
    3,250
    626
    Not to keep beating a dead horse, but you said "Silicon would not be what it is today had NMOS/PMOS (or RTL/DTL/TTL!) not been discarded in favor of CMOS."

    Maybe you didn't mean to say NMOS and PMOS had been discarded...

    And your avatar is still ugly...
     
  18. dl324

    Distinguished Member

    Mar 30, 2015
    3,250
    626
    Generalizations are apt to be wrong. I know Intel will be pushing IOT on 22nm and smaller. Not because it needs to be there, but because they have a lot of capacity that needs to be used.

    At 22nm, Atom processors were already the size of a grain of rice; naturally before packaging...
    Maybe it was at 32nm.
     
  19. joeyd999

    Thread Starter AAC Fanatic!

    Jun 6, 2011
    2,689
    2,751
    I say what I mean, and mean what I say.

    Perhaps you are unaware that there were such things as PMOS Logic and NMOS Logic of which actual LSI semiconductors were built. This is the context in which I was writing. Both technologies are now thoroughly obsolete.
     
  20. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,910
    2,173
    Intel is a great company that made the industry possible in the PNW but Intel just no longer makes the products needed for IOT embedded applications at the tip of the spear. The demand for 64mhz 8-bit controllers seems amazing in the world of the 4ghz CPU but something needs to interface with the real-world that's rugged, with extremely low power requirements and cost where a dime saves millions. The cpu processing power equivalent of the peripheral set at the price point is where the controller market wins. The instruction sets are archaic, software is a bit buggy but everyone can get the basics down quickly to push a product out the door without spending a mint.
     
    cmartinez and GopherT like this.
Loading...