What is causing Moore's Law?

Discussion in 'General Electronics Chat' started by jaydnul, Nov 28, 2015.

  1. jaydnul

    Thread Starter Member

    Apr 2, 2015
    88
    0
    Every article I find online just says something like "the possible transistor density increases each 18 months or so", but what is actually allowing the transistor density to increase? What technology advances each 18 months that allows transistors to be smaller?

    Another way to put it is what is stopping us from designing a chip with 10 times the number of transistors every 18 months instead of just 2 times.

    Thanks
     
  2. Papabravo

    Expert

    Feb 24, 2006
    10,140
    1,789
    The process of making an integrated circuit is called "photolithography". It is the reduction in feature size that lets us increase the density of transistors on a wafer of silicon. Once upon a time the numbers associated with a process could be related to actual features on the chip. Since the 28 nm process which is about the limit of photo-lithographic techniques, the process numbers keep getting smaller but they bear little relation to actual features. So the the current 14 nm. is just marketing bumpf for all I know.

    http://spectrum.ieee.org/semiconductors/devices/the-status-of-moores-law-its-complicated
     
  3. jaydnul

    Thread Starter Member

    Apr 2, 2015
    88
    0
    But what about photo lithography is improving?
     
  4. Papabravo

    Expert

    Feb 24, 2006
    10,140
    1,789
    The ability to create very tiny 3 dimensional structures.
    Violet light is about 400 nanometers in wavelength and we stopped using visible light some time ago.
    Ultraviolet light goes from about 10 nanometers up to the edge of the visible spectrum and that is where we are today.
    To go further we need to perfect X-ray lithography.
     
  5. jaydnul

    Thread Starter Member

    Apr 2, 2015
    88
    0
    I see, so it's the laser technology essentially?
     
  6. Papabravo

    Expert

    Feb 24, 2006
    10,140
    1,789
    Why do you think lasers are involved? Do you understand how lithography works? It is all about how you make a mask with sufficiently small features so that exposure to a light source and a controlled etching process gives the desired result.
     
  7. dl324

    Distinguished Member

    Mar 30, 2015
    3,242
    619
    A process (e.g. 22nm) is thought of as the smallest feature size in the process and that's the gate.
    The small feature layers are mostly patterned with 193nm immersion technology; been that way for several process generations longer than semiconductor manufacturers expected. Whatever you learned in physics, you can forget. To pattern features significantly smaller than the wavelength of the "light" source requires all sorts of optical tricks; phase shift masks - several flavors, multiple patterning (breaking a "layer" e.g. metal 1 into 2 or more layers), cut masks, Optical Proximity Correction, etc. I worked on projects down to 14nm and was just starting on 10nm when I retired.

    Moores Law is still alive, but the time between nodes is stretching out. Instead of 18 months, it's more like 24.

    When EUV makes good on it's promise, there will be a 13.5nm light source...
     
    Last edited: Nov 28, 2015
  8. jaydnul

    Thread Starter Member

    Apr 2, 2015
    88
    0
    No I clearly didn't know how it worked; I thought it was a very precise laser that etched away the photo sensitive layer.

    So in making these "tiny structures", what is advancing so consistently where we can predict where the technology will be at in a few years. I'm confused because it's not like new research is cropping up every few years to make these tiny structures even tinier; that would be very hard to predict.

    So what technology is increasing every two or so years that allows us to make tinier structures?
     
  9. wayneh

    Expert

    Sep 9, 2010
    12,103
    3,038
    Unfortunately that path is coming to an end, in the sense that quantum effects limit how small we can draw - and make - things. Transistor density, which Moore's law describes, will be limited by that even while computers continue to get smaller and faster by other means.
     
  10. wayneh

    Expert

    Sep 9, 2010
    12,103
    3,038
    That's why Moore's law is so amazing, that it turned out to be right for so long. It's not really a law, like gravity or evolution, it was based on an observation tempered by experience and a good intuition. It is hard to predict the outcome of ongoing research, but that's what Moore did.
     
  11. WBahn

    Moderator

    Mar 31, 2012
    17,737
    4,789
    That's not an easy question to answer, particularly at this stage of the game.

    Leaving aside a specific technology, the general rule of exponential technological improvement that has governed most of human technology for millennia basically comes down to the following broadly applicable observation -- the technology involved in most human endeavors tends to improve at a fairly constant rate over fairly long spans of time. To some degree this makes intuitive sense -- it's not unreasonable that if a particular industry was able to improve its performance by 10% last year (without anything revolutionary coming along) that it will likely improve its performance by something around 10% this year and 10% next year. That 10% (or whatever it is for a particular industry) figure is the result of the culmination of a lot of factors. Part of it is physically getting machines to perform better than they do today -- those are usually incremental improvements. Another part is people pushing past the limits of current technology -- and again those tend to be incremental improvements. Part of it is cost -- again, incremental improvements.

    If you improve your technology's performance by X% every year, then the performance will double in log(2)/log(1+X) years. For 10% it's a doubling every 7.3 years. For 25% it's a doubling every 3.1 years. For 50% it's 1.7 years (or about 20 months). So what Moore's Law is really saying is that we are generally able to improve the performance of our technology in the semiconductor industry by about 50% each year.

    As far as integrated circuitry and Moore's Law, it is getting more and more complicated. What, exactly, is it that is "doubling". It used to be transistor count, but increasingly it is a tapestry of things that involve not just how many transistors you can put on a chip or how fast you can clock them, but also how much processing you get out of each transistor, on average, per clock cycle. Have you noticed that processor speeds stalled out many years ago and are actually somewhat lower than they were at the peak of the clock-speed craze? We found that we couldn't push overall clock speeds faster (at least not economically) but we still get a lot more processing done per clock cycle by using increasingly sophisticated superscalar architectures -- in terms of things like more processor cores and instruction set architectures that embody various types of parallelism -- and a bevy of other tricks.
     
  12. WBahn

    Moderator

    Mar 31, 2012
    17,737
    4,789
    Good article! Thanks.

    I don't know that it is entirely marketing bumpf, but it certainly appears that it might be and, if it isn't now, it almost certainly will be.

    We saw the same thing happen with "speed grades" on FPGAs.

    If the nominal node name mapped to some reasonable performance metric along the lines of saying that, "In order to get this level of performance due solely to transistor scaling according to the non-deep-submicron realities, you would have had to get to this node feature size," then I wouldn't have much heartache over it. At least to some degree, I think that is what they are trying to do. But it just begs for marketers to push for better-sounding names whether the performance matches or not.
     
  13. Glenn Holland

    Member

    Dec 26, 2014
    353
    110
    Power intensity (watts/Cm) is also one of the problems with increasing transistor density and it also limits the maximum clock speed. Existing chips employ "Clock Inhibiting" to temporarily reduce CPU cycles to limit CPU heating.

    I always open Task Manager/Performance and monitor CPU use during a given program. Once the CPU goes above 80% for 20 seconds or more, that fan revs up like a jet engine. High CPU use can also cause Windows to become non-responsive and I have to use CTRL + W to close the offending program.

    I've disassembled a few computers and removed the CPU package (like the Pentium) and now I know why it has its own heat sink and a fan.
     
  14. WBahn

    Moderator

    Mar 31, 2012
    17,737
    4,789
    And why very high-end machines are liquid cooled or have outright refrigeration packages on them.

    This is nothing really new -- supercomputers have always been more a challenge in thermal management than processor design. It's just that these issues are now down in the desktop-scale machines.
     
  15. Glenn Holland

    Member

    Dec 26, 2014
    353
    110
    Right now, it's so damn cold in San Francisco that my CPU can go to 90% and stay cool without that annoying fan coming on. :D
     
  16. WBahn

    Moderator

    Mar 31, 2012
    17,737
    4,789
    Since our current temperature involves a minus sign (on either common scale), I think we've got you beat! :D
     
  17. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,677
    2,729
    If I may, I'd like to give a short answer to the OP's question:

    Q. What is causing Moore's Law?
    A. Capitalism.

    Live it. Learn it. Love it.
     
    RichardO likes this.
  18. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,677
    2,729
    Ha ha ha.
     
  19. Glenn Holland

    Member

    Dec 26, 2014
    353
    110
    On a subject remotely related to this topic, one of the senior engineers at Intel was on a radio discussion program and discussing the market forces that drive the quest for higher CPU speed.

    He said that the ability to download and play large video files was the principal market that is driving research and development in the CPU industry. On the news an hour later, there was another report that the porn industry was one of the largest producers of online videos in the world. In fact, California alone has a $15 Billion/year porn industry and that's in addition to worldwide production.

    So we have a very interesting paradigm: The porn industry now has a major influence on R&D in the semiconductor industry. :p
     
  20. WBahn

    Moderator

    Mar 31, 2012
    17,737
    4,789
    I don't buy that -- current processor speeds are plenty capable of downloading and playing HD video. Streaming HD video requires something in the vicinity of 5 to 10 Mbps. Now figure that 8 Mbps is just 1 MB/s and that is nothing compared to just about any computer's processing capabilities. Heck, that's just a few percent of what you can sustain over an old USB2.0 interface. The actual processing of video playback is pretty tame, too, even for highly compressed formats.

    It's my understanding that it is (and has been for quite some time) the gaming community that is driving most areas of computer performance at both the server and the client ends and everything in between. Not only is there the ever-increasing demands to increase the content, but the live-processing of the interactive environment is pushed to it's limits and given that these are highly-competitive applications in which fast response means greater chances of winning (and with the increasing amount of real-life money at stake in some of these games) it is not too surprising that the demand for high-end machines (at premium prices) is being driven by this market segment.
     
Loading...