Closing Comments & Perspective on Moore’s Law

IEEE SV Tech History Chairman Alan J Weissberger wraps up the panel session with comments about Moore’s law and thanks the panel participants.

5 responses to “Closing Comments & Perspective on Moore’s Law”

  1. Ted Hoff, PhD Avatar
    Ted Hoff, PhD

    Comment from Ted Hoff, PhD- employee #12 at Intel in 1968:
    Gordon always considered his work more of an observation than a law. It did provide an important guideline for determining the optimum complexity of ICs at a given point in time. In the early days of Intel, somewhere around 10% yield was about optimum, and applying Moore’s law helped meet that goal. Assuming 100 die sites per wafer, and a $50.00 processed wafer cost, each die would cost $5.00. Allowing $1.00 to package the die would result in a manufacturing cost of $6.00. Sell it in quantity for $10.00 to $20.00 and you make a nice profit.

    Double the size of the die, and you have less than 50 die sites, and yields around 1%. Each chip would cost more like $100 so you would need to charge more like $200-$300 for the same profit, and would likely be competing with SSI/MSI designs which were assumed to be on a PC board at $1.00 to $2.00 per IC. The higher price would probably discourage applications, so production volumes would be lower and there would be less improvement based on the learning curve.

    Given Moore’s law, just wait a few years, and that more complex chip would be very manufacturable. In the past few years I have heard stories of perfect wafers–huge wafers of extremely complex chips running at 100% yield.

    You can go the other way as well, assume half the die size and over 200 sites per wafer with abut 30% yield. Now the dice cost more like 80 cents each but if packaging still costs $1.00, the final cost is close to $2.00. It probably comes pretty close to the original total cost for the customer, but based on Moore’s law, it would soon become obsolete.

    The advances in IC technology have helped reduce the cost of computing enormously, but there are still many areas where I think more computing progress might have been made–e.g. natural language processing, language translation, security, reliability, etc. Just because computer technology advances does not automatically help computer usage in certain areas.

    Somewhere, someone needs to devote a lot of time working to solve those applications. Regarding IC design, the profits reaped by the semiconductor industry helped to motivate the IC industry to develop ever more capable tools and those tools helped reduce engineering costs. Standardization also helps reduce microprocessor design engineering cost, but helps in increasing applications for them. The effect is to move the engineering burden from microprocessor chip design to software and firmware development.

    Alan, you were correct in noting that Moore’s original observation was made when he was at Fairchild. He considered that it applied to both MOS and bipolar designs. At Intel, most of the progress was made in MOS technology, although the Schottky bipolar design was a significant step in allowing bigger/more complex bipolar chips.

  2. Alan J Weissberger Avatar
    Alan J Weissberger

    Ted and I discussed the negative impact Moore’s law has had on hardware & systems engineering companies that make circuit cards/boards/boxes. Many of them have gone out of business with many engineers having to change careers. Meanwhile, no start ups have been funded to create jobs or work for technical consultants (like yours truly).

    There are three reasons for the destruction of engineering companies that are all a result of Moore’s law:

    1. As processors have become faster and more powerful, software has replaced many hardware functions. Video coding/decoding is one example. Dozens of Network Function Virtualization (NFV) virtual appliances are other examples.

    2. Programmable gate arrays have become much denser/more powerful with great improvement in CAD/CAE tools to aid in design, verification & test. That’s collapsed many circuit card logic functions into one or two gate arrays/FPGAs.

    3. More hardware functions have become integrated into a single SoC. Qualcomm and Broadcom offerings are great examples such that those two companies are the undisputed leaders in telecom and networking chips respectively. In years past, the functionality integrated into one such SoC would take 1 or two large circuit boards.

    As a result of the above, much fewer electronic engineers are needed and many equipment companies have disappeared. With the strong trend toward dis-aggregation and commodity hardware, that trend is likely to accelerate with Chinese/Taiwanese ODMs doing most of the server and bare metal switch designs using highly integrated silicon. That leaves traditional EEs out of a job with no place to go.

  3. Tom Gardner Avatar
    Tom Gardner

    Alan seems to have fallen for the Luddite fallacy – there is no evidence that technological progress causes aggregate job loss. Progress may cause local disruption; the weavers facing the looms of the 19th century may be like yesterday’s Silicon Valley circuit board manufacturers but overall progress in semiconductor complexity (Moore’s observation) has led to increased technological employment.

    For example, in spite of the replacement of circuits by software the EE employment in the US is expected to grow 4% annually thru 2022, less than US average of 11% mainly due to the off shoring of manufacturing which has little to do with semiconductor complexity progress. FWIW the EE employment in part of SV grew by 2% between 2012 and 2013. Furthermore the US 2012 EE employment of about 300k is dwarfed by the programming employment now in excess of 2 million with most segments projected to grow at more than double the national average. [data from US Bureau of Labor Statistics website] When I got my EE degree, software wasn’t an identifiable occupation.

    If we counted the EEs in the Pacific Rim there would be further evidence that the progress in semiconductor complexity has if anything increased job opportunities for all technologists, EEs et al

    The traditional answer for Silicon Valley’s traditional EE is, as many have done, a career change, perhaps into a software field or perhaps in an other field, even out of the Valley. Anyone with the intelligence and persistence to get a EE degree should be capable of such a change.


  4. Ken Pyle Avatar
    Ken Pyle

    All, thanks for the comments. As Tom points out, the number of software developers is projected to grow by 22% according to the Bureau of Labor Statistics, which will result in 222,600 new jobs (compared to the 12k increase in EEs for a total projected 318k EEs in 2022).

    One related topic is the rate of change as to how it relates to structural unemployment. With Moore’s law and the associated innovation, the pace of obsolecence has quickened (both for capital and human skills). The window for when a given set of skills are of value seems to be shorter than year’s past.

    As you point out, Tom, the challenge for an individual is to keep their skills fresh and relevant. Another challenge for the prospective employee is dealing with the perception of their skills by potential employers or robo-resume-readers. They may not get the opportunity to try their new skills because they don’t make it through that filter and don’t get the job

    On the positive side, there are more opportunities for a person to start their own gig via things like crowd-funding. Plus, with the aforementioned integration, so much can be done with so little human and machine capital.

    It seems like tech has become more like the fashion, movie or even sports business where you have hits and may have short bursts of income. It probably becomes even more incumbent upon individuals to sock that income while they can and to continually reinvest in themselves.

    1. Alan J Weissberger Avatar
      Alan J Weissberger

      I agree with Tom’s statement: “The answer for Silicon Valley’s traditional EE is, as many have done, a career change, perhaps into a software field or perhaps in an other field, even out of the Valley.”

      That has worked for some EEs I’ve known who became High School Math/Physics teachers, furnace technician, and one with a PhD EE from Stanford who moved to Singapore to write mobile app software. Those are all way below their capabilities with much lower earnings.

      For others, like myself and a few contemporaries, it was too late to change careers and have to compete with 20 somethings that know the latest web software programming languages and tools.

      As Kris Verma noted during the Oct 9th IEEE/CHM meeting, 9 out of every 10 new engineering jobs is Silicon Valley is software related. A professional recruiter recently told me that software engineers with knowledge of web/mobile OS’s, programming languages, & open source offerings are in high demand. Conversely, there is zero demand for traditional design, applications, and systems engineers.

      Hence, there’s great dislocation and pain for those experienced and educated as a traditional EE. As an example, a 50 something ASIC designer/project manager was laid off from Cisco a few months ago. He’s been unable to get even a single interview since then and is thinking of moving to Asia.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.