Electronics: Change Is In The Air

I have been lucky to be able to take a relative long vacation on the high seas and in Italy to recharge the batteries and celebrate my retirement. The time was very enjoyable, but the retirement is not lasting. There is only so much golf one can play, and so much metaphysics one can study. So I started to look at EDA and semiconductors industries again because this is what I love to do. And what I found, having stepped back for almost half a year, may surprise a few.

The Problem

The electronics industry is on a verge of a drastic change, and with it EDA and to a lesser extent the semiconductors industry will have to change as well. Designers have since the middle of the 70's used a semiconductors industry that has grown according to Moore's law. This so called law, which in fact is simply a prediction, is based on the ability of the semiconductors industry to shrink the size of transistors according to a predictable schedule. So, in practice, every new design only confronted the obstacle of utilizing more transistors in a manner that generated revenue.

Before now there has been only one physical hurdle to overcome: the wavelength of visible light. The shift to UV light was accomplished by the semiconductors company without any fundamental impact in the methods followed by designers to develop IC's. Of course complexity brought drastic changes to the design and development methods. When circuits became too large to handle in a timely and reliable fashion, EDA provided Hardware Description Languages (HDLs) like Verilog and VHDL. When Verilog became too hardware oriented to support very large designs, EDA found C and C++ and developed the tools required to translate those descriptions into hardware primitives. But in general the industry has spent billions developing a large number of engineers that understand logic but are almost ignorant of physics.

Such deficiency of knowledge has impacted the cost of development of ICs below 90 nm, as second order parasitic effects have seriously impacted the behavior of circuits. Cross talk and leakage, above all, must now be dealt with. Doing so up front using sound physics and logic design is still rare. Designers are still fixing errors instead of avoiding them.

Warning bells are being sounded at the 20 nm process level. Development, design, and manufacture of these devices are more costly and less reliable than expected. Development costs are in the hundreds of million of dollars, and yields are not as good as expected. Also electronics companies used to "purchase" manufacturing insurance by relying on a second source for IC fabrication. This practice has been abandoned, not because of regulations or competiveness among foundries but because a second source is almost equivalent to a second design.

At the 14 nm process node level we are now finding what physicists have found some years ago: things are different at atomic and near atomic sizes. The industry is approaching the time when quantum events are likely to occur within an IC. It is fine to read that company X has employed EDA vendor's Y tools to produce a working test chip at foundry Z. It means we can survive, it does not mean we can produce and remain in business. The process node will be real only after we reach a steady 50% or more yield level.

A Proposed Scenario

There are at least two directions available to us to solve the problem, and both are required.

From a business and organization point of view, we need the means to build a coalition of tightly integrated partners that covers the idea, design, realization, manufacturing flow. Electronics companies, EDA vendors, foundries, and equipment providers must all be integrated into a producing organization and understand well each strengths and weaknesses. What can each provide and what each must have in order to succeed.

Design of ICs must also change. It is not enough to consider a block as a portable sub-circuit that can be instantiated reliably anywhere in the topology. Engineers will have to deal with "neighborhoods." The electrical and physical influence of a transistor or a cell on its surrounding environment will change with the environment and with its behavioral profile. It may be necessary to build to similar circuits placed at different parts of the die and activate them according to what else is happening on the chip at the time. The age of 97% or so silicon utilization has been left behind for some years.

On board heuristic circuitry may have to be employed in order to enable power and clocking circuitry in various locations of the chip as a function of its present and predicted state. It may even be necessary to stop and resume operations while fully maintaining the logic state in order to survive a damaging transitory physical event. A thousandth of a second is a long time in atomic terms.

So while foundries and equipment providers deal with how to arrange atoms into desired structures, designers and EDA tools developers will have to deal with how to design a circuit not as a static thing that will eventually execute some functions, but as a dynamic entity that changes logical and physical properties as it executes those functions while respecting the boundaries imposed by the process to be used.

Conclusion

Are we structured correctly to address the coming challenges? Are the financial arrangements used presently the correct ones? If not can they be modified or do we need to start from scratch? How do we train a new set of designers that intuitively can foresee and prevent or avoid a totally new set of problems not faced by the present generation of design engineers? It will take a very large investment and significant time to reach the same efficiency at the atomic level that we have achieved today at the UV/nano level.