Coping With Complexity
Seven years ago, on the occasion of Synopsys’ 20th birthday, I talked to the company’s founder, chairman and CEO, Aart de Geus.
“I remember designers saying digital gates will never work,” recalled de Geus, “now 98 per cent of the functionality in the world is digital, and so you can either look at digital design as being the most horrible restriction on electronics ever put, or the most powerful enabler to automation and dealing with complexity there is. And I, of course, am in the second camp.”
“Our job is harnessing complexity so that the job of our customers is to leverage complexity”, said de Geus, “if we don’t harness it, they cannot add more functionality. If we harness it, there’s a lot more that can be put on chips, 5X to 10X more functionality, that will come out in the next decade.”
I asked him if the complexity issue could be solved not by EDA but by more universal hardware platforms? Could a genius come up with a hardware platform that could be the basis of any electronic product?
“Yes, and that’s called the microprocessor”, replied de Geus, “so why is not everything done with a single microprocessor? Well the genius wasn’t quite good enough, because the very versatility of the microprocessor, which can do literally everything, means that, for most tasks, it’s incredibly slow if you want to do just that task. FPGAs have been good example of programmable logic which is very versatile but, compared to a dedicated application, very slow, and very costly.”
Are C-programmable, re-configurable, low power embedded FPGA a way to solve the complexity/cost issue?
“Yes, sure, everyone would like something that’s very high speed, very low power, very cheap, easy to use, low cost in low volume, low cost in high volume,” replied de Geus, “ but you should take sub-sets of these qualifiers and say OK, for this sub-set, which I can actually deliver against, what’s the market opportunity? Of course, the bigger the sub-set, the bigger the market opportunity. But the bigger the sub-set, the less likely you can actually deliver all these things. The challenge with all of these approaches is the balance between what you can actually deliver against, and what the market opportunity is.”
So de Geus is not looking to breakthroughs in hardware architectures to solve complexity problems.
“The age of all these individual pieces is over, because that’s not the way to get there”, said de Geus, “the natural maturation in any field is, if you want to use a negative word, you call it discipline, if you use a positive word, you use automation.”
Could EDA solve the problem that, while transistors can still be shrunk to increase IC density, improvements in power and performance are no longer automatically delivered by more advanced processes?
“Low power and high speed are the same problem”, replies de Geus, “with every new generation you don’t immediately get something that is faster, because power is the fundamental limiter to switching speed on chips. With oxide thickness now at six or seven atoms, oxides are so thin that if you put too much energy around them you punch straight through. Lowering the voltage, which has been one of the things that has been done in the last couple of decades, is becoming difficult because, if you lower it too much, you have signal/noise issues. So, in my opinion, power is the singular biggest and toughest restriction on advanced design today.”
Could new materials, rather than EDA, provice the answer?
“Materials breakthroughs are always welcome because they do extend lifecycles”, answered de Geus, “all the fabrication guys are trying to improve power consumption from the manufacturing perspective. It doesn’t mean there couldn’t be some breakthrough producing some material that’s treated differently or has different implants. But all the people who work on that say: ‘Don’t count on us to reduce power loss’.”
So de Geus’ vision remains as it has been for 20 years, that design automation provides the answers to the chip industry’s problems. “The nature of advanced design”, he says, “is that the constraints all conflict with each other, and, finding the optimal, requires the tight integration of many steps, and this, to me, is the whole next generation of EDA.”