According to Dr. Walden C. Rhines, chairman and CEO, Mentor Graphics Corp., verification has to improve and change every year just to keep up with the rapidly changing semiconductor technology. Fortunately, the innovations are running ahead of the technology and there are no fundamental reasons why we cannot adequately verify the most complex chips and systems of the future. He was speaking at the recently held DVCON 2014 in Bangalore, India.
A design engineer’s project time for doing design has reduced by 15 percent from 2007-2014, while the engineer’s time for doing verification had seen 17 percent increase during the same time. At this rate, in about 40 years, all of a designer’s time will be devoted to verification. At the current rate, there is almost no chance of getting a single-gate design correct on first pass!
Looking at a crossover of verification engineers vs. designer engineers, there is a CAGR designers of 4.55 percent, and for CAGR verifiers, it is 12.62 percent.
The on-time completion remains constant, as we look at the non-FPGA project’s schedule completion trends, which are: 67 percent behind schedule for 2007, 66 percent behind schedule for 2010, 67 percent behind schedule for 2012, and 59 percent behind schedule for 2014. There has been an increase in the average number of embedded processors per design size, moving from 1.12 to 4.05.
Looking at the macro trends, there has been standardization of verification languages. SystemVerilog is the only verification language growing. Now, interestingly, India leads the world in SystemVerilog adoption. It is also remarkable that the industry converged on IEEE 1800. SystemVerilog is now mainstream.
There has been standardization in base class libraries as well. There was 56 percent UVM growth between 2012 and 2014, and 13 percent is projected growth in UVM the next year. Again, India leads the world in UVM adoption.
The second macro trend is standardization of the SoC verification flow. It is emerging from ad hoc approaches to systematic processes. The verification paradox is: a good verification process lets you get the most out of best-in-class verification tools.
The goal of unit-level checking is to verify that the functionality is correct for each IP, while achieving high coverage. Use of advanced verification techniques has also increased from 2007 to 2014.
Next, the goal of connectivity checking is to ensure that the IP blocks are connected correctly, a common goal with IP integration and data path checking.
The goal of system-level checking is performance, power analysis and SoC functionality. Also, there are SoC ‘features’ that need to be verified.
A third macro trend is the coverage and power across all aspects of verification. The Unified Coverage Interoperability Standard or UCIS standard was announced at DAC 2012 by Accellera. Standards accelerate the EDA innovation!
The fourth trend is active power management. Now, low-power design requires multiple verification approaches. Trends in power management verification include things like Hypervisor/OS control of power management, application-level power management, operation in each system power state, interactions between power domains, hardware power control sequence generation, transitions between system power states, power domain state reset/restoration, and power domain power down/power up.
Macro enablers in verification
Looking at the macro enablers in verification, there is the intelligent test bench, multi-engine verification platforms, and application-specific formal. The intelligent test bench technology accelerates coverage closure. It has also seen the emergence of intelligent software driven verification.
Embedded software headcount surges with every node. Clock speed scaling slows the simulation performance improvement. Growing at over 30 percent CAGR from 2010-14, emulation is the fastest growing segment of EDA.
As for system-level checking, as the design sizes increase emulation up, the FPGA prototyping goes down. The modern emulation performance nmakes virtual debug fast. Virtual stimulus makes emulator a server, and moves the emulator from the lab to the datacenter, thereby delivering more productivity, flexibility, and reliability. Effective 100MHz embedded software debug makes virtual prototype behave like real silicon. Now, integrated simulation/emulation/software verification environments have emerged.
Lastly, for application-specific formal, the larger designs use more formal. The application-specific formal includes checking clock domain crossings.
Cadence Design Systems Inc. recently announced the Quantus QRC extraction solution had been certified for TSMC 16nm FinFET.
So, what’s the uniqueness about the Cadence Quantus QRC extraction solution?
KT Moore, senior group director – Product Marketing, Digital and Signoff Group, Cadence Design Systems, said: “There are several parasitic challenges that are associated with advanced node designs — especially FinFET – and it’s not just about tighter geometries and new design rules. We can bucket these challenges into two main categories: increasing complexity and modeling challenges.
“The number of process corners is exploding, and for FinFET devices specifically, there is an explosion in the parasitic coupling capacitances and resistances. This increases the design complexity and sizes. The netlist is getting bigger and bigger, and as a result, there is an increase in extraction runtimes for SoC designs and post-layout simulation and characterization runtimes for custom/analog designs.
“Our customers consistently tell us that, for advanced nodes, and especially for FinFET designs, while their extraction runtimes and time-to-signoff is increasing, their actual time-to-market is shrinking and putting an enormous amount of pressure on designers to deliver on-time tapeout. In order to address these market pressures, we have employed the massively parallel technology that was first introduced in our Tempus Timing Signoff Solution and Voltus IC Power Integrity Solution to our next-generation extraction tool, Quantus QRC Extraction Solution.
“Quantus QRC Extraction Solution enables us to deliver up to 5X better performance than competing solutions and allows scalability of up to 100s of CPUs and machines.”
Support for FinFET features
How is Quantus providing significant enhancements to support FinFET features?
Parasitic extraction is at the forefront with the introduction of any new technology node. For FinFET designs, it’s a bit more challenging due to the introduction of non-planar FinFET devices. There are more layers to be handled, more RC effects that need to be modeled and an introduction of local interconnects. There are also secondary and third order manufacturing effects that need to modeled, and all these new features have to be modeled with precise accuracy.
Performance and turnaround times are absolutely important, but if you can’t provide accuracy for these devices — especially in correlation to the foundry golden data — designers would have to over-margin their designs and leave performance on the table.
How can Cadence claim that it has the ‘tightest correlation to foundry golden data at TSMC vs. competing solutions’? And, why 16nm only?
According to Moore, the foundry partner, TSMC, asserts that Quantus QRC Extraction Solution provides best-in-class accuracy, which was referenced in the recent press announcement:
“Cadence Quantus QRC Extraction Solution successfully passed TSMC’s rigorous parasitic extraction certification requirements to achieve best-in-class accuracy against the foundry golden data for FinFET technology.”
FinFET structures present unique challenges since they are non-planar devices as opposed to its CMOS predecessor, which is a planar device. We partnered with TSMC from the very beginning to address the modeling challenges, and we’ve seen many complex shapes and structures over the year that we’ve modeled accurately.
“We’re not surprised that TSMC has recognized our best-in-class accuracy because we’re the leader in providing extraction solutions for RF designs. Cadence Quantus QRC Extraction Solution has been certified for TSMC 16nm FinFET, however, it’s important to note that we’ve been certified for all other technology nodes and our QRC techfiles are available to our customers from TSMC today.”
Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;) I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.
Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.
In that case, why are some companies STILL not knowing how to verify a chip?
He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.
“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”
How are companies trying to address the challenges?
Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.
* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.
* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.
* Verification environment re-use helps to cut down the time required to develop verification environments.
* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.
Cadence has the widest portfolio of tools to help companies meet verification challenges, including:
Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;
The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;
Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and
Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.
Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.
When should good verification start?
Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”
Are folks mistaking by looking at tools and not at the verification process itself?
He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.
Finally, there’s verification planning! What should be the ‘right’ verification path?
Verification planning needs to include:
* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.
I was pointed out to a piece of news on TV, where a ruling chief minister of an Indian state apparently announced that he could make a particular state of India another Silicon Valley! Interesting!!
First, what’s the secret behind Silicon Valley? Well, I am not even qualified enough to state that! However, all I can say is: it is probably a desire to do something very different, and to make the world a better place – that’s possibly the biggest driver in all the entrepreneurs that have come to and out of Silicon Valley in the USA.
If you looked up Wikipedia, it says that the term Silicon Valley originally referred to the region’s large number of silicon chip innovators and manufacturers, but eventually, came to refer to all high-tech businesses in the area, and is now generally used as a metonym for the American high-technology sector.
So, where exactly is India’s high-tech sector? How many Indian state governments have even tried to foster such a sector? Ok, even if the state governments tried to foster, where are the entrepreneurs? Ok, an even easier one: how many school dropouts from India or even smal-time entrepreneurs have even made a foray into high-tech?
Right, so where are the silicon chip innovators from India? Sorry, I dd not even hear a word that you said? Can you speak out a little louder? It seems there are none! Rather, there has been very little to no development in India, barring the work that is done by the MNCs. Correct?
One friend told me that Bangalore is a place that can be Silicon Valley. Really? How?? With the presence of MNCs, he said! Well, Silicon Valley in the US does not have MNCs from other countries, are there? Let’s see! Some companies with bases in Silicon Valley, listed on Wikipedia, include Adobe, AMD, Apple, Applied Materials, Cisco, Facebook, Google, HP, Intel, Juniper, KLA-Tencor, LSI, Marvell, Maxim, Nvidia, SanDisk, Xilinx, etc.
Now, most of these firms have setups in Bangalore, but isn’t that part of the companies’ expansion plans? Also, I have emails and requests from a whole lot of youngsters asking me: ‘Sir, please advice me which company should I join?’ Very, very few have asked me: ‘Sir, I have this idea. Is it worth exploring?’
Let’s face the truth. We, as a nation, so far, have not been one to take up challenges and do something new. The ones who do, or are inclined to do so, are working in one of the many MNCs – either in India or overseas.
So, how many budding entrepreneurs are there in India, who are willing to take the risk and plunge into serious R&D?
It really takes a lot to even conceive a Silicon Valley. It takes people of great vision to build something of a Silicon Valley, and not the presence of MNCs.
Just look at Hsinchu, in Taiwan, or even Shenzhen, in China. Specifically, look up Shenzhen Hi-Tech Industrial Park and the Hsinchu Science Park to get some ideas.
Early this month, I caught up with Jaswnder Ahuja, corporate VP and MD, Cadence Desiign Systems India. With the global semiconductor industry having entered the sub-20nm era, there are a lot of things happening, and Cadence is sure to be present.
Performance in sub-2onm era
First, let’s see how’s the global semiconductor industry performing after entering the sub-20nm era.
Ahuja replied: “Increased demand for faster, smaller, low-power chips continues to drive the geometry shrink as one of the ways to manage the low-power, higher performance goals in smaller form factors—in other words, PPA is driving the move to advanced node design.
“At Cadence, we are seeing a lot of interest in the wireless space, which includes smartphones, tablets, and consumer devices. In this market, you must support different standards, the device must be really fast, it must have Internet access, and all this must be done at lower power so the that it does not drain the battery. We’re also seeing interest for advanced nodes in other segments such as computing and graphics processors.”
When speaking of advanced nodes, let’s also try and find out what Cadence is doing in helping achieve 10X faster power integrity analysis and signoff.
Cadence Voltus IC power integrity Solution is a full-chip, cell-level power signoff tool that provides accurate, fast, and high-capacity analysis and optimization technologies to designers for debugging, verifying, and fixing IC chip power consumption, IR drop, and electromigration (EM) constraints and violations.
The Voltus solution includes innovative technologies such as massively parallel execution, hierarchical architecture, and physically aware power grid analysis and optimization. Beneficial as a standalone power signoff tool, Voltus IC Power Integrity Solution delivers even more significant productivity gains when used in a highly integrated flow with other key Cadence products, providing the industry’s fastest design closure technology.
Developed with advanced algorithms and a new power integrity analysis engine with massively parallel execution, Voltus IC Power Integrity solution:
* Performs 10X faster than other solutions on the market.
* Supports very large designs—up to one billion instances—with its hierarchical architecture.
* Delivers SPICE-level accuracy.
* Enhances physical implementation quality via physically aware power integrity optimization.
Supported by major foundries and intellectual property (IP) providers, Voltus IC Power Integrity Solution has been validated and certified on advanced nodes processes such as 16nm FinFET and included in reference design flows such as for 3D-IC technology. Backed by Cadence’s rigorous quality control and product release procedures, the Voltus solution delivers best-in-class signoff quality on accuracy and stability for all process nodes and design technologies.
FinFETs to 20nm – are folks benefiting?
It is common news that FinFETs have gone to 20nm and perhaps, lower. Therefore, are those folks looking for power reduction now benefiting?
Ahuja replied that FinFETs allow semiconductor and systems companies to continue to develop commercially viable chips for the mobile devices that are dominating the consumer market. FinFETs enable new generations of high-density, high-performance, and ultra-low-power systems on chip (SoCs) for future smart phones, tablets, and other advanced mobile devices. Anyone who adopts FinFET technology will reap the benefits.
Foundry support for FinFETs will begin at 16nm and 14nm. In April of this year, Cadence announced a collaboration with ARM to implement the industry’s first ARM Cortex-A57 processor on TSMC’s 16nm FinFET manufacturing process. At ARM TechCon 2012, Cadence announced a 14nm test chip tapeout using an ARM Cortex-M0 processor and IBM’s FinFET process technology.