According to Dr. Walden C. Rhines, chairman and CEO, Mentor Graphics Corp., verification has to improve and change every year just to keep up with the rapidly changing semiconductor technology. Fortunately, the innovations are running ahead of the technology and there are no fundamental reasons why we cannot adequately verify the most complex chips and systems of the future. He was speaking at the recently held DVCON 2014 in Bangalore, India.
A design engineer’s project time for doing design has reduced by 15 percent from 2007-2014, while the engineer’s time for doing verification had seen 17 percent increase during the same time. At this rate, in about 40 years, all of a designer’s time will be devoted to verification. At the current rate, there is almost no chance of getting a single-gate design correct on first pass!
Looking at a crossover of verification engineers vs. designer engineers, there is a CAGR designers of 4.55 percent, and for CAGR verifiers, it is 12.62 percent.
The on-time completion remains constant, as we look at the non-FPGA project’s schedule completion trends, which are: 67 percent behind schedule for 2007, 66 percent behind schedule for 2010, 67 percent behind schedule for 2012, and 59 percent behind schedule for 2014. There has been an increase in the average number of embedded processors per design size, moving from 1.12 to 4.05.
Looking at the macro trends, there has been standardization of verification languages. SystemVerilog is the only verification language growing. Now, interestingly, India leads the world in SystemVerilog adoption. It is also remarkable that the industry converged on IEEE 1800. SystemVerilog is now mainstream.
There has been standardization in base class libraries as well. There was 56 percent UVM growth between 2012 and 2014, and 13 percent is projected growth in UVM the next year. Again, India leads the world in UVM adoption.
The second macro trend is standardization of the SoC verification flow. It is emerging from ad hoc approaches to systematic processes. The verification paradox is: a good verification process lets you get the most out of best-in-class verification tools.
The goal of unit-level checking is to verify that the functionality is correct for each IP, while achieving high coverage. Use of advanced verification techniques has also increased from 2007 to 2014.
Next, the goal of connectivity checking is to ensure that the IP blocks are connected correctly, a common goal with IP integration and data path checking.
The goal of system-level checking is performance, power analysis and SoC functionality. Also, there are SoC ‘features’ that need to be verified.
A third macro trend is the coverage and power across all aspects of verification. The Unified Coverage Interoperability Standard or UCIS standard was announced at DAC 2012 by Accellera. Standards accelerate the EDA innovation!
The fourth trend is active power management. Now, low-power design requires multiple verification approaches. Trends in power management verification include things like Hypervisor/OS control of power management, application-level power management, operation in each system power state, interactions between power domains, hardware power control sequence generation, transitions between system power states, power domain state reset/restoration, and power domain power down/power up.
Macro enablers in verification
Looking at the macro enablers in verification, there is the intelligent test bench, multi-engine verification platforms, and application-specific formal. The intelligent test bench technology accelerates coverage closure. It has also seen the emergence of intelligent software driven verification.
Embedded software headcount surges with every node. Clock speed scaling slows the simulation performance improvement. Growing at over 30 percent CAGR from 2010-14, emulation is the fastest growing segment of EDA.
As for system-level checking, as the design sizes increase emulation up, the FPGA prototyping goes down. The modern emulation performance nmakes virtual debug fast. Virtual stimulus makes emulator a server, and moves the emulator from the lab to the datacenter, thereby delivering more productivity, flexibility, and reliability. Effective 100MHz embedded software debug makes virtual prototype behave like real silicon. Now, integrated simulation/emulation/software verification environments have emerged.
Lastly, for application-specific formal, the larger designs use more formal. The application-specific formal includes checking clock domain crossings.
It seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.
Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.
Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”
Would you agree that many companies STILL do not know how to verify a chip?
Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).
“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.
“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”
How are companies trying to address those?
According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification). Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.
Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?
He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.
When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?
Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”
Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?
He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”
What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?
Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.
“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”
How is Mentor addressing this situation?
Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.
In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.
Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?
Here are the five recommendations for verification from Dr. Rhines:
* Ensure your organization has implemented an effective verification planning process.
* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.
* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.
* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.
* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.
Its a pleasure to talk to Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics Corp. On his way to DAC 2013, where he will be giving a ten-minute “Visionary Talk”, he found time to speak with me. First, I asked him given that the global semiconductor industry is entering the sub-20nm era, will it continue to be ‘business as usual’ or ‘it’s going to be different this time’?
Dr. Rhines said: “Every generation has some differences, even though it usually seems like we’ve seen all this before. The primary change that comes with “sub-20nm” is the change in transistor structure to FinFET. This will give designers a boost toward achieving lower power. However, compared to 28nm, there will be a wafer cost penalty to pay for the additional process complexity that also includes two additional levels of resolution enhancement.”
Impact of new transistor structures
How will the new transistor structures impact on design and manufacturing?
According to him, the relatively easy impact on design is related to the simulation of a new device structure; models have already been developed and characterized but will be continuously updated until the processes are stable. More complex are the requirements for place and route and verification; support for “fin grids” and new routing and placement rules has already been implemented by the leading place and route suppliers.
He added: “Most complex is test; FinFET will require transistor-level (or “cell-aware”) design for test to detect failures, rather than just the traditional gate-level stuck-at fault models. Initial results suggest that failure to move to cell-aware ATPG will result in 500 to 1000 DPM parts being shipped to customers.
“Fortunately, “cell-aware” ATPG design tools have been available for about a year and are easily implemented with no additional EDA cost. Finally, there will be manufacturing challenges but, like all manufacturing challenges, they will be attacked, analyzed and resolved as we ramp up more volume.”
Introducing 450mm wafer handling and new lithography
Is it possible to introduce 450mm wafer handling and new lithography successfully at this point in time?
“Yes, of course,” Dr. Rhines said. “However, there are a limited number of companies that have the volume of demand to justify the investment. The wafer diameter transition decision is always a difficult one for the semiconductor manufacturing equipment companies because it is so costly and it requires a minimum volume of machines for a payback. In this case, it will happen. The base of semiconductor manufacturing equipment companies is becoming very concentrated and most of the large ones need the 450mm capability.”
What will be the impact of transistor variability and other physics issues?
As per Dr. Rhines, the impact should be significant. FinFET, for example requires controlling physical characteristics of multiple fins within a narrow range of variability. As geometries shrink, small variations become big percentages. New design challenges are always interesting for engineers but the problems will be overcome relatively quickly.
Agnisys Inc. was established in 2007 in Massachusetts, USA, with a mission to deliver innovative automation to the semiconductor industry. The company offers affordable VLSI design and verification tools for SoCs, FPGAs and IPs that makes the design verification process extremely efficient.
Agnisys’ IDesignSpec is an award winning engineering tool that allows an IP, chip or system designer to create the register map specification once and automatically generate all possible views from it. Various outputs are possible, such as UVM, OVM, RALF, SystemRDL, IP-XACT etc. User defined outputs can be created using Tcl or XSLT scripts. IDesignSpec’s patented technology improves engineer’s productivity and design quality.
The IDesignSpec automates the creation of registers and sequences guaranteeing higher quality and consistent results across hardware and software teams. As your ASIC or FPGA design specification changes, IDesignSpec automatically adjusts your design and verification code, keeping the critical integration milestones of your design engineering projects synchronized.
Register verification and sequences consume up to 40 percent of project time or more when errors are the source of re-spins of SoC silicon or an increase in the number of FPGA builds. IDesignSpec family of products is available in various flavors such as IDSWord, IDSExcel, IDSOO and IDSBatch.
IDesignSpec more than a tool for creating register models!
Anupam Bakshi, founder, CEO and chairman, Agnisys, said: “IDesignSpec is more than a tool for creating register models. It is now a complete Executable Design Specification tool. The underlying theme is always to capture the specification in an executable form and generate as much code in the output as possible.”
The latest additions in the IDesignSpec are Constraints, Coverage, Interrupts, Sequences, Assertions, Multiple Bus Domains, Special Registers and Parameterization of outputs.
“IDesignSpec offers a simple and intuitive way to specify constraints. These constraints, specified by the user, are used to capture the design intent. This design intent is transformed into code for design, verification and software. Functional Coverage models can be automatically generated from the spec so that once again the intent is captured and converted into appropriate coverage models,” added Bakshi.
Using an add-on function of capturing Sequences, the user is now able to capture various programming sequences in the spec, which are translated into C++ and UVM sequences, respectively. Further, the interrupt registers can now be identified by the user and appropriate RTL can be generated from the spec. Both edge sensitive and level interrupts can be handled and interrupts from various blocks can be stacked.
Assertions can be automatically generated from the high level constraint specification. These assertions can be created with the RTL or in the external files such that they can be optionally bound to the RTL. Unit level assertions are good for SoC level verification and debug, and help the user in identifying issues deep down in the simulation hierarchy.
The user can now identify one or more bus domains associated with Registers and Blocks, and generate appropriate code from it. Special Registers such as shadow registers and register aliasing is also automatically generated.
Finally all of the outputs such as RTL, UVM, etc., can be parameterized now, so that a single master specification can be used to create outputs that can be parameterized at the elaboration time.
How is IDesignSpec working as chip-level assertion-based verification?
Bakshi said: “It really isn’t an assertion tool! The only assertion that we automatically generate is from the constraints that the user specifies. The user does not need to specify the assertions. We transform the constraints into assertions.”
It is always a pleasure speaking with Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics Corp. I met him on the sidelines of the 13th Global Electronics Summit, held at the Chaminade Resort & Spa, Santa Cruz, USA.
Status of global EDA industry
First, I asked Dr. Rhines how the EDA industry was doing. Dr. Rhines said: “The global EDA industry has been doing pretty well. The results have been pretty good for 2012. In general, the EDA industry tends to follow the semiconductor R&D by at least 18 months.”
For the record, the electronic design automation (EDA) industry revenue increased 4.6 percent for Q4 2012 to $1,779.1 million, compared to $1,700.1 million in Q4 2011.
Every region, barring Japan, grew in 2012. The Asia Pacific rim grew the fastest – about 12.5 percent. The Americas was the second fastest region in terms of growth at 7.4 percent, and Europe grew at 6.8 percent. However, Japan decreased by 3 percent in 2012.
In 2012, the segments that have grown the fastest within the EDA industry include PCB design and IP, respectively. The front-end CAE (computer aided engineering) group grew faster than the backend CAE. By product category, CAE grew 9.8 percent. The overall growth for license and maintenance was 7 percent. Among the CAE areas, design entry grew 36 percent and emulation 24 percent, respectively.
DFM also grew 28 percent last year. Overall, PCB grew 7.6 percent, while PCB analysis was 25 percent. IP grew 12.6 percent, while the verification IP grew 60 percent. Formal verification and power analysis grew 16 percent each, respectively. “That’s actually a little faster than how semiconductor R&D is growing,” added Dr. Rhines.
Status of global semicon industry
On the fortunes of the global semiconductor industry. Dr. Rhines said: “The global semiconductor industry grew very slowly in 2012. Year 2013 should be better. Revenue was actually consolidated by a lot of consolidations in the wireless industry.”
According to him, smartphones should see further growth. “There are big investments in capacities in the 28nm segment. Folks will likely redesign their products over the next few years,” he said. “A lot of firms are waiting for FinFET to go to 20nm. People who need it for power reduction should benefit.”
“A lot of people are concerned about Japan. We believe that Japan can recover due to the Yen,” he added.