Search Results

Keyword: ‘tools’

Busy period ahead for Indian semicon, solar! While, TI bids for Qimonda’s tools!!


Yes, looks like it!

First, on August 31, the India Semiconductor Association and the UK-TI would be signing an MoU. The next day, September 1, there is a presentation by Ministry of New and Renewable Energy and key officials on the government of India’s policies to the industry!

Next, on September 4, the DIT Secretary R. Chandrasekhar and the Additional Secretary, will be interacting with semiconductor companies in Bangalore.

Further on, September 16 is the day when the Union Minister for New and Renewable Energy, Dr Farooq Abdullah, will be interacting with a small group of industry leaders at a solar PV conclave in Hyderabad!

That’s quite a lot, within a span of 15-odd days! Must say, this augurs well for the Indian semicon and solar/photovoltaics industry.

Interestingly, a lot of the big events are focusing on solar. So, my hunch is that the Indian solar industry may have some serious announcements to make in the coming weeks. Should that happen, I hope to bring those to you, time permitting.

TI bids for Qimonda’s tools
Oh, by the way, there’s news all over the Internet about Texas Instruments (TI) placing a bid of $172.5 million for Qimonda’s 300mm production tools from its closed DRAM fab. While this highlights TI’s focus on building the world’s first 300mm analog fab, I can’t stop wondering, what would have happened had an Indian investor really bought Qimonda!

Are EDA tools a commodity?


I had the pleasure of attending the 20th International Conference on VLSI Design and 6th International Conference on Embedded Systems in Bangalore, and had the good fortune of meeting a range of top experts from these fields.

One panel discussion: “Are EDA technology/products becoming a commodity?”, particularly caught my attention. Speakers debated on whether commoditization of EDA tools was happening with little/no differentiation toward project success.

Dr. Anand Anandkumar, managing director, Magma India, also a good friend, elucidated that the semiconductor design industry cannot do a complex SoC without EDA. And if there’s no EDA, there’s no integration. “If you are a commodity, you cannot solve problems!”

Now EDA is a key driver for semiconductor design companies to achieve objectives of building more and more complex (SoCs). However, the overall market size of EDA industry [estimated at US $4 billion] remains a fraction of the overall semiconductor market size [estimated at US $240 billion].

Dr Anandkumar added there had been various paradigm shifts and problems. The EDA industry was in a way the IP partner with the semiconductor industry. However, he agreed that parts of the tools had been commoditized. The EDA industry had become a prisoner of its own business model.

Nevertheless, newer things have been racing forward. There are also a variety of conflicting problems. Understanding those problems could be a way of handling and solving complex designs. The part of taking over risks had been completely absent. There was little ownership in sharing risks, which needed to change.

From the perspective of consumer electronics eco-system, available EDA technology is often viewed as not being in sync with the expectations and requirements of various design teams. Claims of productivity and quality of results advantages from EDA teams can seem more like wishful thinking than reality to end users.

More so, related issues of quality, inter-operability of standard formats, usability and understanding of designer needs are other areas of ongoing concern. These are not necessarily new issues, so what were EDA companies and their customers doing to address them? Has the EDA industry been getting its proportional value out of the semiconductor industry? Would love to hear from you.

Home-grown EDA company provides innovative tools


Bangalore-based Softjin Technologies is an electronic design automation (EDA) company providing unique EDA solutions for the semiconductor industry. According to Kamal Aggarwal, VP-Marketing and Strategy, the company’s core-capability lies in developing innovative EDA tools for specific requirements of customers, such as semiconductor companies and other EDA product companies.

Softjin’s business model is a “hybrid” model, which is a mix of services and products. It also provides FPGA based system design services and design methodology services for customers.

Softjin’s current products are licensable EDA building blocks that can be used as part of proving an EDA solution to customers in post-layout and logic synthesis technology area within EDA.

Post the announcement of India’s historic semicon policy, Softjin expects to see more investments happening in the semiconductor manufacturing space. As the policy provides incentives for setting up semiconductor manufacturing units above a certain size, Aggarwal expects to see more big-ticket announcements emerging in near future.

Watch the Indian semicon space, guys, as the action heats up…

Accelerating EDA innovation through SoC design methodology convergence


According to Dr. Walden C. Rhines, chairman and CEO, Mentor Graphics Corp., verification has to improve and change every year just to keep up with the rapidly changing semiconductor technology. Fortunately, the innovations are running ahead of the technology and there are no fundamental reasons why we cannot adequately verify the most complex chips and systems of the future. He was speaking at the recently held DVCON 2014 in Bangalore, India.

DVCON India 2014.

DVCON India 2014.

A design engineer’s project time for doing design has reduced by 15 percent from 2007-2014, while the engineer’s time for doing verification had seen 17 percent increase during the same time. At this rate, in about 40 years, all of a designer’s time will be devoted to verification. At the current rate, there is almost no chance of getting a single-gate design correct on first pass!

Looking at a crossover of verification engineers vs. designer engineers, there is a CAGR designers of 4.55 percent, and for CAGR verifiers, it is 12.62 percent.

The on-time completion remains constant, as we look at the non-FPGA project’s schedule completion trends, which are: 67 percent behind schedule for 2007, 66 percent behind schedule for 2010, 67 percent behind schedule for 2012, and 59 percent behind schedule for 2014. There has been an increase in the average number of embedded processors per design size, moving from 1.12 to 4.05.

Macro trends
Looking at the macro trends, there has been standardization of verification languages. SystemVerilog is the only verification language growing. Now, interestingly, India leads the world in SystemVerilog adoption. It is also remarkable that the industry converged on IEEE 1800. SystemVerilog is now mainstream.

There has been standardization in base class libraries as well. There was 56 percent UVM growth between 2012 and 2014, and 13 percent is projected growth in UVM the next year. Again, India leads the world in UVM adoption.

The second macro trend is standardization of the SoC verification flow. It is emerging from ad hoc approaches to systematic processes. The verification paradox is: a good verification process lets you get the most out of best-in-class verification tools.

The goal of unit-level checking is to verify that the functionality is correct for each IP, while achieving high coverage. Use of advanced verification techniques has also increased from 2007 to 2014.

Next, the goal of connectivity checking is to ensure that the IP blocks are connected correctly, a common goal with IP integration and data path checking.

The goal of system-level checking is performance, power analysis and  SoC functionality. Also, there are SoC ‘features’ that need to be verified.

A third macro trend is the coverage and power across all aspects of verification. The Unified Coverage Interoperability Standard or UCIS standard was announced at DAC 2012 by Accellera. Standards accelerate the EDA innovation!

The fourth trend is active power management. Now, low-power design requires multiple verification approaches. Trends in power management verification include things like Hypervisor/OS control of power management, application-level power management, operation in each system power state, interactions between power domains, hardware power control sequence generation, transitions between system power states, power domain state reset/restoration, and power domain power down/power up.

Macro enablers in verification
Looking at the macro enablers in verification, there is the intelligent test bench, multi-engine verification platforms, and application-specific formal. The intelligent test bench technology accelerates coverage closure. It has also seen the emergence of intelligent software driven verification.

Embedded software headcount surges with every node. Clock speed scaling slows the simulation performance improvement. Growing at over 30 percent CAGR from 2010-14, emulation is the fastest growing segment of EDA.

As for system-level checking, as the design sizes increase emulation up, the FPGA prototyping goes down. The modern emulation performance nmakes virtual debug fast. Virtual stimulus makes emulator a server, and moves the emulator from the lab to the datacenter, thereby delivering more productivity, flexibility, and reliability. Effective 100MHz embedded software debug makes virtual prototype behave like real silicon. Now, integrated simulation/emulation/software verification environments have emerged.

Lastly, for application-specific formal, the larger designs use more formal. The application-specific formal includes checking clock domain crossings.

DVCon India 2014 aims to bring Indian design, verification and ESL community closer!

September 11, 2014 1 comment

DVCon India 2014 has come to Bangalore, India, for the first time. It will be held at the Hotel Park Plaza in Bangalore, on Sept. 25-26. Dr. Wally Rhines, CEO, Mentor Graphics will open the proceedings with his inaugural keynote.

proxy?url=http%3A%2F%2F1.bp.blogspot.com%2F-gBtXKeIK9Ss%2FVBHi1IJxT1I%2FAAAAAAAALYM%2FfSqqoFgTuas%2Fs1600%2Fdvconindia2014logo2.png&container=blogger&gadget=a&rewriteMime=image%2F*Other keynotes will be from Dr. Mahesh Mehendale, MCU chief technologist, TI, Janick Bergeron, verification fellow, Synopsys, and Vishwas Vaidya, assistant GM, Electronics, Tata Motors.

Gaurav Jalan, SmartPlay, chair – promotions committee took time to speak about DVCon 2014 India.

Focus of DVCon 2014 India

First, what’s the focus of DVCon 2014 India? According to Jalan, DVCon has been a premiere conference in the US contributing to quality tutorials, papers and an excellent platform for networking. DVCON India focuses on filling the void of a vendor neutral quality conference in the neighbourhood – one that will grow over time.

The idea is to bring together, hitherto dispersed, yet substantial, design, verification and ESL community and give them a voice. Engineers get a chance to learn solutions to the verification problems, share the effectiveness of the solutions they have experimented, understand off the shelf solutions that are available in market and meet the vendor agnostic user fraternity. Moving forward the expectation is to get the users involved as early adopters of upcoming standards and actively contribute to them.

Trends in design

Next, what are the trends today in design? Jalan said while the designs continue to parade on the lines of Moore’s law there is a lot happening beyond the mere gate count. Defining and developing IPs with a wide configuration options serving a variety of application domains is a challenge.

The SoCs are crossing multi billion gate design (A8 in iPhone6 is 2 billion) with multi-fold increase in complexity due to multiple clock domains, multiple power domains, multiple voltage domains while delivering required performance in different application modes with sleek foot print.

Trends in verification

Now, let’s examine the trends today in verification. When design increases linearly, verification jumps exponentially. While UVM has settled dust to some extent on the IP verification level, there is a huge of challenges still awaiting to be addressed. The IP itself is growing in size limiting the simulator and encouraging users to move to emulators. While UVM solved the methodology war the VIPs available are still not simulator agnostic and expecting a emulator agnostic VIP portfolio is still a distant dream.

SoC verification is still a challenge not just due to the sheer size but because porting an env from block to SoC is difficult. The test plan definition and development for SoC level itself is a challenge. Portable stimulus group from Accellera is addressing this.

Similarly, coverage collection from different tools is difficult to merge. Unified coverage group at Accellera is addressing this. Low power today is a norm and verifying a power aware design is quite challenging. UPF is an attempt to standardize this.

Porting a SoC to emulator to enable hardware acceleration so as to run usecases is another trend picking up. Teams now are able to boot android on an SoC even before the silicon arrives. With growing analog content on chip the onus is on the verification engineers to ensure the digital and analog sides of the chip work in conjunction as per specs. Formal apps have picked so as to address connectivity tests, register spec testing, low power static checks and many more.

Accelearating EDA innovation

So, how will EDA innovation get accelerated? According to Jalan, the semiconductor industry has always witnessed that startups and smaller companies lead the innovation. Given the plethora of challenges around, there are multiple opportunities to be addressed from both the biggies and the start-ups.

The evolution of standards at Accellera definitely is a great step so as to bring the focus on real innovation in the tools while providing a platform for the user community to come forward sharing the challenges and proposing alternates. With a standard baseline that is defined with collaboration from all partners of the ecosystem, the EDA companies can focus on competing on performance, user interface, increased tool capacity and enabling faster time to market.

Forums like DVCON India help in growing awareness on standard promoted by Accellera while encouraging participants from different organizations and geographies join to contribute. Apart from tools areas where EDA innovation would pick up include new IT technologies and platforms – Cloud, Mobile devices.

Next level of verification productivity

Where is the next level of verification productivity likely to come from? To this, Jalan replied that productivity in the verification improves from different aspects.

While faster tools with increased capacity comes from innovation at EDA end, standard have played an excellent role in addressing it. UVM has helped in displacing vendor specific technologies to improve inter-operability, quick ramp up for engineers and reusability. Similarly on power format, UPF has played an important role in bridging the gaps.

Unified coverage is another aspect where it will help in closing early with coverage driven verification. IPXACT and SystemRDL standards help further in packaging IPs and easier hand off to enable reuse. Similarly other standards on ESL, AMS etc help in closing the loop holes that prevent productivity.

New, portable stimulus specification now being developed under Accellera that will help in easing out test development at different levels from IP to sub system to SoC. For faster simulations, the increase in adoption of hardware acceleration platforms is helping verification engineers to improve regression turn around time.

Formal technologies play an important role in providing a mathematical proofs to common verification challenges at an accelerated pace in comparison to simulation. Finally events like DVCON enables users to share their experiences and knowledge encouraging others to try out solutions instead of struggling with the process of discovering or inventing one.

More Indian start-ups

Finally, do the organizers expect to see more Indian start-ups post this event? Yes, says Jalan. “We even have a special incubation booth that is encouraging young startups to come forth and exhibit at a reduced cost (only $300). We are creating a platform and soon we will see new players in all areas of Semiconductor.

“Also, the Indian government’s push in the semiconductor space will give new startups further incentive to mushroom. These conferences help entrepreneurs to talk to everyone in the community about problems, vet potential solutions and seek blessings from gurus.”

Categories: Semiconductors

Cadence now realizing EDA 360 vision: Nimish Modi


The EDA 360 was an industry vision. It reflected a change in market requirements. It was application driven system design. From a Cadence perspective, the company has done system design enablement, according to Nimish Modi, senior VP, marketing and business development, Cadence Design Systems Inc.

In Apple’s case, the iOS is unique. Cadence feels that the heart of the design is the SoC. The electrical analysis is becoming very important. For instance, how do you optimize before tape-out? Hardware and software conversion presents a huge problem as well. The IP plays an important part. Cadence did IP-as-a-service. It now has an IP strategy.

Today, EDA is about possibility, not productivity. Cadence provides tools and content for semiconductor and systems companies. It is now realizing the EDA 360 vision.

On IP
According to Modi, each IP is immensely complex. Standards based or interface IP is not enough! Silicon-proven design is the need of the hour. Now, more and more IP blocks are said to be coming together.

FPGA-based prototyping
Cadence is offering the Palladium XP, and its primary use is for system verification. Software development is becoming a little bit difficult. People are providing software prototypes. The Palladium compile, turnaround and debug are very fast, best-in-class. All memory, clocking, partitioning, etc., is now automated.

The capacity of the Protium platform is 100 million gates. It will enable hardware and software developers. The use model for Protium is:
* Hardware folks use it for hardware regression.
* Software folks use it for early software development.

The main value proposition is the faster bring-up time. Also, the Palladium hybrid model helps customers overcome the boot problem. It is a hybrid of emulation and virtual prototyping. The dynamic power analysis is another issue. The Palladium hybrid model helps to do the testing.

Collaboration with ARM
ARM provides processor IPs. Cadence works closely with ARM. Cadence is also co-optimizing its tools to provide the best PPA. Physical libraries and tools get optimized. Cadence’s tools are optimized for ARM architecture. Cadence is also the first ones on the access to the V8 ARM models.

Categories: Semiconductors

Optic2Connect develops software for photonics!


Sean Seah

Sean Seah

Optic2Connect will be present at this year’s DAC. I caught up with Sean Seah, project manager, to find out more.

First, what’s the company’s X factor and why? (What is it that makes your offering special and noteworthy – how are you different from competitors)?

Optic2Connect develops software solutions for the photonics industry. The demand to manage high volumes of data in networks, especially with the current smart-phone and cloud computing trend, has increased tremendously. As design gets more complex, simulation tools need to scale with regard to fidelity and accuracy.

Currently, photonic designers, scientists, and fabrication engineers adopt an approximated approach from the electrical data to build an equivalent optical model, hence losing on device physics details. At the same time the process is long as the model needs to be described block-by-block with denser blocks representing a more detailed model. Our competitors are well established in their respective domains, electrical or optical, but they are strong in their own respective fields. However, intimate knowledge in both are essential to fully understand this newer generation of photonic devices. Failure to understand fully results in false results from the manufacturing.

With patented know-how, Optic2Connect provides software solutions that SOLVES this pertinent challenge. It maps accurately simulations from one domain to another, e.g. electrical to optical. This technology has been developed by a team of researchers at A*Star – Singapore Public Research Institute. The technology overcomes error-prone and detailed oriented simulation setups. We demonstrated the ability to map without losing any fidelity in the simulation files.

Optic2Connect’s IP differs from its competitors because it simulates directly from the beginning device processing, to electrical device performance until the final high-speed optical eye diagram. This is in stark contrast to the usual method of representing their operation using simplified transfer functions.

Furthermore, the Optic2Connect design flow uses the same reliable tools and processes from the semiconductor industry that are fully compatible with the Complementary Metal Oxide Semiconductor (CMOS) fabrication process of silicon microelectronics. This design flow uses standard tools libraries, device models especially for active components such as modulators, and simulation of these components incorporating the models.

How have you been doing this year so far? Seah said: “It has been excellent! We are racing to complete our product prototypes and we secured a contract from an MNC and another from universities.”

What’s the future path likely to be? Seah added: “We intend to further validate our prototype with our partners from industry and academia, and integrating advanced modulation formats into our solutions. We want to offer a fully integrated solution for photonic devices to our customers. Our goal is to offer a one-stop solution for leading integrated-circuit (IC) manufacturers!”

Why this name? You sounded like a telecom company!

Seah said: “We strongly believe the future of communications is via optics which has the ability to circumvent the data bottleneck issues. Optic2Connect is meant to offer connect using optical communications. Our goal is a one-stop solution for optical connections. “

How will the solution significantly shorten product time-to-market and reduce development costs of photonics devices?

For complex photonics devices, minute changes to design parameters are significant and could affect loss performance, and operating voltage requirements. One common approach in the industry today is to physically build the variations into multiple device / runs and test them out. Each run cost is the range of hundreds of thousands and consume precious time. Especially, if the first batch of devices do not meet required parameters and additional batches are required. This cost both money and time, which in turn is more money.

Hence, Optic2Connect provides an elegant solution with our accurate modelling and simulation solutions, this accelerates manufacturing prototypes and at much lower production costs. Our software solutions provide a 10x improvement in time reduction and time to market. Further, our cloud solution overcomes traditional problems of insufficient servers / licenses, especially during periods of peak demand.

Set up strong methodology teams to create better verification infrastructure: Synopsys


Arindam Ghosh

Arindam Ghosh

This is the third installment on verification, now, taken up by Synopsys. Regarding the biggest verification mistakes today, Arindam Ghosh, director – Global Technical Services, Synopsys India, attributed these as:

* Spending no time on verification planning (not documenting what needs to be verified) and focusing more on running simulations or on execution.
* No or very low investment in building better verification environments (based on best/new methodologies and best practices); instead maintaining older verification environments.
* Compromising on verification completeness because of tape out pressures and time-to-market considerations.

Would you agree that many companies STILL do not know how to verify a chip?

He said that it could be true for smaller companies or start-ups, but most of the major semiconductor design engineers know about the better approaches/methodologies to verify their chips. However, they may not be investing in implementing the new methodologies for multiple reasons and may instead continue to follow the traditional flows.

One way to address these mistakes would be to set up strong methodology teams to create a better verification infrastructure for future chips. However, few companies are doing this.

Are companies realizing this and building an infrastructure that gets you business advantage? He added that some companies do realize this and are investing in building a better infrastructure (in terms of better methodology and flows) for verification.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment? Ghosh said that good verification starts as soon as we start designing and architecting the design. Verification leads should start discussing the verification environment components with the lead architect and also start writing the verification plan.

Are folks mistaking by looking at tools and not at the verification process itself? According to him, tools play a major role in the effectiveness of any verification process, but we still see a lot of scope in methodology improvements beyond the tools.

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities? Ghosh said that there is no single, full-proof recipe for a ‘right’ verification path. It depends on multiple factors, including whether the design is a new product or derivative, the design application etc. But yes, it is very important to do comprehensive verification planning before starting the verification process.

Synopsys is said to be building a comprehensive, unified and integrated verification environment is required for today’s revolutionary SoCs and would offer a fundamental shift forward in productivity, performance, capacity and functionality.  Synopsys’ Verification Compiler provides the software capabilities, technology, methodologies and VIP required for the functional verification of advanced SoC designs in one solution.

Verification Compiler includes:
* Better capacity and compile and runtime performance.
* Next-generation static and formal technology delivering performance improvement and the capacity to analyze a complete SoC (Property checking, LP, CDC, connectivity).
* Comprehensive low power verification solution.
* Verification planning and management.
* Next-generation verification IP and a deep integration between VIP and the simulation engine, which in turn can greatly improve productivity.  The constraint engine is tuned for optimal performance with its VIP library. It has integrated debug solutions for VIP so one can do protocol-level analysis and transaction-based analysis with the rest of the testbench.
* Support for industry standard verification methodologies.
* X-propagation simulation with both RTL and low power simulations.
* Common debug platform with better debug technology having new capabilities, tight integrations with simulation, emulation, testbench, transaction debug, power-aware debug , hw/sw debug, formal, VIP and coverage.

Top five recommendations for verification
What would be Synopsys’ top five recommendations for verification?

* Spend a meaningful amount of time and effort on verification planning before execution.
* Continuously invest in building a better verification infrastructure and methodologies across the company for better productivity.
* Collaborate with EDA companies to develop, evaluate and deploy new technologies and flows, which can bring more productivity to verification processes.
* Nurture fresh talent through regular on and off-the-job trainings (on flows, methodologies, tools, technology).
* Conduct regular reviews of the completed verification projects with the goal of trying to improve the verification process after every tapeout through methodology enhancements.

Categories: Semiconductors

Cadence: Plan verification to avoid mistakes!


Apurva Kalia

Apurva Kalia

Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;)  I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.

Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.

In that case, why are some companies STILL not knowing how to verify a chip?

He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.

“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”

Addressing challenges
How are companies trying to address the challenges?

Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.

* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.

* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.

* Verification environment re-use helps to cut down the time required to develop verification environments.

* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.

Cadence has the widest portfolio of tools to help companies meet verification challenges, including:

Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;

The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;

Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and

Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.

Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.

Good verification
When should good verification start?

Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”

Are folks mistaking by looking at tools and not at the verification process itself?

He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.

Verification planning
Finally, there’s verification planning! What should be the ‘right’ verification path?

Verification planning needs to include:

* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.

Five recommendations for verification: Dr. Wally Rhines


Dr. Wally RhinesIt seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.

Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.

Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”

Would you agree that many companies STILL do not know how to verify a chip?

Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).

“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.

“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”

How are companies trying to address those?

According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).  Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.

Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?

He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?

Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”

Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?

He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?

Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.

“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”

How is Mentor addressing this situation?

Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.

In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.

Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?

Here are the five recommendations for verification from Dr. Rhines:

* Ensure your organization has implemented an effective verification planning process.

* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.

* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.

* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.

* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.

%d bloggers like this: