Search Results

Keyword: ‘level’

Higher levels of abstraction growth area for EDA

September 1, 2013 2 comments

Dr. Ajoy Bose

Dr. Ajoy Bose

San Jose, USA-based Atrenta’s SpyGlass Predictive Analyzer gives engineers a powerful guidance dashboard that enables efficient verification and optimization of SoC designs early, before expensive and time-consuming traditional EDA tools are deployed. I recently met up with Dr. Ajoy Bose, chairman, president and CEO, Atrenta, to find out more.

I started by asking how Atrenta provides early design analysis for logic designers? He said: “The key ingredient is something we call predictive analysis. That is, we need to analyze a design at a high level of abstraction and predict what will happen when it undergoes detailed implementation. We have a rich library of algorithms that provide highly accurate ‘predictions’, without the time and cost required to actually send a design through detailed implementation.”

There’s a saying: electronic system level (ESL) is where the future of EDA lies. Why? Its because the lower level of abstraction (detailed implementation) of the EDA market is undergoing commoditization and consolidation. There are fewer solutions, and less differentiation between them. At the upper levels of abstraction (ESL), this is not the case. There still exists ample opportunity to provide new and innovative solutions.

Now, how will this help EDA to move up the embedded software space? According to Dr. Bose, the ability to do true hardware/software co-design is still not a solved problem. Once viable solutions are developed, then EDA will be able to sell to the embedded software engineer. This will be a new market, and new revenue for EDA.

How are SpyGlass and GenSys platforms helping the industry? What problems are those solving? Dr. Ajoy Bose said: “SpyGlass is Atrenta’s platform for RTL signoff. It is used by virtually all SoC design teams to ensure the power, performance and cost of their SoC is as good as it can be prior to handoff to detailed implementation.SpyGlass is also used to select and qualify semiconductor IP – a major challenge for all SoC design teams.

“GenSys provides a way to easily assemble and modify designs at the RTL level of abstraction. As a lot of each SoC is re-used design data, the need to modify this data to fit the new design is very prevalent. GenSys provides an easy, correct-by-construction way to get this job done.”

How does the SpyGlass solve RTL design issues, ensuring high quality RTL with fewer design bugs? He added that it’s the predictive analysis technology. SpyGlass provides accurate and relevant information about what will happen when a design is implemented and tested. By fixing these problems early, at RTL, a much higher quality design is handed off to detailed implementation with fewer bugs and associated schedule challenges.

On another note, I asked him why Apple’s choice of chips a factor in influencing the global chip industry? The primary reason is their volume and buying power. Apple is something of a “King Maker” when it comes to who manufactures their chips. Apple is also a thought leader and trend setter, so their decisions affect the decisions of others.

Finally, the global semiconductor industry! How is the global semicon industry doing in H1-2013? As per Dr. Bose: “We see strong growth.  Our customers are undertaking many new designs at advanced process technology nodes. We think that this speaks well for future growth of the industry.  At a macro level, the consumer sector will drive a lot of the growth ahead.  For EDA, the higher levels of abstraction is where the growth will be.”

Is global semicon inventory level headed for oversupply in Q3?


Early this month, iSuppli had indicated that semiconductor inventory levels may have headed into oversupply territory in Q3.

It said: “Semiconductor Days Of Inventory (DOI) for chip suppliers are estimated to have climbed to 75.9 days in the third quarter of 2010, up 1.5 days from Q2. DOI in Q3 also was 4.8 percent higher than the seasonally adjusted average for the period.”

iSuppli added that the value of inventory was not been this high since the second quarter of 2008, when semiconductor suppliers’ stockpiles peaked at $35.4 billion.

Thanks to Jon Cassell and Debra Jaramilla at iSuppli, I was able to speak with Sharon Stiefel, analyst for semiconductor inventory and manufacturing for iSuppli on this situation.

Is there really an oversupply?

Sharon Stiefel, iSuppli.

Sharon Stiefel, iSuppli.

I asked Sharon Stiefel that given the growth that 2010 has seen so far, why are semiconductor inventory levels heading into oversupply territory in Q3?

She said that semiconductor inventories, overall, have risen both in terms of DOI and dollars for the past several quarters, and not yet achieved pre-recession levels last seen in 2008. “The overly lean conditions of 2009 and early 2010 are giving way to inventory levels, which are more appropriate for the strong growth experienced in 2010.

“Oversupply in Q3 2010 is not a foregone conclusion, but is possible that if the companies are not able to match manufacturing run rates with demand as the year winds to a close,” she added.

Which sectors have been witnessing or recording some softness in demand and why?

Stiefel said: “Companies reporting Q3 revenues over the past two weeks have reported a softening in demand, particularly in PC and consumer end markets, attributed to the continued uncertainty in the global economy, leaving consumers unwilling to spend.  A company with more exposure to these sectors has more potential of excessive inventories, versus a company with a more balanced product portfolio.”

Industry needs to moderate inventories
It is also said in iSuppli’s release that: ‘The industry will need to moderate inventories at the appropriate time in its growth curve in order to capture current revenue opportunities while they still exist.’ So, when exactly is that appropriate time?

Stiefel noted: “The appropriate time is when sales opportunities exist – projected quarters of growth, rather than revenue contraction. Semiconductor revenues are projected to grow in Q3 2010, contract in Q4 2010 and Q1 2011, and then resume moderate single digit growth for the remainder of 2011.” Read more…

DVCon India 2014 aims to bring Indian design, verification and ESL community closer!

September 11, 2014 1 comment

DVCon India 2014 has come to Bangalore, India, for the first time. It will be held at the Hotel Park Plaza in Bangalore, on Sept. 25-26. Dr. Wally Rhines, CEO, Mentor Graphics will open the proceedings with his inaugural keynote.

proxy?url=http%3A%2F%2F1.bp.blogspot.com%2F-gBtXKeIK9Ss%2FVBHi1IJxT1I%2FAAAAAAAALYM%2FfSqqoFgTuas%2Fs1600%2Fdvconindia2014logo2.png&container=blogger&gadget=a&rewriteMime=image%2F*Other keynotes will be from Dr. Mahesh Mehendale, MCU chief technologist, TI, Janick Bergeron, verification fellow, Synopsys, and Vishwas Vaidya, assistant GM, Electronics, Tata Motors.

Gaurav Jalan, SmartPlay, chair – promotions committee took time to speak about DVCon 2014 India.

Focus of DVCon 2014 India

First, what’s the focus of DVCon 2014 India? According to Jalan, DVCon has been a premiere conference in the US contributing to quality tutorials, papers and an excellent platform for networking. DVCON India focuses on filling the void of a vendor neutral quality conference in the neighbourhood – one that will grow over time.

The idea is to bring together, hitherto dispersed, yet substantial, design, verification and ESL community and give them a voice. Engineers get a chance to learn solutions to the verification problems, share the effectiveness of the solutions they have experimented, understand off the shelf solutions that are available in market and meet the vendor agnostic user fraternity. Moving forward the expectation is to get the users involved as early adopters of upcoming standards and actively contribute to them.

Trends in design

Next, what are the trends today in design? Jalan said while the designs continue to parade on the lines of Moore’s law there is a lot happening beyond the mere gate count. Defining and developing IPs with a wide configuration options serving a variety of application domains is a challenge.

The SoCs are crossing multi billion gate design (A8 in iPhone6 is 2 billion) with multi-fold increase in complexity due to multiple clock domains, multiple power domains, multiple voltage domains while delivering required performance in different application modes with sleek foot print.

Trends in verification

Now, let’s examine the trends today in verification. When design increases linearly, verification jumps exponentially. While UVM has settled dust to some extent on the IP verification level, there is a huge of challenges still awaiting to be addressed. The IP itself is growing in size limiting the simulator and encouraging users to move to emulators. While UVM solved the methodology war the VIPs available are still not simulator agnostic and expecting a emulator agnostic VIP portfolio is still a distant dream.

SoC verification is still a challenge not just due to the sheer size but because porting an env from block to SoC is difficult. The test plan definition and development for SoC level itself is a challenge. Portable stimulus group from Accellera is addressing this.

Similarly, coverage collection from different tools is difficult to merge. Unified coverage group at Accellera is addressing this. Low power today is a norm and verifying a power aware design is quite challenging. UPF is an attempt to standardize this.

Porting a SoC to emulator to enable hardware acceleration so as to run usecases is another trend picking up. Teams now are able to boot android on an SoC even before the silicon arrives. With growing analog content on chip the onus is on the verification engineers to ensure the digital and analog sides of the chip work in conjunction as per specs. Formal apps have picked so as to address connectivity tests, register spec testing, low power static checks and many more.

Accelearating EDA innovation

So, how will EDA innovation get accelerated? According to Jalan, the semiconductor industry has always witnessed that startups and smaller companies lead the innovation. Given the plethora of challenges around, there are multiple opportunities to be addressed from both the biggies and the start-ups.

The evolution of standards at Accellera definitely is a great step so as to bring the focus on real innovation in the tools while providing a platform for the user community to come forward sharing the challenges and proposing alternates. With a standard baseline that is defined with collaboration from all partners of the ecosystem, the EDA companies can focus on competing on performance, user interface, increased tool capacity and enabling faster time to market.

Forums like DVCON India help in growing awareness on standard promoted by Accellera while encouraging participants from different organizations and geographies join to contribute. Apart from tools areas where EDA innovation would pick up include new IT technologies and platforms – Cloud, Mobile devices.

Next level of verification productivity

Where is the next level of verification productivity likely to come from? To this, Jalan replied that productivity in the verification improves from different aspects.

While faster tools with increased capacity comes from innovation at EDA end, standard have played an excellent role in addressing it. UVM has helped in displacing vendor specific technologies to improve inter-operability, quick ramp up for engineers and reusability. Similarly on power format, UPF has played an important role in bridging the gaps.

Unified coverage is another aspect where it will help in closing early with coverage driven verification. IPXACT and SystemRDL standards help further in packaging IPs and easier hand off to enable reuse. Similarly other standards on ESL, AMS etc help in closing the loop holes that prevent productivity.

New, portable stimulus specification now being developed under Accellera that will help in easing out test development at different levels from IP to sub system to SoC. For faster simulations, the increase in adoption of hardware acceleration platforms is helping verification engineers to improve regression turn around time.

Formal technologies play an important role in providing a mathematical proofs to common verification challenges at an accelerated pace in comparison to simulation. Finally events like DVCON enables users to share their experiences and knowledge encouraging others to try out solutions instead of struggling with the process of discovering or inventing one.

More Indian start-ups

Finally, do the organizers expect to see more Indian start-ups post this event? Yes, says Jalan. “We even have a special incubation booth that is encouraging young startups to come forth and exhibit at a reduced cost (only $300). We are creating a platform and soon we will see new players in all areas of Semiconductor.

“Also, the Indian government’s push in the semiconductor space will give new startups further incentive to mushroom. These conferences help entrepreneurs to talk to everyone in the community about problems, vet potential solutions and seek blessings from gurus.”

Categories: Semiconductors

Diverse requirements for IoT evolving: Charlie Huang


According to Charlie Huang, senior VP, Worldwide Field Operations and System & Verification Group, Cadence, today, we are talking about tremendous data growth. Mobile has been driving the growth of semiconductors, besides medical, industrial, consumer and automotive electronics as well. Trends are also driving disruptive opportunities — from driving growth in China to growth in India. He was delivering the keynote on day two at the CDNLive 2014 in Bangalore, India.

"We can innovate to build things that are yet to be imagined. Greater things are yet to come for the Indian semicon design opportunities.

"Today, the iPad has become a system of systems. Now, everyone is waiting for the next big thing. People are also talking about the IoT. Everything will get revolutionized by the newer SoCs. Diverse requirements for IoT have been evolving. There are development challenges from all directions. More functions also means that more IP cores need to be integrated and verified. The IP cores per SoC is likely to be 123 in 14nm, from 108 in 20/22nm. The complexity is just unimaginable!

"Eighty percent of SoC development costs come from software, verification and validation. We should now look at innovating software design with SoC design.

Cadence has invested substantially in IP. It enables system design enablement from end product down to chip level. System-level design with high level synthesis is used to shorten the development cycle and get better quality of results (QoR).

Categories: Semiconductors

IoT gathering pace as revolution: Guru Ganesan


IoT gathering pace as revolution: Guru Ganesan

By 2020, there will be over 8 billion people on our planet. This will also bring tremendous innovations and challenges. ARM has been connecting intelligence at every level, said Guru Ganesan, president and MD, ARM India.

He was delivering the guest keynote at the recently held CDNLive 2014 event in Bangalore, India.

Newer apps are helping connect with the world. As per Gartner, $27 billion worth apps were downloaded in 2013. By 2020, this is estimated to rise to $80 billion.

According to Ganesan, consumer trends are driving innovation in embedded apps, including rich user interface (UI). ARM is also at the heart of wearable technologies, for example, Smart Glasses from Google. Some examples from India include Lechal from Ducere Technologies, GOQ Pi remote fitness companion, Fin+ navigation and device control gesture based device from RHLVision, and Smarty Ring that brings instant smartphone alerts to your fingers from Chennai.

So, what are the key requirements for wearables? These are video/image, audio, display, software, OS, connectivity and battery life! In 2013, over 1 billion smartphones were shipped. Further, mobile data 12 times over between now and 2018.

In medical electronics, besides humans, it has extended to keeping the cattle healthy and have intelligent agriculture with OnFarm, by using sensors. IoT as a revolution is gathering pace. As per a survey conducted by ARM, 95 percent of the users expect to be using IoT over the next three years. Common standards are being developed for interoperability. Similarly, mobility and connectivity are also happening in automotives.

Now, let’s see the development challenges for high-end embedded. Embedded applications today integrate more functions. Consequently, design and verification challenges continue to grow. Further, lot of smart devices are now generating lot of data. The question is: how are we using that data?

Ganesan added that by 2020, there will be new challenges in transportation, healthcare, energy and education. Once devices start communicating with each other, we are likely to see the evolution of a smart infrastructure.

Categories: Semiconductors

SEMI materials outlook: Semicon West 2014


Source: SEMI, USA.

Source: SEMI, USA.

At the recently held Semicon West 2014, Daniel P. Tracy, senior director, Industry Research and Statistics, SEMI, presented on SEMI Materials Outlook. He estimated that semiconductor materials will see unit growth of 6 percent or more. There may be low revenue growth in a large number of segments due to the pricing pressures and change in material.

For semiconductor eequipment, he estimated ~20 percent growth this year, following two years of spending decline. It is currently estimated at ~11 percent spending growth in 2015.

Overall, the year to date estimate is positive growth vs. same period 2013, for units and materials shipments, and for equipment billings.

For equipment outlook, it is pointing to ~18 percent growth in equipment for 2014. Total equipment orders are up ~17 percent year-to-date.

For wafer fab materials outlook, the silicon area monthly shipments are at an all-time high for the moment. Lithography process chemicals saw -7 percent sales decline in 2013. The 2014 outlook is downward pressure on ASPs for some chemicals. 193nm resists are approaching $600 million. ARC has been growing 5-7 percent, respectively.

For packaging materials, the Flip Chip growth drivers are a flip chip growth of ~25 percent from 2012 to 2017 in units. There are trends toward copper pillar and micro bumps for TSV. Future flip chip growth in wireless products are driven by form factor and performance. BB and AP processors are also moving to flip chip.

There has been growth in WLP shipments. Major applications for WLP are driven by mobile products such as smartphones and tablets. It should grow at a CAGR of ~11 percent in units (2012-2017).

Solder balls were $280 million market in 2013. Shipments of lead-free solder balls continues to increase. Underfillls were $208 million in 2013. It includes underfills for flip chip and packages. The increased use of underfills for CSPs and WLPs are likely to pass the drop test in high-end mobile devices.

Wafer-level dielectrics were $94 million market in 2013. Materials and structures are likely to enhance board-level reliability performance.

Die-attach materials has over a dozen suppliers. Hitachi Chemical and Henkel account for major share of total die attach market. New players are continuing to emerge in China and Korea. Stacked-die CSP package applications have been increasing. Industry acceptance of film (flow)-over-wire (FOW) and dicing die attach film (DDF) technologies are also happening.

 

Semiconductor capital spending outlook 2013-18: Gartner


At Semicon West 2014, Bob Johnson, VP Research, Gartner, presented the Semiconductor Capital Spending Outlook at the SEMI/Gartner Market Symposium on July 7.

First, a look at the semiconductor revenue forecast: it is likely to grow at a 4.3 percent CAGR from 2013-2018. Logic continues to dominate, but growth falters. As per the 2013-2018 CAGRs, logic will be growing 3.5 percent, memory at 4.5 percent, and other at 6.3 percent.

Bob Johnson

Bob Johnson

As for the memory forecast, NAND should surpass DRAM. At 2013-2018 CAGRs, DRAM should grow -1.1 percent, while NAND should grow 10.8 percent. Smartphone, SSD and Ultramobile are the applications driving growth through 2018. SSDs are powering the NAND market.

Among ultramobiles, tablets should dominate through 2018. They should also take share from PCs. Next, smartphones have been dominating mobile phones.

Looking at the critical markets for capital investment, smartphones are the largest growth segment, but have been showing signs of saturation. The revenue growth could slow dramatically by 2018. Ultramobiles have the highest overall CAGR, but at the expense of PC market. Tablets are driving down semiconductor content. Desktop and notebook PCs are a large, but declining market. This also requires critical revenue to fund logic capex. Lastly, SSDs are driving NAND Flash growth. The move to data centers is driving sustainable growth.

In capital spending, memory is strong, but logic is weak through 2018. The 2014 spending is up 7.1 percent, driven by strong memory market. Strength in NAND spending will drive future growth. Note that memory oversupply in 2016 can create next cycle. NAND is the capex growth driver in memory spending.

The major semiconductor markets, which justify investment in logic leading edge capacity, are now running out of gas. Ultramobiles are cannibalizing PCs, smartphones are saturating and both are moving to lower cost alternatives. It is increasingly difficult to manufacture complex SoCs successfully at the absolute leading edge. Moore’s Law is slowing down, while costs are going up. Breakthrough technologies (i.e., EUV) are not ready when needed. Much of the intelligence of future applications is moving to the cloud. The data centers’ needs for fast, low power storage solutions are creating sustainable growth for NAND Flash.

The traditional two-year per node pace of Moore’s Law will continue to slow down. Only a few high volume/high performance applications will be able to justify the costs of 20nm and beyond. Whether this will require new or upgraded capacity is uncertain. 28nm will be a long lived node as mid-range mobility products demand higher levels of performance. Finally, the cloud will continue to grow in size and influence creating demand for new NAND Flash capacity and technology.

Categories: Semiconductors

How Intel competes on today’s fabless ecosystem?


The SEMI/Gartner Market Symposium was held Semicon West 2014 at San Francisco, on July 7. Am grateful to Ms. Becky Tonnesen, Gartner, and Ms Agnes Cobar, SEMI, for providing me the presentations. Thanks are also due to Ms Deborah Geiger, SEMI.

Dean Freeman, research VP, Gartner, outlined the speakers:

• Sunit Rikhi, VP, Technology and Manufacturing Group, GM, Intel Custom Foundry Intel, presented on Competing in today’s Fabless Ecosystem.

• Bob Johnson, VP Research, Gartner, presented the Semiconductor Capital Spending Outlook.

• Christian Gregor Dieseldorff, director Market Research, SEMI, presented the SEMI World Fab Forecast: Analysis and Forecast for Fab Spending, Capacity and Technology.

• Sam Wang, VP Research Analyst, Gartner, presented on How Foundries will Compete in a 3D World.

• Jim Walker, VP Research, Gartner, presented on Foundry versus SATS: The Battle for 3D and Wafer Level Supremacy.

• Dr. Dan Tracy, senior director, Industry Research & Statistics, SEMI, presented on Semiconductor Materials Market Outlook.

Let’s start with Sunit Rikhi at Intel.

As a new player in the fabless eco-system, Intel focuses on:
* The value it brings to the table.
* How it delivers on platforms of capability and services.
* How it leverage the advantages of being inside the world’s leading Integrated Device Manufacturer (IDM)
* How it face the challenges of being inside the world’s leading IDM.

Intel sunit-rikhiIntel has leadership in silicon technologies. Transistor performance per watt is the critical enabler for all. Density improvements offset wafer cost trends. Intel currently has ~3.5-year lead in introducing revolutionary transistor technologies.

In foundry capabilities and services platforms, Intel brings differentiated value on industry standard platforms. 22nm was started in 2011, while 14nm was started in 2013. 10nm will be starting in 2015. To date, 125 prototype designs have been processed.

Intel offers broad capability and services on industry standard platforms. It also has fuller array of co-optimized end-to-end services. As for the packaging technology, Intel has been building better products through
multi-component integration. Intel has also been starting high on the yield learning curve.

Regarding IDM challenges, such as high-mix-low-volume configuration, Intel has been doing configuration optimization in tooling and set-up. It has also been separating priority and planning process for customers. Intel has been providing an effective response for every challenge.

Some of Intel Custom Foundry announced customers include Achronix, Altera, Microsemi, Netronome, Panasonic and Tabula.

Set up strong methodology teams to create better verification infrastructure: Synopsys


Arindam Ghosh

Arindam Ghosh

This is the third installment on verification, now, taken up by Synopsys. Regarding the biggest verification mistakes today, Arindam Ghosh, director – Global Technical Services, Synopsys India, attributed these as:

* Spending no time on verification planning (not documenting what needs to be verified) and focusing more on running simulations or on execution.
* No or very low investment in building better verification environments (based on best/new methodologies and best practices); instead maintaining older verification environments.
* Compromising on verification completeness because of tape out pressures and time-to-market considerations.

Would you agree that many companies STILL do not know how to verify a chip?

He said that it could be true for smaller companies or start-ups, but most of the major semiconductor design engineers know about the better approaches/methodologies to verify their chips. However, they may not be investing in implementing the new methodologies for multiple reasons and may instead continue to follow the traditional flows.

One way to address these mistakes would be to set up strong methodology teams to create a better verification infrastructure for future chips. However, few companies are doing this.

Are companies realizing this and building an infrastructure that gets you business advantage? He added that some companies do realize this and are investing in building a better infrastructure (in terms of better methodology and flows) for verification.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment? Ghosh said that good verification starts as soon as we start designing and architecting the design. Verification leads should start discussing the verification environment components with the lead architect and also start writing the verification plan.

Are folks mistaking by looking at tools and not at the verification process itself? According to him, tools play a major role in the effectiveness of any verification process, but we still see a lot of scope in methodology improvements beyond the tools.

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities? Ghosh said that there is no single, full-proof recipe for a ‘right’ verification path. It depends on multiple factors, including whether the design is a new product or derivative, the design application etc. But yes, it is very important to do comprehensive verification planning before starting the verification process.

Synopsys is said to be building a comprehensive, unified and integrated verification environment is required for today’s revolutionary SoCs and would offer a fundamental shift forward in productivity, performance, capacity and functionality.  Synopsys’ Verification Compiler provides the software capabilities, technology, methodologies and VIP required for the functional verification of advanced SoC designs in one solution.

Verification Compiler includes:
* Better capacity and compile and runtime performance.
* Next-generation static and formal technology delivering performance improvement and the capacity to analyze a complete SoC (Property checking, LP, CDC, connectivity).
* Comprehensive low power verification solution.
* Verification planning and management.
* Next-generation verification IP and a deep integration between VIP and the simulation engine, which in turn can greatly improve productivity.  The constraint engine is tuned for optimal performance with its VIP library. It has integrated debug solutions for VIP so one can do protocol-level analysis and transaction-based analysis with the rest of the testbench.
* Support for industry standard verification methodologies.
* X-propagation simulation with both RTL and low power simulations.
* Common debug platform with better debug technology having new capabilities, tight integrations with simulation, emulation, testbench, transaction debug, power-aware debug , hw/sw debug, formal, VIP and coverage.

Top five recommendations for verification
What would be Synopsys’ top five recommendations for verification?

* Spend a meaningful amount of time and effort on verification planning before execution.
* Continuously invest in building a better verification infrastructure and methodologies across the company for better productivity.
* Collaborate with EDA companies to develop, evaluate and deploy new technologies and flows, which can bring more productivity to verification processes.
* Nurture fresh talent through regular on and off-the-job trainings (on flows, methodologies, tools, technology).
* Conduct regular reviews of the completed verification projects with the goal of trying to improve the verification process after every tapeout through methodology enhancements.

Categories: Semiconductors

Cadence: Plan verification to avoid mistakes!


Apurva Kalia

Apurva Kalia

Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;)  I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.

Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.

In that case, why are some companies STILL not knowing how to verify a chip?

He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.

“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”

Addressing challenges
How are companies trying to address the challenges?

Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.

* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.

* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.

* Verification environment re-use helps to cut down the time required to develop verification environments.

* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.

Cadence has the widest portfolio of tools to help companies meet verification challenges, including:

Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;

The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;

Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and

Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.

Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.

Good verification
When should good verification start?

Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”

Are folks mistaking by looking at tools and not at the verification process itself?

He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.

Verification planning
Finally, there’s verification planning! What should be the ‘right’ verification path?

Verification planning needs to include:

* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.

%d bloggers like this: