Archive

Archive for the ‘EDA’ Category

Cadence Quantus solution meets 16nm FinFET challenges


Cadence Design Systems Inc. recently announced the Quantus QRC extraction solution had been certified for TSMC 16nm FinFET.

So, what’s the uniqueness about the Cadence Quantus QRC extraction solution?

Quantus

Quantus

KT Moore, senior group director – Product Marketing, Digital and Signoff Group, Cadence Design Systems, said: “There are several parasitic challenges that are associated with advanced node designs — especially FinFET – and it’s not just about tighter geometries and new design rules. We can bucket these challenges into two main categories: increasing complexity and modeling challenges.

“The number of process corners is exploding, and for FinFET devices specifically, there is an explosion in the parasitic coupling capacitances and resistances. This increases the design complexity and sizes. The netlist is getting bigger and bigger, and as a result, there is an increase in extraction runtimes for SoC designs and post-layout simulation and characterization runtimes for custom/analog designs.

“Our customers consistently tell us that, for advanced nodes, and especially for FinFET designs, while their extraction runtimes and time-to-signoff is increasing, their actual time-to-market is shrinking and putting an enormous amount of pressure on designers to deliver on-time tapeout. In order to address these market pressures, we have employed the massively parallel technology that was first introduced in our Tempus Timing Signoff Solution and Voltus IC Power Integrity Solution to our next-generation extraction tool, Quantus QRC Extraction Solution.

“Quantus QRC Extraction Solution enables us to deliver up to 5X better performance than competing solutions and allows scalability of up to 100s of CPUs and machines.”

Support for FinFET features
How is Quantus providing significant enhancements to support FinFET features?

Parasitic extraction is at the forefront with the introduction of any new technology node. For FinFET designs, it’s a bit more challenging due to the introduction of non-planar FinFET devices. There are more layers to be handled, more RC effects that need to be modeled and an introduction of local interconnects. There are also secondary and third order manufacturing effects that need to modeled, and all these new features have to be modeled with precise accuracy.

Performance and turnaround times are absolutely important, but if you can’t provide accuracy for these devices — especially in correlation to the foundry golden data — designers would have to over-margin their designs and leave performance on the table.

Best-in-class accuracy
How can Cadence claim that it has the ‘tightest correlation to foundry golden data at TSMC vs. competing solutions’? And, why 16nm only?

According to Moore, the foundry partner, TSMC, asserts that Quantus QRC Extraction Solution provides best-in-class accuracy, which was referenced in the recent press announcement:

“Cadence Quantus QRC Extraction Solution successfully passed TSMC’s rigorous parasitic extraction certification requirements to achieve best-in-class accuracy against the foundry golden data for FinFET technology.”

FinFET structures present unique challenges since they are non-planar devices as opposed to its CMOS predecessor, which is a planar device. We partnered with TSMC from the very beginning to address the modeling challenges, and we’ve seen many complex shapes and structures over the year that we’ve modeled accurately.

“We’re not surprised that TSMC has recognized our best-in-class accuracy because we’re the leader in providing extraction solutions for RF designs. Cadence Quantus QRC Extraction Solution has been certified for TSMC 16nm FinFET, however, it’s important to note that we’ve been certified for all other technology nodes and our QRC techfiles are available to our customers from TSMC today.”

Cadence: Plan verification to avoid mistakes!


Apurva Kalia

Apurva Kalia

Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;)  I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.

Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.

In that case, why are some companies STILL not knowing how to verify a chip?

He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.

“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”

Addressing challenges
How are companies trying to address the challenges?

Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.

* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.

* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.

* Verification environment re-use helps to cut down the time required to develop verification environments.

* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.

Cadence has the widest portfolio of tools to help companies meet verification challenges, including:

Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;

The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;

Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and

Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.

Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.

Good verification
When should good verification start?

Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”

Are folks mistaking by looking at tools and not at the verification process itself?

He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.

Verification planning
Finally, there’s verification planning! What should be the ‘right’ verification path?

Verification planning needs to include:

* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.

Five recommendations for verification: Dr. Wally Rhines


Dr. Wally RhinesIt seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.

Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.

Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”

Would you agree that many companies STILL do not know how to verify a chip?

Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).

“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.

“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”

How are companies trying to address those?

According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).  Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.

Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?

He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?

Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”

Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?

He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?

Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.

“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”

How is Mentor addressing this situation?

Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.

In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.

Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?

Here are the five recommendations for verification from Dr. Rhines:

* Ensure your organization has implemented an effective verification planning process.

* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.

* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.

* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.

* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.

Are we at an inflection point in verification?


synopsysAre we at an inflection point in verification today? Delivering the guest keynote at the UVM 1.2 day, Vikas Gautam, senior director, Verification Group, Synopsys, said that today, mobile and the Internet of Things are driving growth. Naturally, the SoCs are becoming even more complex. It is also opening up new verification challenges, such as power efficiency, more software, and reducing time-to-market. There is a need to shift-left to be able to meet time-to-market goal.

The goal is to complete your verification as early as possible. There have been breakthrough verification innovations. System Verilog brought in a single language. Every 10-15 years, there has been a need to upgrade verification.

Today, many verification technologies are needed. There is a growing demand for smarter verification. There is need for much upfront verification planning. There is an automated setup and re-use with VIP. There is a need to deploy new technologies and different debug environments. The current flows are limitimg smart verification. There are disjointed environments with many tools and vendors.

Synopsys has introduced the Verification Compiler. You get access to each required technology, as well as next-gen technology. These technologies are natively integrated. All of this enables 3X verification productivity.

Regarding next gen static and formal platforms, there will be capacity and performance for SoCs. It should be compatible with implementation products and flows. There is a comprehensive set of applications. The NLP+X-Prop can help find tough wake-up bug at RTL. Simulation is tuned for the VIP. There is a ~50 percent runtime improvement.

System Verilog has brought in many new changes.  Now, we have the Verification Compiler. Verdi is an open platform. It offers VIA – a platform for customizing Verdi. VIA improves the debug efficiency.

India’s evolving importance to future of fabless: Dr. Wally Rhines

February 3, 2014 2 comments

Dr. Wally RhinesIf I correctly remember, sometime in Oct. 2008, S. Janakiraman, then chairman of the India Semiconductor Association, had proclaimed that despite not having fabs, the ‘fabless India” had been shining brightly! Later, in August 2011, I had written an article on whether India was keen on going the fabless way! Today, at the IESA Vision Summit in Bangalore, Dr, Wally Rhines repeated nearly the same lines!

While the number of new fabless startups has declined substantially in the West during the past decade, they are growing in India, said Dr. Walden C. Rhines, chairman and CEO, during his presentation “Next Steps for the Indian Semiconductor Industry” at the ongoing IESA Vision Summit 2014.

India has key capabilities to stimulate growth of semiconductor companies, which include design services companies, design engineering expertise and innovation, returning entrepreneurs, and educational system. Direct interaction with equipment/systems companies will complete the product development process.

Off the top 50 semicon companies in 2012, 13 are fabless and four are foundries. The global fabless IC market is likely to grow 29 percent in 2013. The fabless IC revenue also continues to grow, reaching about $78.1 billion in 2013.  The fabless revenue is highly concentrated with the top 10 companies likely to account for 64 percent revenue in 2013. As of 2012, the GSA estimates that there aere 1,011 fabless companies.

The semiconductor IP (SIP) market has also been growing and is likely to reach $4,774 million by 2020, growing at a CAGR of 10 percent. The top 10 SIP companies account for 87 percent of the global revenue. Tape-outs at advanced nodes have been growing. However, there are still large large opportunities in older technologies.

IoT will transform industry
It is expected that the Internet of Things (IoT) will transform the semiconductor industry. It is said that in the next 10 years, as many as 100 billion objects could be tied together to form a “central nervous system” for the planet and support highly intelligent web-based systems. As of 2013, 1 trillion devices are connected to the network.

Product differentiation alone makes switching analog/mixed-signal suppliers difficult. Change in strategy toward differentiation gradually raises GPM percentage.

India’s evolving importance to future of fabless
Now, India ranks among the top five semiconductor design locations worldwide. US leads with 507, China with 472, Taiwan with 256, Israel with 150, and India with 120. Some prominent Indian companies are Ineda, Saankhya Labs, Orca Systems and Signal Chip (all fabless) and DXCorr and SilabTech (all SIP).

India is already a leading source of SIP, accounting for 5.3 percent, globally, after USA 43 percent and China 17.3 percent, respectively. It now seems that India has been evolving from design services to fabless powerhouse. India has built a foundation for a fabless future. It now has worldwide leadership with the most influential design teams in the world.

Presently, there are 1,031 MNC R&D centers in India. Next, 18 of the top 20 US semiconductor companies have design centers in India. And, 20 European corporations set up engineering R&D centers in India last year. India also has the richest pool of creative engineering resources and educational institutions in the world. The experience level of Indian engineers has been increasing, but it is still a young and creative workforce. There is also a growing pool of angel investors in India, and also in the West, with strong connections to India.

So, what are the key ingredients to generate a thriving infrastructure? It is involvement and expertise with end equipment. Superb product definition requires the elimination of functional barriers. He gave some examples of foreign “flagged” Indian companies that produced early successes. When users and tool developers work in close proximity, “out-of-the-Box” architectural innovations revolutionize design verification.

Top five trends likely to rule global semicon industry in 2014


Rich Goldman

Rich Goldman

What are the top five trends likely to rule the semicon industry in 2014 and why? Rich Goldman, VP, corporate marketing and strategic alliances, Synopsys, had this to say.

FinFETs
FinFETs will be a huge trend through 2014 and beyond. Semiconductor companies will certainly keep us well informed as they progress through FinFET tapeouts and ultimately deliver production FinFET processes.

They will tout the power and speed advantages that their FinFET processes deliver for their customers, and those semiconductor companies early to market with FinFETs will press their advantage by driving and announcing aggressive FinFET roadmaps.

IP and subsystems
As devices grow more complex, integrating third-party IP has become mainstream. Designers recognize as a matter of course that today’s complex designs benefit greatly from integrating third-party IP in such areas as microprocessors and specialized I/Os.

The trend for re-use is beginning to expand upwards to systems of integrated, tested IP so that designers no longer need to redesign well-understood systems, such as memory, audio and sensor systems.

Internet of Things/sensors
Everybody is talking about the Internet of Things for good reason. It is happening, and 2014 will be a year of huge growth for connected things. Sensors will emerge as a big enabler of the Internet of Things, as they connect our real world to computation.

Beyond the mobile juggernaut, new devices such as Google’s (formerly Nest’s) thermostat and smoke detector will enter the market, allowing us to observe and control our surrounding environment remotely.

The mobile phone will continue to subsume and disrupt markets, such as cameras, fitness devices, satellite navigation systems and even flashlights, enabled by sensors such as touch, capacitive pattern, gyroscopic, accelerometers, compasses, altimeters, light, CO, ionization etc. Semiconductor companies positioned to serve the Internet of Things with sensor integration will do well.

Systems companies bringing IC design in-house
Large and successful systems companies wanting to differentiate their solutions are bringing IC specification and/or design in house. Previously, these companies were focused primarily on systems and solutions design and development.

Driven by a belief that they can design the best ICs for their specific needs, today’s large and successful companies such as Google, Microsoft and others are leading this trend, aided by IP reuse.

Advanced designs at both emerging and established process nodes  
While leading-edge semiconductor companies drive forward on emerging process nodes such as 20nm, others are finding success by focusing on established nodes (28nm and above) that deliver required performance at reduced risk. Thus, challenging designs will emerge at both ends of the spectrum.

Part II of this discussion will look at FinFETs below 20nm and 3D ICs.

Atrenta on outlook for EDA in 2014

January 14, 2014 1 comment

I had interacted with Dr. Ajoy Bose, CEO of Atrenta, some months ago. It was a pleasure to meet up with Piyush Sancheti, VP of Marketing recently. First, I asked him about the outlook for EDA in 2014.

Piyush Sancheti

Piyush Sancheti

Outlook for EDA
Piyush Sancheti said: “EDA does not look that attractive from growth point. However, you cannot do SoC designs without EDA. Right now, EDA’s focus is on implementation. The re-use of IP has been doing the rounds for many years. Drivers for SoCs are mobile and Internet of Things. The design cycle for those markets are very short – about three months. EDA business is shifting to IP re-use. The focus is now toward design aggregation.

“We will have done roughly 66 percent of business – net new — on existing customers. There is an industry shift toward doing more on the front end. EDA growth will come from IP-SoC involvement.

“Sub-20nm has challenges. ST says FT-SoI is the way to go. Complexity of process plays a big role, and the amount of chips you put in will also increase. In 14/16nm, we have an investment going on in 3D design. We are extending our 2D tool into 3D tool. We are also investing in the IP qualification. We have standardized a set of design rules in RTL. There are about 30 companies in the TSMC ecosystem.

“Our main focus is IP enablement. SoC acceptance is another key aspect. Our company focus is IP-enablement for SoCs. IP qualification ensures that it meets guidelines. Second, acceptance and making sure all IPs fit in the blocks. Third, integration. We already have this technology and it is driving the business.”

3D design
What’s Atrenta’s take on 3D design? Sancheti replied: “The industry has been slow as 3D designs are not yet to a point of business success. Focus on monolithic 3D-ICs will be a paradigm shift for the semicon industry. For mainstream commercial design, 20nm is still mainstream, but 14/16nm does not look mainstream, as of now. Process node is not necessarily a driver of innovation. EDA as an industry will remain in single digit growth.”

How will EDA move into the embedded software space?

Sancheti said: “We’ve looked into that market. But, the price point is significantly lower. Over time, it could be a strategic area for us. Over time, embedded software development and chip design will co-mingle.”

ESL is where the future of EDA lies. Still true? He added that the future of EDA is going up. It has to head toward integration of embedded software and chip development. However, ESL is not the only viable option.

Atrenta has 220 people in India, about 10 people in Bangalore and 200 in Noida. Sushil Gupta runs the India operations. It has tie-ups with IIT Delhi and IIT Kharagpur as well. Atrenta sees lot of scope for work with the Indian start-ups.

%d bloggers like this: