Search Results

Keyword: ‘added’

Exar serving high-growth areas with innovative value-added solutions


Louis DiNardo.

Louis DiNardo.

Exar Corp., established 1971, is headquartered in Fremont, USA, and has design centers in Silicon Valley and Hangzhou, China. Louis DiNardo, president and CEO, Exar, said that the company’s strategic model is to serve high-growth markets with innovative value-added solutions. He was speaking at the ongoing 13th Globalpress Electronics Summit in Santa Cruz, USA.

Exar offers solutions that includes high performance analog-mixed signal as well as data management solutions. Its current market focus is on networking and storage, industrial and embedded systems, and communications infrastructure. It is focusing on power management products, connectivity products and data management solutions.

Power management products include those for analog power management such as switching regulators, switching controllers, linear regulators, supervisory controllers, etc, For programmable power, Exar focuses on multiple output synchronous buck controllers.

Some of the products include POWER, the Exar Programmable PowerSuite 5.0. Recently, Calceda has been powering servers with the PowerXR technology.

For data compression and security, Exar is offering hardware acceleration and software solutions meant for compression and decompression, acceleration, encryption and decryption. There are high growth markets supporting social networking, industrial Internet and financial technology as well.

Exar’s Panther I is a first generation compression/security engine with the PCIe interface. The Panther II is a second generation compression and security engine with PCIe and FPGA interface.

Set up strong methodology teams to create better verification infrastructure: Synopsys


Arindam Ghosh

Arindam Ghosh

This is the third installment on verification, now, taken up by Synopsys. Regarding the biggest verification mistakes today, Arindam Ghosh, director – Global Technical Services, Synopsys India, attributed these as:

* Spending no time on verification planning (not documenting what needs to be verified) and focusing more on running simulations or on execution.
* No or very low investment in building better verification environments (based on best/new methodologies and best practices); instead maintaining older verification environments.
* Compromising on verification completeness because of tape out pressures and time-to-market considerations.

Would you agree that many companies STILL do not know how to verify a chip?

He said that it could be true for smaller companies or start-ups, but most of the major semiconductor design engineers know about the better approaches/methodologies to verify their chips. However, they may not be investing in implementing the new methodologies for multiple reasons and may instead continue to follow the traditional flows.

One way to address these mistakes would be to set up strong methodology teams to create a better verification infrastructure for future chips. However, few companies are doing this.

Are companies realizing this and building an infrastructure that gets you business advantage? He added that some companies do realize this and are investing in building a better infrastructure (in terms of better methodology and flows) for verification.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment? Ghosh said that good verification starts as soon as we start designing and architecting the design. Verification leads should start discussing the verification environment components with the lead architect and also start writing the verification plan.

Are folks mistaking by looking at tools and not at the verification process itself? According to him, tools play a major role in the effectiveness of any verification process, but we still see a lot of scope in methodology improvements beyond the tools.

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities? Ghosh said that there is no single, full-proof recipe for a ‘right’ verification path. It depends on multiple factors, including whether the design is a new product or derivative, the design application etc. But yes, it is very important to do comprehensive verification planning before starting the verification process.

Synopsys is said to be building a comprehensive, unified and integrated verification environment is required for today’s revolutionary SoCs and would offer a fundamental shift forward in productivity, performance, capacity and functionality.  Synopsys’ Verification Compiler provides the software capabilities, technology, methodologies and VIP required for the functional verification of advanced SoC designs in one solution.

Verification Compiler includes:
* Better capacity and compile and runtime performance.
* Next-generation static and formal technology delivering performance improvement and the capacity to analyze a complete SoC (Property checking, LP, CDC, connectivity).
* Comprehensive low power verification solution.
* Verification planning and management.
* Next-generation verification IP and a deep integration between VIP and the simulation engine, which in turn can greatly improve productivity.  The constraint engine is tuned for optimal performance with its VIP library. It has integrated debug solutions for VIP so one can do protocol-level analysis and transaction-based analysis with the rest of the testbench.
* Support for industry standard verification methodologies.
* X-propagation simulation with both RTL and low power simulations.
* Common debug platform with better debug technology having new capabilities, tight integrations with simulation, emulation, testbench, transaction debug, power-aware debug , hw/sw debug, formal, VIP and coverage.

Top five recommendations for verification
What would be Synopsys’ top five recommendations for verification?

* Spend a meaningful amount of time and effort on verification planning before execution.
* Continuously invest in building a better verification infrastructure and methodologies across the company for better productivity.
* Collaborate with EDA companies to develop, evaluate and deploy new technologies and flows, which can bring more productivity to verification processes.
* Nurture fresh talent through regular on and off-the-job trainings (on flows, methodologies, tools, technology).
* Conduct regular reviews of the completed verification projects with the goal of trying to improve the verification process after every tapeout through methodology enhancements.

Categories: Semiconductors

Cadence: Plan verification to avoid mistakes!


Apurva Kalia

Apurva Kalia

Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;)  I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.

Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.

In that case, why are some companies STILL not knowing how to verify a chip?

He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.

“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”

Addressing challenges
How are companies trying to address the challenges?

Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.

* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.

* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.

* Verification environment re-use helps to cut down the time required to develop verification environments.

* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.

Cadence has the widest portfolio of tools to help companies meet verification challenges, including:

Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;

The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;

Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and

Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.

Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.

Good verification
When should good verification start?

Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”

Are folks mistaking by looking at tools and not at the verification process itself?

He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.

Verification planning
Finally, there’s verification planning! What should be the ‘right’ verification path?

Verification planning needs to include:

* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.

Five recommendations for verification: Dr. Wally Rhines


Dr. Wally RhinesIt seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.

Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.

Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”

Would you agree that many companies STILL do not know how to verify a chip?

Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).

“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.

“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”

How are companies trying to address those?

According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).  Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.

Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?

He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?

Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”

Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?

He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?

Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.

“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”

How is Mentor addressing this situation?

Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.

In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.

Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?

Here are the five recommendations for verification from Dr. Rhines:

* Ensure your organization has implemented an effective verification planning process.

* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.

* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.

* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.

* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.

Three things in Indian semicon: Vinay Shenoy


Vinay Shenoy

Vinay Shenoy

There have been a variety of announcements made by the Government of India in the last one year or so. In the pre-90s period, the country showed just 1 percent GDP growth rate. It was adverse to FDI and had a regulated market. All of this led to deregulation under the late PM, PV Narasimha Rao.

The Indian government was averse to foreign investment, which was opened up around 1994. Since then, we have seen 6-8 percent growth, said Vinay Shenoy, MD, Infineon Technologies (India). He was delivering the keynote at the UVM 1.2 day, being held in Bangalore, India.

Around 1997, India signed the ITA-1 with the WTO. Lot of electronic items had their import duty reduced to zero. It effectively destroyed the electronics manufacturing industry in India. We were now reduced to being a user of screwdriver technology. In 1985, the National Computer Policy, and in 1986, the National Software Policy, were drafted. The government of India believed that there existed some opportunities. The STPI was also created, as well as 100 percent EoUs. So far, we have been very successful in services, but have a huge deficit on manufacturing.

We made an attempt to kick off semicon manufacturing in 2007, but that didn’t take off for several reasons. It was later revived in 2011-12. Under the latest national policy of electronics, there have been a couple announcements – one, setting up of two semicon fabs in India. The capital grant – nearly 25-27 percent — is being given by the government. It has provided a financial incentive – of about $2 billion.

Two, electronics manufacturing per se, unless it is completely an EoU, the semicon industry will find it difficult to survive. There is the M-SIPS package that offers 25 percent capital grant to a wide range of industries.

Three, we have granted some incentives for manufacturing. But, how are you going to sell? The government has also proposed ‘Made in India’, where, 30 percent of the products will be used within India. These will largely be in the government procurements, so that the BoM should be at least 30 percent from India. The preferential market policy applies to all segments, except defense.

Skill development is also key. The government has clearly stated that there should be innovation-led manufacturing. The government also wants to develop PhDs in selected domains. It intends to provide better lab facilities, better professors, etc. Also, young professors seeking to expand, can seek funding from the government.

TSMC promotes small IP companies. Similarly, it should be done in India. For semicon, these two fabs in India will likely come up in two-three years time. “Look at how you can partner with these fabs. Your interest in the semicon industry will be highly critical. The concern of the industry has been the stability of the tax regime. The government of India has assured 10 years of stable tax regime. The returns will come in 10-15 years,” added Shenoy.

The government has set up electronics manufacturing clusters (EMC). These will make it easy for helping companies to set up within the EMC. The NSDC is tying up with universities in bringing skill-sets. The industry is also defining what skills will be required. The government is funding PhDs, to pursue specialization.

Global semicon industry trends in 2014: Analog Devices


Sam Fuller

Sam Fuller

I recently met Sam Fuller, CTO, Analog Devices, and had an interesting conversation. First, I asked him about the state of the global semicon industry in 2013.

Industry in 2013
He said: “Due to the uncertainties in the global economy in the last couple of years, the state of the global semiconductor industry has been quite modest growth. Because of the modest growth, there has been a buildup in demand. As the global economies begin to be more robust going forward, we expect to see more growth.”

Industry in 2014?
How does Analog Devices see the industry going forward in 2014? What are the five key trends?

He added: “I would talk about the trends more from an eco-system and applications perspective. Increased capability on a single chip: Given all the advances to Moore’s law, the capability of a chip has increased considerably in all dimensions and not just performance, be it the horsepower we see in today’s smartphones or the miniaturization and power consumption of wearable gadgets that were on show this year at CES.

“In Analog Devices’ case, as we are focused on high performance signal processing, we can put more of the entire signal chain on a single die. For our customers, the challenge is to provide their customers a more capable product which means a more complex product, but with a simpler interface.

“A classic example is our AD9361 chip, which is a single chip wideband radio transceiver for Software Defined Radio (SDR). It is a very capable ASSP (Application Specific Standard Products) as well as RF front end with a wide operating frequency of 70 MHz to 6 GHz.

“This chip, coupled with an all-purpose FPGA, can build a very flexible SDR operating across different radio protocols, wide frequency range and bandwidth requirements all controlled via software configuration. It finds a number of applications in wireless communication infrastructure, small cell Base stations as well as a whole range of custom radios in the industrial and aerospace businesses.”

Now, let’s see the trends for 2014!

More collaboration with customers: There is a greater emphasis on understanding customers’ end applications to provide a complete signal chain, all in a System on a Chip (SoC) or a System in a package (SiP). The relationship with our customers is changing as we move more towards ASSPs focused with few lead customers for target markets and target applications. While this has already been ongoing in the consumer industry with PCs and laptops, customers in other vertical markets like healthcare, automotive and industrial are and will collaborate more with semiconductor companies like Analog Devices to innovate at a solutions level.

More complete products: We have evolved from delivering just the silicon at a component level to delivering more complete products with more advanced packaging for various 3D chips or multi-die within a package. Our solutions now have typically much more software that makes it easier to configure or program the chips. It is a solution that is a combination of more advanced silicon, advanced packaging and more appropriate software.

With providing the complete solution, the products are more application specific and hence, the need for more collaboration with customers. For example, there may be one focused on Software Defined Radio, one for motor control, and one for vital signs monitoring for consumer health that we have launched recently.

We need it to be generic enough that multiple customers can use it, but it needs to be as tailored as possible to the customers’ needs for specific market segments. While because of the volume and standardization, availability of complete reference designs in the consumer world has been the norm, other market segments are demanding more complete products not-withstanding the huge variation in protocols and applications.

Truly global industry: The semiconductor and electronics industry has become truly global, so multiple design sites around the globe collaborate to create products. For example for Analog Devices, one of our premier design sites is our Bangalore product design center where we quite literally developed our most complex and capable chips. At the same time our customers are also global.

We see large multinational companies like GE, Honeywell, Cisco, Juniper, ABB, Schneider and many of our top strategic customers globally doing substantial system design work in Bangalore along with a multitude of India design houses. Our fastest growing region is in Asia, but we have substantial engagement with customers in North America and Europe. And our competition is also global, which means that the industry is ever moving faster as the competition is global.

Smarter design tools: The final trend worth talking about is the need for smarter design tools.  As our products and our customers’ products become more complex and capable, there have to be rapidly developing design tools, for us to design them.

This cannot be done by brute force but by designing smarter and better tools. There is a lot of innovation that goes on in developing better tool suites. There is also ever more capable software that caters to a market moving from 100s of transistors to literally billions of transistors for an application.

STMicro intros M24SR dynamic NFC/RFID tag


Amit Sethi

Amit Sethi

STMicroelectronics recently introduced the M24SR dynamic NFC/RFID tag.

Speaking about the USP of the M24SR, Amit Sethi, Product Marketing manager – Memories and RFID, STMicroelectronics India, said: “The unique selling proposition of the M24SR product is its two interfaces, giving users and applications the ability to program or read its memory using either an RF NFC interface or a wired I2C interface, in an affordable and easy-to-use device for a wide range of applications such as consumer/home appliance, OTP card, healthcare/wellness and industrial/smart meter.”

Let us see how the M24SR is beneficial for smartphone or any other audio device.

The M24SR is a dynamic NFC/RFID tag that manages the data exchange between the NFC phone and the microcontroller. The main use cases for data exchange are updating user settings, downloading data logs, and remote programming and servicing. The dynamic tag also enables seamless Bluetooth and Wi-Fi pairing, which is useful in, for example, audio devices.

How is the M24SR different from other products of the same segment?

Sethi said that the key difference is the dual interface: the M24SR memory can be accessed either by a low-power 2C interface or

M24SR

M24SR

by an ISO14443A RF interface operating at 13.56MHz. It also features RF status (MCU wake-up) and RF disable functions to minimize power consumption. In addition, the devices support the NFC data exchange format (NDEF from NFC forum) and 128-bit password protection mechanism.

The M24SR series is available in EEPROM memory densities from 2 Kbit to 64 Kbit and three package types: SO8, TSSOP8, and UFDFPN8.

What are the contributions of M24SR toward the Internet of Things?

Accotding to him, the M24SR dynamic NFC/RFID tag interactive and zero power capability, simplifies complex communications setups and enables data exchange among the home automation, wearable electronics, home appliances, smart meter, wellness, etc.

Especially with the NFC capability, the M24SR is ideal for applications waiting for something, like a ticket or ID to launch an activity.

Relevance for India
Finally, what’s the relevance of the product for the Indian market?

Sethi added: “Mobile and NFC based application are gaining its popularity in India. M24SR is an easy-to-use and an affordable product for the Implementation of NFC-based applications in transportation, entertainment, and lifestyle areas.

As for the go-to-market strategy, the M24SR mass market launch is planned for end of February 2014. Some M24SR samples have been delivered to key customers during Q4 2013 and design/development is ongoing.

What should India do to boost electronics manufacturing?


The IESA 2014 Vision Summit opened today in Bangalore, with the one key question: what does India need to do to boost electronics manufacturing? Here are some words of wisdom from some industry icons.

SR Patil

SR Patil

SR Patil, Minister for IT-BT, Science and Technology, Karnataka, remarked that at present, we are not able to find any significant place in global hardware arena. We are heavily dependent on other countries to import electronic goods – that may be computers, chips, mobile phones and the list goes on.

“If I am right, our import bill of electronic goods has surpassed $30 billion previous year. It is calculated to be $42 billion by next year if we don’t initiate sincere measures to boost the domestic manufacturing. I don’t have any hesitation to say that we must learn lessons from small countries such as South Korea, Taiwan and Israel on this count.”

The main objective of the Karnataka ESDM policy is to make the state a preferred destination for ESDM investment, and emerge as the ESDM leader in the country.

Patil said: “We aim to generate around 2.4 lakh jobs and 20 percent of the country’s total ESDM export target of $80 billion by the year 2020. We are preparing a ground for setting up of ESDM clusters – both that of Brownfield and Greenfield.”

As many eight ESDM companies have registered with the IT-BT Department recently and obviously they are entitled for various incentives and concessions under the new policy.

Dr. Om Nalamasu, senior VP and CTO, Applied Materials Inc. added that establishing a high-value manufacturing industry as semiconductor chip fabrication will have transformative effect on the overall electronics industry in India.

This will have a very strong multiplier effect that will result in major strides forward in the value generated from all sectors within the semiconductor ecosystem – one of the biggest being the growth of high-tech and high value-add employment opportunities this will generate in the country. The historic significance of this approval will be felt for many years to come. Manufacturing in India will soon witness a new frontier.

A strong manufacturing base is critical for high-growth economies. There are successful examples in South East Asia where advanced manufacturing has resulted in strong GDP multipliers. In India, there’s a strong electronics market opportunity, driven by telecom, IT, consumer and industrial electronics; 65 percent of these electronic products are imported today. The disposable income of the growing middle class in India and China will continue to drive electronics market growth.

The point is: all of these words have been spoken over and over again! The first semicon policy was announced in 2007-08, followed by a revised policy in 2010-11. In between, the first Karnataka semicon policy was announced. However, there have been very, very few, or no takers! Even the first semicon fab policy announcement went unaccounted for! Later, last year, there was another announcement regarding two fabs that are said to be coming up!

When will India deliver? One hopes that happens soon!

FinFETs delivering on promise of power reduction: Synopsys


Here is the concluding part of my conversation with Synopsys’ Rich Goldman on the global semiconductor industry.

Rich Goldman

Rich Goldman

Global semicon in sub 20nm era
How is the global semicon industry performing after entering the sub 20nm era? Rich Goldman, VP, corporate marketing and strategic alliances, Synopsys, said that driving the fastest pace of change in the history of mankind is not for the faint of heart. Keeping up with Moore’s Law has always required significant investment and ingenuity.

“The sub-20nm era brings additional challenges in device structures (namely FinFETs), materials and methodologies. As costs rise, a dwindling number of semiconductor companies can afford to build fabs at the leading edge. Those thriving include foundries, which spread capital expenses over the revenue from many customers, and fabless companies, which leverage foundries’ capital investment rather than risking their own. Thriving, leading-edge IDMs are now the exception.

“Semiconductor companies focused on mobile and the Internet of Things are also thriving as their market quickly expands. Semiconductor companies who dominate their space in such segments as automotive, mil/aero and medical are also doing quite well, while non-leaders find rough waters.”

Performance of FinFETs
Have FinFETs gone to below 20nm? Also, are those looking for power reduction now benefiting?

He added that 20nm was a pivotal point in advanced process development. The 20nm process node’s new set of challenges, including double patterning and very leaky transistors due to short channel effects, negated the benefits of transistor scaling.

To further complicate matters, the migration from 28nm to 20nm lacked the performance and area gains seen with prior generations, making it economically questionable. While planar FET may be nearing the end of its scalable lifespan at 20nm, FinFETs provide a viable alternative for advanced processes at emerging nodes.

The industry’s experience with 20nm paved the way for an easier FinFET transition. FinFET processes are in production today, and many IC design companies are rapidly moving to manufacture their devices on the emerging 16nm and 14nm FinFET-based process geometries due to the compelling power and performance benefits. Numerous test chips have taped out, and results are coming in.

“FinFET is delivering on its promise of power reduction. With 20nm planar FET technologies, leakage current can flow across the channel between the source and the drain, making it very difficult to completely turn the transistor off. FinFETs provide better channel control, allowing very little current to leak when the device is in the “off” state. This enables the use of lower threshold voltages, resulting in better power and performance. FinFET devices also operate at a lower nominal voltage supply, significantly improving dynamic power.”
Read more…

Atrenta on outlook for EDA in 2014

January 14, 2014 1 comment

I had interacted with Dr. Ajoy Bose, CEO of Atrenta, some months ago. It was a pleasure to meet up with Piyush Sancheti, VP of Marketing recently. First, I asked him about the outlook for EDA in 2014.

Piyush Sancheti

Piyush Sancheti

Outlook for EDA
Piyush Sancheti said: “EDA does not look that attractive from growth point. However, you cannot do SoC designs without EDA. Right now, EDA’s focus is on implementation. The re-use of IP has been doing the rounds for many years. Drivers for SoCs are mobile and Internet of Things. The design cycle for those markets are very short – about three months. EDA business is shifting to IP re-use. The focus is now toward design aggregation.

“We will have done roughly 66 percent of business – net new — on existing customers. There is an industry shift toward doing more on the front end. EDA growth will come from IP-SoC involvement.

“Sub-20nm has challenges. ST says FT-SoI is the way to go. Complexity of process plays a big role, and the amount of chips you put in will also increase. In 14/16nm, we have an investment going on in 3D design. We are extending our 2D tool into 3D tool. We are also investing in the IP qualification. We have standardized a set of design rules in RTL. There are about 30 companies in the TSMC ecosystem.

“Our main focus is IP enablement. SoC acceptance is another key aspect. Our company focus is IP-enablement for SoCs. IP qualification ensures that it meets guidelines. Second, acceptance and making sure all IPs fit in the blocks. Third, integration. We already have this technology and it is driving the business.”

3D design
What’s Atrenta’s take on 3D design? Sancheti replied: “The industry has been slow as 3D designs are not yet to a point of business success. Focus on monolithic 3D-ICs will be a paradigm shift for the semicon industry. For mainstream commercial design, 20nm is still mainstream, but 14/16nm does not look mainstream, as of now. Process node is not necessarily a driver of innovation. EDA as an industry will remain in single digit growth.”

How will EDA move into the embedded software space?

Sancheti said: “We’ve looked into that market. But, the price point is significantly lower. Over time, it could be a strategic area for us. Over time, embedded software development and chip design will co-mingle.”

ESL is where the future of EDA lies. Still true? He added that the future of EDA is going up. It has to head toward integration of embedded software and chip development. However, ESL is not the only viable option.

Atrenta has 220 people in India, about 10 people in Bangalore and 200 in Noida. Sushil Gupta runs the India operations. It has tie-ups with IIT Delhi and IIT Kharagpur as well. Atrenta sees lot of scope for work with the Indian start-ups.

%d bloggers like this: