It is always a pleasure speaking with Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics Corp. I met him on the sidelines of the 13th Global Electronics Summit, held at the Chaminade Resort & Spa, Santa Cruz, USA.
Status of global EDA industry
First, I asked Dr. Rhines how the EDA industry was doing. Dr. Rhines said: “The global EDA industry has been doing pretty well. The results have been pretty good for 2012. In general, the EDA industry tends to follow the semiconductor R&D by at least 18 months.”
For the record, the electronic design automation (EDA) industry revenue increased 4.6 percent for Q4 2012 to $1,779.1 million, compared to $1,700.1 million in Q4 2011.
Every region, barring Japan, grew in 2012. The Asia Pacific rim grew the fastest – about 12.5 percent. The Americas was the second fastest region in terms of growth at 7.4 percent, and Europe grew at 6.8 percent. However, Japan decreased by 3 percent in 2012.
In 2012, the segments that have grown the fastest within the EDA industry include PCB design and IP, respectively. The front-end CAE (computer aided engineering) group grew faster than the backend CAE. By product category, CAE grew 9.8 percent. The overall growth for license and maintenance was 7 percent. Among the CAE areas, design entry grew 36 percent and emulation 24 percent, respectively.
DFM also grew 28 percent last year. Overall, PCB grew 7.6 percent, while PCB analysis was 25 percent. IP grew 12.6 percent, while the verification IP grew 60 percent. Formal verification and power analysis grew 16 percent each, respectively. “That’s actually a little faster than how semiconductor R&D is growing,” added Dr. Rhines.
Status of global semicon industry
On the fortunes of the global semiconductor industry. Dr. Rhines said: “The global semiconductor industry grew very slowly in 2012. Year 2013 should be better. Revenue was actually consolidated by a lot of consolidations in the wireless industry.”
According to him, smartphones should see further growth. “There are big investments in capacities in the 28nm segment. Folks will likely redesign their products over the next few years,” he said. “A lot of firms are waiting for FinFET to go to 20nm. People who need it for power reduction should benefit.”
“A lot of people are concerned about Japan. We believe that Japan can recover due to the Yen,” he added.
It is always a pleasure to chat with Dr. Wally (Walden C.) Rhines, chairman and CEO, of Mentor Graphics. I chatted with him, trying to understand gigascale design, verification trends, strategy for power-aware verification, SERDES design challenges, migrating to 3D FinFET transistors, and Moore’s Law getting to be “Moore Stress”!
Chip design in gigascale, hertz, complex
First, I asked him to elaborate on how implementation of chip design will evolve, with respect to gigascale design, gigahertz and gigacomplex geometries.
He said: “Thanks to close co-operation among members of the foundry ecosystem, as well as cooperation between IDMs and their suppliers, serious development of design methods and software tools is running two to three generations ahead of volume manufacturing capability. For most applications, “Gigascale” power dissipation is a bigger challenge than managing the complexity but “system-level” power optimization tools will continue to allow rapid progress. Thermal analysis is becoming part of the designer’s toolkit.”
Functional verification is continually challenged by complexity but there have been, and continue to be, many orders of magnitude improvement in performance just from adoption of emulation, intelligent test benches and formal methods so this will not be a major limitation.
The complexity of new physical design problems will, however, be very challenging. Design problems ranging from basic ESD analysis, made more complex due to multiple power domains, to EMI, electromigration and intra-die variability are now being addressed with new design approaches. Fortunately, programmable electrical rule checking is being widely adopted and will help to minimize the impact of these physical effects.
Is verification keeping up?
How is the innovation in verification keeping up with trends?
Dr. Rhines added that over the past decade, microprocessor clock speeds have leveled out at 3 to 4 GHz and server performance improvement has come mostly from multi-core architectures. Although some innovative approaches have allowed simulators to gain some advantage from multi-core architectures, the speed of simulators hasn’t kept up with the growing complexity of leading edge chips.
Emulators have more than made up the difference. Emulators offer more than four orders of magnitude faster performance than simulators and emulators do so at about 0.005X the cost per cycle of simulation. The cost of power per year is more than one third the cost of hardware in a large simulation farm today, while emulation offers a 12X savings in power per verification clock cycle. For those who design really complex chips, a combination of emulation and simulation, along with formal methods and intelligent test benches, has become standard.
At the block and subsystem level, high level synthesis is enabling the next move up in design and verification abstraction. Since verification complexity grows at about the square of component count, we have plenty of room to handle larger chips by taking advantage of the four orders of magnitude improvement through emulation plus another three or four orders of magnitude through formal verification techniques, two to three orders of magnitude from intelligent test benches and three orders of magnitude from higher levels of abstraction.
By applying multiple engines and multiple abstraction levels to the challenge of verifying chips, the pressure is on to integrate the flow. Easily transitioning and reusing verification efforts from every level—including tests and coverage models, from high level models to RTL and from simulation to emulation—is being enabled through more powerful and adaptable verification IP and high level, graph-based test specification capabilities. These are keys to driving verification reuse to match the level of design reuse.
Powerful verification management solutions enable the collection of coverage information from all engines and abstraction levels, tracking progress against functional specifications and verification plans. Combining verification cycle productivity growth from emulation, formal, simulation and intelligent testing with higher verification abstraction, re-use and process management provides a path forward to economically verifying even the largest, most complex chips on time and within budget.
Good power-aware verification strategy for SoCs
What should be a good power-aware verification strategy for SoCs
According to him, the most important guideline is to start power-aware design at the highest possible level of system description. The opportunity to reduce system power is typically an order of magnitude greater at the system level than at the RTL level. For most chips today, that means at least the transaction level when the design is still described in C++ or SystemC.
Significant experience and effort should then be invested at the RTL level using synthesis and UPF-enabled simulation. Verification solutions typically automate the generation of correctness checks for power-control sequences and power-state coverage metrics. As SoC power is typically managed by software, the value of a hardware/software co-verification and co-debug solution in simulation and emulation becomes apparent in power-management verification at this level.
As designers proceed to the gate and transistor level, accuracy of power estimation improves. That is why gate level analysis and verification of the fully implemented power management architecture is important. Finally, at the physical layout, designers traditionally were stuck with whatever power budget was passed down to them. Now,they increasingly have power goals that can be achieved using dozens of physical design techniques that are built into the place and route tools.
Today, EDA requires specialization. Elaborating on EDA over the past decade, Dr. Walden (Wally) C. Rhines, chairman and CEO, of Mentor Graphics, and vice chairman of the EDA Consortium, USA, said that PCB design has been flat despite growth in analysis, DFM and new emerging markets. Front end design has seen growth from RF/analog design and simulation, and analysis As design methodologies mature, EDA expenditures stop growing. He was speaking at Mentor Graphics’ U2U (User2User) conference in Bangalore, India.
Most of the EDA revenue growth comes from major new design methodologies, such as ESL, DFM, analog-mixed signal and RF. PCB design trend continues to be flat, and includes license and maintenance. The IC layout verification market is pointing to a 2.1 percent CAGR at the end of 2011. The RTL simulation market has been growing at 1.3 percent CAGR for the last decade. The IC physical implementation market has been growing at 3,4 percent CAGR for the last decade.
Growth areas in EDA from 2000-2011 include DFM at 28 percent CAGR, formal verification at 12 percent, ESL at 11 pecent, and IC/ASIC analysis at 9 percent, respectively.
What will generate the next wave of electronic product design challenges, and the future growth of EDA? This would involve solving new problems that are not part of the traditional EDA, and ‘do what others don’t do!
Methodology changes that may change EDA
There are five factors that can make this happen. These are:
* Low power design beyond RTL (and even ESL).
* Functional verification beyond simulation.
* Physical verification beyond design for manufacturability.
* Design for test beyond compression.
* System design beyond PCBs
Low power design at higher levels
Power affects every design stage. Sometimes, designing for low power at system level is required. System level optimization has the biggest impact on power/performance. And, embedded software is a major point of leverage.
Embedded software has an increasing share of the design effort. Here, Mentor’s Nucleus power management framework is key. It has an unique API for power management, enables software engineers to optimize power consumption, and reduces lines of application code. Also, power aware design optimizes code efficiency.
Functional verification beyond RTL simulation
The Verification methodology standards war is over. UVM is expected to grow by 286 percent in the next 12 months. Mentor Graphics Questa inFact is the industry’s most advanced testbench automation solution. It enables Testbench re-use and accelerates time-to-coverage. Intelligent test bench facilitates linear transition to multi-processing.
Questa accelerates the hardware/software verification environment. In-circuit emulation has been evolving to virtual hardware acceleration and embedded software development. Offline debug increases development productivity. A four-hour on-emulator software debug session drops to 30 minutes batch run. The offline debug allows 150 software designers to jumpstart debug process on source code. Virtual stimulus increases the flexibility of the emulator. As an example, Veloce is 700x more efficient than large simulation farms.
Physical verification beyond design for manufacturability
The Calibre PERC is a new approach to circuit verification. The Calibre 3DSTACK is the verification flow for 3D.
It always gives me great pleasure chatting with Dr. Walden (Wallly) C. Rhines, chairman and CEO, of Mentor Graphics, and vice chairman of the EDA Consortium, USA. 2013 is just round the corner. What lies ahead for the global semiconductor industry is a question on everyone’s lips! How will the EDA industry do next year? For that matter, what should the Indian semiconductor industry look forward to next year?
Three trends for 2013
First, I asked Dr. Wally Rhines regarding the trends in the global semiconductor industry. He cited:
* Growth in communication ICs.
* Growth in the third dimension.
* Accelerated design activity at the leading edge.
Growth in communication ICs: On the macro level, silicon area shipments continue to grow gradually, as do semiconductor unit shipments. However, there’s a major shift in application segments from computing to communications. Communications used to be only one third the size of computing in terms of semiconductor usage.
Communications are expected to surpass computing in terms of semiconductor consumption by 2014 thanks to the rapid growth of wireless applications, the incorporation of computing into communications devices like smart phones and the addition of communications to computing devices like tablet computers.
Growth in the third dimension: Shrinking feature sizes and growing wafer diameters will continue to contribute to the annual 30 percent decrease in the average cost per transistor and average 72 percent unit growth of transistors, but they will do so at a diminished rate. Fortunately, other avenues are emerging that can help sustain the semiconductor industry’s remarkable rate of growth. One largely untapped opportunity is in the third dimension, i.e. growing vertically instead of shrinking in the XY plane.
DRAM stacks of eight or more die are already possible, although they are still more expensive on a cost per bit basis compared to unstacked devices. Complex packaged systems made up of multiple heterogeneous die, memory stacked on logic and interposers to connect the die are evolving rapidly. Layers in the IC manufacturing process continue to increase as well.
Accelerated design activity at the leading edge: Another interesting trend is the recent surge in capital spending among foundries to add capacity at the leading edge. This wave of spending will result in excess capacity, at least initially, which may force foundries to lower prices to boost demand. In fact, capacity utilization data in the last few months shows a dramatic decline in utilization at 28/32nm and 22nm nodes, suggesting that excess capacity is already happening to an extent.
While differences in 28 and 20nm processes—such as double patterning—create challenges, the existing capital equipment is largely compatible with both processes. Such a high volume of wafers and the large available capacity will lead to increasingly aggressive wafer pricing over time. As a result, cost-effective wafers from foundries will encourage totally new designs that would not have been possible at today’s wafer cost.
Industry outlook 2013
So, how is the outlook for 2013 going to shape up? Dr. Rhines said: ”After almost no growth in 2012, most analysts are expecting improvement in semiconductor market growth in the coming year. Currently, the analyst forecasts for the semiconductor industry in 2013 range from 4.2 percent on the low side to 16.6 percent on the high side, with most firms coming in between 6 percent and 10 percent. The average of forecasts among the major semiconductor analyst firms is approximately 8.2 percent.
“However, most semiconductor companies are less optimistic in their published outlooks. This seems to be influenced by the level of uncertainty that exists because of unknown government actions and market conditions in the US, Europe and China.”
Any more consolidations?
It would be interesting to hear Dr. Rhines’ opinion on any further consolidations within the industry. He said: “It is common misperception that the semiconductor industry is consolidating. A closer look at the data shows that the semiconductor industry has been doing the opposite. It has been DE-consolidating for more than 40 years.
“Take the #1 semiconductor supplier, Intel. Intel’s market share is the same today as it was a decade ago. And, the combined market share of the top five semiconductor suppliers has been slowly declining since the 1960s. Similar trends also apply to the top ten and top 50—both are the same or lower than they were a decade, as well as decades, ago. In fact, the combined market share of the top 50 semiconductor companies has decreased 11 points in the last 12 years.
It is always a pleasure interacting with Dr. Walden (Wally) C. Rhines, the chairman and CEO, Mentor Graphics, and vice chairman of the EDA Consortium, USA. I started by enquiring about the global semiconductor industry.
Dr. Wally Rhines said: “The absolute size of the semiconductor industry (in terms or total revenue) differs depending on which analyst you ask, because of differences in methodology and the breadth of analysts’ surveys. Current 2012 forecasts include $316 billion from Gartner, $320 billion from IDC, $324.5 billion from IHS iSuppli, $327.2 billion from Semico Research and $339 billion from IC Insights.
“These numbers reflect growth rates from 4 per cent to 9.2 per cent, based on the different analyst-specific 2011 totals. Capital spending forecasts for the three largest semiconductor companies have increased by almost 50 per cent just since the beginning of this year. However, the initial spurt of demand was influenced by the replenishment of computer and disc drive inventories caused by the Thailand flooding. Now that this is largely complete, there is some uncertainty about the second half.
“So, overall it looks like the industry will pass $310 billion this year, but it may not be by very much. The strong capital spending and demand for leading edge capacity should impact the second half but the bigger impact will probably be in 2013.
What’s with 28.20nm?
Has 28/20nm semiconductor technology become a major ‘work horse’? What’s going on in that area? At least, this area is now of considerable interest.
Dr. Rhines said that the semiconductor industry’s transition to the 28nm family of technologies, which broadly includes 32nm and 20nm, is a much larger transition than we have experienced for many technology generations.
The world’s 28nm-capable capacity now comprises almost 20 per cent of the total silicon area in production and yet, the silicon foundries are fully loaded with more 28nm demand than they can handle. In fact, high demand for 28/20nm has created a capacity pinch that is currently spurring additional capital expenditure by foundries.
He added: “As yields and throughput mature at 28nm, the major wave of capital investment will provide plentiful foundry capacity at lower cost, stimulating a major wave of design activity. Cost-effective, high yield 28nm foundry capacity will not only drive increasing numbers of new designs but it will also force re-designs of mature products to take advantage of the cost reduction opportunity.”
According to Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics Corp., while fabless startups have declined substantially in the West during the past decade, they are growing in India.
Given the time required to grow large fabless companies in the past, India should not be discouraged by current progress. India has key capabilities to stimulate growth of fabless companies, such as:
* Design services companies.
* Design engineering expertise and innovation.
* Returning entrepreneurs.
* Educational system.
Semiconductor frustrations abound! I recall a discussion in mid-2005 where an industry expert mentioned that fabless was the way forward for the Indian industry! Between then and now, fabs were supposed to come up, but they failed. Nevertheless, one must not give up hope!
As of now, there seems to be too much focus on services, multinational company dominance, perceived lack of progress, perceived lag compared to China, lack of foundry infrastructure, and no clear dominant indigenous Indian company.
Of the top 50 semiconductor companies in 2011, 12 are fabless and four are foundries. Fabless IC revenue has been growing at 17 percent CAGR since 1997 and will continue to grow. Even the fabless market has been gaining in the overall market. However, the fabless revenue is said to be highly concentrated. He added that the leading fabless companies specialize and average ~23 years since formation. Also, the VC funding for fabless semiconductor companies has been declining in the West. As for the number of fabless companies, the GSA put it at 1,200 companies, at the end of 2010.
According to Dr. Rhines, the semiconductor IP market would grow to about $3,707 million by 2015, at a CAGR of 14 percent. The leading semicon IP players specialize and average 22 years in business (similar to fabless).
Now, India is said to be among the top five semiconductor design locations worldwide (SIP + fabless + design services). Also, India is a leading source of semicon IP, accounting for 5.3 percent globally. From the looks of it, India seems to have built a foundation for a fabless future. India can well become the next great fabless incubator! Read more…
The global semiconductor industry keeps consolidating, said Dr. Walden (Wally) Rhines, while making the keynote presentation at the ongoing Mentor Graphics’ U2U conference in Bangalore, India.
According to a survey, the no. 1′s market share has been relatively flat since 1972. The combined share of the top five semiconductor companies has been nearly the same as that of 1972. However, the share of the top 10 companies has been nearly the same but less than the historical average. If you look at the numbers, it is also evident that Texas Instruments’ (TI) acquisition of National Semiconductors has had negligible impact. Also, the market share of the top 50 semiconductor companies continues to decline, especially in the last decade.
The answer lies in the fact that manufacturing is consolidating, while semiconductor is not! Foundries share of semiconductor revenue has increased. The share of total IC production has been flat. However, foundry capex has tripled over the last two years. Also, 28nm/32nm capacity has been sold out, and prices rising have been put on top of the next slide. Foundries are likely to revamp and record the highest market share of the 28nm/20nm market. Foundry spending is said to be at an all time high as percentage of total capex.
There have been significant design changes. In fact, 28nm has now become the ‘work horse’ technology. There have been high yields and at lower costs. The accelerated design activity has seen redesign take advantage of smalller node efficiencies. So, how can you prepare? Perhaps, do more design in less time, and use the same resources. Or, you could do less of power devices.
Significant changes are coming in design. Because of 2010/2011 capital expenditures, 28/20nm semiconductor technology will become a major “work horse” compared to previous technology generations. Plenty of wafers will be available from silicon foundries. Yields will be high and costs will be low. As a result, design activity will accelerate beginning in late 2012 to take advantage of the 28nm capability and capacity. Favorable costs and yields will cause semiconductor companies to redesign 180/130/90/65/45nm products into 28/20nm versions while adding functionality. Totally new applications will emerge because of the 28/20nm capability and cost, thus growing the semiconductor market in 2014+.
Implications of plentiful 28/20nm foundry capacity include: 28nm will become “work horse” technology. There will be high yields and low costs, as well as an accelerated design activity. Redesigns will take advantage of smaller node cost efficiencies. New designs will leverage the additional transistors.
In future, more companies will be moving to ESL-based design. Place and route will be more in fashion — 20nm double patterning, as well as DFM and integrated verification. DFM will pave the way for designing for reliability. The metal layer stack has been doubling as device complexity has been rising. Another factor is reliability checking. Initial efforts aer being offered at TSMC AMS flow 2.0. More will be offered by TSMC for digital flow in Q4.
Thanks to my friend, Veeresh Shetty at Mentor Graphics, I was able to meet up with Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics, as well as with Hanns Windele, VP Mentor Graphics (Europe & India), for a short conversation regarding the global semiconductor industry.
Growth of global semicon industry
First, I sought Dr. Rhines’ views on the growth of the global semiconductor industry. Dr. Rhines said: “Capital investment in the foundries has been quite high. TSMC, GlobalFoundries, Samsung, etc., have invested double. In 2012, some of the foundries will run at a lower percentage of capacity. If that happens, foundry wafer prices might fall. However, equipment prices would not decrease.”
So, what has the industry learned from the previous recession? He said: “Capacity in the semicon industry was relatively tight in Q408. In 2009, we called it as inventory correction. If we had not had a recession, there would have been a capacity shortage.
“Now, companies seem to have caught up. There was large investment in the manufacturing capacity in 2010, and that has continued into 2011. There is more new capacity coming into foundries by 2012. Investment in memory has been modest. However, fabless companies should find more capacity in 2012.”
Hanns Windele added: “The automotive industry was contributing to all of this as well. As of now, 45 percent is consumed by the computer industry, 20 percent by the communications industry, and consumer electronics and automotive account for 5-10 percent, approximately.”
It appears that everything that one buys today, communications seems to be attached to it. Dr. Rhines also reckoned that PC shipments were holding up well, for now. Read more…
According to Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics Corp., customers pay a premium for differentiated products. Gross profit margin (GPM) percentage is the best measure for the differentiation of a manufactured product. The difficulty of switching suppliers is proportional to differentiation and GPM. He was speaking at the EDA Tech Forum 2011 in Bangalore, India.
As an example, Apple released the Mac Classic to compete with IBM clones to regain market share in the PC industry. However, it did not gain market ascendancy. It was only when Apple introduced the iPod in 2H’ 01 that things began changing. Later, it introduced the Apple iPad 2H ’07. The rest is, for now, history.
Product differentiation is said to be easiest in new and emerging markets. Apple has since invested in semiconductor design, while Nokia has divested. Apple now reduces power and improves performance through design differentiation.
On the other hand, Nokia has divested of IC design as it is now difficult to create a differentiated ecosystem even for the leader (Nokia). As of now, it is using Windows Mobile 7 for developing smartphones. The question is: is the Android vs. iPhone an analogy to the PC vs. the Mac?
The smartphone market will eventually commoditize. However, this time, there has been substantial differentiation. Product differentiation does provide temporary differentiation. However, a company created infrastructure sustains differentiation. The third-party ecosystem drives a longer term differentiation.
Now, Apple isn’t the only company with sustainable differentiation. Intel and AMD have also invested in application development, with Intel’s x86 MPUs differentiation a prime example.
In mature commodity products, system integration reduces cost and power with increasing performance. For example, Texas Instruments’ calculators are commoditized and selling well. The practise of involving education in product development has helped TI. Read more…