Archive

Archive for the ‘global semiconductor industry’ Category

Cadence Quantus solution meets 16nm FinFET challenges


Cadence Design Systems Inc. recently announced the Quantus QRC extraction solution had been certified for TSMC 16nm FinFET.

So, what’s the uniqueness about the Cadence Quantus QRC extraction solution?

Quantus

Quantus

KT Moore, senior group director – Product Marketing, Digital and Signoff Group, Cadence Design Systems, said: “There are several parasitic challenges that are associated with advanced node designs — especially FinFET – and it’s not just about tighter geometries and new design rules. We can bucket these challenges into two main categories: increasing complexity and modeling challenges.

“The number of process corners is exploding, and for FinFET devices specifically, there is an explosion in the parasitic coupling capacitances and resistances. This increases the design complexity and sizes. The netlist is getting bigger and bigger, and as a result, there is an increase in extraction runtimes for SoC designs and post-layout simulation and characterization runtimes for custom/analog designs.

“Our customers consistently tell us that, for advanced nodes, and especially for FinFET designs, while their extraction runtimes and time-to-signoff is increasing, their actual time-to-market is shrinking and putting an enormous amount of pressure on designers to deliver on-time tapeout. In order to address these market pressures, we have employed the massively parallel technology that was first introduced in our Tempus Timing Signoff Solution and Voltus IC Power Integrity Solution to our next-generation extraction tool, Quantus QRC Extraction Solution.

“Quantus QRC Extraction Solution enables us to deliver up to 5X better performance than competing solutions and allows scalability of up to 100s of CPUs and machines.”

Support for FinFET features
How is Quantus providing significant enhancements to support FinFET features?

Parasitic extraction is at the forefront with the introduction of any new technology node. For FinFET designs, it’s a bit more challenging due to the introduction of non-planar FinFET devices. There are more layers to be handled, more RC effects that need to be modeled and an introduction of local interconnects. There are also secondary and third order manufacturing effects that need to modeled, and all these new features have to be modeled with precise accuracy.

Performance and turnaround times are absolutely important, but if you can’t provide accuracy for these devices — especially in correlation to the foundry golden data — designers would have to over-margin their designs and leave performance on the table.

Best-in-class accuracy
How can Cadence claim that it has the ‘tightest correlation to foundry golden data at TSMC vs. competing solutions’? And, why 16nm only?

According to Moore, the foundry partner, TSMC, asserts that Quantus QRC Extraction Solution provides best-in-class accuracy, which was referenced in the recent press announcement:

“Cadence Quantus QRC Extraction Solution successfully passed TSMC’s rigorous parasitic extraction certification requirements to achieve best-in-class accuracy against the foundry golden data for FinFET technology.”

FinFET structures present unique challenges since they are non-planar devices as opposed to its CMOS predecessor, which is a planar device. We partnered with TSMC from the very beginning to address the modeling challenges, and we’ve seen many complex shapes and structures over the year that we’ve modeled accurately.

“We’re not surprised that TSMC has recognized our best-in-class accuracy because we’re the leader in providing extraction solutions for RF designs. Cadence Quantus QRC Extraction Solution has been certified for TSMC 16nm FinFET, however, it’s important to note that we’ve been certified for all other technology nodes and our QRC techfiles are available to our customers from TSMC today.”

SEMI materials outlook: Semicon West 2014


Source: SEMI, USA.

Source: SEMI, USA.

At the recently held Semicon West 2014, Daniel P. Tracy, senior director, Industry Research and Statistics, SEMI, presented on SEMI Materials Outlook. He estimated that semiconductor materials will see unit growth of 6 percent or more. There may be low revenue growth in a large number of segments due to the pricing pressures and change in material.

For semiconductor eequipment, he estimated ~20 percent growth this year, following two years of spending decline. It is currently estimated at ~11 percent spending growth in 2015.

Overall, the year to date estimate is positive growth vs. same period 2013, for units and materials shipments, and for equipment billings.

For equipment outlook, it is pointing to ~18 percent growth in equipment for 2014. Total equipment orders are up ~17 percent year-to-date.

For wafer fab materials outlook, the silicon area monthly shipments are at an all-time high for the moment. Lithography process chemicals saw -7 percent sales decline in 2013. The 2014 outlook is downward pressure on ASPs for some chemicals. 193nm resists are approaching $600 million. ARC has been growing 5-7 percent, respectively.

For packaging materials, the Flip Chip growth drivers are a flip chip growth of ~25 percent from 2012 to 2017 in units. There are trends toward copper pillar and micro bumps for TSV. Future flip chip growth in wireless products are driven by form factor and performance. BB and AP processors are also moving to flip chip.

There has been growth in WLP shipments. Major applications for WLP are driven by mobile products such as smartphones and tablets. It should grow at a CAGR of ~11 percent in units (2012-2017).

Solder balls were $280 million market in 2013. Shipments of lead-free solder balls continues to increase. Underfillls were $208 million in 2013. It includes underfills for flip chip and packages. The increased use of underfills for CSPs and WLPs are likely to pass the drop test in high-end mobile devices.

Wafer-level dielectrics were $94 million market in 2013. Materials and structures are likely to enhance board-level reliability performance.

Die-attach materials has over a dozen suppliers. Hitachi Chemical and Henkel account for major share of total die attach market. New players are continuing to emerge in China and Korea. Stacked-die CSP package applications have been increasing. Industry acceptance of film (flow)-over-wire (FOW) and dicing die attach film (DDF) technologies are also happening.

 

Semicon West 2014: SEMI World Fab forecast report


Fabs

Fabs

Christian Gregor Dieseldorff, senior analyst, Industry Research & Statistics  Group at SEMI, presented the SEMI World Fab Forecast at the recently held Semicon West 2014, as part of the SEMI/Gartner Market Symposium on July 7.

Scenarios of fab equipment spending over time has been  20-25 percent in 2014, and 10-15 percent in 2015. At this time, worldwide fab equipment spending is about same in 1H14 vs 2H14. As for fab construction projects, 2013 was a record year with over $9 billion.

New fabs: construction spending (front end cleanrooms only!)
2013: record year with over $9 billion.
2014: -22 percent to -27 percent (~$6.6 billion)
2015: -22 percent to -30 percent (~$5 billion +/-).

Fab equipment spending front end (new and used)
2014: 20 percent to 25 percent (~$35 billion to $36 billion) – if $35 billion, then third largest on record.
2015: 10 percent to 15 percent (~$40 billion) – if $40 billion, then largest in record.

Installed capacity for front end fabs (without discretes)
2014: 2 to 3 percent
2015: 3 to 4 percent
Future outlook beyond 2015: less than 4 percent.

SEMI World Fab Forecast report status and activity outlined that there were 1,148 front end facilities (R&D to HVM) active and future. Also,
* There are 507 companies (R&D to HVM).
* Including 249 LEDs and Opto facilities active and future.
* There are 60 future facilities starting HVM in 2014 or later.
* Major investments (construction projects and/or equipping): 202 facilities in 2014, 189 facilities in 2015.

A slow down of fab closures is expected from 2015 to 2018 for 200mm fabs and 150mm fabs.

How Intel competes on today’s fabless ecosystem?


The SEMI/Gartner Market Symposium was held Semicon West 2014 at San Francisco, on July 7. Am grateful to Ms. Becky Tonnesen, Gartner, and Ms Agnes Cobar, SEMI, for providing me the presentations. Thanks are also due to Ms Deborah Geiger, SEMI.

Dean Freeman, research VP, Gartner, outlined the speakers:

• Sunit Rikhi, VP, Technology and Manufacturing Group, GM, Intel Custom Foundry Intel, presented on Competing in today’s Fabless Ecosystem.

• Bob Johnson, VP Research, Gartner, presented the Semiconductor Capital Spending Outlook.

• Christian Gregor Dieseldorff, director Market Research, SEMI, presented the SEMI World Fab Forecast: Analysis and Forecast for Fab Spending, Capacity and Technology.

• Sam Wang, VP Research Analyst, Gartner, presented on How Foundries will Compete in a 3D World.

• Jim Walker, VP Research, Gartner, presented on Foundry versus SATS: The Battle for 3D and Wafer Level Supremacy.

• Dr. Dan Tracy, senior director, Industry Research & Statistics, SEMI, presented on Semiconductor Materials Market Outlook.

Let’s start with Sunit Rikhi at Intel.

As a new player in the fabless eco-system, Intel focuses on:
* The value it brings to the table.
* How it delivers on platforms of capability and services.
* How it leverage the advantages of being inside the world’s leading Integrated Device Manufacturer (IDM)
* How it face the challenges of being inside the world’s leading IDM.

Intel sunit-rikhiIntel has leadership in silicon technologies. Transistor performance per watt is the critical enabler for all. Density improvements offset wafer cost trends. Intel currently has ~3.5-year lead in introducing revolutionary transistor technologies.

In foundry capabilities and services platforms, Intel brings differentiated value on industry standard platforms. 22nm was started in 2011, while 14nm was started in 2013. 10nm will be starting in 2015. To date, 125 prototype designs have been processed.

Intel offers broad capability and services on industry standard platforms. It also has fuller array of co-optimized end-to-end services. As for the packaging technology, Intel has been building better products through
multi-component integration. Intel has also been starting high on the yield learning curve.

Regarding IDM challenges, such as high-mix-low-volume configuration, Intel has been doing configuration optimization in tooling and set-up. It has also been separating priority and planning process for customers. Intel has been providing an effective response for every challenge.

Some of Intel Custom Foundry announced customers include Achronix, Altera, Microsemi, Netronome, Panasonic and Tabula.

What’s the future of MEMS?


MEMS market.

MEMS market.

What does the future hold for MEMS? How can the MEMS indistry stay profitable and innovative in the next five years? The MEMS market is still in a dynamic growth with an estimated 12.3 percent CAGR over 2013-2019 in $US value, growing from $11.7 billion in 2013 to $24 billion in 2019.

This growth, principally driven by a huge expansion of consumer products, is mitigated by two main factors. First, due to a fierce competition based on pricing, the ASPs are continuously decreasing.

Second, innovation is slow and incremental, as no new devices have been successfully introduced on the market since 2003.  Fierce competition based on pricing in now ongoing putting thus extreme pressure on device manufacturers.

Some trends are still impacting MEMS business. These are:

* Decrease of price in consumer electronics; ASP of MEMS microphones.
* Component size is still decreasing.

However, successful companies are still large leaders in distinct MEMS categories, such as STMicroelectronics, Knowles, etc. But maintaining growth in consumer electronic applications remains a challenge.

The market for motion sensor in cell phones and tablets is large and continuously expanding. Discrete sensors still decline, but will still be used in some platforms (OIS function for gyros). Next, 6- and 9-axis combos should grow rapidly. Because of strong price pressure and high adoption rate, the total market will stabilize from 2015.

STMicroelectronics, InvenSense and Bosch are still leaders in 3-axis gyros and 6-axis IMUs. It seems difficult for new players to compete and be profitable in this market. The automotive, industrial and medical applications of MEMS are driving growth of MEMS business. MEMS for automotive will grow from $2.6 billion in 2012 to $3.6 billion in 2018 with 5 percent CAGR.

MEMS industry is big and growing. Strong market pull observed for sensors and actuators in cell phones, automotive, medical, industrial.

• Not limited to few devices. A new wave of MEMS is coming!
• Component and die size are still being optimized while combo approaches become mainstream. And several disruptive technology approaches are now in development to keep going in term of size and price decrease.
• But the MEMS industry has not solved a critical issue: how to increase the chance of new devices to enter the market?

–RF switch, autofocus, energy harvesting devices, fuel cells… are example of devices still under development after over 10 years of effort.
–How to help companies to go faster and safer on the market with new devices?

Plunify’s InTime helps FPGA design engineers meet timing and area goals!


Kirvy Teo

Kirvy Teo

Engineers designing FPGA applications face many challenges. Using Plunify’s automation and analysis platform, engineers can run 100 times more builds, analyze a larger set of builds and quickly zoom in on better quality results. Using data analytics and the cloud, Plunify created new capabilities for FPGA design, with InTime being an example.

Kirvy Teo said: What happens when you need to close timing in FPGA design and still can’t get it to work? Here is a new way to solve that problem – machine learning and analytics. InTime is an expert software that helps FPGA design engineers meet timing and area goals by recommending “strategies”. Strategies are combination of settings found in the existing FPGA software. With more than 70 settings available in the FPGA software, no sane FPGA design engineer have the time or capacity to understand how these affect the design outcomes.

One of the common methods now is to try random bruteforce using seeds. This is a one-way street. If you get to your desired result, great! If not, you would have wasted a bunch of time running builds with you none the wiser. Another aspect of running seeds is that the variance of the results is usually not very big, meaning you can’t run seeds on a design with bad timing scores.

However, using InTime, all builds become part of the data that we used to recommend strategies that can give you better results, using machine learning and predictive analytics. This means you will definitely get a better answer at the end of the day, and we have seen 40 percent performance improvements on designs!

How has Plunify been doing this year so far? According to Teo, Plunify did a controlled release to selected customers in first quarter of 2014, who are mainly based in China. It is easier to guess who as we nicknamed them “BCC” – Big Chinese Corporations.

Unsurprisingly, they have different methodologies to solving timing problems and design guidelines, many of which were done to pre-empt timing problems at the later stage of the design. InTime was a great way to help them to achieve their performance targets without disrupting their tool flows.

Plunify is announcing the launch of InTime during DAC and will be looking to partner with sale organizations in US.

What’s the future path likely to be? Teo added: “Machine learning and predictive analytics are one of the hottest topics and we have yet seen it being used much in chip design. We see a lot of potential in this sector. Beyond what InTime is doing now, there are still many chip design problems that can be solved with similar techniques.

“First, there is a need to determine the type of problems that can be solved with these techniques. Second, we are re-looking at existing design problems and wondering, if I can throw 100 or 1000 machines to this problem, can I get a better result? Third, how to get that better result without even running it!

“As you know, we do offer a FPGA cloud platform on Amazon. One of the most surprising observations is that people do not know how to use all those cheap power in the cloud! FPGA design is still confined to a single machine for daily work, like email. Even if I give you 100 machines, you don’t know how to check your emails faster! We see the same thing, the only method they know is to run seeds. InTime is what they need to make use of all these resources intelligently.

Why would FPGA providers take up the solution?

The InTime software works as a desktop software which can be installed in internal data centers or desktops. It is on longer just a cloud play. It works with the current  in-house FPGA software that the customer already own. We are helping FPGA providers like Xilinx or Altera, by helping their customers with the designs. They will feel: How about “Getting better results without touching your RTL code!”

Optic2Connect develops software for photonics!


Sean Seah

Sean Seah

Optic2Connect will be present at this year’s DAC. I caught up with Sean Seah, project manager, to find out more.

First, what’s the company’s X factor and why? (What is it that makes your offering special and noteworthy – how are you different from competitors)?

Optic2Connect develops software solutions for the photonics industry. The demand to manage high volumes of data in networks, especially with the current smart-phone and cloud computing trend, has increased tremendously. As design gets more complex, simulation tools need to scale with regard to fidelity and accuracy.

Currently, photonic designers, scientists, and fabrication engineers adopt an approximated approach from the electrical data to build an equivalent optical model, hence losing on device physics details. At the same time the process is long as the model needs to be described block-by-block with denser blocks representing a more detailed model. Our competitors are well established in their respective domains, electrical or optical, but they are strong in their own respective fields. However, intimate knowledge in both are essential to fully understand this newer generation of photonic devices. Failure to understand fully results in false results from the manufacturing.

With patented know-how, Optic2Connect provides software solutions that SOLVES this pertinent challenge. It maps accurately simulations from one domain to another, e.g. electrical to optical. This technology has been developed by a team of researchers at A*Star – Singapore Public Research Institute. The technology overcomes error-prone and detailed oriented simulation setups. We demonstrated the ability to map without losing any fidelity in the simulation files.

Optic2Connect’s IP differs from its competitors because it simulates directly from the beginning device processing, to electrical device performance until the final high-speed optical eye diagram. This is in stark contrast to the usual method of representing their operation using simplified transfer functions.

Furthermore, the Optic2Connect design flow uses the same reliable tools and processes from the semiconductor industry that are fully compatible with the Complementary Metal Oxide Semiconductor (CMOS) fabrication process of silicon microelectronics. This design flow uses standard tools libraries, device models especially for active components such as modulators, and simulation of these components incorporating the models.

How have you been doing this year so far? Seah said: “It has been excellent! We are racing to complete our product prototypes and we secured a contract from an MNC and another from universities.”

What’s the future path likely to be? Seah added: “We intend to further validate our prototype with our partners from industry and academia, and integrating advanced modulation formats into our solutions. We want to offer a fully integrated solution for photonic devices to our customers. Our goal is to offer a one-stop solution for leading integrated-circuit (IC) manufacturers!”

Why this name? You sounded like a telecom company!

Seah said: “We strongly believe the future of communications is via optics which has the ability to circumvent the data bottleneck issues. Optic2Connect is meant to offer connect using optical communications. Our goal is a one-stop solution for optical connections. “

How will the solution significantly shorten product time-to-market and reduce development costs of photonics devices?

For complex photonics devices, minute changes to design parameters are significant and could affect loss performance, and operating voltage requirements. One common approach in the industry today is to physically build the variations into multiple device / runs and test them out. Each run cost is the range of hundreds of thousands and consume precious time. Especially, if the first batch of devices do not meet required parameters and additional batches are required. This cost both money and time, which in turn is more money.

Hence, Optic2Connect provides an elegant solution with our accurate modelling and simulation solutions, this accelerates manufacturing prototypes and at much lower production costs. Our software solutions provide a 10x improvement in time reduction and time to market. Further, our cloud solution overcomes traditional problems of insufficient servers / licenses, especially during periods of peak demand.

Metro450 Conference 2014 discusses all things 450mm wafers!


Thanks to the Enable450 newsletter, sent out by Malcolm Penn, CEO, Future Horizons, here is a piece on the Metro450 Conference 2014, held earlier this year in Israel.

450Metro450 is an Israel-based consortium with the goal of helping the metrology companies advance in their fields. The consortium’s members include metrology and related companies, as well as academics who support these companies by performing basic research.

The conference was sponsored by the Israeli Chief Scientist Office, by Applied Materials Israel and by Intel. There were several goals for the conference: to provide an opportunity for industry leaders as well as academicians to meet and discuss the latest developments in the world of metrology, to present these advances to audiences which would normally not be privy to such information, and to learn more about the international effort in 450mm wafer technology.

Over 200 people attended this conference from Israeli companies and academia, as well as from Europe and the United States. Israeli companies included Applied Materials, Jordan Valley, Nova, KLA, Zeiss Israel, and others. Academic members included researchers from the leading Israeli universities, including the Technion, Tel-Aviv U. and Haifa U. European companies were represented by ENIAC, as well as large corporations such as ASML as well SME-based companies. The G450C consortium, based in Albany, N.Y. was also well represented at this conference.

Some of the highlights of the conference included scientific discussions of different metrology methods, and their adjunct requirements, such as improved rapid wafer movement, improved sampling methods and fast computing. Presentations also included an overview of the advances necessary to move the industry forward, optical CD metrology, x-ray metrology, and novel piezo-based wafer movement.

A panel discussed various broad industry trends, including the timeline of 450mm wafers, European programs and the Israeli programs. International speakers discussed the European technology model, risk mitigation of 450 through collaborations, 450 collaborative projects under ENIAC, 450mm wafer movement challenges and metrology challenges beyond 14nm.

This second annual Metro450 conference took place this January at the Technion, Israel.

Five recommendations for verification: Dr. Wally Rhines


Dr. Wally RhinesIt seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.

Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.

Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”

Would you agree that many companies STILL do not know how to verify a chip?

Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).

“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.

“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”

How are companies trying to address those?

According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).  Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.

Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?

He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?

Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”

Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?

He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?

Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.

“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”

How is Mentor addressing this situation?

Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.

In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.

Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?

Here are the five recommendations for verification from Dr. Rhines:

* Ensure your organization has implemented an effective verification planning process.

* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.

* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.

* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.

* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.

Semicon industry needs to keep delivering value: Anil Gupta


Anil Gupta

Anil Gupta

In 2013, the global semiconductor industry had touched $306 billion or so. Sales had doubled from $100 billion to $200 billion in six years — from 1994 to 2000. It was enterprise sales that was driving this. It has taken 14 years to move past $300 billion, said Anil Gupta, managing director, Applied Micro Circuits India Pvt Ltd, at the UVM 1.2 day.

This time, consumption of semiconductors is not only around enterprise, but social networks as well. Out of the $306 billion, logic was approximately $86 billion, memory was $67 billion, and micro was $58 billion. We, as consumers, are starting to play a huge role.

However, the number of large players seem to be shrinking. Mid-size firms, like Applied Micro, are said to be struggling. Technology is playing an interesting role. There is a very significant investment in FinFETs. It may only get difficult for all of us. Irrespective, all of this is a huge barrier to the mid- to small-companies. Acquisitions are probably the only route, unless you are in software.

In India, we have been worried for a while, whether the situation will be a passing phase. We definitely will have a role to play. From an expertise perspective, thanks to our background, we have been a poor nation. For us, the job is the primary goal. We need to think: how do we deliver value? We have to try and keep creating value for as long as possible.

As more and more devices actually happen, many other things are also happening. An example for devices is power. We still have a fair number of years ahead where there will be opportunities to deliver value.

What’s happening between hardware and software? The latter is in demand. Clearly, there is a trend to make the hardware a commodity. However, hardware s not going away! Therefore, the opportunity for us to deliver value is huge.

Taking the tools to make something, is critical. UVM tools are critical. But, somewhere along the way, we seem to stop at that. We definitely need to add value. UVM’s aim is to make things re-usable.

Don’t loose your focus while doing verification. Think about the block, the subsystem and the top. You need to and will discover and realize how valuable it is to find a bug, before the tape out of the chip.

%d bloggers like this: