Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;) I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.
Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.
In that case, why are some companies STILL not knowing how to verify a chip?
He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.
“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”
How are companies trying to address the challenges?
Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.
* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.
* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.
* Verification environment re-use helps to cut down the time required to develop verification environments.
* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.
Cadence has the widest portfolio of tools to help companies meet verification challenges, including:
Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;
The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;
Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and
Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.
Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.
When should good verification start?
Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”
Are folks mistaking by looking at tools and not at the verification process itself?
He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.
Finally, there’s verification planning! What should be the ‘right’ verification path?
Verification planning needs to include:
* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.
It seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.
Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.
Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”
Would you agree that many companies STILL do not know how to verify a chip?
Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).
“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.
“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”
How are companies trying to address those?
According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification). Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.
Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?
He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.
When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?
Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”
Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?
He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”
What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?
Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.
“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”
How is Mentor addressing this situation?
Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.
In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.
Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?
Here are the five recommendations for verification from Dr. Rhines:
* Ensure your organization has implemented an effective verification planning process.
* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.
* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.
* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.
* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.
In 2013, the global semiconductor industry had touched $306 billion or so. Sales had doubled from $100 billion to $200 billion in six years — from 1994 to 2000. It was enterprise sales that was driving this. It has taken 14 years to move past $300 billion, said Anil Gupta, managing director, Applied Micro Circuits India Pvt Ltd, at the UVM 1.2 day.
This time, consumption of semiconductors is not only around enterprise, but social networks as well. Out of the $306 billion, logic was approximately $86 billion, memory was $67 billion, and micro was $58 billion. We, as consumers, are starting to play a huge role.
However, the number of large players seem to be shrinking. Mid-size firms, like Applied Micro, are said to be struggling. Technology is playing an interesting role. There is a very significant investment in FinFETs. It may only get difficult for all of us. Irrespective, all of this is a huge barrier to the mid- to small-companies. Acquisitions are probably the only route, unless you are in software.
In India, we have been worried for a while, whether the situation will be a passing phase. We definitely will have a role to play. From an expertise perspective, thanks to our background, we have been a poor nation. For us, the job is the primary goal. We need to think: how do we deliver value? We have to try and keep creating value for as long as possible.
As more and more devices actually happen, many other things are also happening. An example for devices is power. We still have a fair number of years ahead where there will be opportunities to deliver value.
What’s happening between hardware and software? The latter is in demand. Clearly, there is a trend to make the hardware a commodity. However, hardware s not going away! Therefore, the opportunity for us to deliver value is huge.
Taking the tools to make something, is critical. UVM tools are critical. But, somewhere along the way, we seem to stop at that. We definitely need to add value. UVM’s aim is to make things re-usable.
Don’t loose your focus while doing verification. Think about the block, the subsystem and the top. You need to and will discover and realize how valuable it is to find a bug, before the tape out of the chip.
Are we at an inflection point in verification today? Delivering the guest keynote at the UVM 1.2 day, Vikas Gautam, senior director, Verification Group, Synopsys, said that today, mobile and the Internet of Things are driving growth. Naturally, the SoCs are becoming even more complex. It is also opening up new verification challenges, such as power efficiency, more software, and reducing time-to-market. There is a need to shift-left to be able to meet time-to-market goal.
The goal is to complete your verification as early as possible. There have been breakthrough verification innovations. System Verilog brought in a single language. Every 10-15 years, there has been a need to upgrade verification.
Today, many verification technologies are needed. There is a growing demand for smarter verification. There is need for much upfront verification planning. There is an automated setup and re-use with VIP. There is a need to deploy new technologies and different debug environments. The current flows are limitimg smart verification. There are disjointed environments with many tools and vendors.
Synopsys has introduced the Verification Compiler. You get access to each required technology, as well as next-gen technology. These technologies are natively integrated. All of this enables 3X verification productivity.
Regarding next gen static and formal platforms, there will be capacity and performance for SoCs. It should be compatible with implementation products and flows. There is a comprehensive set of applications. The NLP+X-Prop can help find tough wake-up bug at RTL. Simulation is tuned for the VIP. There is a ~50 percent runtime improvement.
System Verilog has brought in many new changes. Now, we have the Verification Compiler. Verdi is an open platform. It offers VIA – a platform for customizing Verdi. VIA improves the debug efficiency.
Here is the concluding part of my discussion with Sam Fuller, CTO, Analog Devices. We discussed the technology aspects of Moore’s Law and
‘More than Moore’, among other things.
Are we at the end of Moore’s Law?
First, I asked Fuller that as Gordon Moore suggested – are we about to reach the end of Moore’s Law? What will it mean for personal computing?
Fuller replied: “There is definitely still life left in Moore’s law, but we’re leaving the golden age after the wonderful ride that we have had for the last 40 years. We will continue to make chips denser, but it is becoming difficult to continue to improve the performance as well as lower the power and cost.
“Therefore, as Moore’s law goes forward, more innovation is required with each new generation. As we move from Planer CMOS to FinFET (a new technology for multi-gate architecture of transistors); from silicon to more advanced materials Moore’s law will still have life for the next decade, but we are definitely moving into its final stages.
“For personal computing, there is still a lot of innovation left before we begin to run out of ideas. There will continue to be great advances in smart phones, mobile computing and tablets because software applications are really just beginning to take advantage of the phenomenal power and capacity of today’s semiconductors. The whole concept of ‘Internet of things’ will also throw up plenty of new opportunities.
“As we put more and more sensors in our personal gadgets, in factories, in industries, in infrastructures, in hospitals, and in homes and in vehicles, it will open up a completely new set of applications. The huge amount of data generated out of these sensors and wirelessly connected to the Internet will feed into the big data and analytics. This would create a plethora of application innovations.”
What’s happening in the plane?
The plane opportunity – 90nm – 65nm – 45nm – 22nm – 20nm – 14/18nm – is starting to get difficult and probably won’t work at 12nm, for purely physics reasons. What is Analog Devices’ take on this?
Fuller said: “You are right! We have been going from 45 nm down to lower nodes, it’ll probably go down to 10 nm, but we are beginning to run into some fundamental physics issues here. After all, it’s a relatively finite number of atoms that make up the channels in these transistors. So, you’re going to have to look at innovations beyond simply going down to finer dimensions.
“There are FinFETS and other ways that can help move you into the third dimension. We’re getting to a point where we can put a lot of complexity and a number of functions on a single die. We have moved beyond purely digital design to having more analog and mixed signal components in the same chip. There are also options such as stacked dies and multiple dies.
“Beyond integration on a single chip, Analog Devices leads in advanced packaging technologies for System in a Package (SiP) where sensors, digital and analog/mixed signal components are all in a single package as the individual components would typically use different technology nodes and it might not be practical to do such integration on a single die.
“So, the challenge often gets described as “More than Moore”, which is going beyond Moore’s law, bringing those capabilities to do analog processing as well as digital and then integrating sensors for temperature sensing, pressure sensing, motion sensing and a whole range of sensors integrated for enabling the ‘Internet of Things’.
“At Analog Devices, we have the capability in analog as well as digital, and having worked for over 20 years on MEMS devices, we are particularly well positioned as we get into ‘More than Moore’.”
I recently met Sam Fuller, CTO, Analog Devices, and had an interesting conversation. First, I asked him about the state of the global semicon industry in 2013.
Industry in 2013
He said: “Due to the uncertainties in the global economy in the last couple of years, the state of the global semiconductor industry has been quite modest growth. Because of the modest growth, there has been a buildup in demand. As the global economies begin to be more robust going forward, we expect to see more growth.”
Industry in 2014?
How does Analog Devices see the industry going forward in 2014? What are the five key trends?
He added: “I would talk about the trends more from an eco-system and applications perspective. Increased capability on a single chip: Given all the advances to Moore’s law, the capability of a chip has increased considerably in all dimensions and not just performance, be it the horsepower we see in today’s smartphones or the miniaturization and power consumption of wearable gadgets that were on show this year at CES.
“In Analog Devices’ case, as we are focused on high performance signal processing, we can put more of the entire signal chain on a single die. For our customers, the challenge is to provide their customers a more capable product which means a more complex product, but with a simpler interface.
“A classic example is our AD9361 chip, which is a single chip wideband radio transceiver for Software Defined Radio (SDR). It is a very capable ASSP (Application Specific Standard Products) as well as RF front end with a wide operating frequency of 70 MHz to 6 GHz.
“This chip, coupled with an all-purpose FPGA, can build a very flexible SDR operating across different radio protocols, wide frequency range and bandwidth requirements all controlled via software configuration. It finds a number of applications in wireless communication infrastructure, small cell Base stations as well as a whole range of custom radios in the industrial and aerospace businesses.”
Now, let’s see the trends for 2014!
More collaboration with customers: There is a greater emphasis on understanding customers’ end applications to provide a complete signal chain, all in a System on a Chip (SoC) or a System in a package (SiP). The relationship with our customers is changing as we move more towards ASSPs focused with few lead customers for target markets and target applications. While this has already been ongoing in the consumer industry with PCs and laptops, customers in other vertical markets like healthcare, automotive and industrial are and will collaborate more with semiconductor companies like Analog Devices to innovate at a solutions level.
More complete products: We have evolved from delivering just the silicon at a component level to delivering more complete products with more advanced packaging for various 3D chips or multi-die within a package. Our solutions now have typically much more software that makes it easier to configure or program the chips. It is a solution that is a combination of more advanced silicon, advanced packaging and more appropriate software.
With providing the complete solution, the products are more application specific and hence, the need for more collaboration with customers. For example, there may be one focused on Software Defined Radio, one for motor control, and one for vital signs monitoring for consumer health that we have launched recently.
We need it to be generic enough that multiple customers can use it, but it needs to be as tailored as possible to the customers’ needs for specific market segments. While because of the volume and standardization, availability of complete reference designs in the consumer world has been the norm, other market segments are demanding more complete products not-withstanding the huge variation in protocols and applications.
Truly global industry: The semiconductor and electronics industry has become truly global, so multiple design sites around the globe collaborate to create products. For example for Analog Devices, one of our premier design sites is our Bangalore product design center where we quite literally developed our most complex and capable chips. At the same time our customers are also global.
We see large multinational companies like GE, Honeywell, Cisco, Juniper, ABB, Schneider and many of our top strategic customers globally doing substantial system design work in Bangalore along with a multitude of India design houses. Our fastest growing region is in Asia, but we have substantial engagement with customers in North America and Europe. And our competition is also global, which means that the industry is ever moving faster as the competition is global.
Smarter design tools: The final trend worth talking about is the need for smarter design tools. As our products and our customers’ products become more complex and capable, there have to be rapidly developing design tools, for us to design them.
This cannot be done by brute force but by designing smarter and better tools. There is a lot of innovation that goes on in developing better tool suites. There is also ever more capable software that caters to a market moving from 100s of transistors to literally billions of transistors for an application.
NXP Semiconductors N.V. recently released the LPC1500 microcontroller series, optimized for fast, easy, and high-precision motor control.
So, what’s unique about the new LPC family? First, the LPC1500 was designed to simplify motor control for the masses. It has the flexibility to drive various types of motors, such as ACIM, PMSM, BLDC, etc. The LPC1500 can also drive multiple motors simultaneously.
These aren’t all! The hardware interconnection between the SCTimer/PWM, ADCs and comparators allow the motor to be driven with little CPU intervention. It has free LPCXpresso IDE and free FOC firmware for sensored and sensorless motors that reduces cost and improves time to market.
Looking at the unique features and benefits, the Switch Matrix allows any function to be routed out to any pin making schematic capture and board layout simpler and faster. The SCTimer/PWM block is unique to NXP.
Benefits are, it can run independently of the CPU and generate extremely precise PWM waveforms for quiet, smooth, efficient motor drive. The 2x 2Msps 12b,12ch ADCs can measure simultaneous phase currents to determine precise motor position and speed. There are four comparators for fast system shutdown upon fault detection.
The LPC1500 is suitable for large appliances, HVAC, building automation, factory automation, industrial pumps and generators, digital power, remote sensing, etc.
How will the LPC1500 aid embedded engineers? According to NXP, it saves time to market using the free FOC firmware and GUI tuning tool. It also saves system cost by using only one system MCU, e.g., HVAC typically has one MCU for fan control and one MCU for the compressor. LPC1500 can control both.
The LPC1500 feature set makes it ideal for sensorless motor control removing the need for sensored motors and allowing customers to switch to cheaper sensorless motors. As the SCTimer/PWM can run independently of the CPU, the freed up CPU bandwidth can be used to control other parts of the system for example the LPC1500 can be used for both the control and motor board in a washing machine.
NXP is currently working with customers to understand their future requirements and developing the roadmap to match their needs.
STMicroelectronics recently introduced the M24SR dynamic NFC/RFID tag.
Speaking about the USP of the M24SR, Amit Sethi, Product Marketing manager – Memories and RFID, STMicroelectronics India, said: “The unique selling proposition of the M24SR product is its two interfaces, giving users and applications the ability to program or read its memory using either an RF NFC interface or a wired I2C interface, in an affordable and easy-to-use device for a wide range of applications such as consumer/home appliance, OTP card, healthcare/wellness and industrial/smart meter.”
Let us see how the M24SR is beneficial for smartphone or any other audio device.
The M24SR is a dynamic NFC/RFID tag that manages the data exchange between the NFC phone and the microcontroller. The main use cases for data exchange are updating user settings, downloading data logs, and remote programming and servicing. The dynamic tag also enables seamless Bluetooth and Wi-Fi pairing, which is useful in, for example, audio devices.
How is the M24SR different from other products of the same segment?
Sethi said that the key difference is the dual interface: the M24SR memory can be accessed either by a low-power 2C interface or
by an ISO14443A RF interface operating at 13.56MHz. It also features RF status (MCU wake-up) and RF disable functions to minimize power consumption. In addition, the devices support the NFC data exchange format (NDEF from NFC forum) and 128-bit password protection mechanism.
The M24SR series is available in EEPROM memory densities from 2 Kbit to 64 Kbit and three package types: SO8, TSSOP8, and UFDFPN8.
What are the contributions of M24SR toward the Internet of Things?
Accotding to him, the M24SR dynamic NFC/RFID tag interactive and zero power capability, simplifies complex communications setups and enables data exchange among the home automation, wearable electronics, home appliances, smart meter, wellness, etc.
Especially with the NFC capability, the M24SR is ideal for applications waiting for something, like a ticket or ID to launch an activity.
Relevance for India
Finally, what’s the relevance of the product for the Indian market?
Sethi added: “Mobile and NFC based application are gaining its popularity in India. M24SR is an easy-to-use and an affordable product for the Implementation of NFC-based applications in transportation, entertainment, and lifestyle areas.
As for the go-to-market strategy, the M24SR mass market launch is planned for end of February 2014. Some M24SR samples have been delivered to key customers during Q4 2013 and design/development is ongoing.
Following a host of forecasts for 2014, it is now the turn of Applied Materials with its forecast for the year. First, I asked Om Nalamasu, senior VP, CTO, Applied Materials regarding the outlook for the global semicon industry in 2014.
Semicon outlook 2014
He said that Gartner expects the semiconductor industry to grow in mid-single digits to over $330 billion in 2014.
“In our industry – the semiconductor wafer fab equipment sector – we are at the beginning of major technology transitions, driven by FinFET and 3D NAND, and based a wide range of analyst projections, wafer fab equipment investment is expected to be up 10-20 percent in 2014. We expect to see a year-over-year increase in foundry, NAND, and DRAM investment, with logic and other spending flat to down.”
Five trends for 2014
Next, what are the top five trends likely to rule the industry in 2014?
Nalamasu said that the key trends continuing to drive technology in 2014 and beyond include 3D transistors, 3D NAND, and 3D packaging. 3D remains a central theme. In logic, foundries will ramp to 20nm production and begin early transition stages to3D finFET transistors.
With respect to 3D NAND, some products will be commercially available, but most memory manufacturers plan to crossover from planar NAND to vertical NAND starting this year. In wafer level packaging, critical mechanical and electrical characterization work is bringing the manufacturability of 3D-integrated stacked chips closer to reality.
These device architecture inflections require significant advances in precision materials engineering. This spans such critical steps as precision film deposition, precision materials removal, materials modification and interface engineering. Smaller features and atomic-level thin films also make interface engineering and process integration more critical than ever.
Driving technology innovations are mobility applications which need high performance, low power semiconductors. Smartphones, smart watches, tablets and wearable gadgets continue to propel industry growth. Our customers are engaged in a fierce battle for mobility leadership as they race to be the first to market with new products that improve the performance, battery-life, form-factor and user experience of mobile devices.
How is the global semiconductor industry managing the move to the sub 20nm era?
He said that extensive R&D work is underway to move the industry into the sub-20nm realm. For the 1x nodes, more complex architectures and structures as well as new higher performance materials will be required.
Some specific areas where changes and technology innovations are needed include new hard mask and channel materials, selective material deposition and removal, patterning, inspection, and advanced interface engineering. For the memory space, different memory architectures like MRAM are being explored.
FinFETs in 20nm!
By the way, have FinFETs gone to 20nm? Are those looking for power reduction now benefiting?
FinFET transistors are in production in the most advanced 2x designs by a leading IDM, while the foundries are in limited R&D production. In addition to the disruptive 3D architecture, finFET transistors in corporate new materials such as high-k metal gate (HKMG) that help to drastically reduce power leakage.
Based on public statements, HKMG FinFET designs are expected to deliver more than a 20 percent improvement in speed and a 30 percent reduction in power consumption compared to28nm devices. These are significant advantages for mobile applications.
Status of 3D ICs
Finally, what’s the status with 3D ICs? How is Applied helping with true 3D stacking integration?
Nalamasu replied that vertically stacked 3D ICs are expected to enter into production first for niche applications. This is due primarily to the higher cost associated with building 3D wafer-level-packaged (WLP) devices. While such applications are limited today, Applied Materials expects greater utilization and demand to grow in the future.
Applied is an industry leader in WLP, having spear-headed the industry’s development of through silicon via (TSV) technology. Applied offers a suite of systems that enable customers to implement a variety of packaging techniques, from bumping to redistribution layer (RDL) to TSV. Because of work in this area, Applied is strongly positioned to support customers as they begin to adopt this technology.
To manufacture a robust integrated 3D stack, several fundamental innovations are needed. These include improving defect density and developing new materials such as low warpage laminates and less hygroscopic dielectrics.
Another essential requirement is supporting finer copper line/spacing. Important considerations here are maintaining good adhesion while watching out for corrosion. Finally, for creating the necessary smaller vias, the industry needs high quality laser etching to replace mechanical drilling techniques.