Archive

Archive for the ‘global semiconductor industry’ Category

Optic2Connect develops software for photonics!


Sean Seah

Sean Seah

Optic2Connect will be present at this year’s DAC. I caught up with Sean Seah, project manager, to find out more.

First, what’s the company’s X factor and why? (What is it that makes your offering special and noteworthy – how are you different from competitors)?

Optic2Connect develops software solutions for the photonics industry. The demand to manage high volumes of data in networks, especially with the current smart-phone and cloud computing trend, has increased tremendously. As design gets more complex, simulation tools need to scale with regard to fidelity and accuracy.

Currently, photonic designers, scientists, and fabrication engineers adopt an approximated approach from the electrical data to build an equivalent optical model, hence losing on device physics details. At the same time the process is long as the model needs to be described block-by-block with denser blocks representing a more detailed model. Our competitors are well established in their respective domains, electrical or optical, but they are strong in their own respective fields. However, intimate knowledge in both are essential to fully understand this newer generation of photonic devices. Failure to understand fully results in false results from the manufacturing.

With patented know-how, Optic2Connect provides software solutions that SOLVES this pertinent challenge. It maps accurately simulations from one domain to another, e.g. electrical to optical. This technology has been developed by a team of researchers at A*Star – Singapore Public Research Institute. The technology overcomes error-prone and detailed oriented simulation setups. We demonstrated the ability to map without losing any fidelity in the simulation files.

Optic2Connect’s IP differs from its competitors because it simulates directly from the beginning device processing, to electrical device performance until the final high-speed optical eye diagram. This is in stark contrast to the usual method of representing their operation using simplified transfer functions.

Furthermore, the Optic2Connect design flow uses the same reliable tools and processes from the semiconductor industry that are fully compatible with the Complementary Metal Oxide Semiconductor (CMOS) fabrication process of silicon microelectronics. This design flow uses standard tools libraries, device models especially for active components such as modulators, and simulation of these components incorporating the models.

How have you been doing this year so far? Seah said: “It has been excellent! We are racing to complete our product prototypes and we secured a contract from an MNC and another from universities.”

What’s the future path likely to be? Seah added: “We intend to further validate our prototype with our partners from industry and academia, and integrating advanced modulation formats into our solutions. We want to offer a fully integrated solution for photonic devices to our customers. Our goal is to offer a one-stop solution for leading integrated-circuit (IC) manufacturers!”

Why this name? You sounded like a telecom company!

Seah said: “We strongly believe the future of communications is via optics which has the ability to circumvent the data bottleneck issues. Optic2Connect is meant to offer connect using optical communications. Our goal is a one-stop solution for optical connections. ”

How will the solution significantly shorten product time-to-market and reduce development costs of photonics devices?

For complex photonics devices, minute changes to design parameters are significant and could affect loss performance, and operating voltage requirements. One common approach in the industry today is to physically build the variations into multiple device / runs and test them out. Each run cost is the range of hundreds of thousands and consume precious time. Especially, if the first batch of devices do not meet required parameters and additional batches are required. This cost both money and time, which in turn is more money.

Hence, Optic2Connect provides an elegant solution with our accurate modelling and simulation solutions, this accelerates manufacturing prototypes and at much lower production costs. Our software solutions provide a 10x improvement in time reduction and time to market. Further, our cloud solution overcomes traditional problems of insufficient servers / licenses, especially during periods of peak demand.

Metro450 Conference 2014 discusses all things 450mm wafers!


Thanks to the Enable450 newsletter, sent out by Malcolm Penn, CEO, Future Horizons, here is a piece on the Metro450 Conference 2014, held earlier this year in Israel.

450Metro450 is an Israel-based consortium with the goal of helping the metrology companies advance in their fields. The consortium’s members include metrology and related companies, as well as academics who support these companies by performing basic research.

The conference was sponsored by the Israeli Chief Scientist Office, by Applied Materials Israel and by Intel. There were several goals for the conference: to provide an opportunity for industry leaders as well as academicians to meet and discuss the latest developments in the world of metrology, to present these advances to audiences which would normally not be privy to such information, and to learn more about the international effort in 450mm wafer technology.

Over 200 people attended this conference from Israeli companies and academia, as well as from Europe and the United States. Israeli companies included Applied Materials, Jordan Valley, Nova, KLA, Zeiss Israel, and others. Academic members included researchers from the leading Israeli universities, including the Technion, Tel-Aviv U. and Haifa U. European companies were represented by ENIAC, as well as large corporations such as ASML as well SME-based companies. The G450C consortium, based in Albany, N.Y. was also well represented at this conference.

Some of the highlights of the conference included scientific discussions of different metrology methods, and their adjunct requirements, such as improved rapid wafer movement, improved sampling methods and fast computing. Presentations also included an overview of the advances necessary to move the industry forward, optical CD metrology, x-ray metrology, and novel piezo-based wafer movement.

A panel discussed various broad industry trends, including the timeline of 450mm wafers, European programs and the Israeli programs. International speakers discussed the European technology model, risk mitigation of 450 through collaborations, 450 collaborative projects under ENIAC, 450mm wafer movement challenges and metrology challenges beyond 14nm.

This second annual Metro450 conference took place this January at the Technion, Israel.

Five recommendations for verification: Dr. Wally Rhines


Dr. Wally RhinesIt seems to be the season of verification. The Universal Verification Methodology (UVM 1.2) is being discussed across conferences. Dennis Brophy, director of Strategic Business Development, Mentor Graphics, says that UVM 1.2 release is imminent, and UVM remains a topic of great interest.

Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.

Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”

Would you agree that many companies STILL do not know how to verify a chip?

Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).

“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.

“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”

How are companies trying to address those?

According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).  Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.

Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?

He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.

When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?

Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”

Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?

He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”

What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?

Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.

“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”

How is Mentor addressing this situation?

Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.

In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.

Five recommendations for verification
Finally, I asked him, what would be your top five recommendations for verification?

Here are the five recommendations for verification from Dr. Rhines:

* Ensure your organization has implemented an effective verification planning process.

* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.

* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.

* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.

* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.

Semicon industry needs to keep delivering value: Anil Gupta

April 3, 2014 Comments off

Anil Gupta

Anil Gupta

In 2013, the global semiconductor industry had touched $306 billion or so. Sales had doubled from $100 billion to $200 billion in six years — from 1994 to 2000. It was enterprise sales that was driving this. It has taken 14 years to move past $300 billion, said Anil Gupta, managing director, Applied Micro Circuits India Pvt Ltd, at the UVM 1.2 day.

This time, consumption of semiconductors is not only around enterprise, but social networks as well. Out of the $306 billion, logic was approximately $86 billion, memory was $67 billion, and micro was $58 billion. We, as consumers, are starting to play a huge role.

However, the number of large players seem to be shrinking. Mid-size firms, like Applied Micro, are said to be struggling. Technology is playing an interesting role. There is a very significant investment in FinFETs. It may only get difficult for all of us. Irrespective, all of this is a huge barrier to the mid- to small-companies. Acquisitions are probably the only route, unless you are in software.

In India, we have been worried for a while, whether the situation will be a passing phase. We definitely will have a role to play. From an expertise perspective, thanks to our background, we have been a poor nation. For us, the job is the primary goal. We need to think: how do we deliver value? We have to try and keep creating value for as long as possible.

As more and more devices actually happen, many other things are also happening. An example for devices is power. We still have a fair number of years ahead where there will be opportunities to deliver value.

What’s happening between hardware and software? The latter is in demand. Clearly, there is a trend to make the hardware a commodity. However, hardware s not going away! Therefore, the opportunity for us to deliver value is huge.

Taking the tools to make something, is critical. UVM tools are critical. But, somewhere along the way, we seem to stop at that. We definitely need to add value. UVM’s aim is to make things re-usable.

Don’t loose your focus while doing verification. Think about the block, the subsystem and the top. You need to and will discover and realize how valuable it is to find a bug, before the tape out of the chip.

Are we at an inflection point in verification?

April 2, 2014 Comments off

synopsysAre we at an inflection point in verification today? Delivering the guest keynote at the UVM 1.2 day, Vikas Gautam, senior director, Verification Group, Synopsys, said that today, mobile and the Internet of Things are driving growth. Naturally, the SoCs are becoming even more complex. It is also opening up new verification challenges, such as power efficiency, more software, and reducing time-to-market. There is a need to shift-left to be able to meet time-to-market goal.

The goal is to complete your verification as early as possible. There have been breakthrough verification innovations. System Verilog brought in a single language. Every 10-15 years, there has been a need to upgrade verification.

Today, many verification technologies are needed. There is a growing demand for smarter verification. There is need for much upfront verification planning. There is an automated setup and re-use with VIP. There is a need to deploy new technologies and different debug environments. The current flows are limitimg smart verification. There are disjointed environments with many tools and vendors.

Synopsys has introduced the Verification Compiler. You get access to each required technology, as well as next-gen technology. These technologies are natively integrated. All of this enables 3X verification productivity.

Regarding next gen static and formal platforms, there will be capacity and performance for SoCs. It should be compatible with implementation products and flows. There is a comprehensive set of applications. The NLP+X-Prop can help find tough wake-up bug at RTL. Simulation is tuned for the VIP. There is a ~50 percent runtime improvement.

System Verilog has brought in many new changes.  Now, we have the Verification Compiler. Verdi is an open platform. It offers VIA – a platform for customizing Verdi. VIA improves the debug efficiency.

Three things in Indian semicon: Vinay Shenoy

April 2, 2014 Comments off

Vinay Shenoy

Vinay Shenoy

There have been a variety of announcements made by the Government of India in the last one year or so. In the pre-90s period, the country showed just 1 percent GDP growth rate. It was adverse to FDI and had a regulated market. All of this led to deregulation under the late PM, PV Narasimha Rao.

The Indian government was averse to foreign investment, which was opened up around 1994. Since then, we have seen 6-8 percent growth, said Vinay Shenoy, MD, Infineon Technologies (India). He was delivering the keynote at the UVM 1.2 day, being held in Bangalore, India.

Around 1997, India signed the ITA-1 with the WTO. Lot of electronic items had their import duty reduced to zero. It effectively destroyed the electronics manufacturing industry in India. We were now reduced to being a user of screwdriver technology. In 1985, the National Computer Policy, and in 1986, the National Software Policy, were drafted. The government of India believed that there existed some opportunities. The STPI was also created, as well as 100 percent EoUs. So far, we have been very successful in services, but have a huge deficit on manufacturing.

We made an attempt to kick off semicon manufacturing in 2007, but that didn’t take off for several reasons. It was later revived in 2011-12. Under the latest national policy of electronics, there have been a couple announcements – one, setting up of two semicon fabs in India. The capital grant – nearly 25-27 percent — is being given by the government. It has provided a financial incentive – of about $2 billion.

Two, electronics manufacturing per se, unless it is completely an EoU, the semicon industry will find it difficult to survive. There is the M-SIPS package that offers 25 percent capital grant to a wide range of industries.

Three, we have granted some incentives for manufacturing. But, how are you going to sell? The government has also proposed ‘Made in India’, where, 30 percent of the products will be used within India. These will largely be in the government procurements, so that the BoM should be at least 30 percent from India. The preferential market policy applies to all segments, except defense.

Skill development is also key. The government has clearly stated that there should be innovation-led manufacturing. The government also wants to develop PhDs in selected domains. It intends to provide better lab facilities, better professors, etc. Also, young professors seeking to expand, can seek funding from the government.

TSMC promotes small IP companies. Similarly, it should be done in India. For semicon, these two fabs in India will likely come up in two-three years time. “Look at how you can partner with these fabs. Your interest in the semicon industry will be highly critical. The concern of the industry has been the stability of the tax regime. The government of India has assured 10 years of stable tax regime. The returns will come in 10-15 years,” added Shenoy.

The government has set up electronics manufacturing clusters (EMC). These will make it easy for helping companies to set up within the EMC. The NSDC is tying up with universities in bringing skill-sets. The industry is also defining what skills will be required. The government is funding PhDs, to pursue specialization.

Are we about to reach end of Moore’s Law?


Here is the concluding part of my discussion with Sam Fuller, CTO, Analog Devices. We discussed the technology aspects of Moore’s Law and

Sam Fuller

Sam Fuller

‘More than Moore’, among other things.

Are we at the end of Moore’s Law?
First, I asked Fuller that as Gordon Moore suggested – are we about to reach the end of Moore’s Law? What will it mean for personal computing?

Fuller replied: “There is definitely still life left in Moore’s law, but we’re leaving the golden age after the wonderful ride that we have had for the last 40 years. We will continue to make chips denser, but it is becoming difficult to continue to improve the performance as well as lower the power and cost.

“Therefore, as Moore’s law goes forward, more innovation is required with each new generation. As we move from Planer CMOS to FinFET (a new technology for multi-gate architecture of transistors); from silicon to more advanced materials Moore’s law will still have life for the next decade, but we are definitely moving into its final stages.

“For personal computing, there is still a lot of innovation left before we begin to run out of ideas. There will continue to be great advances in smart phones, mobile computing and tablets because software applications are really just beginning to take advantage of the phenomenal power and capacity of today’s semiconductors. The whole concept of ‘Internet of things’ will also throw up plenty of new opportunities.

“As we put more and more sensors in our personal gadgets, in factories, in industries, in infrastructures, in hospitals, and in homes and in vehicles, it will open up a completely new set of applications. The huge amount of data generated out of these sensors and wirelessly connected to the Internet will feed into the big data and analytics. This would create a plethora of application innovations.”

What’s happening in the plane?
The plane opportunity – 90nm – 65nm – 45nm – 22nm – 20nm – 14/18nm – is starting to get difficult and probably won’t work at 12nm, for purely physics reasons. What is Analog Devices’ take on this?

Fuller said: “You are right! We have been going from 45 nm down to lower nodes, it’ll probably go down to 10 nm, but we are beginning to run into some fundamental physics issues here. After all, it’s a relatively finite number of atoms that make up the channels in these transistors. So, you’re going to have to look at innovations beyond simply going down to finer dimensions.

“There are FinFETS and other ways that can help move you into the third dimension. We’re getting to a point where we can put a lot of complexity and a number of functions on a single die. We have moved beyond purely digital design to having more analog and mixed signal components in the same chip. There are also options such as stacked dies and multiple dies.

“Beyond integration on a single chip, Analog Devices leads in advanced packaging technologies for System in a Package (SiP) where sensors, digital and analog/mixed signal components are all in a single package as the individual components would typically use different technology nodes and it might not be practical to do such integration on a single die.

“So, the challenge often gets described as “More than Moore”, which is going beyond Moore’s law, bringing those capabilities to do analog processing as well as digital and then integrating sensors for temperature sensing, pressure sensing, motion sensing and a whole range of sensors integrated for enabling the ‘Internet of Things’.

“At Analog Devices, we have the capability in analog as well as digital, and having worked for over 20 years on MEMS devices, we are particularly well positioned as we get into ‘More than Moore’.”
Read more…

%d bloggers like this: