Great! That’s what was required!! As though software piracy isn’t enough, there is now an article about EDA software piracy!!!
According to the article, the anti-piracy committee of the Electronic Design Automation Consortium (EDAC) estimates that 30-40 percent of all EDA software use is via pirated licenses. That’s a huge number!
What are the chief reasons for EDA software piracy? Surely, it can’t be attributed to the Far East countries alone, and definitely not China and Taiwan, and perhaps, India, for that matter.
Everyone in the semiconductor industry knows that EDA software is required to design. There are hefty license fees involved that companies have to pay.
Designing a chip is a very complex activity and that requires EDA software. EDA firms send out sales guys to all over the country. Why, some of the EDA vendors are also known to form alliances with the technical colleges and universities. They offer their EDA software to such institutes at a very low cost.
Back in 2006, John Tanner wrote an article in Chip Design, stating: EDA tools shouldn’t cost more than the design engineer!
However, how many of such EDA licenses are properly used? Also, has the EDA vendor, who does go out to the technical institutes made a study about any particular institute’s usage of the EDA tool?
The recently held Design and Automation Conference (DAC) showered praises on itself for double-digit rise in attendance. Was there a mention of EDA piracy in all of that? No way! If so, why not?
The reasons are: the EDA industry already churns out a sizeable revenue from the global usage of EDA software. EDA firms are busy trying to keep up with the latest process nodes and develop the requisite EDA tools. New products are constantly being developed, and so, product R&D is a continuous event! Of course, in all of this race, EDA firms are continuously looking to keep their revenues running high, lest there is an industry climb-down!
Where then, are the reasons for EDA firms to even check, leave alone, control piracy?
An industry friend had this to say regarding EDA software piracy. “It is the inability to use certain ‘tool modules’ only at ‘certain time’. Like, if a IP company wants to just run PrimeTime (Synopsys) few times to ensure its timing worthiness before releasing that IP, and doesn’t need it after that. However, it is is not possible to get such a short time license.” Cost and unethical practices by the stake holders were some other reasons EDA users have cited.
Regarding the status in India, especially, the difference isn’t that much, from say, China. Another user said it is not such a prevelant, ‘worrisome’ aspect, yet. Yet another EDA user said that EDA piracy is there more in the sense of ‘unauthorized’ usage than ‘unpaid’ usage — not using it for what it is supposed to be used for. For instance, using academic licenses for ‘commercial developments’, etc.
That leads to the key question: can EDA software piracy be curtailed to some extent? One user feels that yes, it can. Perhaps, Microsoft type ‘detection’ technologies exist. However, another said that the EDA companies’ expenses have to do, so it can be more than actual losses. Hence, they are probably not quite doing it!
According to Patrick Maccartee, director of product management and James Ready, CTO, Monta Vista, Monta Vista virtualization can be realized. The benefits to developers are clear in terms of lowered complexity, flexibility in development, high performance, first Linux configured for dataplane performance.
These were the conclusions from the seminar, where I was an invited audience, on Beyond Virtualization: The MontaVista Approach to Multi-core SoC Resource Allocation and Control.
Use cases for virtualization in the IT world include server consolidation, underutilization, management of numerous OSs and dependant applications.
Hardware considerations include very uniform server hardware platforms, especially, I/O, and an extensive processor support for virtualization. There also exists a huge uniform market for virtualization, with numerous successful companies of very large scale.
Embedded is different yet again. Embedded devices are already highly optimized, especially, in terms of size, power consumption, CPU utilization, etc. No layer of software makes a processor go faster. So far, it is not a big market.
Multi-core does not automatically mean either RTOS for data plane, hypervisors/virtualization and multiple OSs. In this scenario, what’s useful for embedded virtualization? The answer is MontaVista virtualization architecture
In conjunction with the Mobile Marketing Association Forum (MMA Forum) APAC event held this April 13-15, I had the opportunity to interact with Anand Chandrasekaran, director of Product Management, Openwave Systems Inc., which also did a global launch of it product — the Analytics Express at the event.
Managing data traffic challenges
Despite claims of vendors to have solved growing data traffic challenges, those still remain. How can Openwave really help manage this?
According to Anand Chandrasekaran, a fundamental shift has occurred in the industry. He said: “The demand for mobile data that we planned for years ago is finally here – only it’s bigger than everyone predicted. The proliferation of new devices like the iPhone and HTC Incredible, along with vastly improved user experiences and unlimited data plans (to date), has caused a tremendous and unprecedented surge in mobile data demand – AT&T disclosed this year that 3 percent of its users consume 40 percent of its bandwidth resources. This increase in traffic and the competitive pressure to keep data plans flat are squeezing service providers’ margins.”
Not all service providers have the financial strength to simply throw money at the problem, nor does that guarantee a sustainable solution. Service providers need to take a more holistic approach in developing solutions that will maximize available bandwidth while being able to monetize this surge of mobile data traffic.
An effective way for mobile service providers to handle the approaching data tsunami is to deploy context-aware traffic mediation software that sits in the data path, empowering them with a full view of their network, their subscribers’ profiles and the mobile devices in use. Context-aware traffic mediation enables service providers to monitor, manage and monetize traffic by creating and delivering smart policy-driven services.
According to him, Openwave’s Traffic Mediation solution runs on an open, IP-access platform that acts as a single control point for traffic management and provides services such as content adaptation, web and media optimization, network security, smart policy control and dynamic charging and campaigning. Read more…
Ever wondered why you are unable to update your anti-virus software? Chances are that you are running a pirated version of the Windows OS! That itself opens your system and network up to cyber threats and other attacks! Yes, I know! Prices of software, at least the relevant ones, aren’t that low! However, you don’t have much choice, do you?
Now, if you were using original software in your computer, you won’t really face this problem! It means, you are mananging your asset — in this case, the software. To run your heavy, feature rich programs, you need robust software that is sometimes (or, nearly, all the times) expensive!
Okay! Imagine an enterprise using counterfeit or pirated software. Will it face problems? Surely, very serious ones, in the short and long terms. Now what if a public sector undertaking was using pirated software? Perhaps, that would directly impact the services, and in most cases, e-governamce services that it offers.
Keeping all of these in mind, the Business Software Alliance (BSA) has launched a SAM (software asset management) program for public sector undertakings (PSUs) based in the Indian state of Karnataka. This is the first SAM initiative for PSUs in partnership with Centre of e-Governance, Government of Karnataka.
Dr. D.S. Ravindran, CEO, Centre of e-Governance, Government of Karnataka, said: “In the last year, the IT spend by the government was close to Rs. 300 crore. All PSUs are now also coming up with their own IT needs and it is important to adopt this standard of SAM with good IT governance practices in order to enable the state of Karnataka to be more productive and cost-efficient.”
Dr. Ravindran added: “We have connected 30,000 offices across the state thru the wide area WAN. We have done over Rs. 22,000 crores worth of procurement over our e-governance platform.” Read more…
This piece was information was sent to me by a friend, Ms Tahira Amjad. Thanks a lot!
Skyway Software is a US-based software development company that provides technology and processes to IT organizations to simplify their software delivery systems, often reducing application development and deployment schedules by 30 percent or more.
The company’s flagship product, Skyway Builder, is an open-source, model-centric JEE application development and deployment tool for delivering RIAs (Rich Internet applications) and Web Services to the Spring Framework.
Unlike any other development tool, Skyway Builder provides comprehensive modeling capabilities at three distinct application layers — Web/UI layer, service layer, and data layer — and fully functional solutions may be delivered easily to a wide variety of open-source and commercial infrastructures.
In order to introduce Skyway software to developers in India — a market Skyway Software considers ‘as the leading force in software development’ — it is are sponsoring the SkywayCup, a multinational challenge for software developers to create a new, viable and working application using Skyway Software’s Skyway Builder Community Edition (CE), or to create a logical and workable extension to Skyway Builder itself.
The contest will give away prize money of more than US$40,000, and full information on the contest can be found at http://www.skywaycup.in
It’s the Skyway Cup!
Rich applications that benefit and delight end users in today’s enterprises, built by developers like you, will be a critical component of the Skyway Software vision.
So that you can help us realize our vision as quickly as possible, we have created the Skyway Cup, which will provide almost $50,000 in awards — no strings attached — for great solutions built using Skyway Builder.
By creating the Skyway Cup, we want to showcase how members of our Skyway Community are using Skyway Builder to build Rich Internet Applications, as well as demonstrate how our members are using Skyway Builder to serve their specific needs.
Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;) I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let’s try to find out the the biggest verification mistakes.
Top verification mistakes
Kalia said that the biggest verification mistakes made today are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.
In that case, why are some companies STILL not knowing how to verify a chip?
He added: “I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.
“For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip.”
How are companies trying to address the challenges?
Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.
* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.
* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.
* Verification environment re-use helps to cut down the time required to develop verification environments.
* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.
Cadence has the widest portfolio of tools to help companies meet verification challenges, including:
Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;
The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;
Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and
Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.
Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.
When should good verification start?
Kalia noted: “Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements.”
Are folks mistaking by looking at tools and not at the verification process itself?
He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.
Finally, there’s verification planning! What should be the ‘right’ verification path?
Verification planning needs to include:
* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.