“Specification is a problem and we have an opportunity in this area,” notes Foster. Both of these require significant changes to the existing design and development practices on both sides of the fence. This can be fixed either by having better specifications, such that the teams can work independently, or by finding ways to improve communication between teams so problems can be found and resolved earlier. There are increasing problems at the interfaces. It is an immature process.” And yet the industry appears to be complacent about these problems.
“Most software development is so primitive with manual development, manual testing, almost no test automation. We have not seen a huge rise in the number of design engineers. Clearly we are being productive in the design side. When looking at the number of transistors per engineer from 1985 to present, we have grown five orders of magnitude.
The software world is ripe for productivity improvements. “As we move down to each node, the headcount between 90nm and 16/14nm showed a 17% increase in required software engineers.
It is not clear that the hardware side actually does have full control over the security aspects, but what about productivity? “The 20 IBS reports showed the skyrocketing costs of software development,” points out Harry Foster, chief scientist at Mentor Graphics. All of the things that have been put in place on the hardware side are not there for software.” The supply chain has no software quality control. Much of this software is what they get from their suppliers. “The security team is scared because they know they have problems with their software. “It was a huge wakeup call to Chrysler when the Jeep got hacked, which resulted in 1.4 million recalls,” says Andreas Kuehlmann, senior VP and GM, Software Integrity Group of Synopsys. There are many people in the hardware world who are scared by the concept of adopting more software engineering practices. Hardware is perfect as chips are going out without bugs. I talked to Thomas DeMarco and Fred Brooks, authors of “The Mythical Man Month,” and their response was stunning: ‘Please don’t mess with hardware methods. The so-called software crisis started in 1968 when projects ran over time and budget, resulting in inefficient software of low quality that did not meet requirements. “I published a paper in 1994 titled, Transferring software engineering methods to VLSI-design: a statistical approach. It has become difficult to talk about power management without talking about the interface between the capabilities provided in hardware and the policies defined in software.įrank Schirrmeister, group director for product marketing of the System Development Suite at Cadence, reminisces about similar attempts 20 years ago. There is a growing dependence between hardware and software. Agile Development, continuous integration, and buddy programming are examples. In the past couple of years, there has been a growing chorus of people who are seriously looking at the migration of some software development methodologies to hardware. Not all of these technology migrations have been as successful as originally envisioned, but they still have had a significant impact. That goal may not have been accomplished, but SystemC is seeing widespread adoption in virtual prototypes. A unified hardware software language was created called SystemC.
That was followed by attempts to migrate methodologies, such as object-oriented programming, which is the basis for most verification languages and methodologies today, such as SystemVerilog and UVM. Then they were refined and polished and became fully integrated into the hardware development and verification flow. In the past, technologies were developed in the software world that have languished until they were taken up by the hardware community.