Acceptance of Formal Methods: Lessons from Hardware Design

by

David L. Dill (Stanford University) and John Rushby (SRI International)

From the IEEE magazine Computer 15-page "roundtable" on formal methods in its April 1996 issue (Vol. 29, No. 4, pp. 16-30). Organized by Hossein Saiedian and titled An Invitation to Formal Methods, the article features short articles by Hossein Saiedian, Jonathan P. Bowen, Ricky W. Butler, David L. Dill, Robert L. Glass, David Gries, Anthony Hall, Michael G. Hinchey, C. Michael Holloway, Daniel Jackson, Cliff B. Jones, Michael J. Lutz, David L. Parnas, John Rushby, Jeannette Wing, and Pamela Zave.

Despite years of research, the overall impact of formal methods on mainstream software design has been disappointing. By contrast, formal methods are beginning to make real inroads in commercial hardware design. This penetration is the result of sustained progress in automated hardware verification methods, an increasing accumulation of success stories from using formal techniques, and a growing consensus among hardware designers that traditional validation techniques are not keeping up with the increasing complexity of designs. For example, validation of a new microprocessor design typically requires as much manpower as the design itself, and the size of validation teams continues to grow. This manpower is employed in writing test cases for simulations that run for months on acres of high-powered workstations.

In particular, the notorious FDIV bug in the Intel Pentium processor, has galvanized verification efforts, not because it was the first or most serious bug in a processor design, but because it was easily repeatable and because the cost was quantified (at over $400 million).

Hence, hardware design companies are increasingly looking to new techniques, including formal verification, to supplement and sometimes replace conventional validation methods. Indeed, many companies, including industry leaders such as AT&T, Cadence, Hewlett-Packard, IBM, Intel, LSI Logic, Motorola, Rockwell, Texas Instruments, and Silicon Graphics have created formal verification groups to help with ongoing designs. In many cases, these groups began by demonstrating the effectiveness of formal verification by finding subtle design errors that were overlooked by months of simulation.

Why have formal methods been more successful for hardware than for software? We believe that the overriding reason is that applications of formal methods to hardware have become cost-effective.

The decision to use a new methodology is driven by ECONOMICS: do the benefits of the new method exceed the costs of converting to it and using it by a sufficient margin to justify the risks of doing so? The benefits may include an improved product (e.g., fewer errors), but those most keenly desired are reduced validation costs and reduced time-to-market (for the same product quality). The chief impediments to applying traditional formal methods are that the costs are thought to be high (e.g., large amounts of highly skilled labor) or even unacceptable (a potential increase in time-to-market), while the benefits are uncertain (a possible increase in product quality). Formal hardware verification has become attractive because it has focussed on reducing the cost and time required for validation rather than pursuit of perfection.

Of course, hardware has some intrinsic advantages over software as a target for formal methods. In general, hardware has no pointers, no potentially unbounded loops or recursion, and no dynamically created processes, so its verification problem is more tractable. Furthermore, hardware is based on a relatively small number of major design elements, so that investment in mastering the formal treatment of, say, pipelining or cache coherence can pay off over many applications. And the cost of fabricating hardware is much greater than software, so the financial incentive to reduce design errors is much greater.

However, we believe there are some lessons and principles from hardware verification that can be transferred to the software world. Some of these are listed below.

PROVIDE POWERFUL TOOLS
Technology is the primary source of increased productivity in most areas, and especially this one. In particular, tools that use formal specifications as the starting point for mechanized formal CALCULATIONS are the primary source of cost-effective applications of formal methods. This is exactly analogous to the use of mathematical modeling and calculation in other engineering disciplines. Without tools to deliver tangible benefits, formal specifications are just documentation, and there is little incentive for engineers to construct them, or to keep them up to date as the design evolves.

For hardware, a spectrum of tools has evolved to perform formal calculations at different levels of the design hierarchy and with different benefits and costs. At the lowest level are tools that check Boolean equivalence of combinational circuits (this is useful for checking manual circuit optimizations). Techniques based on Ordered Binary Decision Diagrams (BDDs) are able to check large circuits quite efficiently, and are now incorporated in commercial CAD tools. At a higher level, designs can often be represented as interacting finite state machines, and tools that systematically explore the combined state space can check that certain desired properties always hold, or that undesired circumstances never arise. Tools based on explicit state enumeration can explore many millions of states in a few hours; tools that represent the state space symbolically (using BDDs) can sometimes explore vast numbers of states (e.g., 10^100) in the same time, and can check richer properties (e.g., those that can be specified in a temporal logic, in which case the technique is called "temporal logic model checking"). At the highest levels, or when very complex properties or very large (or infinite) state spaces are involved, highly automated theorem proving methods can be used to compare implementations with specifications. These theorem proving methods include decision procedures for propositional calculus, equality, and linear arithmetic, combined with rewriting, and induction. In all cases, the tools concerned are highly engineered so that they can deal with very large formulas, and require little or no user interaction when applied in familiar domains.

USE VERIFICATION TO FIND BUGS
A tool that simply blesses a design at the end of a laborious process is not nearly as impressive to engineers as a tool that finds a bug. Finding bugs is computationally easier than proving correctness, and a potential cost can be attached to every bug that is found, making it easy to see the payoff from formal verification. Traditional validation methods already are used primarily as bug-finders, so formal methods are very attractive if they find different bugs from those traditional methods--a much more achievable goal than trying to guarantee correctness.

Shortcuts can be taken when formal verification is used for finding bugs rather than proving correctness. For example, a system can be "down-scaled"--the number or sizes of components can be drastically reduced. For example, a directory-based cache-coherence protocol can be checked with just four processors, one cache line, and two data values--such a down-scaled description will still have many millions of states, but will be within reach of state exploration and model checking methods. These methods can check the reduced system COMPLETELY; in contrast, simulation checks the full system very incompletely. Both techniques find some bugs and miss others, but the formal methods often detect bugs that simulation does not. For example, cache-coherence bugs have been found in the IEEE standard FutureBus+ and Scalable Coherent Interface (SCI) protocols in just this way. Some researchers are now applying these techniques to software.

FORMAL TECHNIQUES MUST BE TARGETED
In hardware, experience shows that control-dominated circuits are much harder to debug than data paths. So effort has gone into developing formal verification techniques for protocols and controllers rather than for data paths. Targeting maximizes the potential payoff of formal methods by solving problems that are not handled by other means. Notice that the targeted problems often concern the HARDEST challenges in design: cache coherence, pipeline (and now superscalar) correctness, and floating point arithmetic. For software, correspondingly difficult and worthwhile challenges include those where local design decisions have complex global consequences, such as the fault-tolerance and real-time properties of concurrent distributed systems.

RESEARCHERS SHOULD APPLY THEIR WORK TO REAL PROBLEMS
Our research priorities are completely different from what they would have been, had we not exercised our ideas on realistic problems. Such efforts have frequently raised interesting new theoretical problems, as well as highlighting the need for improvements in tools.

Of course, applying verification strategies to real problems is also crucial for building credibility. There is now a long string of success stories from academia and industry where finite-state verification techniques have been applied to hardware and protocols. A few documented examples include the protocol bugs mentioned above in FutureBus+ (found using symbolic model checking with a version of CMU's SMV system) and SCI (found using explicit state enumeration with Stanford's Murphi verifier), and formal verification of the microarchitecture and microcode of the Collins AAMP5 and AAMP-FV avionics processors (using theorem proving with SRI's PVS system). Several groups have also demonstrated the ability to detect bugs in the quotient-prediction tables of SRT division algorithms (similar to the Pentium FDIV bug), and some have been able to verify specific SRT circuits and tables. There have also been many unpublicized examples of problems found by industrial formal verification groups, which have helped them build credibility among designers and managers in their companies.

In conclusion, we attribute the growing acceptance of formal methods in commercial hardware design to the power and effectiveness of the tools that have been developed, to the pragmatic character of the ways in which those tools have been applied, and to the overall cost-effectiveness and utility that has been demonstrated. We believe that formal methods can achieve similar success in selected software applications by following the same principles.