My research centers on the mathematical modeling and analysis of computer hardware and software, i.e., computing as applied mathematics. "Computer science" might be an apt description of this endeavor—unfortunately, that term has been irretrievably abused.
Practical modeling of real computer systems often requires greater attention to detail than is typical of traditional mathematical domains. A variety of so-called "formal methods", based on special-purpose notations and mechanical reasoning tools, have been developed to ease this burden. These methods have been applied successfully across the computer industry, but their potential benefits remain largely unexplored. The emphasis has been on automatic tools that are easy to use but lack the versatility of interactive theorem proving. It is unfortunate that the value of a formal method is measured more often by its accessibility to the untrained user than by its ultimate utility as an aid to mathematical reasoning.
The effective systematic use of formal methods will require a broader view of the role of mathematics in the science of computing, which I envision as patterned after the classical physical sciences. Thus, I try to use formal methods where appropriate to support, rather than to replace, mathematical rigor.
Most of the papers listed below describe the formal analysis and ultimate proof of correctness of (hardware and software) designs that were created by other people. In some cases, this analysis exposed fatal flaws in the original designs. Some day, I hope to do something creative, but for now, this is the thing that I enjoy most and do best: finding fault in the efforts of others. (Hey, I am certainly not the first to build a career around a personality disorder. Mike Tyson and Mel Tillis come to mind. And what was the name of that French nightclub performer whose act was based on a profound case of chronic flatulence?)
"Nature is written in that great book which ever lies before our eyes -- I mean the universe -- but we cannot understand it if we do not first learn the language and grasp the symbols in which it is written. The book is written in the mathematical language, and the symbols are triangles, circles and other geometrical figures, without whose help it is impossible to comprehend a single word of it; without which one wanders in vain through a dark labyrinth." -- Galileo
y first exposure to mechanical theorem proving was at the University of Texas in 1983, when Bob Boyer and J Moore showed me their prover, Nqthm. At their suggestion (which came with a conditional offer of a master's degree), in response to a 1961 challenge by John McCarthy, I used Nqthm to check a formal proof of Wilson's Theorem, and proceeded to develop an Nqthm library of elementary number theory, including
The primary significance of this last result, of course, was that it finally put to rest any suspicion that the 197 previously published proofs of this theorem were all flawed. By the way, it also put me in some good company.
More recently, I have repeated these proofs with the ACL2 prover. Generally, it is difficult to attach any practical significance to work of this sort. However, Piergiorgio Bertoli showed me a real application for
From UT, I went to the Artificial Intelligence Program of Microelectronics and Computer Technology Corp. (MCC), where I developed
This was largely a waste of time, although it did introduce me to logic programming and eventually led to
I later joined MCC's Formal Methods Project, where I got back to proving theorems and did some work on concurrent program verification:
In 1991, I went to Computational Logic, Inc. (CLI) to work on fault-tolerant hardware verification. This led to an analysis of the semantics of VHDL:
The application of interest was an asynchronous system of communicating processors:
When CLI folded in 1995, I went to Advanced Micro Devices, Inc. (AMD), where I worked primarily on floating-point design verification. Here are some papers on our register-transfer logic (RTL) verification methodology:
One aspect of this methodology is the translation of RTL to ACL2, which led me to an investigation of the semantics of Verilog and, in particular, an analysis of IEEE Standard 1362-2001:
Here are some applications:
Here is my rant on the accepted standard of correctness for floating-point designs:
An outgrowth of my work is an evolving ACL2 library of floating-point arithmetic, which has been made publicly available through the ACL2 Web site in the hope that it will be useful to others who are interested in floating-point verification. The following (long) paper is both an exposition of the underlying theory and a user's manual for the library:
At CAV 2007, I gave an invited talk describing my experience at AMD:
I also worked at AMD on a formal model of the x86 instruction set architecture. Here is a note on
In 2012, I joined Intel, where I've continued my work on floating-point verification. John O'Leary and I have developed a formal functional programming language for the purpose of modeling arithmetic algorithms to be implemented in RTL:
One application of MASC (in progress) is a formal proof of correctness of a hardware implementation of the Diffie-Hellman key exchange algorithm known as Curve25519. The first step in this project was a formalization of
The next step was
A by-product of the latter was an analysis of
My Erdös-Bacon number, as documented here, is 5.