Conference Paper

How Good Can a Resolution Based SAT-solver Be?

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We introduce a parameterized class M(p) of unsatisfiable formulas that specify equivalence checking of Boolean circuits. If the parameter p is fixed, a formula of M(p) can be solved in general resolution in a linear number of resolutions. On the other hand, even though there is a polynomial time deterministic algorithm that solves formulas from M(p), the order of the polynomial is a monotone increasing function of parameter p. We give reasons why resolution based SAT-algorithms should have poor performance on this very “easy” class of formulas and provide experimental evidence that this is indeed the case.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In this report we continue developing the theory of equivalence checking and logic synthesis of circuits with a common specification (CS) started in [5][6][7]. A CS S of combinational circuits N 1 and N 2 is just a circuit of multi-valued gates (further referred to as blocks) such that N 1 and N 2 are different implementations of S.Figure 1 gives an example. ...
... (The most efficient equivalence checkers heavily rely on existence of functionally equivalent internal points.) Fortunately, in [5][6] it was shown that if a CS S of N 1 and N 2 is known, there is an efficient procedure for EC of N 1 and N 2 . The most advanced version of this procedure was given in [7]. ...
... Is there an efficient procedure for EC of N 1 , N 2 if S is unknown (i.e we do not know the partitions Spec(N 1 ) and Spec(N 2 ) representing S)? In [6][5] it was conjectured that in that case EC of N 1 ,N 2 is hard for any deterministic algorithm. The new (and equivalent) definition of CS given in this report allows one to get a better perspective on the problems one has to solve when checking N 1 ,N 2. for equivalence. ...
Conference Paper
In this report we develop a theory of equivalence checking and logic synthesis of circuits with a common specification (CS). We show that two combinational circuits N1,N2 have a CS iff they can be partitioned into subcircuits that are connected "in the same way" and are toggle equivalent. This fact allows one to represent a specification of a circuit implicitly as a partitioning into subcircuits. We give an efficient procedure for checking if circuit N1, N2 have the same predefined specification. As a "by-product", this procedure checks N1 and N2 for functional equivalence. We show how, given a circuit N1 with a predefined specification, one can efficiently build a circuit N2 satisfying the same specification. We give experimental evidence that equivalence checking of N1, N2 is hard if their CS is unknown. We also show experimentally that one can eliminate logic redundancy of circuit N1 by building a circuit N2 that is toggle equivalent to N1.
... On the other hand, an HLLS procedure performs "non-local" synthesis transformations, so verification of the correctness of such transformations may pose a problem. This problem was addressed in [3], [4] where it was shown that if two Boolean circuits have a common specification (CS), their equivalence checking is "easy". Informally, circuits N 1 and N 2 have a CS if they can be considered as two different implementations of a circuit S of multi-valued gates further referred to as blocks. ...
... (S is called a CS of N 1 and N 2 ). In [3] [4] it was proven that given a CS S of circuits N 1 and N 2 , there is an equivalence checking procedure whose complexity is linear in the number of blocks of S and exponential in the granularity of S (the "size" of the largest block of S). An example of circuits N 1 ,N 2 having a common specification of three blocks is shown in Fig. 1. ...
Article
In this paper we develop a method of logic synthesis that preserves high-level structure of the circuit to be synthesized. This method is based on the fact that two combinational circuits implementing the same "high-level" specification can be efficiently checked for equivalence. Hence, logic transformations preserving a predefined specification can be made efficiently. We introduce the notion of toggle equivalence of Boolean functions and show that toggle equivalence can be used for making gate level transformations that preserve a predefined specification. We describe a practical procedure for checking toggle equivalence of two Boolean circuits and give experimental data about its performance.
... It is observed that, equivalence checking becomes more easier when the Boolean circuits are of similar structure [4]. Major problem is faced when verifying digital circuits with asymmetric structures. ...
Preprint
Full-text available
The use of Boolean Satisfiability (SAT) solver for hardware verification incurs exponential run-time in several instances. In this work we have proposed an efficient quantum SAT (qSAT) solver for equivalence checking of Boolean circuits employing Grover's algorithm. The Exclusive-Sum-of-Product based generation of the Conjunctive Normal Form equivalent clauses demand less qubits and minimizes the gates and depth of quantum circuit interpretation. The consideration of reference circuits for verification affecting Grover's iterations and quantum resources are also presented as a case study. Experimental results are presented assessing the benefits of the proposed verification approach using open-source Qiskit platform and IBM quantum computer.
... Correctness verification is a well-studied field in the classical domain [26,35,34] but unfortunately not all methods directly carry over to quantum computing because the state of n quantum bits is generally represented as 2 n complex values [40]. Due to the reversibility of quantum circuits, verifying equivalence of circuits C 1 , C 2 is reducible to checking if the circuit C 1 · C −1 2 , i.e., C 1 followed by the inverse of C 2 , is equivalent to the identity circuit, i.e., a circuit that implements an operator that does not modify the inputs. ...
Preprint
Checking whether two quantum circuits are equivalent is important for the design and optimization of quantum-computer applications with real-world devices. We consider quantum circuits consisting of Clifford gates, a practically-relevant subset of all quantum operations which is large enough to exhibit quantum features such as entanglement and forms the basis of, for example, quantum-error correction and many quantum-network applications. We present a deterministic algorithm that is based on a folklore mathematical result and demonstrate that it is capable of outperforming previously considered state-of-the-art method. In particular, given two Clifford circuits as sequences of single- and two-qubit Clifford gates, the algorithm checks their equivalence in O(nm)O(n \cdot m) time in the number of qubits n and number of elementary Clifford gates m. Using the performant Stim simulator as backend, our implementation checks equivalence of quantum circuits with 1000 qubits (and a circuit depth of 10.000 gates) in \sim22 seconds and circuits with 100.000 qubits (depth 10) in \sim15 minutes, outperforming the existing SAT-based and path-integral based approaches by orders of magnitude. This approach shows that the correctness of application-relevant subsets of quantum operations can be verified up to large circuits in practice.
... Proving the equivalence of two combinational circuits is a common task. Most current approaches address it by proving the extensional equivalence of the circuits, i.e., by checking the equivalence of an exponential number of input-output pairs either directly or via an encoding to SAT [8,6,15]. We instead propose to apply our recent work [1] on type isomorphisms and their equivalences to this problem domain. ...
Article
The use of Boolean Satisfiability (SAT) solver for hardware verification incurs exponential run-time in several instances. In this work we have proposed an efficient quantum SAT (qSAT) solver for equivalence checking of Boolean circuits employing Grover’s algorithm. The Exclusive-Sum-of-Product (ESOP)-based generation of the Conjunctive Normal Form (CNF) equivalent clauses demands less qubits and minimizes the gates and depth of quantum circuit interpretation. The consideration of reference circuits for verification affecting Grover’s iterations and quantum resources are also presented as a case study. Experimental results are presented assessing the benefits of the proposed verification approach using open-source Qiskit platform and IBM quantum computer.
Conference Paper
Checking whether two quantum circuits are equivalent is important for the design and optimization of quantum-computer applications with real-world devices. We consider quantum circuits consisting of Clifford gates, a practically-relevant subset of all quantum operations which is large enough to exhibit quantum features such as entanglement and forms the basis of, for example, quantum-error correction and many quantum-network applications. We present a deterministic algorithm that is based on a folklore mathematical result and demonstrate that it is capable of outperforming previously considered state-of-the-art method. In particular, given two Clifford circuits as sequences of single- and two-qubit Clifford gates, the algorithm checks their equivalence in O(nm)O(n \cdot m) time in the number of qubits n and number of elementary Clifford gates m. Using the performant Stim simulator as backend, our implementation checks equivalence of quantum circuits with 1000 qubits (and a circuit depth of 10.000 gates) in \sim 22 s and circuits with 100.000 qubits (depth 10) in \sim 15 min, outperforming the existing SAT-based and path-integral based approaches by orders of magnitude. This approach shows that the correctness of application-relevant subsets of quantum operations can be verified up to large circuits in practice.
Chapter
This chapter discusses advances in SAT algorithm design, including the use of SAT algorithms as theory drivers, classic implementations of SAT solvers, and some theoretical aspects of SAT. Some applications to which SAT solvers have been successfully applied are also presented. The intention is to assist someone interested in applying SAT technology in solving some stubborn class of combinatorial problems.
Chapter
This chapter consists of two parts. In the first part we show that resolution based SAT-solvers cannot be scalable on real-life formulas unless some extra information about formula structure is known. In the second part we introduce a new way of satisfiability testing that may be used for designing more efficient and “intelligent” SAT-algorithms that will be able to take into account formula structure.
Conference Paper
We consider the problem of equivalence checking of circuits N 1,N 2 with a common specification (CS). We show that circuits N 1 and N 2 have a CS iff they can be partitioned into toggle equivalent subcircuits that are connected “in the same way”. Based on this result, we formulate a procedure for checking equivalence of circuits N 1 and N 2 with specifications S 1 and S 2. This procedure not only checks equivalence of N 1 and N 2 but also verifies that S 1 and S 2 are identical. The complexity of this procedure is linear in specification size and exponential in the value of a specification parameter. Previously we considered specifications parameterized by the size of the largest subcircuit (specification granularity). In this paper we give a more general parameterization based on specification “width”.
Conference Paper
Full-text available
We describe a SAT-solver, BerkMin, that inherits such features of GRASP, SATO, and Chaff as clause recording, fast BCP, restarts, and conflict clause "aging". At the same time BerkMin introduces a new decision making procedure and a new method of clause database management. We experimentally compare BerkMin with Chaff, the leader among SAT-solvers used in the EDA domain. Experiments show that our solver is more robust than Chaff. BerkMin solved all the instances we used in experiments including very large CNFs from a microprocessor verification benchmark suite. On the other hand, Chaff was not able to complete some instances even with the timeout limit of 16 hours
Article
The question of the minimum complexity of derivation of a given formula in classical propositional calculus is considered in this article and it is proved that estimates of complexity may vary considerably among the various forms of propositional calculus. The forms of propositional calculus used in the present article are somewhat unusual, † but the results obtained for them can, in principle, be extended to the usual forms of propositional calculus.
Article
We prove that, for infinitely many disjunctive normal form propositional calculus tautologies ξ, the length of the shortest resolution proof of ξ cannot be bounded by any polynomial of the length of ξ. The tautologies we use were introduced by Cook and Reckhow (1979) and encode the pigeonhole principle. Extended resolution can furnish polynomial length proofs of these formulas.
Conference Paper
A description is given of SIS, an interactive tool for synthesis and optimization of sequential circuits. Given a state transition table or a logic-level description of a sequential circuit, SIS produces an optimized net-list in the target technology while preserving the sequential input-output behavior. Many different programs and algorithms have been integrated into SIS, allowing the user to choose among a variety of techniques at each stage of the process. It is built on top of MISII and includes all (combinational) optimization techniques therein as well as many enhancements. SIS serves as both a framework within which various algorithms can be tested and compared and as a tool for automatic synthesis and optimization of sequential circuits
Conference Paper
The automatizability of Resolution and tree-like Resolution was analyzed in context to propositional proof systems. It was shown that neither Resolution nor tree-like Resolution is automatizable unless the class W[P] from the hierarchy of parameterized problems is fixed-parameter tractable by randomized algorithms with one sided error. Both systems posseses efficient interpolation so their non-automatizability could not be proved. The usefulness of proof search heuristics and automated theorem proving was also analyzed.
Conference Paper
Boolean satisfiability is probably the most studied of the combinatorial optimization/search problems. Significant effort has been devoted to trying to provide practical solutions to this problem for problem instances encountered in a range of applications in electronic design automation (EDA), as well as in artificial intelligence (AI). This study has culminated in the development of several SAT packages, both proprietary and in the public domain (e.g. GRASP, SATO) which find significant use in both research and industry. Most existing complete solvers are variants of the Davis-Putnam (DP) search algorithm. In this paper we describe the development of a new complete solver, Chaff which achieves significant performance gains through careful engineering of all aspects of the search-especially a particularly efficient implementation of Boolean constraint propagation (BCP) and a novel low overhead decision strategy. Chaff has been able to obtain one to two orders of magnitude performance improvement on difficult SAT benchmarks in comparison with other solvers (DP or otherwise), including GRASP and SATO.
Conference Paper
The problem of checking equality of Boolean functions can be solved successfully using existing techniques for only a limited range of examples. We extend the range by using a test generator and the divide and conquer paradigm
Article
This paper introduces GRASP (Generic seaRch Algorithm for the Satisfiability Problem), a new search algorithm for Propositional Satisfiability (SAT). GRASP incorporates several search-pruning techniques that proved to be quite powerful on a wide variety of SAT problems. Some of these techniques are specific to SAT, whereas others are similar in spirit to approaches in other fields of Artificial Intelligence. GRASP is premised on the inevitability of conflicts during the search and its most distinguishing feature is the augmentation of basic backtracking search with a powerful conflict analysis procedure. Analyzing conflicts to determine their causes enables GRASP to backtrack nonchronologically to earlier levels in the search tree, potentially pruning large portions of the search space. In addition, by “recording” the causes of conflicts, GRASP can recognize and preempt the occurrence of similar conflicts later on in the search. Finally, straightforward bookkeeping of the causality chains leading up to conflicts allows GRASP to identify assignments that are necessary for a solution to be found. Experimental results obtained from a large number of benchmarks indicate that application of the proposed conflict analysis techniques to SAT algorithms can be extremely effective for a large number of representative classes of SAT instances
Article
We present the best known separation between tree-like and general resolution, improving on the recent exp(n ) separation of [BEGJ98].
Article
this paper we demonstrate that both of these phenomena can occur. In particular, we show that using an even more expensive form of reasoning than performed by 2clsVER can achieve a better tradeoff between reasoning time and search reduction and thus achieve significantly better performance. In some cases, we will also see that the net rate of nodes/seconds can also be improved due to the additional simplification being done at the top of the search tree. In the sequel we explain the reasoning that we employ, and show some empirical results
Article
Boolean Satisfiability is probably the most studied of combinatorial optimization/search problems. Significant effort has been devoted to trying to provide practical solutions to this problem for problem instances encountered in a range of applications in Electronic Design Automation (EDA), as well as in Artificial Intelligence (AI). This study has culminated in the development of several SAT packages, both proprietary and in the public domain (e.g. GRASP, SATO) which find significant use in both research and industry. Most existing complete solvers are variants of the Davis-Putnam (DP) search algorithm. In this paper we describe the development of a new complete solver, Chaff, which achieves significant performance gains through careful engineering of all aspects of the search -- especially a particularly efficient implementation of Boolean constraint propagation (BCP) and a novel low overhead decision strategy. Chaff has been able to obtain one to two orders of magnitude performance improvement on difficult SAT benchmarks in comparison with other solvers (DP or otherwise), including GRASP and SATO.
Article
The interpolation method has been one of the main tools for proving lower bounds for propositional proof systems. Loosely speaking, if one can prove that a particular proof system has the feasible interpolation property, then a generic reduction can (usually) be applied to prove lower bounds for the proof system, sometimes assuming a (usually modest) complexity-theoretic assumption. In this paper, we show that this method cannot be used to obtain lower bounds for Frege systems, or even for TC 0 -Frege systems. More specifically, we show that unless factoring (of Blum integers) is feasible, neither Frege nor TC 0 -Frege has the feasible interpolation property. In order to carry out our argument, we show how to carry out proofs of many elementary axioms/theorems of arithmetic in polynomial size TC 0 -Frege. As a corollary, we obtain that TC 0 -Frege as well as any proof system that polynomially simulates it, is not automatizable (under the assumption that factoring of Blum integ...