13
$\begingroup$

It is known that many logic problems (e.g. satisfiability problems of several modal logics) are not decidable. There are also many undecidable problems in algorithm theory, e.g. in combinatorial optimization. But in practice heuristcs and approximate algorithms works well for practical algorithms.

So one can expect that approximate algorithms for logic problems can be suitable as well. However the only reseach trend along these lines I have managed to find is the max-SAT problem and its development was active in nineties.

Are there some other active research trends, workshops, keywords, good references for the use and development of approximate methods for modal logics, logic programming and so on?

If automated reasoning is expected to gain prominence in the future applications of computer science then one will have to be able to go beyond undecidability constraints of logics and approximate methods or heuristics can be natural path to follow, isn't it so?

$\endgroup$
3
  • 1
    $\begingroup$ "the only reseach trend along these lines I have managed to find is the max-SAT problem and its development was active in nineties." Actually, MAXSAT solvers are improving significantly these days: maxsat.udl.cat/12/solvers/index.html $\endgroup$ Commented Aug 28, 2013 at 4:07
  • $\begingroup$ After some studies now I am inclined to change my mind. The finite model theory should be the most prospective field for AI and applied logic. Logics that are based on infinite model theory maybe aesthetically nice but they lack two important connections with reality: 1) practical applications are always restricted by bounded resources (e.g. list of variables should be bounded); 2) the physical worl is necessarily bounded and more likely to be discrete as well (e.g. fundamental length and so on). So - now I don't understand the use of infinite model theories. THEY are the approximations. $\endgroup$
    – TomR
    Commented Nov 24, 2013 at 16:26
  • $\begingroup$ One other trend is "connection science" or neuro-symbolic integration - where the logic is used for stating problem and providing input and reading output of the computation, but the computation itself is performed by neural network. There is some discussion how powerful NN can be (e.g. some suggest that they can break the Turing limit only when realy numbers are used as weights but this can be discussed - e.g. it is open question whether real numbers exist in nature at all) but it is clrar that there should be prospects for using heuristic methods in logic and integration is one way. $\endgroup$
    – TomR
    Commented Dec 22, 2013 at 20:02

1 Answer 1

10
$\begingroup$

The motivation you state for dealing with undecidability applies to decidable but hard problems as well. If you have a problem that is NP-hard or PSPACE-hard, we will typically have to use some form of approximation (in the broad sense of the term) to find a solution.

It is useful to distinguish between different notions of approximation.

  • Numeric approximation: applies when you have a metric on the solution space and you can measure distance from the solution. This is the case for problems like #SAT, where you count solutions and can design an approximation algorithm that returns a value within $\varepsilon$ of the solution.
  • Probabilistic approximation: when you use a randomized algorithm you can give up certainty in the solution you will obtain every time you run the algorithm. You can move between randomness in the algorithm and randomness in the solution space by considering a probability distribution over solutions. Then the approximation here is that with some probability greater than $\delta$, you will obtain a correct solution.

There are $(\varepsilon,\delta)$-algorithms for problems like ALLSAT which combine the two perspectives above. In the specific case of undecidable problems, or problems where there is no obvious metric and we do not want to use a randomized algorithm, there is a question of what notion of approximation to use. This problem arises specifically in the analysis of programs in optimizing compilers and in program verification. Determining complex properties of programs is either undecidable or computationally expensive, so some form of approximation is required.

Here is an example of a different notion of approximation. Suppose you perform a computation like multiplying two large numbers and want to check if it the multiplication was correct. There are lots of heuristic techniques that are used in practice to check correctness without repeating the calculation again. You can check that the signs were multiplied to obtain the right sign. You can check whether the numbers have the right parity (even/odd number properties). You can use a more sophisticated check like Casting out nines. All these techniques have a common property that they can tell you if you made a mistake, but they cannot guarantee if you got the right answer. This property can be viewed as a logical approximation because you may be able to prove that the original calculation is wrong but you may not be able to prove that it is correct.

All the checks I have mentioned above are examples of a technique called abstract interpretation. Abstract interpretation makes completely rigorous a notion of logical approximation distinct from numeric and probabilistic approximations. The problem I described with analysis of a single calculation extends to the more complex case of analysis of a program. The literature on abstract interpretation has developed techniques and frameworks for approximate, logical reasoning about programs, and more recently about logics. The following references might be useful.

  1. Abstract Interpretation in a Nutshell by Patrick Cousot, which is a simple overview.
  2. Overview of Abstraction by Patrick Cousot, as part of his course. There is a very nice example of abstraction for determining properties of a boquet of flowers. The bouquet analogy includes fixed points and can be made completely mathematically precise.
  3. Course on Abstract Interpretation by Patrick Cousot, if you want all the depth and details.
  4. Abstract interpretation and application to logic programs, Patrick Cousot and Radhia Cousot, 1992. Applies to logic programs, as per your request. The initial section also formalises the casting out nines procedure as an abstract interpretation.

All of this has typically been applied to reason about computer programs. There has been fairly recent work on applying ideas from abstract interpretation to study decision procedures for logics. The focus has not been modal logic but satisfiability in propositional logic and quantifier-free first-order theories. (Since I have worked in this space, one paper below is mine)

  1. A generalization of Staalmarck's method by Aditya Thakur and Thomas Reps, 2012. Gives a generalisation of Staalmarck's method to problems in program analysis.
  2. The reduced product of abstract domains and the combination of decision procedures, Patrick Cousot, Radhia Cousot, and Laurent Mauborgne, 2011. This paper studies the Nelson-Oppen technique for combining decision procedures and shows that it can also be used for incomplete combinations, which is particularly interesting if you have undecidable problems.
  3. Satisfiability Solvers are Static Analysers, my paper with Leopold Haller and Daniel Kroening, 2012. Applies the lattice-based approximation view to characterise existing solvers. You could also look at my slides on the topic, instead.

Now none of the papers above answers your specific question about attacking satisfiability problems which are undecidable. What these papers do is take an approximation-oriented view of logical problems that is not numeric or probabilistic. This view has been applied extensively to reason about programs and I believe it addresses exactly what you are asking.

To apply it to modal logic, I would suggest a starting point is to use algebraic semantics of Jonsson and Tarski or the later semantics of Lemmon and Scott. This is because abstract interpretation is formulated in terms of lattices and monotone functions, so Boolean algebras with operators are a convenient semantics to work with. If you want to start with Kripke frames, you can apply Jonsson and Tarski's duality theorem (which some may call Stone duality) and derive the algebraic representation. Thereafter, you can apply the theorems of abstract interpretation for logical approximation.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.