Caleb Stanford


ProvableP vs ProvableNP


logic complexity-theory math

Introduction and Background

The versus question asks: is it possible to determine whether a Boolean formula is satisfiable, in time polynomial in the length of the formula? Determining whether a Boolean formulas is satisfiable is called the SAT problem, and programs which solve the problem are called SAT solvers. SAT solvers are used in many modern programming tools (usually, for automatically solving some system of constraints), so the SAT problem is of great practical importance today. And it is often suggested that a solution to “ versus ” would resolve whether there exists any efficient algorithm at all for SAT.

There are a number of reasons why this suggestion could be wrong, or at least, a number of ways to argue against it. But here’s a particular thought: what if (that is, SAT can be solved in polynomial time), but the polynomial-time algorithm is beyond our ability to find – and in particular, beyond our ability to prove correct? What if there exists some magic algorithm that happens to work, but there doesn’t exist any proof that it works? Or perhaps there doesn’t exist any proof that it runs in polynomial time? Such a scenario would be an unsatisfying resolution to versus , in a way: because practically speaking, we generally only trust algorithms that we can prove correct.

The question I’m trying to motivate is: does ? That is, what happens if we restrict our attention to algorithms which can be proven correct? Is this question answerable, or is it just as hard as versus ?

Summary

It turns out that if and only if . I’ll define what I mean by and . This post is adapted from a question and answer I made on Computer Science StackExchange.

The ProvableP versus ProvableNP question

Fix a logic that is strong enough to encode statements about Turing machines. By this I mean the same requirements as Godel’s second incompleteness theorem; but for the purposes of this question, just assume the following:

  • is consistent (it doesn’t prove False); and

  • implies the Peano arithmetic axioms of (PA).

Then define

There’s an issue with this definition: how do we encode languages (of which there are uncountably many) as finite objects in order for the logic to have something to say about them? In particular, how do we encode formulas for and ? Let us assume that languages are subsets of the natural numbers , and that any in or must be given by an arbitrary formula of the symbols of , i.e. it is any definable subset of .

From the definition we have that . But does ? How does this relate to the original vs question?

The Answer

Obligatory warning: to my knowledge, this proof has not been rigorously checked by someone else :) But we now argue that

For the forward direction, assume , and consider any language in . Note that is accepted by a nondeterministic Turing machine , such that there exists a polynomial which bounds the longest execution branch of on input a string of length . Then let be a nondeterministic Turing machine with the following description: first, count the input length ; second, calculate ; and finally, runs for at most steps nondeterministically. If doesn’t halt (this never actually occurs, but doesn’t know that for sure), rejects.

Now observe that (more specifically, PA) proves that runs in time . Moreover, can describe “the language of strings accepted by ” by a formula (roughly, “there exists an sequence of configurations such that the sequence is a run of and ends in an accept state”). Then we have that proves . Therefore, . But we know (even if doesn’t) that by construction. So . But , so .

The backward direction is a similar trick. Assume and that . Then , so . From here, we know there is some Turing machine and polynomial such that runs in time and . Let be a deterministic Turing machine which, on input of length , first calculates , and then runs for at most steps. If doesn’t halt, returns a default value, say .

Then similarly to before, proves that halts in time and therefore that . it follows that , but we know (even if doesn’t) that . Thus, .


Comments