16
$\begingroup$

An asymptotic lower bound such as exponential-hardness is generally thought to imply that a problem is "inherently difficult". Encryption that is "inherently difficult" to break is thought to be secure.

However, an asymptotic lower bound does not rule out the possibility that a huge but finite class of problem instances are easy (eg. all instances with size less than $10^{1000}$).

Is there any reason to think that cryptography being based on asymptotic lower bounds would confer any particular level of security? Do security experts consider such possibilities, or are they simply ignored?

An example is the use of trap-door functions based on the decomposition of large numbers into their prime factors. This was at one point thought to be inherently difficult (I think that exponential was the conjecture) but now many believe that there may be a polynomial algorithm (as there is for primality testing). No one seems to care very much about the lack of an exponential lower bound.

I believe that other trap door functions have been proposed that are thought to be NP-hard (see related question), and some may even have a proven lower bound. My question is more fundamental: does it matter what the asymptotic lower bound is? If not, is the practical security of any cryptographic code at all related to any asymptotic complexity result?

$\endgroup$
1
  • $\begingroup$ Welcome! Not quite a duplicate, but very related: this question. In order to improve the question, please give concrete examples where you think the issue is ignored. You don't want to be fighting wind mills! $\endgroup$
    – Raphael
    Commented Apr 11, 2012 at 20:18

1 Answer 1

2
$\begingroup$

I'll try to give a partial answer, since I'm not fully aware of how this issue is considered by the entire crypto-community (maybe repost on crypto.SE?).

I would say there are two "kinds" of cryptographers: Theoretical and Practical. I will not try to tell them apart (every practical-cryptographer is also a bit theoretician..) but I'll say that for theoretical cryptography - this issue doesn't really matter. For any security parameter, there will be an instance size that will provide that security level, and that is usually all that we care about.

Practical cryptographers care a lot about the issue you mention. For a given security parameter (say $2^{1024}$) cryptographers try to come up with the most efficient protocols, shaving the constant as much as possible. Look for instance for NIST's AES or SHA-3 competition. The algorithms were required to be both secure and efficient. The issue here is that the notion of security is somewhat different from the "theoretical" one, and sometimes is not really asymptotic.

A concrete example I can think of is the discrete log or the DDH assumption (the security of many cryptographic schemes are based on these assumptions). We assume that for some group $G$ computing the log takes $O(|G|)$ time. (We can't be sure: it might be that $\text{P}=\text{NP}$ and this problem can be solved in $O(\log |G|)$). Cryptographers DO care about the actual time it takes to compute the log. More accurately, they care alot about special cases for which computing the log is easy. In many papers you will see the sentence "Let $G$ be a group in which DDH is hard". See this survey for families of groups for which DDH is believed to be intractable (unless P=NP).

$\endgroup$
1
  • $\begingroup$ This answer is not terribly satisfying to me, perhaps because I am not expert enough to figure out how it addresses my question. Admittedly, I have not studied complexity theory for some 25 years, but I do understand many of the underlying concepts. Having looked up some of the linked references, it seems that the complexity characterizations used are asymptotic, so I still can't figure out how they can give usable guarantees over finite classes of instances. $\endgroup$
    – Micah Beck
    Commented Apr 16, 2012 at 17:09

Not the answer you're looking for? Browse other questions tagged or ask your own question.