3
$\begingroup$

From my understanding, the complexity of the algorithm is O(number of inputs * number of bits for input). The number of bits in binary notation is obviously less than the number of bits in unary notation. Then why is it that for binary, complexity is exponential but for unary, it is polynomial?

EDIT: I think the binary time is still faster. It just appears to be faster in case of unary due to notations. If x bits are used, then n has to be less than equal to x. So it becomes polynomial. while in case of binary, it's obviously O(n*2^number of bits). There might be something wrong in my explanation. Feel free add an answer to correct it.

$\endgroup$
2
  • $\begingroup$ Landau notation is a pretty poor way to say whether something is fast or slow. See also What is an efficient algorithm $\endgroup$
    – adrianN
    Commented Jul 19, 2016 at 7:51
  • $\begingroup$ Google "pseudopolynomial". $\endgroup$
    – Raphael
    Commented Jul 19, 2016 at 10:33

1 Answer 1

12
$\begingroup$

Consider the following (silly) function cow, which accepts a number $n$:

def cow(n):
   k = 0
   for i in range(0, n):
       k =  k + 1
   return k

The loop takes $n$ steps, so the time needed for the function to complete on input $n$ is $\Theta(n)$.

However, the size of the input is not $n$ but rather $\log_2 n$ because computers represent numbers in binary. That is, if $b = \log_2 n$ is the size of the input, then the complexity is $\Theta(2^b)$ (because $n = 2^{\log_2 n}$), which is an exponential algorithm when measured in size of the input $b$. If we measure in terms of $n$ then it is a linear algorithm.

If we represented numbers in unary then we would have $b = n$, i.e., it would take $n$ "units" (not "bits" anymore) to represent $n$. In this case the complexity would be $\Theta(b)$, a linear algorithm.

$\endgroup$
4
  • $\begingroup$ b = $\log_{2}(n) + 1$ is the amount of bits, e.g. $\log_{2}(8) = 3$ but clearly you need 4 bits to represent 8. Then the n that you calculate with $n = 2^{log_{2}(n)} - 1$ is the worst case the for-loop can run proportional to the input length. $\endgroup$ Commented Mar 7, 2020 at 15:27
  • $\begingroup$ Are you pointing out that I made an off-by-one error, or something else? If it is off-by-error, then I'd point out that you made an error, too, because $\log_2(n) + 1$ is not an integer. I purposely did not want to pollute the answer with stuff like rounding up $\log_2(n) + 1$ because the answer is trying to explain a very basic misconception, not count the exact number of bits needed. $\endgroup$ Commented Mar 7, 2020 at 16:15
  • $\begingroup$ I found your answer very useful, so i am sorry if the tone was snarky. When trying it out on paper i just realized that writing a comment to elaborate what i found would maybe help someone who thinks like me. Very good note on the log not giving a integer, had not thought about that. $\endgroup$ Commented Mar 7, 2020 at 17:45
  • $\begingroup$ Sure, no problem there, but I couldn't tell from your comment what it was about. $\endgroup$ Commented Mar 7, 2020 at 19:43

Not the answer you're looking for? Browse other questions tagged or ask your own question.