1
$\begingroup$

I'm looking for the name of the family of algorithms whose exact runtime is dependent only on the size of the input. An example of such an algorithm would be naive $O(n^3)$ matrix multiplication. If we know $n$, we know exactly how many operations it will execute and how many cycles it will take to complete (ignore real world things like cache misses, context switching, etc.). An algorithm that would not be in that family is Quick sort ($O(n^2)$) since the number of operations is dependent on the contents of the list being sorted. Divider algorithms that only depend on the number of bits of the operands would be in this family, but Euclid's algorithm would not because it finishes faster for some numbers than others.

In other words, suppose you have the algorithm implemented in assembler and the cpu runs one instruction per cycle. Then the value of the program counter $pc(t, n)$ is a function of time $t$ and problem size $n$. The total number of cycles to compute the solution is known in advance.

$\endgroup$
6
  • $\begingroup$ I guess oblivious algorithms form an important subclass of things you are interested in (see Computational Complexity: A ModernApproach , remark 1.10 p1.11 (21) $\endgroup$
    – Dmitry
    Commented Oct 12, 2022 at 21:16
  • $\begingroup$ Oblivious may be the word I'm looking for. Though most google hits for "oblivious algorithms" are for "cache-oblivious algorithms" which is something different. $\endgroup$ Commented Oct 12, 2022 at 23:43
  • $\begingroup$ Though most google hits for "oblivious algorithms" are for "cache-oblivious algorithms" I know:) Not that you mention it, I'm not actually sure if "oblivious algorithms" is a standard name. I'm only sure that "oblivious TM" is a standard name: en.wikipedia.org/wiki/… $\endgroup$
    – Dmitry
    Commented Oct 13, 2022 at 0:14
  • $\begingroup$ Are you talking about Big O vs Big Theta? $\endgroup$
    – Neil
    Commented Oct 13, 2022 at 5:50
  • $\begingroup$ The family I'm thinking of is a subset of $\Theta$-bound algorithms. Tell me the size of the matrices, bits of the operands or whatever, and I tell you the exact number of RAM operations required. $\endgroup$ Commented Oct 13, 2022 at 13:58

1 Answer 1

-1
$\begingroup$

ETA: adjusted to "size-based constant time algorithms". Indeed, as pointed out, constant time typically refers to constant time on all inputs. In average-case and worst-case analysis, typically the time is per size n, so in that contest the algorithms that do not change behaviour on each given size are (size-based) constant time. The traditional definition of constant time would be an "overall" constant time (for all sizes).

It is reasonable to refer to such algorithms as "(size-based) constant-time algorithms". Their worst-case and average-case time (for instance measured in number of comparisons) is the same as the constant time c[n] which the algorithm takes to execute on inputs of size n. This includes Bubble-Sort but not Insertion-Sort (when measuring the time as the number of comparisons executed by the algorithm on lists of fixed size n).

$\endgroup$
2
  • 2
    $\begingroup$ Don't know about others, but if someone tells me that $A$ is a "constant-time algorithm", I will 100% understand it as $A$ having $O(1)$ running time. Wikipedia seems to agree: en.wikipedia.org/wiki/Time_complexity#Constant_time $\endgroup$
    – Dmitry
    Commented Oct 12, 2022 at 21:21
  • $\begingroup$ @Dmitry Ok fair enough. The suggestion would be my own terminology. I meant constant time per size n, which is a reasonable definition in the context of size based approaches (worst-case, average-case analysis), but is indeed confusing with respect to the traditional definition of an overall constant time (one constant for all sizes). $\endgroup$ Commented Oct 13, 2022 at 8:05

Not the answer you're looking for? Browse other questions tagged or ask your own question.