1
$\begingroup$

While studying algorithm analysis I found that there is something called tight bound and there is some mathematical formula to support it. Given:

Mergesort takes $\Theta(n \log n)$ compares to sort $n$ elements.

What insight can we get from the above statement?

$\endgroup$
2

2 Answers 2

3
$\begingroup$

The insight you get from that statement is that the number of comparisons that merge sort uses behaves like a constant time $n\log n$. That's exactly what the statement means. In many cases the running time is captured by the number of comparisons, so the running time of merge sort behaves like a constant time $n\log n$. You can use this to estimate how long merge sort will take on a given number of elements (though you'll have to empirically estimate the constants involved). You can also use this estimate to compare merge sort to other algorithms, which might use asymptotically more comparisons.

$\endgroup$
7
  • $\begingroup$ What is the significance of $\theta$ here then? we can simply say that the running time is directly proprotional to $nlogn$ or running time is $nlogn$. $\endgroup$
    – vivek
    Commented Mar 6, 2016 at 18:54
  • 1
    $\begingroup$ That's more or less what $\Theta$ is trying to express. There is a technical difference – $\Theta(n\log n)$ doesn't mean that the number of comparisons is roughly $Cn\log n$ for some positive constant $C$, but rather that the number of comparisons $M$ is bounded as $An\log n \leq M \leq B n\log n$ for some positive constants $A,B$. So it might be that the constant fluctuates, but typically it's not the case. $\endgroup$ Commented Mar 6, 2016 at 18:56
  • $\begingroup$ Like we are told in many algorithm book that constant factor depends on many things like processor, language etc. Basically our aim to introduce Big-oh is to simplify the analysis of the running time of the function. Please let me know if I am on the right path. $\endgroup$
    – vivek
    Commented Mar 6, 2016 at 18:59
  • 2
    $\begingroup$ In principle you're right, but the number of comparisons is not processor dependent. You're right about running time analysis, though. $\endgroup$ Commented Mar 6, 2016 at 19:00
  • $\begingroup$ Also, if we think deeply the $\Theta$ gives us the more accurate running time analysis. For example if we can say that, at best the above algorithm has to make $nlogn$ compares and not more that $nlogn$ compares. $\endgroup$
    – vivek
    Commented Mar 7, 2016 at 5:01
4
$\begingroup$

Mergesort takes $Θ(n \log n)$ compares to sort n elements [in the worst case].

Just unfold definitions. Step one:

Mergesort takes $O(n \log n)$ and $\Omega(n \log n)$ compares to sort $n$ elements [in the worst case].

Step two (and three, I guess, since I recombine the unfolded definitions of $O$ and $\Omega$):

For some $n_0 \in \mathbb{N}$ and $a,b \in \mathbb{R}_{>0}$, the number of comparisons $C_n$ Mergesort uses to sort $n$ elements [in the worst case] is bounded by $a \cdot n \log n \leq C_n \leq b \cdot n \log n$ for all $n \leq n_0$.

That's it. Would I call that insight? Hardly.

Adding the justification for why we ignore constant factors in the first place and that comparisons are a dominant operation in Mergesort, I guess the intended statement is something like this:

If you implemented this Mergesort algorithm on a machine [that fits the RAM model and can compare all input numbers in constant time], its running-time cost would be roughly proportional to $n \log n$ for large enough $n$.

Note that neither the proportionality constant nor the sense of "roughly" and "large enough" are specified by the formal definitions. It may very well be that the algorithm is utterly unusable for all instances relevant to you. Hence, statements about algorithm performance using Landau notation are, for practical purposes, mostly vacuous -- formally speaking.

Implicitly, "we know" that "usually" constants are small and lower-order terms behave "nicely" (see e.g. here for a more precise result on Mergesort) so comparing Landau bounds is a somewhat meaningful criterion for choosing algorithms in practice. It has serious limitations, though, and it is important to keep those in mind.

$\endgroup$
2
  • $\begingroup$ So, basically we can say that the running time of mergesort will be roughly $n\log_{2}n$ up to a constant factor. Its not preferred basically of external memory needed but that's a different story I think. $\endgroup$
    – vivek
    Commented Mar 16, 2016 at 15:17
  • $\begingroup$ I have been thinking deeply about running time and all and suddenly I got a aha moment. The worst case, best case and average case are not related to the asymptotic notation at all. We can say that the worst case running time is $\Theta(logn)$ and that's fine because the latter one is more generic concept which is well explained by you in various posts. $\endgroup$
    – vivek
    Commented Mar 16, 2016 at 18:35

Not the answer you're looking for? Browse other questions tagged or ask your own question.