1
$\begingroup$

Suppose we have two sorting algorithms which takes $O(n\log n)$ and $O(n^2)$ time. What can we say about it? Is it always better to choose $n\log n$ if the size $n$ is not given? Or can we say on an average $n \log n$ out performs $n^2$. If I want to make one of the algorithm as default sorting algorithm of my system then should I always choose $O(n\log n)$ over $O(n^2)$ .

Please give some input.

$\endgroup$

2 Answers 2

7
$\begingroup$

No. Suppose sorting algorithm $A$ takes $1000n \log(n)$ steps and algorithm $B$ takes $n^2$ steps and we need to sort 1000 elements. Then $A$ takes $1000 \cdot 1000 \cdot \log(1000)=3000000$ steps, but $B$ takes $n^2=1000^2=1000000$ steps.

However, it is eventually better. For example, for $n=10000$, $A$ is better than $B$.

$\endgroup$
3
  • 3
    $\begingroup$ There's also the matter of convenience. Altho quicksort is asymptotically faster, I'd use insertion sort if I only need to sort an array of modest dimensions, on account of it being a bit simpler. Also of note is that some of the $O(n\log n)$ algorithms will use $O(n^2)$ algorithms as soon as they reach a point where the arrays to be dealt with are sufficiently small. $\endgroup$ Commented May 16, 2016 at 13:55
  • $\begingroup$ If I want to make one of the algorithm as default sorting algorithm of my system then should I always choose $O(nlogn)$ over $O(n^2)$ ? a generic answer would be great $\endgroup$
    – ViX28
    Commented May 16, 2016 at 15:31
  • $\begingroup$ @Vi, okay, practical advice: implement your best quadratic algorithm, and your best log-linear algorithm. Test them for sorting arrays of increasing size. Find the break-even point, where one becomes slower than the other. Now, you know how to write an if statement, right? $\endgroup$ Commented May 17, 2016 at 4:38
1
$\begingroup$

What can we say about it? Is it always better to choose $n \log n$ if the size $n$ is not given? Or can we say on an average $n \log n$ outperforms $n^2$.

Strictly speaking, no to both questions. The only thing we can say for sure is that $n \log n$ algorithm outperforms $n^2$ algorithm for sufficiently large $n$. In practice, all $n \log n$ algorithms have low enough multipliers that $n^2$ algorithm can be quicker only for very small $n$ (and for very small $n$, it usually doesn't matter what algorithm is used).

There are other problems, though. Sorting algorithms have average-case time and worst-case time. For example, quick sort has average time $O(n \log n)$, but worst time is $O(n^2)$, and worst case for this algorithm is when an array being sorted is already almost ordered. Also, there are lesser effects such as memory requirements (merge sort is good algorithm, but it requires $O(n)$ extra space) and relative speed of comparisons (say, strings including only four letters take the longer to compare the closer they are to each other).

Generally, using quick sort everywhere can be considered a good decision unless proven otherwise.

$\endgroup$
3
  • $\begingroup$ Of course, there are strategies that lessen the brittleness of quicksort (e.g. random partitioning); nevertheless, your advice is sound. $\endgroup$ Commented May 16, 2016 at 14:05
  • $\begingroup$ sorry but I did not mean to compare average cases. I meant on an average is it true that $nlogn$ better than $n^2$ $\endgroup$
    – ViX28
    Commented May 16, 2016 at 15:28
  • 1
    $\begingroup$ @ViX28 Average of what? To my knowledge $n^2$ algorithms are rarely used in pure form, mostly as improvements to recursive algorithms (you use recursive sort to split the problem into smaller sub-problems and call either the same algorithm or $n^2$ one when, say, $n<32$). For your overall program sorting algorithm (usually) actually matters when you sort large arrays and you don't want to use $n^2$ algorithm there. $\endgroup$ Commented May 16, 2016 at 15:38

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .