What can we say about it? Is it always better to choose $n \log n$ if the size $n$ is not given? Or can we say on an average $n \log n$ outperforms $n^2$.
Strictly speaking, no to both questions. The only thing we can say for sure is that $n \log n$ algorithm outperforms $n^2$ algorithm for sufficiently large $n$. In practice, all $n \log n$ algorithms have low enough multipliers that $n^2$ algorithm can be quicker only for very small $n$ (and for very small $n$, it usually doesn't matter what algorithm is used).
There are other problems, though. Sorting algorithms have average-case time and worst-case time. For example, quick sort has average time $O(n \log n)$, but worst time is $O(n^2)$, and worst case for this algorithm is when an array being sorted is already almost ordered. Also, there are lesser effects such as memory requirements (merge sort is good algorithm, but it requires $O(n)$ extra space) and relative speed of comparisons (say, strings including only four letters take the longer to compare the closer they are to each other).
Generally, using quick sort everywhere can be considered a good decision unless proven otherwise.