We know that the worst-case time complexity of any comparison sorting algorithm is $\Omega(n\log n)$. Is there a lower bound on the worst-case running time of sorting algorithms of any type? Not just comparison-based? I.e, a lower bound on the sorting problem.
-
1$\begingroup$ How you define the problem exactly? Note that, the most general version of sorting actually requires comparison based algorithms, because you don't know anything about elements besides of how they compare. If you restrict your problem to words over a finite alphabet, fractions, etc. then the problem changes and you can do more things. The problem with a "general lower bound" for all this widely different problems is that there's none! For example, sorting a particular kind of input with the guarantee that all elements are equal takes 0 operations... $\endgroup$Bernardo Subercaseaux– Bernardo Subercaseaux2021年07月28日 17:41:59 +00:00Commented Jul 28, 2021 at 17:41
-
$\begingroup$ @BernardoSubercaseaux So if it requires comparison-based algorithms in the general case the answer is $\Omega(nlogn)$. What I meant was the most general case. And about 0ドル$ operations, Well it's best-case, not worst-case for an input of size $n$. $\endgroup$Emad– Emad2021年08月08日 06:33:10 +00:00Commented Aug 8, 2021 at 6:33
-
1$\begingroup$ As I mentioned, there's no well-defined "more general case". The 0 operations case I mentioned is indeed worst-case: the Turing Machine that does nothing takes 0 operations on any input of size $n$ on said problem, as all inputs come already sorted. $\endgroup$Bernardo Subercaseaux– Bernardo Subercaseaux2021年08月09日 01:10:18 +00:00Commented Aug 9, 2021 at 1:10
1 Answer 1
Suppose the input size is $n$. The lower bound of any algorithm (such as comparison based or non-comparison based) that solve the Sorting problem is $$\Omega(n)$$ because we need read the input at least once.
-
$\begingroup$ I guess the book CLRS normally doesn't take the input read time and accessing array elements time into account in its procedures. i.e It assumes that we already have the array and neglects the accessing elements time as well. That's what I think but I might be wrong. But your point is good to note. I'll edit my question. what I emphasize more is the algorithm itself rather than reading and accessing. $\endgroup$Emad– Emad2021年07月28日 09:19:35 +00:00Commented Jul 28, 2021 at 9:19
-
$\begingroup$ The lower bound $\Omega(n)$ is a tight lower bound, because there is some non-comparison based algorithm that work in linear time, notice that, reading input in efficient manner is a critical concept, i suggest you, look at problem 4-2 (Parameter-passing costs ) in your book. $\endgroup$ErroR– ErroR2021年07月28日 15:16:29 +00:00Commented Jul 28, 2021 at 15:16
Explore related questions
See similar questions with these tags.