The notion of best-case running time is kind of ambiguous for me. According to wikipedia, the definition of best case running time is:
The term best-case performance is used in computer science to describe the way of an algorithm behaves under optimal conditions. For example, the best case for a simple linear search on a list occurs when the desired element is the first element of the list.
According to this definition, the best case running time for BST insertion should be $O(1)$ [consider that we are inserting to the root node]. But different resources says different things, some claim that it is $O(\log n)$ [perfect balanced tree] and some others claim that it is $O(1)$ which one should I believe?
1 Answer 1
Both! That is, if you are lenient with what "best-case" means. The best-case is the input for respectively run1 of an algorithm which minimises runtime (for a given size). I guess that some of your sources refer to (asymptotically) optimal BST implementations, either them or you mixing up the notions.
These are the facts:
For any (reasonable) binary search tree implementation, the best-case insertion time is certainly $O(1)$ (for all sizes): all nodes are in the root's right subtree, the one to be inserted belong in the left.
An optimal binary search tree implemenentation has worst-case insertion time in $\Theta(\log n)$; it is height-balanced (examples include AVL- and Red-Black-trees).
- That's equivalent for deterministic algorithms; for nondeterministic ones you consider runs.
-
$\begingroup$ yes you are right, probably they are referring to the perfect balanced tree as the best-case. $\endgroup$systemsfault– systemsfault2012年09月24日 22:32:31 +00:00Commented Sep 24, 2012 at 22:32
Explore related questions
See similar questions with these tags.