While being somewhat familiar with intuitively determining the complexity of algorithms, I'm a bit lost on how to actually calculate it.
For the following code, any idea how I could determine the complexity?
list = [...]
start = list[0]
end = null
remove list[0] from list
while(list.length > 0) {
for(i = 0; i < list.length; i++) {
if(list[i] is immediately before start or immediately after end) {
link list[i] to start or end (populate end if null)
remove list[i] from list
}
}
}
This assumes a valid dataset (a continuously linked list of elements that must be sorted). Also was simplified for illustration purposes;
So, best case scenario it's O(n) if the list is already ordered since you'd only need a pass to process them and pop them out.
What I can't determine is the worst case scenario, since every "while" iteration the dataset is going to get smaller by at least 1 element (usually 2 or more) since it's assumed the dataset will always be valid. So it's clearly less than O(n^2) (i think). All ideas are welcomed.
Thanks!
UPDATE After graphing it out, it seems to be O(nlogk(n)) where k = n ^ (2/(n+1)) Does this count as O(nlog(n))? It's unclear to me.
1 Answer 1
Big O notation aims to provide an upper bound for the growth rate of a function, so you have to account for the worst case scenario.
In the worst case scenario I would assume in each iteration the dataset will get smaller by 1 element so the number of operations you'd be performing would be bound by n + n-1 + n-2 ... + 2 + 1
which equals (n+1)*n / 2
which is O(n^2)
2 Comments
(n²+n)/2
is bounded by the n²
factor, and thus it is indeed O(n²). What big O notation says is how the time the algorithm takes to run scaled with the input it receives, and in this case it is quadratically even though the constants are < 1.