I'm working on an algorithm that takes an array of $N$ values and it will iterate through each of the values and in each of them iterate through the rest of the array to the right. So the first value will check $N-1$ positions, the next one will check $N-2$, etc
I'm a bit confused about the time complexity of my algorithm. I remembered that Selection Sort does a similar operation and I checked it's considered $O(N^2)$ best and worst case. Why is that? If both algorithms check less and less positions every new position I imagined it would be some complexity between $O(N)$ and $O(N*log(N))$ and not $O(N^2)$.
Another question. If I did my algorithm only for half of the array what would the complexity be? I mean starting at the middle position and only doing the algorithm to the right side for example so the middle position would check only $N/2-1$ positions and the next only $N/2-2$, etc. I guess it would be $O((N/2)^2)$ which would be $O(N^2/4)$ and so still $O(N^2)$?
1 Answer 1
You have to count. Intuition is necessary, but you must calculate.
$$ 1+2+\cdots+N-1 = N(N-1)/2 \in \mathcal{O}(N^2)$$ by Gaussian summation rules.
In asymptotic notation, the constants are removed. For example.
$\mathcal{O}(N/k) = \mathcal{O}(N)$ as long as $k$ is a constant.