2
$\begingroup$

I'm working on an algorithm that takes an array of $N$ values and it will iterate through each of the values and in each of them iterate through the rest of the array to the right. So the first value will check $N-1$ positions, the next one will check $N-2$, etc

I'm a bit confused about the time complexity of my algorithm. I remembered that Selection Sort does a similar operation and I checked it's considered $O(N^2)$ best and worst case. Why is that? If both algorithms check less and less positions every new position I imagined it would be some complexity between $O(N)$ and $O(N*log(N))$ and not $O(N^2)$.

Another question. If I did my algorithm only for half of the array what would the complexity be? I mean starting at the middle position and only doing the algorithm to the right side for example so the middle position would check only $N/2-1$ positions and the next only $N/2-2$, etc. I guess it would be $O((N/2)^2)$ which would be $O(N^2/4)$ and so still $O(N^2)$?

asked Oct 12, 2018 at 17:18
$\endgroup$

1 Answer 1

0
$\begingroup$

You have to count. Intuition is necessary, but you must calculate.

$$ 1+2+\cdots+N-1 = N(N-1)/2 \in \mathcal{O}(N^2)$$ by Gaussian summation rules.

In asymptotic notation, the constants are removed. For example.

$\mathcal{O}(N/k) = \mathcal{O}(N)$ as long as $k$ is a constant.

answered Oct 12, 2018 at 18:35
$\endgroup$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.