I need to find the time complexity of the following simple algorithm.
Calculate the time complexity of the following algorithm:
void f(int n) { int i, x; for (i = 1; i <= n; i++) if (i % 2 == 1) x++ }
It obvious that the time complexity of this algorithm is $\Theta(N)$, but how I prove it formally?
I know that the loop will do $N$ iterations, so the total number of operations will be $N \times \text{(number of operations inside the loop)}$.
Here the tricky part, I know that for each iteration will be one operation for the comparison and one operation only if $i$ is odd.
So it's something like this: $\sum_{i=1}^{n}(1+\frac{1}{2})$? I'm not sure, it's seems too strange and I'm not sure how to continue.
What is the best way to write it formally?
Also, I would like to ask if there are books with many examples of analysis of simple algorithms like this and proofs of their complexity. The most recommended book seems to be CLRS, but I didn't found many examples of this kind there and from what I understand it can be very complex topic (depending on the series you need to calculate in the end).
Thank you.
3 Answers 3
You're doing the right thing. There's no magic that you're missing.
$$ \begin{aligned} \sum_{i=1}^n (1 + \frac{1}{2}) &=\\ &= \sum_{i=1}^n \frac{3}{2}\\ &= \underbrace{\frac{3}{2} + \dots + \frac{3}{2}}_{n \text{ times}}\\ &= \frac{3}{2}n \in O(n) \end{aligned} $$
One important thing to understand here is that the asymptotic analysis happens with respect to the parameters that are of interest to you. Here, for example, we are interested in the complexity with respect to the input $n$.
Typically, we assume that arithmetic operations (like the modulo or addition, etc) take a constant number of time. In a real system, one operation might indeed be more expensive than another. But, the important detail is that the time required by these operations does not depend on the input parameter $n$: it could be 1ドル,ドル 100ドル,ドル or 10ドル^{\#\text{stars}},ドル but it is still a constant as long as it does not scale with the parameter of interest. So again, in summary, when we compute the asymptotic complexity of the function with respect to $n,ドル we can consider that all these operations take a constant time.
Based on the above, the code inside the for-loop also takes a constant time. Whether the condition is true or not is irrelevant. If it is false it will take some constant time, and if it is true it will take again some constant time (maybe more than before, but still constant). Hence, overall, the time for one iteration is $\Theta(1)$.
Then, as you said, the overall complexity will be $\Theta(n) \cdot \Theta(1) = \Theta(n)$.
The thing to remember with Big-O is that it's an upper bound. So there are times you can assume things are worse than they actually are.
You know that inside the loop, at most two operations are performed each time. So forget about $\sum_{i=1}^{n}(1+\frac{1}{2})$. You know that $\sum_{i=1}^{n}(1+\frac{1}{2}) \leq \sum_{i=1}^{n}(2),ドル and you can apply sum rules to see that $\sum_{i=1}^{n}(2) = 2n$.
So now you know the number of operations performed inside the loop is less-than or equal to 2ドルn$ operations, and we evaluate the loop condition $n+1$ times (it fails on the last time).
2ドルn$ is in the set $O(n),ドル that is, when you remove the constants, you can see that the overall running time is $O(n)$.
If you wanted to be picky, you could do the opposite to get a lower bound on the running time by assuming you only ever do one operation inside the loop, thus showing that the running time is $\Theta(n)$ i.e. $n$ $O(n)$ is a tight upper bound.
-
$\begingroup$ $O(-)$ isn't an upper bound, as you show by one of your examples. 2ドルn\in O(n)$ but $n$ is not an upper bound for 2ドルn$. $\endgroup$David Richerby– David Richerby2015年03月12日 22:00:56 +00:00Commented Mar 12, 2015 at 22:00
-
$\begingroup$ Sorry, the O represents the fact that it's an upper bound. The function inside is the "order" of the upper bound, i.e. how big is it i.e. drop the constants and smaller terms. The point I was trying to make is that $O(2n) = O(1.5n) = O(n),ドル you can increase the constants without losing anything. And you can increase the inner function without losing anything, though your bound is no longer tight. The above code runs in $O(n^2)$ time but not $\Theta(n^2)$ time. $\endgroup$Joey Eremondi– Joey Eremondi2015年03月12日 22:18:01 +00:00Commented Mar 12, 2015 at 22:18
for (i = n; i <= n; i++)
Is that a typo? Because as it sits, it only goes through the loop once (unlessn
is INT_MAX). $\endgroup$