I'm trying to determine the time complexity of adding $n$ numbers that each have a bit length of $n\log n$. I'm confused because sometimes I've seen the addition of two $n$ bit numbers as requiring $O(n)$ time and other places $O(1)$ time - especially in cases where there is multiplication involved in other steps.
So I know that my addition problem takes at least $O(n)$ time since there are $n$ numbers. But is the time complexity instead $n^2\log n$? I'm thinking that since the length of the numbers (in binary) is comparable to the number of numbers to be added, the time complexity could be greater. Thanks.
2 Answers 2
Time complexity depends on the model you are working with. For example if you are working with single work-tape Turing machine (with read-only input tape), you need to at least go through the whole input. As the input length is $n^2\log{n}$ the machine has to spent at least this much time.
On the other hand, say if you are working with a RAM model where the input numbers can be loaded into registers and adding two register contents is counted as a single step, then time complexity here would be $O(n)$.
There can be other models (I just know these basic two). But the point remains you must first define the model you work with before going to time/space complexity.
Another question you might find useful: What is the difference between RAM and TM?
You usually assume that primitive operations can be performed in constant time. But there can be disagreement about what is a primitive operation.
Many programmers will assume that an n bit number can be stored in n/64 words, adding two such numbers takes n/64 steps times a small constant, so O(n).
Others use a computer that can add n = 64 bit numbers as a primitive operation, and assume that always n <= 64, so always takes constant time. Or being generous, they assume that always n <= 128, also leading to slightly larger constant time. So they assume O(1).
Now in reality... You can build hardware of size O(n log n) taking O(log n) time units to add two n-bit numbers. So it’s a matter of cost.
You can figure out the time for n log n-bit numbers; adding two of your numbers takes O(log log n) time, and adding n numbers adds a constant log n. At excessive hardware cost.
Explore related questions
See similar questions with these tags.