Complexity

Joao proposed me a question once and I found it tricky. I’m going to post here the question and answer I gave him after some thinking and discussing, but feel free to disagree and discuss some more (I read the answer and I am trying to convince myself again) =)

What is the complexity of this two programs:

– Finding the mean of n random numbers.
{
int s =0;
for(int i = 0; i < N; i) {
s += rand();
}
mean = s/N;
}

– Determining of N is prime
int i = 1
{
for(; i < N; i++){
if( N % i == 0) break;
}
printf(“%d”, i);
}

There are some things you should think about when you look at those two programs. I agree with you that, if you consider just the number n, you can say that their complexity is O(n). But, that cannot be considered a valid measure of complexity. Why? Because the linearity in complexity theory is measured according to the *size* of your input. In both cases, the size is one, since it is only one number. And depending on which number you use (*type* of input), the execution time may vary. So, since this is the case for both codes below, the size of your input should be in fact measured by the number of bits of your input number. In that case, the larger the number, more bits are needed to represent it and more time is spent by your algorithm. The actual _amount_ of the number (1, 10 or 100000) does not represent anything computationally, so you can’t use it to say your program is O(n). (I hope that’s clear…).

You see that is not the case of being an algorithm to factor numbers or not, the main point here is that both programs will depend on the type of the input. So, regarding the number of bits, we could say that:

– Your program executes N steps.
– The number of bits to represent the number N is b = log N
– Therefore: N = 2^b

Which means the your program’s complexity is actually O(2^b), which is exponential.