Big O notation (Algorithms) - java

Hi I am new to Big O Notation and having trouble specifying big O for the following if someone could kindly explain to me how to work it out please
int sum=1;
for(int count=n; count>0; count/=2) {
sum=sum*count;
}

As each time count is divided by 2, it will be Theta(log n) (from n to 0). Hence, it is in O(log(n)) as well.

In the first line, the code will run once, so for it, the complexity is O(1).
In second line a for loop will run until count > 0 and we can see that it will be divided by 2 every time loop runs and so it will just go on continuously infinite times. So basically, you cannot apply Big O notation on an infinite loop. And so also you cannot apply big O to the code inside the loop as well. So if there exist any O(infinite) then it could be an answer, but it won't exist. So you cannot define time complexity until and unless n<=0. if n<=0 then only the code will run once and complexity will be O(1).

count gets halved per each iteration. If n = 64:
step1 => count = 64
step2 => count = 32
step3 => count = 16
...
Therefore, its worst case scenario has a O(log n) time complexity.
In this case, however, your loop's body has a constant number of operations, therefore, best, worst, and average case scenarios are same, and:
Ω(log n) = Θ(log n) = O(log n)

Related

How to calculate Big O time complexity for while loops

I am having trouble understanding how while loops affect the Big O time complexity.
For example, how would I calculate the time complexity for the code below?
Since it has a for loop that traverses through each element in the array and two nested while loops my initial thought was O(n^3) for the time complexity but I do not think that is right.
HashMap<Integer,Boolean> ht = new HashMap<>();
for(int j : array){
if(ht.get(j)) continue;
int left = j-1;
//check if hashtable contains number
while(ht.containsKey(left)){
//do something
left--;
}
int right = j+1;
//check if hashtable contains number
while(ht.containsKey(right)){
//do something
right++;
}
int diff = right - left;
if(max < diff) {
//do something
}
}
There is best case, average case, and worst case.
I'm going to have to assume there is something that constrains the two while loops so that neither iterates more than n times, where n is the number of elements in the array.
In the best case, you have O(n). That is because if(ht.get(j)) is always true, the continue path is always taken. Neither while loop is executed.
For the worst case, if(ht.get(j)) is always false, the while loops will be executed. Also, in the worst case, each while loop will have n passes. [1] The net result is 2 * n for both inner loops multiplied by n for the outer loop: (2 * n) * n. That would give you time complexity of O(n^2). [2]
The lookup time could potentially be a factor. A hash table lookup usually runs in constant time: O(1). That's the best case. But, the worst case is O(n). This happens when all entries have the same hash code. If that happens, it could potentially change your worst case to O(n^3).
[1] I suspect the worst case, the number of passes of the first while loop plus the number of passes of the second while loop is actually n or close to it. But, that doesn't change the result.
[2] In Big O, we chose the term that grows the fastest, and ignore the coefficients. So, in this example, we drop the 2 in 2*n*n.
Assuming there are m and n entries in your HashMap and array, respectively.
Since you have n elements for the for loop, the complexity can be written as n * complexity_inside_for.
Inside the for loop, you have two consecutive (not nested) while loops, each contributing a complexity of m as in worst case it'll need to go through all entries in your HashMap. Therefore, complexity_inside_for = m + m = 2m.
So overall, time complexity is n * 2m. However, as m and n approach infinity, the number 2 doesn't matter because it is not a function of m and/or n and can be discarded. This gives a big-O time complexity of O(m*n)
for one nested loop the time complexity works like this: O(n^2).
In each iteration of i, inner loop is executed 'n' times. The time complexity of a loop is equal to the number of times the innermost statement is to be executed.
so for your case that would be O(n^2)+O(n).
there you can find more explanation
Time-complexity

Complexity of An Algorithm(Nested loops)

I'm given code for an algorithm as such:
1 sum =0;
2 for(i=0; i<n; i++)
3 for(j=1; j<= n; j*=3)
4 sum++;
I'm told this algorithm runs in O(nlogn) where log is in base 3.
So I get that the second line runs n times, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.
What you have been told is correct. The algorithm runs in O(nlog(n)). The third line: for(j=1; j<= n; j*=3) runs in O(log3(n)) because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n. The solution is x = log3(n)
Yes. Algo runs in nlog(n) times where log is base 3.
The easiest way to calculate complexity is to calculate number of operations done.
The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...
That means that the inner for loop runs in logarithmic time (log(n)) for each n.
So n*log(n) will be total complexity.
On the second loop you have j *= 3, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end

Big O notation (The complexity) of the following code?

I just wondering what is the big o for the code below:
I am thinking O(n). What do you guys think? Thank you for your help!
for( w = Length ; w >= 0 ; w = w / 2 ){
for( i = Length ; i >= 0 ; --i ){
if( randomNumber() == 4 )
return
}
}
Since you are asking for the Big O notation, which is the worst case time complexity, the answer is:
O(n^x) , where x is the denominator used in the outer-for loop.
This pretty much looks like a class assignment, so I do not answer it, but just give you some pointers (homework should not be done by copying the assignment to the web ;) ). Also the assignment is incomplete. I hope your teacher/lecturer did not give it like this.
The missing information is:
Are you looking for worst case runtime or average case runtime? Big-O can be used for both. [originally I included best case runtime, but this is done with big omega, as Jerry pointed out in the comments]
Another missing piece is the datatype of the variables. If they are doubles, it takes much longer until w = w/2 is 0 than with integers.
Worst-case runtime:
The inner loop has i = i-1, so it is executed length times. This gives you O(n) for the inner loop.
This already shows that your estimate is wrong. It has to be the number of executions of the outer loop TIMES the number of executions of the inner loop, so it must be more than linear (unless the outer loop has constant number of executions).
The outer loop has w = w/2, so, in terms of length, how long will this need to be 0? This gives you how often the outer loop is executed. And, by multiplication, the total number of executions.
Than there is this randomNumber(). As I said I am assuming worst-case analysis, the worst case is clearly that it is never 4 and thereby we can ignore this return.
Average-case runtime:
The analysis for the loops does not change. For the randomNumber(), we need to estimate how long it takes until the probability of NOT having 4 is sufficiently small. However, I do not have enough information about randomNumber() to do this.
Best-case runtime [should be big omega, not big o]:
In the best case, randomNumber() returns 4 on the first call. So the best case runtime is constant, O(1).

What is the time complexity for this algorithm?

public static void Comp(int n)
{
int count=0;
for(int i=0;i<n;i++)
{
for(int j=0;j<n;j++)
{
for(int k=1;k<n;k*=2)
{
count++;
}
}
}
System.out.println(count);
}
Does anyone knows what the time complexity is?
And what is the Big Oh()
Please can u explain this to me, step by step?
Whoever gave you this problem is almost certainly looking for the answer n^2 log(n), for reasons explained by others.
However the question doesn't really make any sense. If n > 2^30, k will overflow, making the inner loop infinite.
Even if we treat this problem as being completely theoretical, and assume n, k and count aren't Java ints, but some theoretical integer type, the answer n^2 log n assumes that the operations ++ and *= have constant time complexity, no matter how many bits are needed to represent the integers. This assumption isn't really valid.
Update
It has been pointed out to me in the comments below that, based on the way the hardware works, it is reasonable to assume that ++, *=2 and < all have constant time complexity, no matter how many bits are required. This invalidates the third paragraph of my answer.
In theory this is O(n^2 * log(n)).
Each of two outer loops is O(n) and the inner one is O(log(n)), because log base 2 of n is the number of times which you have to divide n by 2 to get 1.
Also this is a strict bound, i.e the code is also Θ(n^2 * log(n))
The time complexity is O(n^2 log n). Why? each for-loop is a function of n. And you have to multiply by n for each for loop; except the inner loop which grows as log n. why? for each iteration k is multiplied by 2. Think of merge sort or binary search trees.
details
for the first two loops: summation of 1 from 0 to n, which is n+1 and so the first two loops give (n+1)*(n+1)= n^2+2n+1= O(n^2)
for the k loop, we have k growing as 1,2,4,8,16,32,... so that 2^k = n. Take the log of both sides and you get k=log n
Again, not clear?
So if we set m=0, and a=2 then we get -2^n/-1 why is a=2? because that is the a value for which the series yields 2,4,8,16,...2^k

Big O for 3 nested loops

A Big O notation question...What is the Big O for the following code:
for (int i = n; i > 0; i = i / 2){
for (int j = 0; j < n; j++){
for (int k = 0; k < n; k++){
count++;
}
}
}
My Thoughts:
So breaking it down, I think the outside loop is O(log2(n)), then each of the inner loops is O(n) which would result in O(n^2 * log2(n)) Question #1 is that correct?
Question #2:
when combining nested loops is it always just as simple as multiply the Big O of each loop?
When loop counters do not depend on one another, it's always possible to work from the inside outward.
The innermost loop always takes time O(n), because it loops n times regardless of the values of j and i.
When the second loop runs, it runs for O(n) iterations, on each iteration doing O(n) work to run the innermost loop. This takes time O(n2).
Finally, when the outer loop runs, it does O(n2) work per iteration. It also runs for O(log n) iterations, since it runs equal to the number of times you have to divide n by two before you reach 1. Consequently, the total work is O(n2 log n).
In general, you cannot just multiply loops together, since their bounds might depend on one another. In this case, though, since there is no dependency, the runtimes can just be multiplied. Hopefully the above reasoning sheds some light on why this is - it's because if you work from the inside out thinking about how much work each loop does and how many times it does it, the runtimes end up getting multiplied together.
Hope this helps!
Yes, this is correct: the outer loop is logN, the other two are N each, for the total of O(N^2*LogN)
In the simple cases, yes. In more complex cases, when loop indexes start at numbers indicated by other indexes, the calculations are more complex.
To answer this slightly (note: slightly) more formally, say T(n) is the time (or number of operations) required to complete the algorithm. Then, for the outer loop, T(n) = log n*T2(n), where T2(n) is the number of operations inside the loop (ignoring any constants). Similarly, T2(n) = n*T3(n) = n*n.
Then, use the following theorem:
If f1(n) = O(g1(n)) and f2(n) = O(g2(n)), then f1(n)×f2(n) = O(g1(n)×g2(n))
(source and proof)
This leaves us with T(n) = O(n2logn).
"Combining nested loops" is just an application of this theorem. The trouble can be in figuring out exactly how many operations each loop uses, which in this case is simple.
You can proceed formally using Sigma Notation, to faithfully imitate your loops:

Categories