A Big O notation question...What is the Big O for the following code:
for (int i = n; i > 0; i = i / 2){
for (int j = 0; j < n; j++){
for (int k = 0; k < n; k++){
count++;
}
}
}
My Thoughts:
So breaking it down, I think the outside loop is O(log2(n)), then each of the inner loops is O(n) which would result in O(n^2 * log2(n)) Question #1 is that correct?
Question #2:
when combining nested loops is it always just as simple as multiply the Big O of each loop?
When loop counters do not depend on one another, it's always possible to work from the inside outward.
The innermost loop always takes time O(n), because it loops n times regardless of the values of j and i.
When the second loop runs, it runs for O(n) iterations, on each iteration doing O(n) work to run the innermost loop. This takes time O(n2).
Finally, when the outer loop runs, it does O(n2) work per iteration. It also runs for O(log n) iterations, since it runs equal to the number of times you have to divide n by two before you reach 1. Consequently, the total work is O(n2 log n).
In general, you cannot just multiply loops together, since their bounds might depend on one another. In this case, though, since there is no dependency, the runtimes can just be multiplied. Hopefully the above reasoning sheds some light on why this is - it's because if you work from the inside out thinking about how much work each loop does and how many times it does it, the runtimes end up getting multiplied together.
Hope this helps!
Yes, this is correct: the outer loop is logN, the other two are N each, for the total of O(N^2*LogN)
In the simple cases, yes. In more complex cases, when loop indexes start at numbers indicated by other indexes, the calculations are more complex.
To answer this slightly (note: slightly) more formally, say T(n) is the time (or number of operations) required to complete the algorithm. Then, for the outer loop, T(n) = log n*T2(n), where T2(n) is the number of operations inside the loop (ignoring any constants). Similarly, T2(n) = n*T3(n) = n*n.
Then, use the following theorem:
If f1(n) = O(g1(n)) and f2(n) = O(g2(n)), then f1(n)×f2(n) = O(g1(n)×g2(n))
(source and proof)
This leaves us with T(n) = O(n2logn).
"Combining nested loops" is just an application of this theorem. The trouble can be in figuring out exactly how many operations each loop uses, which in this case is simple.
You can proceed formally using Sigma Notation, to faithfully imitate your loops:
Related
I am having trouble understanding how while loops affect the Big O time complexity.
For example, how would I calculate the time complexity for the code below?
Since it has a for loop that traverses through each element in the array and two nested while loops my initial thought was O(n^3) for the time complexity but I do not think that is right.
HashMap<Integer,Boolean> ht = new HashMap<>();
for(int j : array){
if(ht.get(j)) continue;
int left = j-1;
//check if hashtable contains number
while(ht.containsKey(left)){
//do something
left--;
}
int right = j+1;
//check if hashtable contains number
while(ht.containsKey(right)){
//do something
right++;
}
int diff = right - left;
if(max < diff) {
//do something
}
}
There is best case, average case, and worst case.
I'm going to have to assume there is something that constrains the two while loops so that neither iterates more than n times, where n is the number of elements in the array.
In the best case, you have O(n). That is because if(ht.get(j)) is always true, the continue path is always taken. Neither while loop is executed.
For the worst case, if(ht.get(j)) is always false, the while loops will be executed. Also, in the worst case, each while loop will have n passes. [1] The net result is 2 * n for both inner loops multiplied by n for the outer loop: (2 * n) * n. That would give you time complexity of O(n^2). [2]
The lookup time could potentially be a factor. A hash table lookup usually runs in constant time: O(1). That's the best case. But, the worst case is O(n). This happens when all entries have the same hash code. If that happens, it could potentially change your worst case to O(n^3).
[1] I suspect the worst case, the number of passes of the first while loop plus the number of passes of the second while loop is actually n or close to it. But, that doesn't change the result.
[2] In Big O, we chose the term that grows the fastest, and ignore the coefficients. So, in this example, we drop the 2 in 2*n*n.
Assuming there are m and n entries in your HashMap and array, respectively.
Since you have n elements for the for loop, the complexity can be written as n * complexity_inside_for.
Inside the for loop, you have two consecutive (not nested) while loops, each contributing a complexity of m as in worst case it'll need to go through all entries in your HashMap. Therefore, complexity_inside_for = m + m = 2m.
So overall, time complexity is n * 2m. However, as m and n approach infinity, the number 2 doesn't matter because it is not a function of m and/or n and can be discarded. This gives a big-O time complexity of O(m*n)
for one nested loop the time complexity works like this: O(n^2).
In each iteration of i, inner loop is executed 'n' times. The time complexity of a loop is equal to the number of times the innermost statement is to be executed.
so for your case that would be O(n^2)+O(n).
there you can find more explanation
Time-complexity
I was wondering what the big o notation is for the following (java)code:
while (n > 0) {
while (n > 0){
n-- ;
}
}
If I use n = 10 it will do one iteration in the outer loop and 10 inside the inner loop. So a total of 11 iterations right?
if I use n = 100 it will de one iteration in the outer loop and 100 inside the inner loop. So a total of 101 iterations right?
But this is the point where I got stuck. Because I think the notation is O(n). Simply because I think the iteration are almost equal to n.
But I don't know how to prove it?
I am not that much in math so a clear explanation would be appriciated
Informally speaking, for positive arguments, the outer loop takes exactly one iteration, as in the inner loop n is decreased to zero. The inner loop will take exactly n iterations, so the runtime complexity of the inner loop is O(n). In total, although the termination condition of the outer loop syntactically depends on n, it is in fact independent from n. The overall complexity can be seen as O(n+c) where c is a constant representing the execution of the outer loop. However, O(n+c) equals O(n).
What probably puzzles you is that in your terminology, you speak of a number of 101 iterations of a loop, where you in fact refer to two different loops.
It's O(n), because the outer loop runs one single time. When the inner loop finishes, then the condition for the outer loop is also false. Therefore the outer loop is not important to the O notation.
Yes, it's O(n). Mathematical proofs for even simple algorithms are not easy, however.
What you could do is apply weakest precondition to formally analyze this.
See this https://en.wikipedia.org/wiki/Predicate_transformer_semantics#While_loop
Informally, it's easy to see that after the inner while n >= 0 must be true, regardless of what happens inside the inner loop. And if n >= 0 then the outer while will end. As this happens every single time after the inner while (regardless of it's contents), then the outer loop never executes more than once.
Weakest precondition can be used to proof this more formally, but if you apply it to bigger problems your head will definitely start to ache. It's educational though.
I'm given code for an algorithm as such:
1 sum =0;
2 for(i=0; i<n; i++)
3 for(j=1; j<= n; j*=3)
4 sum++;
I'm told this algorithm runs in O(nlogn) where log is in base 3.
So I get that the second line runs n times, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.
What you have been told is correct. The algorithm runs in O(nlog(n)). The third line: for(j=1; j<= n; j*=3) runs in O(log3(n)) because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n. The solution is x = log3(n)
Yes. Algo runs in nlog(n) times where log is base 3.
The easiest way to calculate complexity is to calculate number of operations done.
The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...
That means that the inner for loop runs in logarithmic time (log(n)) for each n.
So n*log(n) will be total complexity.
On the second loop you have j *= 3, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end
public static void Comp(int n)
{
int count=0;
for(int i=0;i<n;i++)
{
for(int j=0;j<n;j++)
{
for(int k=1;k<n;k*=2)
{
count++;
}
}
}
System.out.println(count);
}
Does anyone knows what the time complexity is?
And what is the Big Oh()
Please can u explain this to me, step by step?
Whoever gave you this problem is almost certainly looking for the answer n^2 log(n), for reasons explained by others.
However the question doesn't really make any sense. If n > 2^30, k will overflow, making the inner loop infinite.
Even if we treat this problem as being completely theoretical, and assume n, k and count aren't Java ints, but some theoretical integer type, the answer n^2 log n assumes that the operations ++ and *= have constant time complexity, no matter how many bits are needed to represent the integers. This assumption isn't really valid.
Update
It has been pointed out to me in the comments below that, based on the way the hardware works, it is reasonable to assume that ++, *=2 and < all have constant time complexity, no matter how many bits are required. This invalidates the third paragraph of my answer.
In theory this is O(n^2 * log(n)).
Each of two outer loops is O(n) and the inner one is O(log(n)), because log base 2 of n is the number of times which you have to divide n by 2 to get 1.
Also this is a strict bound, i.e the code is also Θ(n^2 * log(n))
The time complexity is O(n^2 log n). Why? each for-loop is a function of n. And you have to multiply by n for each for loop; except the inner loop which grows as log n. why? for each iteration k is multiplied by 2. Think of merge sort or binary search trees.
details
for the first two loops: summation of 1 from 0 to n, which is n+1 and so the first two loops give (n+1)*(n+1)= n^2+2n+1= O(n^2)
for the k loop, we have k growing as 1,2,4,8,16,32,... so that 2^k = n. Take the log of both sides and you get k=log n
Again, not clear?
So if we set m=0, and a=2 then we get -2^n/-1 why is a=2? because that is the a value for which the series yields 2,4,8,16,...2^k
What have I done before
Asymptotic analysis of three nested for loops
I was solving the time complexity of this algorithm.
seeing that the outer loop runs n times and inner loop 1 runs i times, I applied the summation and got the two outer loops' complexity to be n(n plus 1)/2.
Then the inner loop executes j times equals to summation of j from j is 0 to j is n(n plus 1)/2. This yields the total complexity of O(n4).
The problem
Asymptotic analysis of three nested for loops
Seems like my answer is wrong. Where did I made the mistake?
You counted the wrong thing. The inner loop executes j times, and j is always less than n. Therefore, for each of your n(n-1)/2 times the inner loop starts, the body of the inner loop will be executed less than n times, which means the total number of times the loop executes is at most n(n-1)/2 * O(n), which is at most O(n^3). What I think you did was double-counting. You tried to use the "summation" of j, which is the total number of times the for (int j... loop executes. But that information is already contained in the computation you already made, n(n-1)/2; using that information again and multiplying it with n(n-1)/2 is double-counting.