Time complexity for a triple for loop - java

What have I done before
Asymptotic analysis of three nested for loops
I was solving the time complexity of this algorithm.
seeing that the outer loop runs n times and inner loop 1 runs i times, I applied the summation and got the two outer loops' complexity to be n(n plus 1)/2.
Then the inner loop executes j times equals to summation of j from j is 0 to j is n(n plus 1)/2. This yields the total complexity of O(n4).
The problem
Asymptotic analysis of three nested for loops
Seems like my answer is wrong. Where did I made the mistake?

You counted the wrong thing. The inner loop executes j times, and j is always less than n. Therefore, for each of your n(n-1)/2 times the inner loop starts, the body of the inner loop will be executed less than n times, which means the total number of times the loop executes is at most n(n-1)/2 * O(n), which is at most O(n^3). What I think you did was double-counting. You tried to use the "summation" of j, which is the total number of times the for (int j... loop executes. But that information is already contained in the computation you already made, n(n-1)/2; using that information again and multiplying it with n(n-1)/2 is double-counting.

Related

Big O notation explanation nested while loop

I was wondering what the big o notation is for the following (java)code:
while (n > 0) {
while (n > 0){
n-- ;
}
}
If I use n = 10 it will do one iteration in the outer loop and 10 inside the inner loop. So a total of 11 iterations right?
if I use n = 100 it will de one iteration in the outer loop and 100 inside the inner loop. So a total of 101 iterations right?
But this is the point where I got stuck. Because I think the notation is O(n). Simply because I think the iteration are almost equal to n.
But I don't know how to prove it?
I am not that much in math so a clear explanation would be appriciated
Informally speaking, for positive arguments, the outer loop takes exactly one iteration, as in the inner loop n is decreased to zero. The inner loop will take exactly n iterations, so the runtime complexity of the inner loop is O(n). In total, although the termination condition of the outer loop syntactically depends on n, it is in fact independent from n. The overall complexity can be seen as O(n+c) where c is a constant representing the execution of the outer loop. However, O(n+c) equals O(n).
What probably puzzles you is that in your terminology, you speak of a number of 101 iterations of a loop, where you in fact refer to two different loops.
It's O(n), because the outer loop runs one single time. When the inner loop finishes, then the condition for the outer loop is also false. Therefore the outer loop is not important to the O notation.
Yes, it's O(n). Mathematical proofs for even simple algorithms are not easy, however.
What you could do is apply weakest precondition to formally analyze this.
See this https://en.wikipedia.org/wiki/Predicate_transformer_semantics#While_loop
Informally, it's easy to see that after the inner while n >= 0 must be true, regardless of what happens inside the inner loop. And if n >= 0 then the outer while will end. As this happens every single time after the inner while (regardless of it's contents), then the outer loop never executes more than once.
Weakest precondition can be used to proof this more formally, but if you apply it to bigger problems your head will definitely start to ache. It's educational though.

Complexity of An Algorithm(Nested loops)

I'm given code for an algorithm as such:
1 sum =0;
2 for(i=0; i<n; i++)
3 for(j=1; j<= n; j*=3)
4 sum++;
I'm told this algorithm runs in O(nlogn) where log is in base 3.
So I get that the second line runs n times, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.
What you have been told is correct. The algorithm runs in O(nlog(n)). The third line: for(j=1; j<= n; j*=3) runs in O(log3(n)) because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n. The solution is x = log3(n)
Yes. Algo runs in nlog(n) times where log is base 3.
The easiest way to calculate complexity is to calculate number of operations done.
The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...
That means that the inner for loop runs in logarithmic time (log(n)) for each n.
So n*log(n) will be total complexity.
On the second loop you have j *= 3, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end

Count amount of times loop runs (Big O)

I'm given a pseodocode statement as such:
function testFunc(B)
for j=1 to B.length-1
for i=1 to B.length-j
if(B[i-1] > B[i]
swap B[i-1] and B[i]
And I'm told to show that this algorithm runs in Big o O(n^2) time.
So I know that the first for loop runs n times, because I believe it's inclusive. I'm not sure about the rest of the lines though, would the second for loop run n-2 times? Any help would be much appreciated.
The inner loop runs a decreasing number of times. Look at a concrete example. If B.length were 10, then the contents of the inner loop would be executed 10 times, then 9 times, and so on, down to 1 time.
Using Gauss' equation of:
n(n + 1) / 2
you can see that the inner code would be executed 55 times in that example. (10(10 + 1)/2 = 55)
So, it follows that for n times, it would run n(n + 1) / 2 times. This is equivalent to:
1/2 n^2 + 1/2 n
In terms of Big-Oh, the co-efficients and the smaller values of n are ignored, so this is equivalent to O(n^2).
If N = B.length, then outer loop runs N-1 times, and inner loop runs (N-1)+...+3+2+1 times, for a total of (N-1) * (N/2) = N^2/2 - N/2 times, which means O(n^2).
Let's say that B.length is 5 times. The outer loop will thus run 4 times. On the first iteration through the outer loop, the inner loop will run 4 times; on the second iteration the inner loop will run 3 times; 2 times for the third iteration; and 1 time for the fourth.
Let's lay the results out geometrically:
AAAA
AAA
AA
A
Each A represents getting to the conditional/swap inside the nested loop, and you want to know, in general, how many A's there are.
An easy way to count them is to double the triangle shape to produce a rectangle:
AAAAB
AAABB
AABBB
ABBBB
and you can quickly see that for a triangle whose side is length N, there are N*(N-1)/2 A's because they are half of the N*(N-1) rectangle is made up of A's. Carrying out the multiplication, and ignoring the scale factor of 1/2 (because big-O doesn't care about constants), we see that there are O(N^2) A's.

Space Complexity Recurrence

So this is a 2 part question.
I have some code that asks for the time complexity, and it consists of 3 for loops (nested):
public void use_space(int n)
for(int i=0;i<N;i++)
for(int j=0;j<N;j++)
for(int k=0;k<N;k++)
//and at the end of the code, it makes a recursive call to the function
use_space(n/2);
use_space(n/2);
So what I derived for this time complexity recurrence was: T(n) = 2T(n/2) + n^3. The reason I got that was because there were 2 recursive calls to the function each consisting of n/2 time and the nested for loops take n^3 time (3 loops).
IS this correct?
And then for the space complexity, I got S(n) = S(n/2) + n
Hope someone can clarify and tell me if this is wrong/explain. All help would be greatly appreciated.
Your timecomplexity is correct. You can use masters theorem to simplify it to Theta(n^3)
But your space complexity seems to be (a little) incorrect.
For every call of use_space(n) you have to save three numbers i, j and k. This numbers are at most of the size of n (if n == N, I think you mixed them up), so they need log n bits to be saved. Due to freeing the space after finishing use_space you only have to save one additional call use_space(n/2).
So you get the space complexity S(n) = S(n/2) + 3 log n.
Improvement: You could free i, j and k after ending the three loops, so you do not need 3 log n AND S(n/2) (at the same time), but first 3 log n and after this S(n/2), which will be frist 3 log (n/2) and then S(n/4) (and so on), so you only need the maximum of the space used at the same time, which would be 3 log n.

Big O for 3 nested loops

A Big O notation question...What is the Big O for the following code:
for (int i = n; i > 0; i = i / 2){
for (int j = 0; j < n; j++){
for (int k = 0; k < n; k++){
count++;
}
}
}
My Thoughts:
So breaking it down, I think the outside loop is O(log2(n)), then each of the inner loops is O(n) which would result in O(n^2 * log2(n)) Question #1 is that correct?
Question #2:
when combining nested loops is it always just as simple as multiply the Big O of each loop?
When loop counters do not depend on one another, it's always possible to work from the inside outward.
The innermost loop always takes time O(n), because it loops n times regardless of the values of j and i.
When the second loop runs, it runs for O(n) iterations, on each iteration doing O(n) work to run the innermost loop. This takes time O(n2).
Finally, when the outer loop runs, it does O(n2) work per iteration. It also runs for O(log n) iterations, since it runs equal to the number of times you have to divide n by two before you reach 1. Consequently, the total work is O(n2 log n).
In general, you cannot just multiply loops together, since their bounds might depend on one another. In this case, though, since there is no dependency, the runtimes can just be multiplied. Hopefully the above reasoning sheds some light on why this is - it's because if you work from the inside out thinking about how much work each loop does and how many times it does it, the runtimes end up getting multiplied together.
Hope this helps!
Yes, this is correct: the outer loop is logN, the other two are N each, for the total of O(N^2*LogN)
In the simple cases, yes. In more complex cases, when loop indexes start at numbers indicated by other indexes, the calculations are more complex.
To answer this slightly (note: slightly) more formally, say T(n) is the time (or number of operations) required to complete the algorithm. Then, for the outer loop, T(n) = log n*T2(n), where T2(n) is the number of operations inside the loop (ignoring any constants). Similarly, T2(n) = n*T3(n) = n*n.
Then, use the following theorem:
If f1(n) = O(g1(n)) and f2(n) = O(g2(n)), then f1(n)×f2(n) = O(g1(n)×g2(n))
(source and proof)
This leaves us with T(n) = O(n2logn).
"Combining nested loops" is just an application of this theorem. The trouble can be in figuring out exactly how many operations each loop uses, which in this case is simple.
You can proceed formally using Sigma Notation, to faithfully imitate your loops:

Categories