Count amount of times loop runs (Big O) - java

I'm given a pseodocode statement as such:
function testFunc(B)
for j=1 to B.length-1
for i=1 to B.length-j
if(B[i-1] > B[i]
swap B[i-1] and B[i]
And I'm told to show that this algorithm runs in Big o O(n^2) time.
So I know that the first for loop runs n times, because I believe it's inclusive. I'm not sure about the rest of the lines though, would the second for loop run n-2 times? Any help would be much appreciated.

The inner loop runs a decreasing number of times. Look at a concrete example. If B.length were 10, then the contents of the inner loop would be executed 10 times, then 9 times, and so on, down to 1 time.
Using Gauss' equation of:
n(n + 1) / 2
you can see that the inner code would be executed 55 times in that example. (10(10 + 1)/2 = 55)
So, it follows that for n times, it would run n(n + 1) / 2 times. This is equivalent to:
1/2 n^2 + 1/2 n
In terms of Big-Oh, the co-efficients and the smaller values of n are ignored, so this is equivalent to O(n^2).

If N = B.length, then outer loop runs N-1 times, and inner loop runs (N-1)+...+3+2+1 times, for a total of (N-1) * (N/2) = N^2/2 - N/2 times, which means O(n^2).

Let's say that B.length is 5 times. The outer loop will thus run 4 times. On the first iteration through the outer loop, the inner loop will run 4 times; on the second iteration the inner loop will run 3 times; 2 times for the third iteration; and 1 time for the fourth.
Let's lay the results out geometrically:
AAAA
AAA
AA
A
Each A represents getting to the conditional/swap inside the nested loop, and you want to know, in general, how many A's there are.
An easy way to count them is to double the triangle shape to produce a rectangle:
AAAAB
AAABB
AABBB
ABBBB
and you can quickly see that for a triangle whose side is length N, there are N*(N-1)/2 A's because they are half of the N*(N-1) rectangle is made up of A's. Carrying out the multiplication, and ignoring the scale factor of 1/2 (because big-O doesn't care about constants), we see that there are O(N^2) A's.

Related

Confusion regarding while loop in for loop for time complexity

I'm learning about time complexity and I understand the gist of it, but there's one thing that's really confusing me, it's about understanding the time complexity of a while loop in a for loop.
This is the code I am analyzing:
sum := 0
for i := 1 to n
j := 1
while j ≤ i
sum += j
j*=5
end
end
I've given a shot at this and I made this table, breaking it down:
CODE:
COST:
# OF TIMES:
TIME COMPLEXITY:
sum := 0
1
1
1
for i := 1 to n
int i = 1 (1)
1
2n+1
i<=n (1)
n+1
i++ (1)
n
j := 1
1
n
n
while j ≤ i
j ≤ i (1)
?
?
sum += j
1
?
?
j*=5
1
?
?
end
0
1
0
end
0
1
0
I understand the how the time complexity works for the for loop, but when I get to the while loop I'm confused.
I know that assignments have cost of 1 and comparisons have a cost of 1.
If the while loop was written as:
sum:=0
j:=1
while j<=n
sum+=j
j*=5
end
Since it's moving in increments of 5: (j*=5), its time complexity would be: log base5 n.
But the while loop in the code goes j<=i, which is throwing me off.
I someone could help me with cost/# of times the individual lines of the while loop, I would really appreciate it.
fyi: this isn't an assignment for school or anything like that, I'm genuinely trying to understand this concept for myself.
If the table above doesn't format correctly, here is a ss of it
Your loop is equivalent to the sum of log_5(i) for all i from 1 to n. The logarithm base doesn't matter, it's a constant; so i'll just say log from now on. This summation is asymptotically equal to n log n. Think about it this way: If you look at the graph of log, it's really, really flat, so most of the higher values in your summation look the same to the log function. This is only an intuition hint, not a formal proof, but it's sufficient. If you need a formal proof, check out this post

Complexity of An Algorithm(Nested loops)

I'm given code for an algorithm as such:
1 sum =0;
2 for(i=0; i<n; i++)
3 for(j=1; j<= n; j*=3)
4 sum++;
I'm told this algorithm runs in O(nlogn) where log is in base 3.
So I get that the second line runs n times, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.
What you have been told is correct. The algorithm runs in O(nlog(n)). The third line: for(j=1; j<= n; j*=3) runs in O(log3(n)) because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n. The solution is x = log3(n)
Yes. Algo runs in nlog(n) times where log is base 3.
The easiest way to calculate complexity is to calculate number of operations done.
The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...
That means that the inner for loop runs in logarithmic time (log(n)) for each n.
So n*log(n) will be total complexity.
On the second loop you have j *= 3, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end

Space Complexity Recurrence

So this is a 2 part question.
I have some code that asks for the time complexity, and it consists of 3 for loops (nested):
public void use_space(int n)
for(int i=0;i<N;i++)
for(int j=0;j<N;j++)
for(int k=0;k<N;k++)
//and at the end of the code, it makes a recursive call to the function
use_space(n/2);
use_space(n/2);
So what I derived for this time complexity recurrence was: T(n) = 2T(n/2) + n^3. The reason I got that was because there were 2 recursive calls to the function each consisting of n/2 time and the nested for loops take n^3 time (3 loops).
IS this correct?
And then for the space complexity, I got S(n) = S(n/2) + n
Hope someone can clarify and tell me if this is wrong/explain. All help would be greatly appreciated.
Your timecomplexity is correct. You can use masters theorem to simplify it to Theta(n^3)
But your space complexity seems to be (a little) incorrect.
For every call of use_space(n) you have to save three numbers i, j and k. This numbers are at most of the size of n (if n == N, I think you mixed them up), so they need log n bits to be saved. Due to freeing the space after finishing use_space you only have to save one additional call use_space(n/2).
So you get the space complexity S(n) = S(n/2) + 3 log n.
Improvement: You could free i, j and k after ending the three loops, so you do not need 3 log n AND S(n/2) (at the same time), but first 3 log n and after this S(n/2), which will be frist 3 log (n/2) and then S(n/4) (and so on), so you only need the maximum of the space used at the same time, which would be 3 log n.

Time complexity for a triple for loop

What have I done before
Asymptotic analysis of three nested for loops
I was solving the time complexity of this algorithm.
seeing that the outer loop runs n times and inner loop 1 runs i times, I applied the summation and got the two outer loops' complexity to be n(n plus 1)/2.
Then the inner loop executes j times equals to summation of j from j is 0 to j is n(n plus 1)/2. This yields the total complexity of O(n4).
The problem
Asymptotic analysis of three nested for loops
Seems like my answer is wrong. Where did I made the mistake?
You counted the wrong thing. The inner loop executes j times, and j is always less than n. Therefore, for each of your n(n-1)/2 times the inner loop starts, the body of the inner loop will be executed less than n times, which means the total number of times the loop executes is at most n(n-1)/2 * O(n), which is at most O(n^3). What I think you did was double-counting. You tried to use the "summation" of j, which is the total number of times the for (int j... loop executes. But that information is already contained in the computation you already made, n(n-1)/2; using that information again and multiplying it with n(n-1)/2 is double-counting.

Big O for 3 nested loops

A Big O notation question...What is the Big O for the following code:
for (int i = n; i > 0; i = i / 2){
for (int j = 0; j < n; j++){
for (int k = 0; k < n; k++){
count++;
}
}
}
My Thoughts:
So breaking it down, I think the outside loop is O(log2(n)), then each of the inner loops is O(n) which would result in O(n^2 * log2(n)) Question #1 is that correct?
Question #2:
when combining nested loops is it always just as simple as multiply the Big O of each loop?
When loop counters do not depend on one another, it's always possible to work from the inside outward.
The innermost loop always takes time O(n), because it loops n times regardless of the values of j and i.
When the second loop runs, it runs for O(n) iterations, on each iteration doing O(n) work to run the innermost loop. This takes time O(n2).
Finally, when the outer loop runs, it does O(n2) work per iteration. It also runs for O(log n) iterations, since it runs equal to the number of times you have to divide n by two before you reach 1. Consequently, the total work is O(n2 log n).
In general, you cannot just multiply loops together, since their bounds might depend on one another. In this case, though, since there is no dependency, the runtimes can just be multiplied. Hopefully the above reasoning sheds some light on why this is - it's because if you work from the inside out thinking about how much work each loop does and how many times it does it, the runtimes end up getting multiplied together.
Hope this helps!
Yes, this is correct: the outer loop is logN, the other two are N each, for the total of O(N^2*LogN)
In the simple cases, yes. In more complex cases, when loop indexes start at numbers indicated by other indexes, the calculations are more complex.
To answer this slightly (note: slightly) more formally, say T(n) is the time (or number of operations) required to complete the algorithm. Then, for the outer loop, T(n) = log n*T2(n), where T2(n) is the number of operations inside the loop (ignoring any constants). Similarly, T2(n) = n*T3(n) = n*n.
Then, use the following theorem:
If f1(n) = O(g1(n)) and f2(n) = O(g2(n)), then f1(n)×f2(n) = O(g1(n)×g2(n))
(source and proof)
This leaves us with T(n) = O(n2logn).
"Combining nested loops" is just an application of this theorem. The trouble can be in figuring out exactly how many operations each loop uses, which in this case is simple.
You can proceed formally using Sigma Notation, to faithfully imitate your loops:

Categories