Finding the big O complexity for the following code - java

int sum = 0;
for(int i=1;i<n;i++)
for(int j=1;j<i*i;j++)
for(int k=1;k<j;k++)
if (j % i==1)
sum++;
The first statement is constant.
The first for loop takes n times.
The second for loop takes n^2 times.
The third for loop takes n times.
The if statement takes constant time.
The final statement takes constant time.
Thus final complexity is n x n^2 x n x n = O(n^4)
Is my understanding and final answer correct?

The time complexity of the third loop is O(n^2) since it is k<j and j will equal i*i (n*n). But the complexity of the example would be O(n^3) as you only multiply the time complexity if they are nested. Otherwise, for things occurring in sequence, their time complexity should be added O(1 + n^3 + n^2), which is simplified to the term of the highest order O(n^3) as with very large numbers this will be the only term that matters.
See this post for a more detailed answer, there are more resources there too.
int sum = 0; O(1)
for(int i=1;i<n;i++) O(n)
for(int j=1;j<i*i;j++) O(n^2)
= O(n^3)
for(int k=1;k<j;k++) O(n^2)
if (j % i==1) O(1)
sum++; O(1)
= O(n^2)
= O(1 + n^3 + n^2)
= O(n^3)

Related

Trying to understand the reason for this time complexity

I am trying to calculate the time complexity of these two algorithms. The book I am referring to specifies these time complexities of each.
A) Algorithm A: O(nlogn)
int i = n;
while (i > 0)
{
for (int j = 0; j < n; j++)
System.out.println("*");
i = i / 2;
}
B) Algorithm B: O(n)
while (n > 0)
{
for (int j = 0; j < n; j++)
System.out.println("*");
n = n / 2;
}
I can see how algo. A is O(nlogn). The for loop is O(n) and while loop is O(logn). However I am failing to see how AlgoB has a time complexity of O(n). I was expecting it to be O(nlogn) also. Any help would be appreciated.
Let's look at Algorithm B from a mathematical standpoint.
The number of * printed in the first loop is n. The number of * printed in each subsequent loop is at most n/2. That recurrence relation leads to the sequence:
n + n/2 + n/4 + n/8 + ...
If this were an infinite sequence, then the sum could be represented by the formula n/(1 - r), where r is the factor between terms. In this case, r is 1/2, so the infinite sequence has a sum of 2(n).
Your algorithm certainly won't go on forever, and it each time it loops, it may be printing less than half the stars of the previous loop. The number of stars printed is therefore less than or equal to 2(n).
Because constant factors drop out of time complexity, the algorithm is O(n).
This concept is called amortized complexity, which averages the cost of operations in a loop, even if some of the operations might be relatively expensive. Please see this question and the Wikipedia page for a more thorough explanation of amortized complexity.
Algorithm B is printing half the starts at every iteration. Assume n=10, then:
n=10 -> 10*
n=5 -> 5*
n=2 -> 2*
n=1 -> 1*
In total 18* are printed. You will print n + n/2 + n/4 + ... + n/(2^i) stars. How much does i value? It is equal to the number of steps required for n to become 0. In other terms, it is the exponent to which 2 must be raised to produce n: log_2(n). You get the sum in picture:
Which can be approximated to O(n).

What is the Big O complexity of this algorithm

I can't seem to figure out the time complexity for this algorithm. I know that the while loop will execute at O(log n) times because n keeps halving. But I'm not super sure how to represent the time complexity of the inner for loop.
while(n>0){
for(int j = 0; j < n; j++){
System.out.println("*");
}
n = n/2;
}
Each while loop divides n by 2.
Therefore, the first for loop would run n times, the second for loop would run n/2 times, the third for loop would run n/4 times, and so on.
If we set n to an arbitrary large value, then the complexity would be:
n + n/2 + n/4 + n/8 + n/16 + n/32 + ... = 2n
Of course, n is an integer, and programmatically, the division would result in 0 after enough repetitions.
Therefore, the time-complexity of the algorithm would be O(2N) = O(N)
We can figure out the complexity of both loops by inspection. For an input of n = 8, the inner loop would print out this many stars at each iteration:
8 + 4 + 2 + 1 = 15 ~ 2^4
For n = 16:
16 + 8 + 4 + 2 + 1 = 31 ~ 2^5
The complexity here is:
O(2^([log_2(N)]+1)) = 2*O(2^log_2[N])
= O(N)
We can see that at each step the number of print statements is roughly twice the value of the input N. So the overall number of print operations is actually O(N), which is linear in the actual value of the input N.
Lets n₀ be the start value of n.
At the i-th (from 0) iteration of the while loop, n = n₀ * ½ⁱ
The cost of one iteration of the while loop is n (because of the for loop)
Hence the global cost is: sum for i from 0 to whatever of n₀ * ½ⁱ
Or: n₀ times the sum for i from 0 to whatever of ½ⁱ
The sum is a sum of a geometric series, and is less than 2.
Hence the global cost is O(2.n₀) or O(n₀)

What is the Big-O complexity of nested for loops of increasing length

I'm new to the Big O notation and I am a little bit confused on how the following code plays into the Big O notation. In this situation n is just the length of an array and as you can see the two inner for loops are iterating through n^2 and n * 4. How does that play into the Big O notation? At first glance I thought it was n^3 however I am unsure.
for (int i=0; i < n; i++) {
for (int j=0; j < n*n; j++)
sum += data[i] * data[j];
for (int j=0; j < 4*n; j++)
sum += data[i] + data[j];
}
O(n * (O(c1 * n^2) + O(c2 * 4 * n))) =
O(n * (O(n^2) + O(n)) =
O(n * O(n^2)) =
O(n ^ 3)
Because sum (+), assignment (=), multiplication (*) and array access ([i]) cost constant.
So you are right.
The first loop iterates n times. For each iteration of the first loop second loop iterates n*n times. Therefore the total number of iterations of the second loop is n*(n*n). Furthermore for each iteration of the first loop third loop iterates 4*n times. Therefore the total number of iterations of the third loop is n*(4*n).
Hence complexity is O(n*n*n + n*4*n) -> O(n^3 + n^2) -> O(n^3).
You are right!!

Why am I getting different time complexities in these two different nested for-loops?

Ok, so I know two nested for-loops each incrementing by 1 gives a quadratic time complexity. Then I was curious to see if I change the update of one of the loops incrementing by a multiplication of 2 would I get O(n log n) instead of O(n^2) and vice versa to the other loop.
In each inner loop I have a variable to count how many times the loop executes. The array is size 2^20 so 1,048,576. I'm thinking both methods should have the same complexity of n log n (20 * 1,048,576). But only Algorithm 2 gets close to that complexity and Algorithm 1 has a complexity of n * 2.
To my understanding one loop is O(n) and the other is O(log n) so it should be O(n log n) together and then if I switch them I should get a complexity of O(log n n) which would be the same thing.
int[] arr = new int[1048576];
// Algorithm 1
int counter1 = 0;
for (int i = 1; i < arr.length; i++) {
for (int j = i; j < arr.length; j *= 2) {
counter1++;
}
}
System.out.println(counter1);
// Algorithm 2
int counter2 = 0;
for (int i = 1; i < arr.length; i *= 2) {
for (int j = i; j < arr.length; j++) {
counter2++;
}
}
System.out.println(counter2);
// counter1: 2097130 (n * 2)
// counter2: 19922945 (n log n)
Simple math. Assume for now, that w *= 2 will essentially take 20 steps to reach roughly 1 million.
Algo 1 will run roughly 1 million j-loops, but those loops will only take about 20 j-steps each to complete. You're running the inner loop on algo 2 (especially the first time) on a factor of millions, whereas the other will run <=20 times, 1 million times. However, you need to account for the decay, especially at the start. By the time you've hit i=2, you're already down to 19 j-steps on the algorithm. By 4, you're down to 18, and so on. This early decay will essentially "kill" the momentum for the number of steps. The last ~500,000 "i-steps" will only increment the counter once.
Algo 2 runs 1 million j-steps the first run alone, followed by another (i-step-1) j-steps (1, followed by 2, followed by 4, etc). You're running a million steps roughly each time.
Let's count the number of passes in each loop for the second algorithm. Let's take N=arr.length.
First the outer most loop. i ranges from 1 to N, and multiplied by 2 each time, that makes log(N) iterations.
Then, in the inner most loop j ranges from i to N and is incremented by 1 each time, that makes (N-i) iterations.
Lets now take k=log(i). So the total number of times counter2 is incremented is sum(N-2^k) for k=0 to log(N)
Sum for k=0 to log(N) of 2^k is a geometrical sum that adds up to 2^(log(N)+1)-1 so 2N-1.
Therfore the total complexity of the second loop is Nlog(N)-2N+1 which is O(Nlog(N)) just like your first loop.
The difference is the second order term 2N. If we push our development we have for the first loop a complexity of:
Nlog(N) + O(1)
and for the second:
Nlog(N) - 2N + O(1)
Both solutions you posted have exactly the dame complexity, i assume u forgot to swap the inner for loop begin clause variables.
O (n^2) with a constant factor.
Big-O notation works like this:
n/2 * n is the time your loop needs (any of the two u posted)
-> 1/2 * n * n = 1/2 * n^2
1/2 is the factor.
Complexity is polynomial, n^2

Estimating the order of growth of running time of an alogrithm

I am following the an online course and I can't understand well how to estimate the order of growth of an algorithm here s an example
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
Int sum = 0;
For (int i = 1; i <= 4*N; i = i*4)
for (int j = 0; j < i; j++)
sum++;
Can anyone explain to me how to get it
Just calculate how many times the statement sum++; is executed.
= 1 + 4 + 16 + 64 ... + 4*N
This is a geometric progression with common factor of 4. If number of terms in this series is k, then
4^k = 4*N.
Sum of series = (4^k - 1)(4-1) = (4*N - 1)/3.
In order of growth we neglect the constant factors.
Hence the complexity is O(N).
This is fairly straightforward:
There are log(N) + 1 iterations of the outer loop (logarithm is base 4).
Let x be the outer loop iteration number. The inner loop iterates 4^x times.
Hence the total runtime is Sum(4 ^ x) for x in [0..c], where c is log(N) This is a geometric series, and it's sum is easily calculated using the formula from the wiki:
Sum(4 ^ x) for x in [1..c] = (1 - 4^c)/(1 - 4) = (4 ^ c)/3. Now c is log(N) in base 4, hence 4^c = N. The total answer is hence N with some constant factors.
While finding order of algorithm we find the total number of steps that algorithm goes through
Here the innermost loop has the number of steps equal to current value of i.
Let i goes through values i1,i2,i3...in
So the total number of steps in the algorithm are ->> i1+i2+i3+ ... in .
Here values of i1,i2,i3...in are 1,4,64...4N ; which is a GP with first term=a=1 and last term
equal to 4N.So the complexity of this algorithm is sum of all terms in this GP.
SUM=1+4+64+...4N
sum of GP with n terms a,ar,ar^2...ar^(n-1)=a((r^n)-1)/(r-1)=a(L*r-1)/(r-1)
where L=last term;
Here in our case sum= 1*((4*4N)-1)/3
which is approximately 1.33 times the last term L
SUM=1.33*4N
which is linear order of N
Thus number of steps are linear function of N and
So the complexity of algorithm is of order N; i.e. O(n).

Categories