What is the Big O complexity of this algorithm - java

I can't seem to figure out the time complexity for this algorithm. I know that the while loop will execute at O(log n) times because n keeps halving. But I'm not super sure how to represent the time complexity of the inner for loop.
while(n>0){
for(int j = 0; j < n; j++){
System.out.println("*");
}
n = n/2;
}

Each while loop divides n by 2.
Therefore, the first for loop would run n times, the second for loop would run n/2 times, the third for loop would run n/4 times, and so on.
If we set n to an arbitrary large value, then the complexity would be:
n + n/2 + n/4 + n/8 + n/16 + n/32 + ... = 2n
Of course, n is an integer, and programmatically, the division would result in 0 after enough repetitions.
Therefore, the time-complexity of the algorithm would be O(2N) = O(N)

We can figure out the complexity of both loops by inspection. For an input of n = 8, the inner loop would print out this many stars at each iteration:
8 + 4 + 2 + 1 = 15 ~ 2^4
For n = 16:
16 + 8 + 4 + 2 + 1 = 31 ~ 2^5
The complexity here is:
O(2^([log_2(N)]+1)) = 2*O(2^log_2[N])
= O(N)
We can see that at each step the number of print statements is roughly twice the value of the input N. So the overall number of print operations is actually O(N), which is linear in the actual value of the input N.

Lets n₀ be the start value of n.
At the i-th (from 0) iteration of the while loop, n = n₀ * ½ⁱ
The cost of one iteration of the while loop is n (because of the for loop)
Hence the global cost is: sum for i from 0 to whatever of n₀ * ½ⁱ
Or: n₀ times the sum for i from 0 to whatever of ½ⁱ
The sum is a sum of a geometric series, and is less than 2.
Hence the global cost is O(2.n₀) or O(n₀)

Related

Trying to understand the reason for this time complexity

I am trying to calculate the time complexity of these two algorithms. The book I am referring to specifies these time complexities of each.
A) Algorithm A: O(nlogn)
int i = n;
while (i > 0)
{
for (int j = 0; j < n; j++)
System.out.println("*");
i = i / 2;
}
B) Algorithm B: O(n)
while (n > 0)
{
for (int j = 0; j < n; j++)
System.out.println("*");
n = n / 2;
}
I can see how algo. A is O(nlogn). The for loop is O(n) and while loop is O(logn). However I am failing to see how AlgoB has a time complexity of O(n). I was expecting it to be O(nlogn) also. Any help would be appreciated.
Let's look at Algorithm B from a mathematical standpoint.
The number of * printed in the first loop is n. The number of * printed in each subsequent loop is at most n/2. That recurrence relation leads to the sequence:
n + n/2 + n/4 + n/8 + ...
If this were an infinite sequence, then the sum could be represented by the formula n/(1 - r), where r is the factor between terms. In this case, r is 1/2, so the infinite sequence has a sum of 2(n).
Your algorithm certainly won't go on forever, and it each time it loops, it may be printing less than half the stars of the previous loop. The number of stars printed is therefore less than or equal to 2(n).
Because constant factors drop out of time complexity, the algorithm is O(n).
This concept is called amortized complexity, which averages the cost of operations in a loop, even if some of the operations might be relatively expensive. Please see this question and the Wikipedia page for a more thorough explanation of amortized complexity.
Algorithm B is printing half the starts at every iteration. Assume n=10, then:
n=10 -> 10*
n=5 -> 5*
n=2 -> 2*
n=1 -> 1*
In total 18* are printed. You will print n + n/2 + n/4 + ... + n/(2^i) stars. How much does i value? It is equal to the number of steps required for n to become 0. In other terms, it is the exponent to which 2 must be raised to produce n: log_2(n). You get the sum in picture:
Which can be approximated to O(n).

Why am I getting different time complexities in these two different nested for-loops?

Ok, so I know two nested for-loops each incrementing by 1 gives a quadratic time complexity. Then I was curious to see if I change the update of one of the loops incrementing by a multiplication of 2 would I get O(n log n) instead of O(n^2) and vice versa to the other loop.
In each inner loop I have a variable to count how many times the loop executes. The array is size 2^20 so 1,048,576. I'm thinking both methods should have the same complexity of n log n (20 * 1,048,576). But only Algorithm 2 gets close to that complexity and Algorithm 1 has a complexity of n * 2.
To my understanding one loop is O(n) and the other is O(log n) so it should be O(n log n) together and then if I switch them I should get a complexity of O(log n n) which would be the same thing.
int[] arr = new int[1048576];
// Algorithm 1
int counter1 = 0;
for (int i = 1; i < arr.length; i++) {
for (int j = i; j < arr.length; j *= 2) {
counter1++;
}
}
System.out.println(counter1);
// Algorithm 2
int counter2 = 0;
for (int i = 1; i < arr.length; i *= 2) {
for (int j = i; j < arr.length; j++) {
counter2++;
}
}
System.out.println(counter2);
// counter1: 2097130 (n * 2)
// counter2: 19922945 (n log n)
Simple math. Assume for now, that w *= 2 will essentially take 20 steps to reach roughly 1 million.
Algo 1 will run roughly 1 million j-loops, but those loops will only take about 20 j-steps each to complete. You're running the inner loop on algo 2 (especially the first time) on a factor of millions, whereas the other will run <=20 times, 1 million times. However, you need to account for the decay, especially at the start. By the time you've hit i=2, you're already down to 19 j-steps on the algorithm. By 4, you're down to 18, and so on. This early decay will essentially "kill" the momentum for the number of steps. The last ~500,000 "i-steps" will only increment the counter once.
Algo 2 runs 1 million j-steps the first run alone, followed by another (i-step-1) j-steps (1, followed by 2, followed by 4, etc). You're running a million steps roughly each time.
Let's count the number of passes in each loop for the second algorithm. Let's take N=arr.length.
First the outer most loop. i ranges from 1 to N, and multiplied by 2 each time, that makes log(N) iterations.
Then, in the inner most loop j ranges from i to N and is incremented by 1 each time, that makes (N-i) iterations.
Lets now take k=log(i). So the total number of times counter2 is incremented is sum(N-2^k) for k=0 to log(N)
Sum for k=0 to log(N) of 2^k is a geometrical sum that adds up to 2^(log(N)+1)-1 so 2N-1.
Therfore the total complexity of the second loop is Nlog(N)-2N+1 which is O(Nlog(N)) just like your first loop.
The difference is the second order term 2N. If we push our development we have for the first loop a complexity of:
Nlog(N) + O(1)
and for the second:
Nlog(N) - 2N + O(1)
Both solutions you posted have exactly the dame complexity, i assume u forgot to swap the inner for loop begin clause variables.
O (n^2) with a constant factor.
Big-O notation works like this:
n/2 * n is the time your loop needs (any of the two u posted)
-> 1/2 * n * n = 1/2 * n^2
1/2 is the factor.
Complexity is polynomial, n^2

Big O of nested for loop

int y = 1;
for (int x = 1 ; x <= n+2 ; x++)
for (int w = n ; w > 0 ; w--)
y = y + 1;
I'm a little confused about determining the BigO of the above code. If in the outermost loop it was for(int x = 1; x <= n; w++), then the BigO of the loop would be O(n^2) because the outermost loop would iterate n times and the innermost loop would also iterate n times.
However, given that the outermost loop iterates n+2 times, would that change the bigO or does the rule that additive constants don't matter imply? Lastly, would it change anything if the innermost loop were to iterate n+2 times instead of n?
Thank you!
Outer loop run n + 2 times, and inner loop runs n times, so code block runs (n + 2) * n times, which is n * n + 2 * n times. With increasing values of n, the 2 * n becomes insignificant, so you're left with n * n, giving you the answer: O(n^2)
Long-ish answer short, the additive constants don't matter.
Suppose we did count the constants. Then, the inner loop is executed
(n+2)(n) = n^2 + 2n
times. This is still O(n^2), since the squared term takes precedence over the linear term.
n and n+2 are the same order of magnitude, so this code run in O(n^2).
Even if the inner loop runs n + 2 times.
for (int x = 1 ; x <= n+2 ; x++)
outer loop is (n+2) times.
for (int w = n ; w > 0 ; w--)
inner loop is (n) time
((n+2) * n) => n^2 + 2n => O(n^2). Because we consider the larger value.
The reason is for the larger values of n, value of 2n will be insignificant to n^2. So we drop the n.
You can read here for more explanation: Big O Analysis

Estimating the order of growth of running time of an alogrithm

I am following the an online course and I can't understand well how to estimate the order of growth of an algorithm here s an example
What is the order of growth of the worst case running time of the following code fragment
as a function of N?
Int sum = 0;
For (int i = 1; i <= 4*N; i = i*4)
for (int j = 0; j < i; j++)
sum++;
Can anyone explain to me how to get it
Just calculate how many times the statement sum++; is executed.
= 1 + 4 + 16 + 64 ... + 4*N
This is a geometric progression with common factor of 4. If number of terms in this series is k, then
4^k = 4*N.
Sum of series = (4^k - 1)(4-1) = (4*N - 1)/3.
In order of growth we neglect the constant factors.
Hence the complexity is O(N).
This is fairly straightforward:
There are log(N) + 1 iterations of the outer loop (logarithm is base 4).
Let x be the outer loop iteration number. The inner loop iterates 4^x times.
Hence the total runtime is Sum(4 ^ x) for x in [0..c], where c is log(N) This is a geometric series, and it's sum is easily calculated using the formula from the wiki:
Sum(4 ^ x) for x in [1..c] = (1 - 4^c)/(1 - 4) = (4 ^ c)/3. Now c is log(N) in base 4, hence 4^c = N. The total answer is hence N with some constant factors.
While finding order of algorithm we find the total number of steps that algorithm goes through
Here the innermost loop has the number of steps equal to current value of i.
Let i goes through values i1,i2,i3...in
So the total number of steps in the algorithm are ->> i1+i2+i3+ ... in .
Here values of i1,i2,i3...in are 1,4,64...4N ; which is a GP with first term=a=1 and last term
equal to 4N.So the complexity of this algorithm is sum of all terms in this GP.
SUM=1+4+64+...4N
sum of GP with n terms a,ar,ar^2...ar^(n-1)=a((r^n)-1)/(r-1)=a(L*r-1)/(r-1)
where L=last term;
Here in our case sum= 1*((4*4N)-1)/3
which is approximately 1.33 times the last term L
SUM=1.33*4N
which is linear order of N
Thus number of steps are linear function of N and
So the complexity of algorithm is of order N; i.e. O(n).

Determining as a function of n how often the statement incrementing the variable count is performed

Ok so I'm new to analyzing algorithms and would really appreciate any helpful tips that can be shared on how to go about this. I am trying to determine how many times count is incremented as a function of n. I have ran it in an ide and for values 1-7 the output is 1,3,6,10,15,21,28. Im just unsure how to write this as a function of n? Thanks.
The loop is as follows:
for(int i = 1 ; i <= n ; i++){
for (int j = 1 ; j <= i ; j++) {
count++;
}
}
The aim of this type of exercise is to teach how to you analyze it on paper instead of running it on a machine. But let's look at the pattern:
The outer loop will run a total of n times
The inner loop will run between 1 to n times depend on what i is at the time. But you know that on average this will run (n+1)/2 times.
Thus count = n(n+1)/2), which is O(n^2)
See arithmetic series
Update: As requested - why the inner loop is (n+1)/2:
The outer loop will increment i between 1 and n. So on each iteration of the outer loop, the inner loop will "loop" one more than than it did previously.
Thus the inner loop will iterate this number of times:
1 + 2 + 3 + ... + n
So we can do something clever and pair up :
n with 1: (n + 1) = n + 1
n-1 with 2: (n - 1) + 2 = n + 1
n-2 with 3: (n - 2) + 3 = n + 1
...
And since we paired these up, we have n/2 such pairs:
So the sum of 1 + 2 + ... + n is ( (n+1) * (n/2) ).
So the average is ( (n+1) * (n/2) ) / n = (n+1)/2
(Visualize it as you are writing 1 + 2 + 3 + ... + n on a long strip of paper, and you fold it in half.)
I would also highly recommend reading this famous story about Karl Friedrich Gauss so you'll always remember how to sum arithmetic series =)
1
1+2 = 3
1+2+3 = 6
1+2+3+4 = 10
1+2+3+4+5 = 15
It is only me who sees the pattern? :-)
Here you go:
count = n * (n + 1) / 2

Categories