This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 4 years ago.
int sum = 0;
for (int n = N; n > 0; n /= 2)
for(int i = 0; i < n; i++)
sum++;
int sum = 0;
for (int i = 1 i < N; i *= 2)
for (int j = 0; j < i; j++)
sum++;
int sum = 0;
for (int i = 1 i < N; i *= 2)
for (int j = 0; j < N; j++)
sum++;
I have been suffering from this for a lot of time. I am still a second-year student but I still can't calculate the complexity of an algorithm. How can I calculate it? I feel very incompetent because I never seem to get it!
For example, is the complexity of a for loop always N? How to know? Can you recommend any resources I can read? Any videos?
Well your first and second example is same (in terms of time complexity). For them, time complexity is O(N). Why is it. Let us compute. For the first example, your inner loop runs for N times, then N/2 times, then N/4 and goes upto 1. So, the time complexity is O(N+N/2+N/4+..+1) and sum of this GP is (2n-1). So, the time complexity for first case is O(N).
For the second example, your inner loop runs for 1 time, then 2 times, 4 times, and goes upto N. So, the time complexity is O(1+2+4+...+N) and sum of this GP is 2log(N+1)-1 which is equal to N. So, the time complexity for the second case is also O(N).
For the third example, first loop runs for log(N) time and inner loop runs for N time and since each of them is independent, required time complexity is O(NlogN). (All calculations are approximate and all log bases are 2)
Well, to know about time complexity of a for loop, you have to see how many times "i" is assigned a value (can be same or different).
To learn about time complexity, check out hackerearth material and every time you write an algorithm, try to calculate its time complexity. Its the best method to learn it and check out Masters theorem for recurrence relation but know its basic too.
Resource
https://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
Explanation
A general idea, Complexity of a loop means the number of times that will run. So a for loop for(int i=0;i<10;i++) is of Complexity O(n) where n=10. If there are multiple loops (not nested) the complexity of the code will be the highest n. If the loops are nested, like in the 3rd example you have shown above, the limit of both loop, which is N, gets multiplied. Making the complexity O(N square). (This is a general idea and not the precise definition!)
This previous question may be helpful because there are a few different approaches to calculate the complexity of an algorithm, and quite a few good resources.
As for your example, the complexity of a for loop may not always be N.
For example the following code snippet is linear (with a time complexity of N) because it goes from every iteration i = 0 to i = N sequentially,
for (int i = 0; i < N; i++) {
sum++;
}
Whereas this code snippet is logarithmic in its time complexity because it doesn't progress sequentially through every value from 0 to N, but instead is multiplied by 2.
for (int i = 0; i < N; i*2) {
sum++;
}
Related
I am very new to competitive programming and to Big O notation.
public void function(int n){
for(int i = n; i > 0; i/=3){
for(int j = 0; j < i; j++){
System.out.println("Hello");
}
}
}
This is the algorithm.
As far as i know about time complexity.It defines how run time gets affected from number of inputs.
So here if we take a example
if 'n' is 10.
The outer loop runs log n times and the inner loop runs 'i' times.
the inner loop runs relatively to 'i' not 'n'.
So im a bit confused here as to how the time complexity is calculated.
I think it is O(log n).Please correct me if i am wrong.
Will it be O(log n) or O (n log n) or (n^2).
Please help me out with this.
Thank you.
I will try to explain it in the simplest term possible
The outer loop will simply run log(n) with base 3 times.
Since, i is decreasing by factor of 3 every time. The total work done is equal to :
n + n/3 + n/9 + n/27 + .... n/(3^log(n))
since, n/3 + ... + n/(3^log(n)) will always be less than n
for e.g. let n = 100
then, 100 + 100/3 + 100/9 + 100/27 + ... = 100 + (33.3 + 11.11 + 3.7 + ...)
we can clearly see the terms in the bracket will always be less than 100
The total time complexity of the overall solution will be O(n).
Actually it will never terminate cause i=0 and update is i *= 3 so i will stay 0 so we can say O(+oo)
assuming you meant for(int i =1... instead, then its O(n):
Outer loop is clearly O(log_3 n) cause we keep multiplying by 3
Inner loop will get executed O(log_3 n) times with iteration count of (1 + 3 + 9 + 27 + ... + 3^log_3(n)) which is clearly a geometric progression, solving which gives us approx 3^log_3(n)) which according to log rules gives n so this loop takes O(n) for all iterations, so total complexity is O(n)
for your code :
for(int i = n; i > 0; i/=3){
for(int j = 0; j < i; j++){
System.out.println("Hello");
}
}
Inner loop variable j is dependent on outer loop variable i, so your inner loop will be the one which will decide the complexity for your algorithm.
since j will run 'n' times in first run, 'n/3' times in second run and so on.. therefore your total complexity can be calculated as
n + n/3 + n/9 + n/27 + .......
resulting in O(n)
So this is a great question! It's a tricky one that takes a little more thinking to analyse.
As correctly stated in some of the other answers, the outer loop:
for(int i = n; i > 0; i/=3)
Will run log(n) times. Specifically log_3(n) times but in big O notation we don't often worry about the base so log(n) will be fine.
Now the nested loop is a bit trickier:
for(int j = 0; j < i; j++){
On first glance you may think this is a simple log(n) loop but lets look a little further.
So on the first iteration this will run N times since the value of i will be n. Next iteration it will be run n/3 times. Then n/9, n/27, n/81 etc....
If we sum this series, it is clear to see it will total less than 2n.
Therefore we can conclude this algorithm has a complexity of O(n).
In your code snippet:
for (int i=0; i < n; i*=3) {
for (int j=0; j < i; j++) {
System.out.println("Hello");
}
}
The outer loop in i is O(log_3(n)), because each increment of the loop decreases the amount of ground needed for i to reach n by a factor of 3. This is logarithmic behavior (log_3 in this case). The inner loop in j simply iterates the same number of times as whatever the outer value of i might be, so we can just square the outer complexity, to arrive at:
O(log_3(n)^2)
Hi I was solving this problem, but I'm kind of stuck to choose which one is the right answer. So, I'm trying to get a help from you guys.
Here is the code:
for (int i=0; i < n; i++) { // loop1 from 0 to n-1
int total = 0;
for (int j=0; j < n; j++) // loop2 from 0 to n-1
for (int k=0; k <= j; k++) // loop3 from 0 to j
total += first[k];
if (second[i] == total) count++;
} return count;
From above code, loop1 and loop2 have n times n which has n^2 complex processing time
but the problem is loop3, I really don't want to say that has n times also since the limit of the loop is j not n.
In the worst time case, how should I conclude the complexity? will it going to be ((n+1)(n+2))/2 only for loop3? or ((n+1)(n+2))/2 for loop2 and loop3
My final answers are like this
first case: n * n * (((n+1)(n+2))/2) = O(n^4)
second case: n * (((n+1)(n+2))/2) = O(n^3)
which one will be the correct one? Or did I get both wrong?
p.s. please mind loop3 is 0 to j, not 0 to j-1
It is not clear from the question that what are the different cases you are concerned about. While discussing Big O, we only care about the most significant portion of complexity. Here j in the third loop can go up to n, as per the second loop. Hence there is no problem substituting it with n when calculating the total complexity. Since there are 3 loops in all, all directly or indirectly dependent on n, the complexity would be O(n^3). Also, you could ignore constants, so n-1 could be simply considered as n.
See
https://stackoverflow.com/a/487278/945214
Big-O for Eight Year Olds?
I have just begun learning about Big O Notation and honestly I don't think I have the hang of it, and I am not quite sure how to determine the O() performance by just looking at for loops. I have listed a few examples and then some of the answers that I think are correct! Please let me know if they are wrong and any explanations would be greatly appreciated!
for (int i = 0; i <1000; i++) {
count ++;
I believe this would be O(n), because nothing else is going on in the for loop except constant time printing. We iterate 'n' times, or in this case 1000?
for (int i = 0; i < n; i++) {
for(int j = 0; j < n; j++)
count ++;
Would this one have an O(n^2) because the loops are nested and it iterates n twice, n*n?
for (int i = 0; i < n; i++) {
for( int j = i; j < n; j++)
count++;
Is this one another O(n^2) but in the worst case? Or is this O(n log n)?
Big-O notation is supposed to be a guide to help the user understand how the runtime increases with input.
For the first example, the loop runs exactly 1000 times no matter what n is, thus it is O(1000) = O(1) time.
For the second example, the nested loop runs n times for every time the outer loop runs, which runs n times. This is a total of n*n = n^2 operations. Thus the number of operations increases proportional to the square of n, thus we say it is O(n^2).
For the third example, it is still O(n^2). This is because Big-O notation ignores constants in the exact formula of the time complexity. If you calculate the number of times j runs as i increases, you get this pattern.
i: 0 1 2 3 ...
j: n n-1 n-2 n-3 ...
In total, the number of operations is around 1/2 n^2. Since Big-O notation ignores constants this is still O(n^2)
I am trying to figure out the run time of the following algorithm.
I argue it is O(n) because the inner loop does not depend on the outer loop.
So we could have O(n) + O(n) = O(2n) which equals O(n)
Is this correct? I'm not sure my logic is correct and I cannot figure out how to analyze is correctly.
The algorithm is finding the largest elements to the left of a list of elements.
Thanks!
public static void main(String[] args){
int[] a = {4,3,2,10,4,8,9,1};
int[] p = new int[a.length];
ArrayDeque<Integer> previousIndex = new ArrayDeque<Integer>();
for(int i = 0; i < a.length ; i++){
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
previousIndex.pop();
}
if (previousIndex.isEmpty()) {
p[i] = 0;
} else {
p[i] = previousIndex.peek();
}
previousIndex.push(i);
}
for(int i = 0; i < p.length ; i++){
System.out.println(p[i]);
}
}
}
This is O(N) for though you have a loop within a loop, the total number of times the inner loop will be executed can never be more than the total number of times that
previousIndex.push(i);
is called, which is a.length (or N)
To work out the order really you are looking at the worst case. You are correct that the nested loop is the cause for concern here:
for(int i = 0; i < a.length ; i++){
This is immediately order N
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
This could potentially also go nearly N times.
So the final order is N*N or N^2
You do have to keep in mind the usual case though. If it is likely that the while loop will in fact exit after only a couple of iterations you could get back down to O(N).
In fact, you are fine and have an O(N) algorithm - but it's harder to prove than most. This assumes, though, that .isEmpty(), .peek() etc. on the ArrayDeque are all constant-time operations. Consult the documentation to be sure.
The key is that your processing of the deque in the inner loop is destructive:
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
previousIndex.pop();
}
This removes an element from previousIndex each time, and can only run when there is one to remove. Therefore, the total number of times the while loop could run, across all indices, is the number of times that something is .pushed into deque. And since this only happens at one point - at the end of the first for loop - we can see that the number of items pushed is O(N).
I'm trying to figure out the running time of the code below.
If add and trimToSize are both O(n), the interior of the block would run in 2N time, and then since the loop takes N time, the whole program would run in N*(2N) time?... O(n^2)?
ArrayList a = new ArrayList();
for (int i = 0; i< N; i++){
a.add(i);
a.trimToSize();
}
Yes. But ArrayList#add is usually O(1) except for the case when the internal storage array has to be increased.
If you want to optimize your code, do it as follows:
ArrayList a = new ArrayList(N); // reserve space for N elements
for (int i = 0; i < N; i++) {
a.add(i); // O(1)
}
// no need for trimToSize
This now has only O(n)!
You are correct, it would be O(n^2). The for loop executes N times, and like you said, add and trimToSize take O(n) time, so it would be:
N * (N + N) = N * (2N) = 2 * N^2
but the constant factor of 2 does not matter for big-O notation because the n^2 is the dominating part of the function. Therefore, it is O(n^2).