I'm trying to figure out the running time of the code below.
If add and trimToSize are both O(n), the interior of the block would run in 2N time, and then since the loop takes N time, the whole program would run in N*(2N) time?... O(n^2)?
ArrayList a = new ArrayList();
for (int i = 0; i< N; i++){
a.add(i);
a.trimToSize();
}
Yes. But ArrayList#add is usually O(1) except for the case when the internal storage array has to be increased.
If you want to optimize your code, do it as follows:
ArrayList a = new ArrayList(N); // reserve space for N elements
for (int i = 0; i < N; i++) {
a.add(i); // O(1)
}
// no need for trimToSize
This now has only O(n)!
You are correct, it would be O(n^2). The for loop executes N times, and like you said, add and trimToSize take O(n) time, so it would be:
N * (N + N) = N * (2N) = 2 * N^2
but the constant factor of 2 does not matter for big-O notation because the n^2 is the dominating part of the function. Therefore, it is O(n^2).
Related
Working on the following problem:
Given a string s, find the length of the longest substring without repeating characters.
I'm using this brute force solution:
public class Solution {
public int lengthOfLongestSubstring(String s) {
int n = s.length();
int res = 0;
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
if (checkRepetition(s, i, j)) {
res = Math.max(res, j - i + 1);
}
}
}
return res;
}
private boolean checkRepetition(String s, int start, int end) {
int[] chars = new int[128];
for (int i = start; i <= end; i++) {
char c = s.charAt(i);
chars[c]++;
if (chars[c] > 1) {
return false;
}
}
return true;
}
}
Tbe big O notation is as follows:
I understand that three nested iterations would result in a time complexity O(n^3).
I only see two sigma operators being used on the start of the formula, could someone enlighten me on where the third iteration comes to play in the beginning of the formula?
The first sum from i=0 to n-1 corresponds to the outer for loop of lengthOfLongestSubstring, which you can see iterates from i=0 to n-1.
The second sum from j = i+1 to n corresponds to the second for loop (you could be starting j at i+1 rather than i as there's no need to check length 0 sub-strings).
Generally, we would expect this particular double for loop structure to produce O(n^2) algorithms and a third for loop (from k=j+1 to n) to lead to O(n^3) ones. However, this general rule (k for loops iterating through all k-tuples of indices producing O(n^k) algorithms) is only the case when the work done inside the innermost for loop is constant. This is because having k for loops structured in this way produces O(n^k) total iterations, but you need to multiply the total number of iterations by the work done in each iteration to get the overall complexity.
From this idea, we can see that the reason lengthOfLongestSubstring is O(n^3) is because the work done inside of the body of the second for loop is not constant, but rather is O(n). checkRepitition(s, i, j) iterates from i to j, taking j-i time (hence the expression inside the second term of the sum). O(j-i) time is O(n) time in the worst case because i could be as low as 0, j as high as n, and of course O(n-0) = O(n) (it's not too hard to show that checkRepitions is O(n) in the average case as well).
As mentioned by a commenter, having a linear operation inside the body of your second for loop has the same practical effect in terms of complexity as having a third for loop, which would probably be easier to see as being O(n^3) (you could even imagine the function definition for checkRepitition, including its for loop, being pasted into lengthOfLongestSubstring in place to see the same result). But the basic idea is that doing O(n) work for each of the O(n^2) iterations of the 2 for loops means the total complexity is O(n)*O(n^2) = O(n^3).
Since this 2 loops, iterate at different amount, whats can be the time complexity
int middleindex = items.length/2;
int index = 0;
while(index < middleindex){
System.out.println(items[index]);
index++;
}
for(int i = 0 ; i < 100 ; i++){
System.out.println("Hi");
}
The first loop will take a time related to the number of items:
if you have n items, it will take a time t.
if you have 10 * n items, it will take 10 * t
So you can see that the computation time is linearly related to the number of items. It is O(n).
The second loop isn't related to the number of items, it will always run in constant time. It is O(1).
This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 4 years ago.
int sum = 0;
for (int n = N; n > 0; n /= 2)
for(int i = 0; i < n; i++)
sum++;
int sum = 0;
for (int i = 1 i < N; i *= 2)
for (int j = 0; j < i; j++)
sum++;
int sum = 0;
for (int i = 1 i < N; i *= 2)
for (int j = 0; j < N; j++)
sum++;
I have been suffering from this for a lot of time. I am still a second-year student but I still can't calculate the complexity of an algorithm. How can I calculate it? I feel very incompetent because I never seem to get it!
For example, is the complexity of a for loop always N? How to know? Can you recommend any resources I can read? Any videos?
Well your first and second example is same (in terms of time complexity). For them, time complexity is O(N). Why is it. Let us compute. For the first example, your inner loop runs for N times, then N/2 times, then N/4 and goes upto 1. So, the time complexity is O(N+N/2+N/4+..+1) and sum of this GP is (2n-1). So, the time complexity for first case is O(N).
For the second example, your inner loop runs for 1 time, then 2 times, 4 times, and goes upto N. So, the time complexity is O(1+2+4+...+N) and sum of this GP is 2log(N+1)-1 which is equal to N. So, the time complexity for the second case is also O(N).
For the third example, first loop runs for log(N) time and inner loop runs for N time and since each of them is independent, required time complexity is O(NlogN). (All calculations are approximate and all log bases are 2)
Well, to know about time complexity of a for loop, you have to see how many times "i" is assigned a value (can be same or different).
To learn about time complexity, check out hackerearth material and every time you write an algorithm, try to calculate its time complexity. Its the best method to learn it and check out Masters theorem for recurrence relation but know its basic too.
Resource
https://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
Explanation
A general idea, Complexity of a loop means the number of times that will run. So a for loop for(int i=0;i<10;i++) is of Complexity O(n) where n=10. If there are multiple loops (not nested) the complexity of the code will be the highest n. If the loops are nested, like in the 3rd example you have shown above, the limit of both loop, which is N, gets multiplied. Making the complexity O(N square). (This is a general idea and not the precise definition!)
This previous question may be helpful because there are a few different approaches to calculate the complexity of an algorithm, and quite a few good resources.
As for your example, the complexity of a for loop may not always be N.
For example the following code snippet is linear (with a time complexity of N) because it goes from every iteration i = 0 to i = N sequentially,
for (int i = 0; i < N; i++) {
sum++;
}
Whereas this code snippet is logarithmic in its time complexity because it doesn't progress sequentially through every value from 0 to N, but instead is multiplied by 2.
for (int i = 0; i < N; i*2) {
sum++;
}
I have just begun learning about Big O Notation and honestly I don't think I have the hang of it, and I am not quite sure how to determine the O() performance by just looking at for loops. I have listed a few examples and then some of the answers that I think are correct! Please let me know if they are wrong and any explanations would be greatly appreciated!
for (int i = 0; i <1000; i++) {
count ++;
I believe this would be O(n), because nothing else is going on in the for loop except constant time printing. We iterate 'n' times, or in this case 1000?
for (int i = 0; i < n; i++) {
for(int j = 0; j < n; j++)
count ++;
Would this one have an O(n^2) because the loops are nested and it iterates n twice, n*n?
for (int i = 0; i < n; i++) {
for( int j = i; j < n; j++)
count++;
Is this one another O(n^2) but in the worst case? Or is this O(n log n)?
Big-O notation is supposed to be a guide to help the user understand how the runtime increases with input.
For the first example, the loop runs exactly 1000 times no matter what n is, thus it is O(1000) = O(1) time.
For the second example, the nested loop runs n times for every time the outer loop runs, which runs n times. This is a total of n*n = n^2 operations. Thus the number of operations increases proportional to the square of n, thus we say it is O(n^2).
For the third example, it is still O(n^2). This is because Big-O notation ignores constants in the exact formula of the time complexity. If you calculate the number of times j runs as i increases, you get this pattern.
i: 0 1 2 3 ...
j: n n-1 n-2 n-3 ...
In total, the number of operations is around 1/2 n^2. Since Big-O notation ignores constants this is still O(n^2)
I am trying to figure out the run time of the following algorithm.
I argue it is O(n) because the inner loop does not depend on the outer loop.
So we could have O(n) + O(n) = O(2n) which equals O(n)
Is this correct? I'm not sure my logic is correct and I cannot figure out how to analyze is correctly.
The algorithm is finding the largest elements to the left of a list of elements.
Thanks!
public static void main(String[] args){
int[] a = {4,3,2,10,4,8,9,1};
int[] p = new int[a.length];
ArrayDeque<Integer> previousIndex = new ArrayDeque<Integer>();
for(int i = 0; i < a.length ; i++){
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
previousIndex.pop();
}
if (previousIndex.isEmpty()) {
p[i] = 0;
} else {
p[i] = previousIndex.peek();
}
previousIndex.push(i);
}
for(int i = 0; i < p.length ; i++){
System.out.println(p[i]);
}
}
}
This is O(N) for though you have a loop within a loop, the total number of times the inner loop will be executed can never be more than the total number of times that
previousIndex.push(i);
is called, which is a.length (or N)
To work out the order really you are looking at the worst case. You are correct that the nested loop is the cause for concern here:
for(int i = 0; i < a.length ; i++){
This is immediately order N
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
This could potentially also go nearly N times.
So the final order is N*N or N^2
You do have to keep in mind the usual case though. If it is likely that the while loop will in fact exit after only a couple of iterations you could get back down to O(N).
In fact, you are fine and have an O(N) algorithm - but it's harder to prove than most. This assumes, though, that .isEmpty(), .peek() etc. on the ArrayDeque are all constant-time operations. Consult the documentation to be sure.
The key is that your processing of the deque in the inner loop is destructive:
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
previousIndex.pop();
}
This removes an element from previousIndex each time, and can only run when there is one to remove. Therefore, the total number of times the while loop could run, across all indices, is the number of times that something is .pushed into deque. And since this only happens at one point - at the end of the first for loop - we can see that the number of items pushed is O(N).