Run time of algorithm - java

I am trying to figure out the run time of the following algorithm.
I argue it is O(n) because the inner loop does not depend on the outer loop.
So we could have O(n) + O(n) = O(2n) which equals O(n)
Is this correct? I'm not sure my logic is correct and I cannot figure out how to analyze is correctly.
The algorithm is finding the largest elements to the left of a list of elements.
Thanks!
public static void main(String[] args){
int[] a = {4,3,2,10,4,8,9,1};
int[] p = new int[a.length];
ArrayDeque<Integer> previousIndex = new ArrayDeque<Integer>();
for(int i = 0; i < a.length ; i++){
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
previousIndex.pop();
}
if (previousIndex.isEmpty()) {
p[i] = 0;
} else {
p[i] = previousIndex.peek();
}
previousIndex.push(i);
}
for(int i = 0; i < p.length ; i++){
System.out.println(p[i]);
}
}
}

This is O(N) for though you have a loop within a loop, the total number of times the inner loop will be executed can never be more than the total number of times that
previousIndex.push(i);
is called, which is a.length (or N)

To work out the order really you are looking at the worst case. You are correct that the nested loop is the cause for concern here:
for(int i = 0; i < a.length ; i++){
This is immediately order N
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
This could potentially also go nearly N times.
So the final order is N*N or N^2
You do have to keep in mind the usual case though. If it is likely that the while loop will in fact exit after only a couple of iterations you could get back down to O(N).

In fact, you are fine and have an O(N) algorithm - but it's harder to prove than most. This assumes, though, that .isEmpty(), .peek() etc. on the ArrayDeque are all constant-time operations. Consult the documentation to be sure.
The key is that your processing of the deque in the inner loop is destructive:
while (!previousIndex.isEmpty() && a[previousIndex.peek()] <= a[i]){
previousIndex.pop();
}
This removes an element from previousIndex each time, and can only run when there is one to remove. Therefore, the total number of times the while loop could run, across all indices, is the number of times that something is .pushed into deque. And since this only happens at one point - at the end of the first for loop - we can see that the number of items pushed is O(N).

Related

How do I calculate the time complexity of while loops with nested if conditions?

This method merges two sorted lists and I want to know the time complexity of it if the length of list a is n and the length of list b is m. I am confused with while loops because they also sort of act like if statements (are only executed when the condition is true), meaning they aren't necessarily executed, so how can I compute the time complexity of them?
ArrayList<Integer> union(ArrayList<Integer> a, ArrayList<Integer> b){
ArrayList<Integer> res = new ArrayList<>();
int i = 0, j = 0;
while (i<a.size() && j<b.size()){
if(a.get(i) < b.get(j)){
res.add(a.get(i));
i++;
} else if (a.get(i) > b.get(j)){
res.add(b.get(j));
j++;
} else {
res.add(b.get(j));
i++;
j++;
}
}
while (i < a.size()){
res.add(a.get(i));
i++;
}
while (j < b.size()){
res.add(b.get(j));
j++;
}
return res;
}
Well, each iteration of each while loop increments either i or j (or both) by 1.
i can grow from 0 to n - 1.
j can group from 0 to m - 1.
Hence the total number of iterations is bound by n + m.
Since each iteration of any of the 3 loops does constant amount of work (since both ArrayList's get and add take constant time),
the total time complexity in O(n+m).
BTW, is this method supposed to eliminate duplicates?
If it does, it only eliminates duplicates for elements that appear on both input lists. If a list already contains duplicates, they won't be eliminated.
If it's not supposed to eliminate duplicates, it has a bug, since if a.get(i) == b.get(j), both should be added to the output list (since both i and j are incremented in this case).

Not understanding big O notation O(∑ i=0 n−1 (∑ j=i+1 n (j−i)))=O(∑ i=0 n−1 2 (1+n−i)(n−i))=O(n^3)

Working on the following problem:
Given a string s, find the length of the longest substring without repeating characters.
I'm using this brute force solution:
public class Solution {
public int lengthOfLongestSubstring(String s) {
int n = s.length();
int res = 0;
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
if (checkRepetition(s, i, j)) {
res = Math.max(res, j - i + 1);
}
}
}
return res;
}
private boolean checkRepetition(String s, int start, int end) {
int[] chars = new int[128];
for (int i = start; i <= end; i++) {
char c = s.charAt(i);
chars[c]++;
if (chars[c] > 1) {
return false;
}
}
return true;
}
}
Tbe big O notation is as follows:
I understand that three nested iterations would result in a time complexity O(n^3).
I only see two sigma operators being used on the start of the formula, could someone enlighten me on where the third iteration comes to play in the beginning of the formula?
The first sum from i=0 to n-1 corresponds to the outer for loop of lengthOfLongestSubstring, which you can see iterates from i=0 to n-1.
The second sum from j = i+1 to n corresponds to the second for loop (you could be starting j at i+1 rather than i as there's no need to check length 0 sub-strings).
Generally, we would expect this particular double for loop structure to produce O(n^2) algorithms and a third for loop (from k=j+1 to n) to lead to O(n^3) ones. However, this general rule (k for loops iterating through all k-tuples of indices producing O(n^k) algorithms) is only the case when the work done inside the innermost for loop is constant. This is because having k for loops structured in this way produces O(n^k) total iterations, but you need to multiply the total number of iterations by the work done in each iteration to get the overall complexity.
From this idea, we can see that the reason lengthOfLongestSubstring is O(n^3) is because the work done inside of the body of the second for loop is not constant, but rather is O(n). checkRepitition(s, i, j) iterates from i to j, taking j-i time (hence the expression inside the second term of the sum). O(j-i) time is O(n) time in the worst case because i could be as low as 0, j as high as n, and of course O(n-0) = O(n) (it's not too hard to show that checkRepitions is O(n) in the average case as well).
As mentioned by a commenter, having a linear operation inside the body of your second for loop has the same practical effect in terms of complexity as having a third for loop, which would probably be easier to see as being O(n^3) (you could even imagine the function definition for checkRepitition, including its for loop, being pasted into lengthOfLongestSubstring in place to see the same result). But the basic idea is that doing O(n) work for each of the O(n^2) iterations of the 2 for loops means the total complexity is O(n)*O(n^2) = O(n^3).

What will be the time complexity of this algorithm

I am very new to competitive programming and to Big O notation.
public void function(int n){
for(int i = n; i > 0; i/=3){
for(int j = 0; j < i; j++){
System.out.println("Hello");
}
}
}
This is the algorithm.
As far as i know about time complexity.It defines how run time gets affected from number of inputs.
So here if we take a example
if 'n' is 10.
The outer loop runs log n times and the inner loop runs 'i' times.
the inner loop runs relatively to 'i' not 'n'.
So im a bit confused here as to how the time complexity is calculated.
I think it is O(log n).Please correct me if i am wrong.
Will it be O(log n) or O (n log n) or (n^2).
Please help me out with this.
Thank you.
I will try to explain it in the simplest term possible
The outer loop will simply run log(n) with base 3 times.
Since, i is decreasing by factor of 3 every time. The total work done is equal to :
n + n/3 + n/9 + n/27 + .... n/(3^log(n))
since, n/3 + ... + n/(3^log(n)) will always be less than n
for e.g. let n = 100
then, 100 + 100/3 + 100/9 + 100/27 + ... = 100 + (33.3 + 11.11 + 3.7 + ...)
we can clearly see the terms in the bracket will always be less than 100
The total time complexity of the overall solution will be O(n).
Actually it will never terminate cause i=0 and update is i *= 3 so i will stay 0 so we can say O(+oo)
assuming you meant for(int i =1... instead, then its O(n):
Outer loop is clearly O(log_3 n) cause we keep multiplying by 3
Inner loop will get executed O(log_3 n) times with iteration count of (1 + 3 + 9 + 27 + ... + 3^log_3(n)) which is clearly a geometric progression, solving which gives us approx 3^log_3(n)) which according to log rules gives n so this loop takes O(n) for all iterations, so total complexity is O(n)
for your code :
for(int i = n; i > 0; i/=3){
for(int j = 0; j < i; j++){
System.out.println("Hello");
}
}
Inner loop variable j is dependent on outer loop variable i, so your inner loop will be the one which will decide the complexity for your algorithm.
since j will run 'n' times in first run, 'n/3' times in second run and so on.. therefore your total complexity can be calculated as
n + n/3 + n/9 + n/27 + .......
resulting in O(n)
So this is a great question! It's a tricky one that takes a little more thinking to analyse.
As correctly stated in some of the other answers, the outer loop:
for(int i = n; i > 0; i/=3)
Will run log(n) times. Specifically log_3(n) times but in big O notation we don't often worry about the base so log(n) will be fine.
Now the nested loop is a bit trickier:
for(int j = 0; j < i; j++){
On first glance you may think this is a simple log(n) loop but lets look a little further.
So on the first iteration this will run N times since the value of i will be n. Next iteration it will be run n/3 times. Then n/9, n/27, n/81 etc....
If we sum this series, it is clear to see it will total less than 2n.
Therefore we can conclude this algorithm has a complexity of O(n).
In your code snippet:
for (int i=0; i < n; i*=3) {
for (int j=0; j < i; j++) {
System.out.println("Hello");
}
}
The outer loop in i is O(log_3(n)), because each increment of the loop decreases the amount of ground needed for i to reach n by a factor of 3. This is logarithmic behavior (log_3 in this case). The inner loop in j simply iterates the same number of times as whatever the outer value of i might be, so we can just square the outer complexity, to arrive at:
O(log_3(n)^2)

How to determine Big O performance by looking at for loops?

I have just begun learning about Big O Notation and honestly I don't think I have the hang of it, and I am not quite sure how to determine the O() performance by just looking at for loops. I have listed a few examples and then some of the answers that I think are correct! Please let me know if they are wrong and any explanations would be greatly appreciated!
for (int i = 0; i <1000; i++) {
count ++;
I believe this would be O(n), because nothing else is going on in the for loop except constant time printing. We iterate 'n' times, or in this case 1000?
for (int i = 0; i < n; i++) {
for(int j = 0; j < n; j++)
count ++;
Would this one have an O(n^2) because the loops are nested and it iterates n twice, n*n?
for (int i = 0; i < n; i++) {
for( int j = i; j < n; j++)
count++;
Is this one another O(n^2) but in the worst case? Or is this O(n log n)?
Big-O notation is supposed to be a guide to help the user understand how the runtime increases with input.
For the first example, the loop runs exactly 1000 times no matter what n is, thus it is O(1000) = O(1) time.
For the second example, the nested loop runs n times for every time the outer loop runs, which runs n times. This is a total of n*n = n^2 operations. Thus the number of operations increases proportional to the square of n, thus we say it is O(n^2).
For the third example, it is still O(n^2). This is because Big-O notation ignores constants in the exact formula of the time complexity. If you calculate the number of times j runs as i increases, you get this pattern.
i: 0 1 2 3 ...
j: n n-1 n-2 n-3 ...
In total, the number of operations is around 1/2 n^2. Since Big-O notation ignores constants this is still O(n^2)

Time Complexity of Simple Algo

I'm trying to figure out the running time of the code below.
If add and trimToSize are both O(n), the interior of the block would run in 2N time, and then since the loop takes N time, the whole program would run in N*(2N) time?... O(n^2)?
ArrayList a = new ArrayList();
for (int i = 0; i< N; i++){
a.add(i);
a.trimToSize();
}
Yes. But ArrayList#add is usually O(1) except for the case when the internal storage array has to be increased.
If you want to optimize your code, do it as follows:
ArrayList a = new ArrayList(N); // reserve space for N elements
for (int i = 0; i < N; i++) {
a.add(i); // O(1)
}
// no need for trimToSize
This now has only O(n)!
You are correct, it would be O(n^2). The for loop executes N times, and like you said, add and trimToSize take O(n) time, so it would be:
N * (N + N) = N * (2N) = 2 * N^2
but the constant factor of 2 does not matter for big-O notation because the n^2 is the dominating part of the function. Therefore, it is O(n^2).

Categories