This question already has answers here:
Big O, how do you calculate/approximate it?
(24 answers)
Closed 2 years ago.
I would like to know the complexity of this algorithm, I am a bit confused in the notation.
for(int i=0; i<N; ++i)
for(int j=M-1; j>=i; --j)
++x
The answers are:
O(min(N,M))
O(N*M)
O(N+M)
O(max(N,M))
Short answer: O(N*M)
Explanation:
First, the outer loop is executing N times no matter what.
The inner loop will be executed M times in the first iteration, M-1 times in the second iteration, and so on. This is an Arithmetic progression with a sum of d*(a1+an)/2
Which is 1(M+1)/2 which is O(M).
Multiply O(N) and O(M) will give you O(N*M).
Related
This question already has answers here:
Big O, how do you calculate/approximate it?
(24 answers)
Closed 9 months ago.
I was trying to find a way to sort arrays with the smallest time complexity possible. I was thinking that the following is O(n), however that seems unlikely to me because the currently existing methods to sort arrays have at best O(nlogn). My question is what is the big O complexity of this method in java and how is it calculated?
public static void ArrSort(int[] arr){
int temp;
for(int j=0;j<arr.length;j++)
{
if(arr[j]>arr[j+1]) {
temp=arr[j];
arr[j]=arr[j+1];
arr[j+1]=temp;
if(j==0)
j=-1;
else
j=j-2;
}
}
Its time complexity is NOT O(n) at all. Your algorithm that dealing with array's index j made its time complexity seemed to be O(n) because of containing one single loop only, but actually this loop won't end with runing codes inside n-times only. For some average condition, this number will rise to n^2 times acturally. It's a O(n²) sort algorithm.
This question already has answers here:
Computing Time T(n) and Big-O with an infinite loop
(3 answers)
Closed 3 years ago.
What will be the big-o notation for the algorithm that consist of multiplication of N in the loop.
void testing(int n) {
for(int i =0; i<n;i++) {
n=n*2;
System.out.println("hi"+n);
}
}
I'll try to be as rigorous as possible for my answer.
EDIT : forgot to say, we assume every operation like comparison, assignment and multiplication have complexity of O(1)
In short, this algorithm does not terminate in most of the cases, so complexity is not defined for it.
Complexity is some kind of a upper bound for the cost C of an algorithm, stating O(n) complexity means C <= k x n, k > 0. Non terminating algorithm has a cost which is infinite, and inf > inf is undefined.
Then, let's look at why your algorithm is non-terminating :
each iteration, we continue if i < n. Yet, each iteration n is multiplied by 2. We can see a relation between the value of i and n when checking for the condition of the loop : n = n0x2^i, with n0 being the initial value of n.
Therefore, your algorithm will only be terminating when n0 <= 0, and when this case occurs, it will not enter the loop once.
I tried running your code in my IDE and I found that it is an infinite loop.
Algorithm complexity is only defined for algorithms, which by (the most often accepted) definition must terminate. When a program doesn't terminate, it is not an algorithm. So it has no "algorithmic time complexity".
This question already has answers here:
What is a plain English explanation of "Big O" notation?
(43 answers)
Big-O for Eight Year Olds? [duplicate]
(25 answers)
What does O(log n) mean exactly?
(32 answers)
Big O, how do you calculate/approximate it?
(24 answers)
How can I find the time complexity of an algorithm?
(10 answers)
Closed 3 years ago.
If an algorithm executes a statement, it is n/2 times, then how come O is equal to O(n). Because the video explains that it is because of the degree of a polynomial. Please explain.
for(int i =0;i<n;i=i+2){
sout(n) ---- This statemet can be print n/2 times
}
f(n) = n/2 then O(n)
In simple words, although the statement will be printed n/2 times, it still holds a linear relationship with n.
For n=10, it will print 5 times.
For n=50, it will print 25 times.
For n=100, it will print 50 times.
Notice the linear relationship. The factor 1/2 is just multiplied by n. It is a linear relationship and O(n) signifies a linear relation and doesn't care of the constant (which is 1/2 in this case). Even f(n) = n/3 would have been O(n).
Yes, as Aoerz already said, to understand your problem, you should understand what the O notation means.
In a math way:
O(f(n)) = {g(n) : ∃c>0 ∧ n0 ≥ 0 | g(n) ≤ c*f(n) ∀ n ≥ n0}
so g(n) ∈ O(f(n)) if g(n) ≤ c*f(n) (after a certain n0 and a constant c)
To put it in a easy way, think of n as a really big number. How much all the other factors matter? So what's the only main factor that really matter?
Example:
f(n) = n^3 + 300*n +5 --> f(n) ∈ O(n^3) (try it with n=100 and you'll see that is already enough)
I'm given code for an algorithm as such:
1 sum =0;
2 for(i=0; i<n; i++)
3 for(j=1; j<= n; j*=3)
4 sum++;
I'm told this algorithm runs in O(nlogn) where log is in base 3.
So I get that the second line runs n times, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.
What you have been told is correct. The algorithm runs in O(nlog(n)). The third line: for(j=1; j<= n; j*=3) runs in O(log3(n)) because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n. The solution is x = log3(n)
Yes. Algo runs in nlog(n) times where log is base 3.
The easiest way to calculate complexity is to calculate number of operations done.
The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...
That means that the inner for loop runs in logarithmic time (log(n)) for each n.
So n*log(n) will be total complexity.
On the second loop you have j *= 3, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end
public static void Comp(int n)
{
int count=0;
for(int i=0;i<n;i++)
{
for(int j=0;j<n;j++)
{
for(int k=1;k<n;k*=2)
{
count++;
}
}
}
System.out.println(count);
}
Does anyone knows what the time complexity is?
And what is the Big Oh()
Please can u explain this to me, step by step?
Whoever gave you this problem is almost certainly looking for the answer n^2 log(n), for reasons explained by others.
However the question doesn't really make any sense. If n > 2^30, k will overflow, making the inner loop infinite.
Even if we treat this problem as being completely theoretical, and assume n, k and count aren't Java ints, but some theoretical integer type, the answer n^2 log n assumes that the operations ++ and *= have constant time complexity, no matter how many bits are needed to represent the integers. This assumption isn't really valid.
Update
It has been pointed out to me in the comments below that, based on the way the hardware works, it is reasonable to assume that ++, *=2 and < all have constant time complexity, no matter how many bits are required. This invalidates the third paragraph of my answer.
In theory this is O(n^2 * log(n)).
Each of two outer loops is O(n) and the inner one is O(log(n)), because log base 2 of n is the number of times which you have to divide n by 2 to get 1.
Also this is a strict bound, i.e the code is also Θ(n^2 * log(n))
The time complexity is O(n^2 log n). Why? each for-loop is a function of n. And you have to multiply by n for each for loop; except the inner loop which grows as log n. why? for each iteration k is multiplied by 2. Think of merge sort or binary search trees.
details
for the first two loops: summation of 1 from 0 to n, which is n+1 and so the first two loops give (n+1)*(n+1)= n^2+2n+1= O(n^2)
for the k loop, we have k growing as 1,2,4,8,16,32,... so that 2^k = n. Take the log of both sides and you get k=log n
Again, not clear?
So if we set m=0, and a=2 then we get -2^n/-1 why is a=2? because that is the a value for which the series yields 2,4,8,16,...2^k