Why does Insertion Sort not need the swap operation? - java

So I am currently learning Java and I was asking myself, why the Insertion-Sort method doesn´t have the need to use the swap operation? As Far as I understood, elements get swapped so wouldn´t it be usefull to use the swap operation in this sorting algorithm?
As I said, I am new to this but I try to understand the background of these algorithms , why they are the way they actually are
Would be happy for some insights :)
B.

Wikipedia's article for Insertion sort states
Each iteration, insertion sort removes one element from the input
data, finds the location it belongs within the sorted list, and
inserts it there. It repeats until no input elements remain. [...] If
smaller, it finds the correct position within the sorted list, shifts
all the larger values up to make a space, and inserts into that
correct position.
You can consider this shift as an extreme swap. What actually happens is the value is stored in a placeholder and checked versus the other values. If those values are smaller, they are simply shifted, ie. replace the previous (or next) position in the list/array. The placeholder's value is then put in the position from which the element was shifted.

Insertion Sort does not perform swapping. It performs insertions by shifting elements in a sequential list to make room for the element that is being inserted.
That is why it is an O(N^2) algorithm: for each element out of N, there can be O(N) shifts.

So, you Could do insertion sort by swapping.
But, is that the best way to do it? you should think of what a swap is...
temp = a
a=b
b=temp
there are 3 assignments that take place for a single swap.
eg. [2,3,1]
If the above list is to be sorted, you could 1. swap 3 and 1 then, 2. swap 1 and 2
total 6 assignments
Now,
Instead of swapping, if you just shift 2 and 3 one place to the right ( 1 assignment each) and then put 1 in array[0], you would end up with just 3 assignments instead of the 6 you would do with swapping.

Related

Insertion Time complexity on Java arrays

I have array of 10 elements. Integer[] arr = new Integer10;
What is the time complexity when I add 5th element like arr[5]=999;. I strongly believe that time complexity in arrays for insertion is O(1) , but on some literatures its said that for insertion on arrays (not arraylist) T.C is O(N) because the shifting occurs. How there can be shifting occured? It is not dynamic array.
Here the sources which I believe they are wrong:
Big O Notation Arrays vs. Linked List insertions
O(1) accurately describes inserting at the end of the array. However,
if you're inserting into the middle of an array, you have to shift all
the elements after that element, so the complexity for insertion in
that case is O(n) for arrays. End appending also discounts the case
where you'd have to resize an array if it's full.
https://iq.opengenus.org/time-complexity-of-array/
Inserting and deleting elements take linear time depending on the
implementation. If we want to insert an element at a specific index,
we need to skip all elements from that index to the right by one
position. This takes linear time O(N).
3.https://www.log2base2.com/data-structures/array/insert-element-particular-index-array.html
If we want to insert an element to index 0, then we need to shift all
the elements to right.
For example, if we have 5 elements in the array and need to insert an
element in arr[0], we need to shift all those 5 elements one position
to the right.
There's a misunderstanding:
arr[5]=999; is a direct assignment, not an 'insertion' where shifting would occur, so yes: it's O(1)
Insertion on the other hand consists of the following steps:
if array is full, create a new array
and copy data below index (<) to 0
copy data equal and above index (>=) to index + 1
assign value arr[index]=newValue;
So insertion, because it has to copy/move n-index elements on each insert, is considered O(n), because big-oh notation is designed to showcase worst-case.
Now, with MXX registers and parallel piplines on GPUs and stuff like that, we can speed this up by another big (constant) factor, but the problem itself remains in the O(n) category.
Not to argue with other answers, but I think the OP ask about direct assignment on array arr[5]=999; This operation will assign the value 999 to 5th index in array (if array has length 6 or more). It will not do any shifting or allocation. and for this operation the notion is definitely is O(1)
There are two use cases of inserting data in the array.
Inplace replacement
Shift the data
In first case (Inplace replacement), we generally have code like arr[5] = 99. For this use case, we are not going to shift the data of index 5 to index 6 and so on. Hence, the time complexity of such operation is O(1).
Whereas, in second case (Shift data), for inserting data at index 5, we will first shift the data of index 5 to index 6, and index 6 data to index 7 and so on. In this case, the time complexity of such operation would be O(N)

Best algorithm to group and add in a list of following pattern

Let us consider we have objects in a list as
listOfObjects = [a,b,ob,ob,c,ob,c,ob,c,ob,ob,c,ob]
we have to group them as
[ob,ob,c,ob,c,ob] from index 2 to 7
[ob,ob,c,ob] from index 9 to 12
i.e the group starts if we have two ob's together, as in index 2 and 7, and ends before the 'c' having two ob's following, as in index 8 having 'c' which is followed by two 'ob's or if the list ends.
So what will be the best algorithm to get the above(in java)?
I assume "best algorithm" according to you is that which is optimal in terms of time complexity.
You can do this task by simple one traversal with keeping track of next 3 elements (of course taking care that you don't go out of list size) and ending the group by checking the strategy you said. If there are no 3 elements next the current element, you simply end your group (as you specified in your strategy)
So the time complexity of this algorithm will be O(n). It will not be possible to get better than this.
I think Stack is a suitable data structure.
it'll be all right once you put 'ob' in the stack.
also you need 'count' variable.

merging and sorting 2 sorted arrays with big o looking for clarification.

This is a school work. I am not looking for code help but since my teacher isn't helping I came here.
I am asked to merge and sort two sorted arrays following two cases:
When sizes of the two arrays are equal
When sizes of the two arrays are different
Now I have done case 2 which also does case 1 :/ I just don't get it how I could write a code for case 1 or how it could differ from case 2. array length doesn't connect with the problem or I am not understanding correctly.
Then I am asked to compute big(o).
I am not looking for code here. If anyone by any chance understands what my teacher is asking really please give me hints to solve it.
It is very good to learn instead of copying.
as you suggest, there is no difference between case 1 and 2 but the worst case of algorithms depend on your solution. So I describe my solution (no code) and give you its worst case.
You can in both case, the arrays must ends with infinity so add infinity to them. then iterate over all of elements of each array, at each time, pick the one which is smaller and put in in your result array (merge of tow arrays).
With this solution, you can calculate worst case easily. we must iterate both of arrays once, and we add a infinity to both of them, if their length is n and m so our worst and best case is O(m + n) (you do m + n + 2 - 1 comparison and -1 because you don't compare the end of both array, I mean infinity)
but why adding infinity add the end of array? because for that we must make a copy of array with one more space? it is one way and worst case of that is O(m + n) for copying arrays too. but there is another solution too. you can compare until you get at the end of array, then you must add the rest of array which is not compared completely to end of your result array. but with infinity, it is automatic.
I hope helped you. if there is something wrong, comment it.
Merging two sorted arrays is a linear complexity operation. This means in terms of Big-O notation it is O(m+n) where m and n are lengths of two sorted arrays.
So when you say the array length doesn't connect with the problem your understanding is correct. Irrespective of the lengths of two sorted arrays the merging of these arrays involves taking elements from each sorted array and comparing them and copying the one to new array(depending whether you want the merged sorted array in ascending or descending order) and incrementing the counter of the array from which you copied the element to new sorted array.
Another way to approach this question is to look at each array as having a head and a tail, and solving the problem recursively. This way, we can use a base case, two arrays of size 1, to sort through the entirety of the two arrays m and n. Since both arrays are already sorted, simply compare the two heads of each array and add the element that comes first to your newly-created merged array, and move to the next element in that array. Your function will call itself again after adding the element. This will keep happening until one of the two arrays is empty. Now, you can simply add what is left of the nonempty array to the end of your merged array, and you are done.
I'm not sure if your professor will allow you to use recursive calls, but this method could make the coding much easier. Runtime would still be O(m+n), as you are basically iterating through both arrays once.
Hope this helps.

Best way to go about removing middle element from list or arraylist in java?

I need sequentially remove the middle element from sorted data. What would be the best way to do that? Since LinkedList operation takes n/2 time to remove the element, an arraylist would be faster. But an arraylist on the other hand then takes time to shift all the elements to the left, which isn't efficient either. Would other datastructures be of use maybe?
Removing the middle element is composed of two parts:
Finding the middle element
Deleting that element
ArrayList is O(1) at random access, so 1st step is fast for Array. While LinkedList is O(1) at deletion (given node), so 2nd step is easy for List.
What you want is best of both worlds.
IMO this is easily achievable if you write a custom (or extend existing) LinkedList. You'll need to have extra middle reference variable which will :
move to next upon insertion if the size becomes odd.
move to prev upon deletion if the size becomes odd.
You can also do even in both cases but they must be same (either even or odd).
Might a TreeSet fit the bill? If offers removal in O(log n) by key.

Fastest way to add a value in the middle of a sorted array - Java

I have a sorted array, lets say D={1,2,3,4,5,6} and I want to add the number 5 in the middle. I can do that by adding the value 5 in the middle and move the other values one step to the right.
The problem is that I have an array with 1000 length and I need to do that operation 10.000 times, so I need a faster way.
What options do I have? Can I use LinkedLists for better performance?
That depends on how you add said numbers. If only in ascending or descending order - then yes, LinkedList will do the trick, but only if you keep the node reference in between inserts.
If you're adding numbers in arbitrary order, you may want to deconstruct your array, add the new entries and reconstruct it again. This way you can use a data structure that's good at adding and removing entries while maintaining "sortedness". You have to relax one of your assumptions however.
Option 1
Assuming you don't need constant time random access while adding numbers:
Use a binary sorted tree.
The downside - while you're adding, you cannot read or reference an element by their position, not easily at least. Best case scenario - you're using a tree that keeps track of how many elements the left node has and can get the ith element in log(n) time. You can still get pretty good performance if you're just iterating through the elements though.
Total runtime is down to n * log(n) from n^2. Random access is log(n).
Option 2
Assuming you don't need the elements sorted while you're adding them.
Use a normal array, but add elements to the end of it, then sort it all when you're done.
Total runtime: n * log(n). Random access is O(1), however elements are not sorted.
Option 3
(This is kinda cheating, but...)
If you have a limited number of values, then employing the idea of BucketSort will help you achieve great performance. Essentially - you would replace your array with a sorted map.
Runtime is O(n), random access is O(1), but it's only applicable to a very small number of situations.
TL;DR
Getting arbitrary values, quick adding and constant-time positional access, while maintaining sortedness is difficult. I don't know any such structure. You have to relax some assumption to have room for optimizations.
A LinkedList will probably not help you very much, if at all. Basically you are exchanging the cost of shifting every value on insert with the cost of having to traverse each node in order to reach the insertion point.
This traversal cost will also need to be paid whenever accessing each node. A LinkedList shines as a queue, but if you need to access the internal nodes individually it's not a great choice.
In your case, you want a sorted Tree of some sort. A BST (Balanced Search Tree, also referred to as a Sorted Binary Tree) is one of the simplest types and is probably a good place to start.
A good option is a TreeSet, which is likely functionally equivalent to how you were using an array, if you simply need to keep track of a set of sorted numbers.

Categories