Exception in thread "main" java.lang.StackOverflowError
at Search.mergeSort(Search.java:41)
at Search.mergeSort(Search.java:43)
at Search.mergeSort(Search.java:43)
at Search.mergeSort(Search.java:43)
at Search.mergeSort(Search.java:43)
I keep getting this error when I try to run my program. My program is supposed to take string input from a file and sort it using this algorithm. Any ideas? Problematic lines from code:
public static void mergeSort(String[] word, int p, int r){
int q;
if(p<r){
q=p+r/2;
mergeSort(word,p,q);
mergeSort(word, q+1,r);
merge(word, p, q, r);
}
}
EDIT
These two functions sort the String array by dividing the array in half, sorting each half separately, and merging them together. Int q is the halfway point, and the arrays being evaluated are from word[p] to word[q] and word[q+1] to word[r]. here's merge function:
public static void merge(String[] word, int p, int q, int r){
int n1 = q-p+1;
int n2 = r-q;
String[] L = new String[n1];
String[] R = new String[n2];
int i, j, k;
for(i=0; i<n1; i++) L[i] = word[p+i];
for(j=0; j<n2; j++) R[j] = word[q+r+1];
i=0; j=0;
for(k=p; k<=r; k++){
if(i<n1 && j<n2){
if(L[i].compareTo(R[j])<0){
word[k] = L[i];
i++;
}else{
word[k] = R[j];
j++;
}
}else if(i<n1){
word[k] = L[i];
i++;
}else if(j<n2){
word[k] = R[j];
j++;
}
}
Walk through with a debugger. You'll see exactly how it's leading to an infinite recursion. IDEs (Eclipse, IntelliJ) have them built in.
The problem is that your calculation of q is incorrect. It is supposed to be the halfway point between p and r, and the way to calculate that is:
q = p + (r - p) / 2;
or
q = (p + r) / 2;
But you've written:
q = p + r / 2;
which is equivalent to
q = p + (r / 2);
When making recursive methods, you need some base case, otherwise you'll get infinite recursion like you are right now.
In your mergeSort method, you don't have a base case. You need to put a check in to see if the part you're sorting is already sorted; if word[p..r] is sorted, then you should not call mergeSort
Related
Im solving the QuickSort assignment at Algorithms class by Stanford and using the median rule to select the pivot element. The input is numbers from 1-10000 and output is the number of comparisons
My function are as follows :
public static int noOfComp = 0;
public static void quick_sort(int[] a, int p, int r){
if(p<r) {
noOfComp+= r-p;
int mid = partition(a, p, r);
quick_sort(a, p, mid-1);
quick_sort(a, mid+1, r);
}
}
public static int median(int a[],int p, int r){
int firstPos = p;
int len = r-p+1;
int lastPos = r;
int midPos = len%2==0 ? p + (len)/2-1: p + (len)/2 ;
int first = a[firstPos];
int middle = a[midPos];
int last = a[lastPos];
if (first <= middle) {
if (middle <= last) {
// first - middle - last
return midPos;
} else if (first <= last) {
// first - last - middle
return lastPos;
}
// last - first - middle
return firstPos;
}
if (first <= last) {
// middle - first - last
return firstPos;
} else if (middle <= last) {
// middle - last - first
return lastPos;
}
// last - middle - first
return midPos;
}
public static int partition(int[] a, int p, int r){
int chosen = median(a,p,r);
swap(a, p, chosen);
int pivot = a[p];
int i = p;
for (int j = p+1; j < a.length; j++) {
if (a[j] < pivot) {
i++;
swap(a, i, j);
}
}
swap(a, i,p);
return i;
}
//main
public static void main(String[] args) throws Throwable{
int i=0;
Scanner in = new Scanner(new File("C:\\Users\\Uzumaki Naruto\\Documents\\QuickSort.txt"));
while(in.hasNext()){
i++;
in.next();
}
int[] a = new int[i];
i=0;
Scanner in2 = new Scanner(new File("C:\\Users\\Uzumaki Naruto\\Documents\\QuickSort.txt"));
while(in2.hasNext()){
a[i++] = in2.nextInt();
}
quick_sort(a, 0, a.length-1);
System.out.println("Number of comparisons : " + noOfComp);
}
The answer to question seems to be around 128k , but my algorithm output it 132k. I've read the code number of times but unable to ascertain the error.
Indeed, I also get an average count of around 132k with your code, executed on randomly shuffled arrays of unique numbers. I did not find any mistake in the algorithm, except for the following one, but it's not influencing your count result, which assumed correct code:
The loop in partition has a bad exit condition:
for (int j = p+1; j < a.length; j++) {
It should be:
for (int j = p+1; j <= r; j++) {
The following is not an error, but you can rewrite
int len = r-p+1;
int midPos = len%2==0 ? p + (len)/2-1: p + (len)/2 ;
as:
int midPos = p + (r-p)/2;
But: You did not count the comparisons made in the function median, and this should normally be done, otherwise an algorithm cannot be fairly compared with another (variant). So that results in 2 or 3 more comparisons per call of partition. This increases the average count to around 148k!
Here it says that:
the expected number of comparisons needed to sort n elements with random pivot selection is 1.386 n.log(n). Median-of-three pivoting brings this down to ≈ 1.188 n.log(n).
The thing is that for n = 10 000, 1.188 n.log(n) ≈ 158k so your algorithm seems to do fewer comparisons than this estimate, at least for this particular case of n.
I do see a way to reduce that number again.
Reducing the number of comparisons
The main idea is to profit from the comparisons you make in the function median by already putting the lowest and highest of the three inspected values in the right partition, so they do not need to be treated further by the loop in the function partition.
To give an example, if you have an array like this:
5, 1, 2, 9, 3
Then median will compare 5, 2 and 3 and choose 3 as pivot value. The function could now be extended to also put the three investigated elements in the right order, without extra comparisons, to get this:
2, 1, 3*, 9, 5
And then the pivot element would not have to be swapped to the start of the array, but to the second slot, because we already have decided that the left most element belongs to the lower partition:
2, 3*, 1, 0, 5
And now the main partition loop can concentrate on this sub-array, because also the last element is known the belong to the upper partition:
2, 3*, [1, 0], 5
At the end of the loop the final swap will be with the second element instead of the first:
2, 0, 1, 3*, 5
This will reduce the number of comparisons in the main loop with 2.
In this variant, the median function will always return the index of the second slot, after making a few swaps in the array:
public static int median(int a[],int p, int r){
int m = p + (r-p)/2;
// actually sort the three elements:
noOfComp++;
if (a[r] < a[m]) {
swap(a, r, m);
}
if (p < m) { // more than 2 elements
noOfComp++;
if (a[m] < a[p]) {
swap(a, m, p);
noOfComp++;
if (a[r] < a[m]) {
swap(a, r, m);
}
}
// put the middle element (pivot) in second slot
swap(a, m, p+1);
}
return p+1;
}
And partition will look like this:
public static int partition(int[] a, int p, int r){
int k = median(a, p, r); // always returns p+1 as pivot's index
int i = k; // (k..i] is lower partition
for (int j = p+2; j < r; j++) { // positions p and r can be excluded
if (a[j] < a[k]) {
i++;
swap(a, i, j);
}
}
swap(a, i, k); // place pivot between partitions
return i;
}
In quick_sort the count of comparisons will be two less:
noOfComp += r-p-2;
With the above adjustments the number of comparisons goes down from 148k to 135k on average.
So I am afraid that although the actual number of comparisons has been reduced this way, it still does not match the 128k.
Other ideas
I tried using insertion sort when the array became small, but it did not yield much of an improvement. Another idea is to improve the search for the median by looking at more elements, but only if the array is not too small, as the cost of looking for one must be small compared to the partitioning effort.
But the assignment may not allow for all this tweaking.
During a 45 minute technical interview with Google, I was asked a Leaper Graph problem.
I wrote working code, but later was declined the job offer because I lacked Data structure knowledge. I'm wondering what I could have done better.
The problem was as following:
"Given an N sized board, and told that a piece can jump i positions horizontally (left or right) and j positions vertically (up or down) (I.e, sort of like a horse in chess), can the leaper reach every spot on the board?"
I wrote the following algorithm. It recursively finds out if every position on the board is reachable by marking all spots on the graph that were visited. If it was not reachable, then at least one field was false and the function would return false.
static boolean reachable(int i, int j, int n) {
boolean grid[][] = new boolean[n][n];
reachableHelper(0, 0, grid, i, j, n - 1);
for (int x = 0; x < n; x++) {
for (int y = 0; y < n; y++) {
if (!grid[x][y]) {
return false;
}
}
}
return true;
}
static void reachableHelper(int x, int y, boolean[][] grid, int i, int j, int max) {
if (x > max || y > max || x < 0 || y < 0 || grid[x][y]) {
return;
}
grid[x][y] = true;
int i2 = i;
int j2 = j;
for (int a = 0; a < 2; a++) {
for (int b = 0; b < 2; b++) {
reachableHelper(x + i2, y + j2, grid, i, j, max);
reachableHelper(x + j2, y + i2, grid, i, j, max);
i2 = -i2;
}
j2 = -j2;
}
}
Now, later it was pointed out that the optimal solution would be to implement Donald Knuth's co-prime implementation:
http://arxiv.org/pdf/math/9411240v1.pdf
Is this something that one should be able to figure out on a 45 minute technical interview??
Besides the above, is there anything I could have done better?
edit:
- I enquired about starting position. I was told starting at 0,0 is fine.
edit2
Based on feedback, I wrote a while-loop with queue approach.
The recursive approach runs into a stack-overflow when n = 85.
However, the while loop with queue method below works up to ~n = 30,000. (after that it runs into heap-issues with memory exceeding GB's). If you know how to optimize further, please let me know.
static boolean isReachableLoop(int i, int j, int n) {
boolean [][] grid = new boolean [n][n];
LinkedList<Point> queue = new LinkedList<Point>();
queue.add(new Point(0,0)); // starting position.
int nodesVisited = 0;
while (queue.size() != 0) {
Point pos = queue.removeFirst();
if (pos.x >= 0 && pos.y >= 0 && pos.x < n && pos.y < n) {
if (!grid[pos.x][pos.y]) {
grid[pos.x][pos.y] = true;
nodesVisited++;
int i2 = i;
int j2 = j;
for (int a = 0; a < 2; a++) {
for (int b = 0; b < 2; b++) {
queue.add(new Point(pos.x+i2, pos.y+j2));
queue.add(new Point(pos.x+j2, pos.y+i2));
i2 = -i2;
}
j2 = -j2;
}
}
}
}
if (nodesVisited == (n * n)) {
return true;
} else {
return false;
}
}
I ask a lot of interview questions like this. I don't think you would be expected to figure out the coprime method during the interview, but I would have docked you for using O(n^2) stack space -- especially since you passed all those parameters to each recursive call instead of using an object.
I would have asked you about that, and expected you to come up with a BFS or DFS using a stack or queue on the heap. If you failed on that, I might have a complaint like "lacked data structure knowledge".
I would also have asked questions to make sure you knew what you were doing when you allocated that 2D array.
If you were really good, I would ask you if you can use the symmetry of the problem to reduce your search space. You really only have to search a J*J-sized grid (assuming J>=i).
It's important to remember that the interviewer isn't just looking at your answer. He's looking at the way you solve problems and what tools you have in your brain that you can bring to bear on a solution.
Edit: thinking about this some more, there are lots of incremental steps on the way to the coprime method that you might also come up with. Nobody will expect that, but it would be impressive!
I'm sorry, I feel like I'm missing something.
If you can only go up or down by i and left or right by j, then a case (x,y) is reachable from a start case (a,b) if there are integers m and n so that
a + m*i = x
b + n*j = y
That is, everything is false for a square board where n > 1.
If you meant more like a knight in chess, and you can go up/down by i and left/right by j OR up/down by j and left/right by i, you can use the same technique. It just becomes 2 equations to solve:
a + m * i + n * j = x
b + o * i + p * j = y
If there are no integers m, n, o and p that satisfy those equations, you can't reach that point.
While implementing improvements to quicksort partitioning,I tried to use Tukey's ninther to find the pivot (borrowing almost everything from sedgewick's implementation in QuickX.java)
My code below gives different results each time the array of integers is shuffled.
import java.util.Random;
public class TukeysNintherDemo{
public static int tukeysNinther(Comparable[] a,int lo,int hi){
int N = hi - lo + 1;
int mid = lo + N/2;
int delta = N/8;
int m1 = median3a(a,lo,lo+delta,lo+2*delta);
int m2 = median3a(a,mid-delta,mid,mid+delta);
int m3 = median3a(a,hi-2*delta,hi-delta,hi);
int tn = median3a(a,m1,m2,m3);
return tn;
}
// return the index of the median element among a[i], a[j], and a[k]
private static int median3a(Comparable[] a, int i, int j, int k) {
return (less(a[i], a[j]) ?
(less(a[j], a[k]) ? j : less(a[i], a[k]) ? k : i) :
(less(a[k], a[j]) ? j : less(a[k], a[i]) ? k : i));
}
private static boolean less(Comparable x,Comparable y){
return x.compareTo(y) < 0;
}
public static void shuffle(Object[] a) {
Random random = new Random(System.currentTimeMillis());
int N = a.length;
for (int i = 0; i < N; i++) {
int r = i + random.nextInt(N-i); // between i and N-1
Object temp = a[i];
a[i] = a[r];
a[r] = temp;
}
}
public static void show(Comparable[] a){
int N = a.length;
if(N > 20){
System.out.format("a[0]= %d\n", a[0]);
System.out.format("a[%d]= %d\n",N-1, a[N-1]);
}else{
for(int i=0;i<N;i++){
System.out.print(a[i]+",");
}
}
System.out.println();
}
public static void main(String[] args) {
Integer[] a = new Integer[]{17,15,14,13,19,12,11,16,18};
System.out.print("data= ");
show(a);
int tn = tukeysNinther(a,0,a.length-1);
System.out.println("ninther="+a[tn]);
}
}
Running this a cuople of times gives
data= 11,14,12,16,18,19,17,15,13,
ninther=15
data= 14,13,17,16,18,19,11,15,12,
ninther=14
data= 16,17,12,19,18,13,14,11,15,
ninther=16
Will tuckey's ninther give different values for different shufflings of the same dataset? when I tried to find the median of medians by hand ,I found that the above calculations in the code are correct.. which means that the same dataset yield different results unlike a median of the dataset.Is this the proper behaviour? Can someone with more knowledge in statistics comment?
Tukey's ninther examines 9 items and calculates the median using only those.
For different random shuffles, you may very well get a different Tukey's ninther, because different items may be examined. After all, you always examine the same array slots, but a different shuffle may have put different items in those slots.
The key here is that Tukey's ninther is not the median of the given array. It is an attempted appromixation of the median, made with very little effort: we only have to read 9 items and make 12 comparisons to get it. This is much faster than getting the actual median, and has a smaller chance of resulting in an undesirable pivot compared to the 'median of three'. Note that the chance still exists.
Does this answer you question?
On a side note, does anybody know if quicksort using Tukey's ninther still requires shuffling? I'm assuming yes, but I'm not certain.
Hey I seem to be having a problem trying to implement some Java quick sort code over an array of 10,000 random numbers. I have a text file containing the numbers which are placed into an array, which is then passed to the sorting algorithm to be sorted. My aim is to time how long it takes to time the sorting increasing the numbers sorted each time using the timing loop I have. But for some reason using this code gives me a curved graph instead of a straight linear line. I know the timing loop and array code work fine so there seems to be a problem with the sorting code but can't seem to find anything! Any help is greatly appreciated thanks!
import java.io.*;
import java.util.*;
public class Quicksort {
public static void main(String args[]) throws IOException {
//Import the random integer text file into an integer array
File fil = new File("randomASC.txt");
FileReader inputFil = new FileReader(fil);
int [] myarray = new int [10000];
Scanner in = new Scanner(inputFil);
for(int q = 0; q < myarray.length; q++)
{
myarray[q] = in.nextInt();
}
in.close();
for (int n = 100; n < 10000; n += 100) {
long total = 0;
for (int r = 0; r < 10; ++r) {
long start = System.nanoTime ();
quickSort(myarray,0,n-1);
total += System.nanoTime() - start;
}
System.out.println (n + "," + (double)total / 10.0);
}
}
public static void quickSort(int[] a, int p, int r)
{
if(p<r)
{
int q=partition(a,p,r);
quickSort(a,p,q);
quickSort(a,q+1,r);
}
}
private static int partition(int[] a, int p, int r) {
int x = a[p];
int i = p-1 ;
int j = r+1 ;
while (true) {
i++;
while ( i< r && a[i] < x)
i++;
j--;
while (j>p && a[j] > x)
j--;
if (i < j)
swap(a, i, j);
else
return j;
}
}
private static void swap(int[] a, int i, int j) {
// TODO Auto-generated method stub
int temp = a[i];
a[i] = a[j];
a[j] = temp;
}
}
Only the first iteration of the inner loop actually sorts the array that you've read from the file. All the subsequent iterations are applied to the already-sorted array.
But for some reason using this code gives me a curved graph instead of a straight linear line.
If you mean that the run time grows non-linearly in n, that's to be expected since quicksort is not a linear-time algorithm (no comparison sort is).
Your performance graph looks like a nice quadratic function:
You're getting quadratic rather than O(n log(n)) time due to your choice of pivot: since most of the time you're calling your function on a sorted array, your method of choosing the pivot means you're hitting the worst case every single time.
As part of a school project, I need to write a function that will take an integer N and return a two-dimensional array of every permutation of the array {0, 1, ..., N-1}. The declaration would look like public static int[][] permutations(int N).
The algorithm described at http://www.usna.edu/Users/math/wdj/book/node156.html is how I've decided to implement this.
I wrestled for quite a while with arrays and arrays of ArrayLists and ArrayLists of ArrayLists, but so far I've been frustrated, especially trying to convert a 2d ArrayList to a 2d array.
So I wrote it in javascript. This works:
function allPermutations(N) {
// base case
if (N == 2) return [[0,1], [1,0]];
else {
// start with all permutations of previous degree
var permutations = allPermutations(N-1);
// copy each permutation N times
for (var i = permutations.length*N-1; i >= 0; i--) {
if (i % N == 0) continue;
permutations.splice(Math.floor(i/N), 0, permutations[Math.floor(i/N)].slice(0));
}
// "weave" next number in
for (var i = 0, j = N-1, d = -1; i < permutations.length; i++) {
// insert number N-1 at index j
permutations[i].splice(j, 0, N-1);
// index j is N-1, N-2, N-3, ... , 1, 0; then 0, 1, 2, ... N-1; then N-1, N-2, etc.
j += d;
// at beginning or end of the row, switch weave direction
if (j < 0 || j >= N) {
d *= -1;
j += d;
}
}
return permutations;
}
}
So what's the best strategy to port that to Java? Can I do it with just primitive arrays? Do I need an array of ArrayLists? Or an ArrayList of ArrayLists? Or is there some other data type that's better? Whatever I use, I need to be able to convert it back into a an array of primitive arrays.
Maybe's there a better algorithm that would simplify this for me...
Thank you in advance for your advice!
As you know the number of permutations beforehand (it's N!) and also you want/have to return an int[][] I would go for an array directly. You can declare it right at the beginning with correct dimensions and return it at the end. Thus you don't have to worry about converting it afterwards at all.
Since you pretty much had it completed on your own in javascript, I'll go ahead and give you the Java code for implementing Steinhaus' permutation algorithm. I basically just ported your code to Java, leaving as much of it the same as I could, including comments.
I tested it up to N = 7. I tried to have it calculate N = 8, but it's been running for almost 10 minutes already on a 2 GHz Intel Core 2 Duo processor, and still going, lol.
I'm sure if you really worked at it you could speed this up significantly, but even then you're probably only going to be able to squeeze maybe a couple more N-values out of it, unless of course you have access to a supercomputer ;-).
Warning - this code is correct, NOT robust. If you need it robust, which you usually don't for homework assignments, then that would be an exercise left to you. I would also recommend implementing it using Java Collections, simply because it would be a great way to learn the in's and out's of the Collections API.
There's several "helper" methods included, including one to print a 2d array. Enjoy!
Update: N = 8 took 25 minutes, 38 seconds.
Edit: Fixed N == 1 and N == 2.
public class Test
{
public static void main (String[] args)
{
printArray (allPermutations (8));
}
public static int[][] allPermutations (int N)
{
// base case
if (N == 2)
{
return new int[][] {{1, 2}, {2, 1}};
}
else if (N > 2)
{
// start with all permutations of previous degree
int[][] permutations = allPermutations (N - 1);
for (int i = 0; i < factorial (N); i += N)
{
// copy each permutation N - 1 times
for (int j = 0; j < N - 1; ++j)
{
// similar to javascript's array.splice
permutations = insertRow (permutations, i, permutations [i]);
}
}
// "weave" next number in
for (int i = 0, j = N - 1, d = -1; i < permutations.length; ++i)
{
// insert number N at index j
// similar to javascript's array.splice
permutations = insertColumn (permutations, i, j, N);
// index j is N-1, N-2, N-3, ... , 1, 0; then 0, 1, 2, ... N-1; then N-1, N-2, etc.
j += d;
// at beginning or end of the row, switch weave direction
if (j < 0 || j > N - 1)
{
d *= -1;
j += d;
}
}
return permutations;
}
else
{
throw new IllegalArgumentException ("N must be >= 2");
}
}
private static void arrayDeepCopy (int[][] src, int srcRow, int[][] dest,
int destRow, int numOfRows)
{
for (int row = 0; row < numOfRows; ++row)
{
System.arraycopy (src [srcRow + row], 0, dest [destRow + row], 0,
src[row].length);
}
}
public static int factorial (int n)
{
return n == 1 ? 1 : n * factorial (n - 1);
}
private static int[][] insertColumn (int[][] src, int rowIndex,
int columnIndex, int columnValue)
{
int[][] dest = new int[src.length][0];
for (int i = 0; i < dest.length; ++i)
{
dest [i] = new int [src[i].length];
}
arrayDeepCopy (src, 0, dest, 0, src.length);
int numOfColumns = src[rowIndex].length;
int[] rowWithExtraColumn = new int [numOfColumns + 1];
System.arraycopy (src [rowIndex], 0, rowWithExtraColumn, 0, columnIndex);
System.arraycopy (src [rowIndex], columnIndex, rowWithExtraColumn,
columnIndex + 1, numOfColumns - columnIndex);
rowWithExtraColumn [columnIndex] = columnValue;
dest [rowIndex] = rowWithExtraColumn;
return dest;
}
private static int[][] insertRow (int[][] src, int rowIndex,
int[] rowElements)
{
int srcRows = src.length;
int srcCols = rowElements.length;
int[][] dest = new int [srcRows + 1][srcCols];
arrayDeepCopy (src, 0, dest, 0, rowIndex);
arrayDeepCopy (src, rowIndex, dest, rowIndex + 1, src.length - rowIndex);
System.arraycopy (rowElements, 0, dest [rowIndex], 0, rowElements.length);
return dest;
}
public static void printArray (int[][] array)
{
for (int row = 0; row < array.length; ++row)
{
for (int col = 0; col < array[row].length; ++col)
{
System.out.print (array [row][col] + " ");
}
System.out.print ("\n");
}
System.out.print ("\n");
}
}
The java arrays are not mutable (in the sense, you cannot change their length). For direct translation of this recursive algorithm you probably want to use List interface (and probably LinkedList implementation as you want put numbers in the middle). That is List<List<Integer>>.
Beware the factorial grows rapidly: for N = 13, there is 13! permutations that is 6 227 020 800. But I guess you need to run it for only small values.
The algorithm above is quite complex, my solution would be:
create List<int[]> to hold all permutations
create one array of size N and fill it with identity ({1,2,3,...,N})
program function that in place creates next permutation in lexicographical ordering
repeat this until you get the identity again:
put a copy of the array at the end of the list
call the method to get next permutation.
If your program just needs to output all permutations, I would avoid to store them and just print them right away.
The algorithm to compute next permutation can be found on internet. Here for example
Use whatever you want, arrays or lists, but don't convert them - it just makes it harder. I can't tell what's better, probably I'd go for ArrayList<int[]>, since the outer List allows me to add the permutation easily and the inner array is good enough. That's just a matter of taste (but normally prefer lists, since they're much more flexible).
As per Howard's advice, I decided I didn't want to use anything but the primitive array type. The algorithm I initially picked was a pain to implement in Java, so thanks to stalker's advice, I went with the lexicographic-ordered algorithm described at Wikipedia. Here's what I ended up with:
public static int[][] generatePermutations(int N) {
int[][] a = new int[factorial(N)][N];
for (int i = 0; i < N; i++) a[0][i] = i;
for (int i = 1; i < a.length; i++) {
a[i] = Arrays.copyOf(a[i-1], N);
int k, l;
for (k = N - 2; a[i][k] >= a[i][k+1]; k--);
for (l = N - 1; a[i][k] >= a[i][l]; l--);
swap(a[i], k, l);
for (int j = 1; k+j < N-j; j++) swap(a[i], k+j, N-j);
}
return a;
}
private static void swap(int[] is, int k, int l) {
int tmp_k = is[k];
int tmp_l = is[l];
is[k] = tmp_l;
is[l] = tmp_k;
}