Java OutOfMemory Heap Space - java

everyone. I am working on a project. Along the way, I got the "OutOfMemory" problem with this line of code...
this.A = new int[n][n];
...for n = 10000;
I tried fixing it by adding -Xmx2048m command but the thing is when added, all of the rest of running programs stops responding. Any other suggestion for my case? Thanks a lot

You have to calculate the space that array needs. If it is over 2048M then you'll receive an OutOfMemoryError.
In your case you try to allocate an array of 10000 x 10000 which is 100.000.000.
A primitive int occupies 4 bytes. This means that the whole array takes
100.000.000 * 4 bytes which is 0,37 GB of space.
So it seems that there is something else in your program which caueses the error. For example if you try to allocate multiple arrays in a loop then you can run out of memory real quick.
It can be a problem if your hardware does not have 2048M of memory.
It is also possible that before using -Xmx2048m you had for example -Xmx512m which might be too small for your array.

Use this:
Increase heap size in Java
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
-Xss<size> set java thread stack size
java -Xms16m -Xmx64m ClassName

Let's see...
An int occupies 32bit or 4 bytes in memory. Your 2-dimensional array thus requires
10000 * 10000 * 4 bytes = 400000000 bytes = 381.47 MB
(plus storage for the array objects)
Let's verify that with a simple test:
int n = 10000;
int[][] foo = new int[n][n];
System.out.println("USED MEMORY: " + (Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory()) / 1024 / 1024);
Result: 384 MB (on my machine, your results may vary).
Running out of memory in the heap happens when a requested block of memory cannot be allocated, not necessarily when the memory is full.
As you allocate memory in blocks of 40'000 bytes, you can run out of memory when no contiguous free memory of that size is available (hundreds of blocks with less that 40'000 won't help here). That means you can easily run out of memory when the free memory on the heap is fragmented as small memory blocks.
If you happen to allocate that memory multiple times in a row, try reusing the allocated memory instead of allocating it again (to prevent fragmentation to a certain degree).
Alternatively, try using smaller arrays (even if it means using more of them).

Related

Why maximum size of an java array is Integer.MAX_VALUE/7?

I am little surprised to see why on my machine, the maximum size of an array is Integer.MAX_VALUE/7
I know that the arrays are indexed by integers, and so the array size cannot be greater than Integer.MAX_VALUE. I also read some stackoverflow discussions where I found that it varies on the JVM, and some(5-8 bites) are used by JVM.
In that case also, the maximum values should be Integer.MAX_VALUE-8.
Any value in between Integer.MAX_VALUE-2 and Integer.MAX_VALUE/7 gives me the error: Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
int[] arr = new int[Integer.MAX_VALUE/7];
This is the largest value I can assign to array on my machine. Are specific reasons for this?
Update:
I am running the code from eclipse in which the default heap size is 1024Mb. Below are the more details from my environment:
System.out.println(Runtime.getRuntime().totalMemory()/(1024*3));
System.out.println(Runtime.getRuntime().freeMemory()/(1024*3));
System.out.println(Runtime.getRuntime().maxMemory()/(1024*3));
give the output:
40618
40195
594773
As mentioned already by cloudworker, the real limits for array are explained here: Do Java arrays have a maximum size?
In your case 1GB is just not enough heap space for array that huge.
I do not know what exact processes are run in JVM, but from what I am able to count:
Integer.MAX_VALUE= ~2 billions
int = 4bytes
2billions*4bytes=8billions bytes = 8GB memory
With 1GB heapspace you should be able to have ~ /8 of MAX_VALUE. (I think that reason that you can actually get more than /8 is some optimization in JVM)

Home much memory does the JVM need to allocate a character array?

Consider to following 1 line program:
public static void main(String[] args) throws Exception {
// 134217728 * 2 bytes / 1024 / 1024 = 256M
char[] array = new char[134217728];
}
How much memory does the JVM need to allocate this 256M character array?
Turns out the answer is -Xmx384m. Now lets try 512M character array...
// 268435456 * 2 bytes / 1024 / 1024 = 512M
char[] array = new char[268435456];
The answer appears to be -Xmx769m.
In running though a few examples for a character array of size m. The jvm needs at minimum 1.5m megabytes of memory to allocate the array. This seems like a lot, can anyone explain what is happening here?
I believe you're observing the way that the Oracle JVM allocates memory.
In particular, the whole array has to fit into one "generation". By default, the old generation is twice the size of the new generation - which means for every 3MB you add in total, you only get 2MB of extra space in the old generation. If you change the ratio, you can allocate the char array in a smaller total size. For example, this works for me with your 512MB array:
java -Xmx530M -XX:NewRatio=50 Test
As an aside, you see exactly the same effect with a byte array, and then you don't need to worry about doubling the length of the array to get the size in bytes. (There's the small constant overhead for the class reference and the length, but obviously that's trivial.)
The environment needs a little bit of space for itself and as in one comment it depends on the used JVM and compiler (OpenJdk or Oracle, java 6/7/8). All in all it should orient on the size: Character.SIZE * array.length.

Java OutOfMemoryError when allocating byte[] within max heap

I'm trying to figure out why I am getting an OOM error even though the byte array I am initializing plus the currently used memory is less than the max heap size (1000MB).
Right before the array is initialized I'm using 373MB with 117 free. When I try to initialize the array that takes up 371MB I get an error. The strange thing is that the error persists until I allocate 1.2G or more for the JVM.
373 + 371 is 744, I should still have 256MB free, this is driving me nuts.
In a second case using 920mb with 117 free initializing a 918mb array takes at least 2800mb.
Is this somehow part of how java functions? If so is there a workaround so that something simple like an array copy operation can be done in less than 3n memory?
(memory numbers are from Runtime and max heap size is set with -Xmx)
test.java:
byte iv[];
iv =new byte[32];
byte key[] = new byte[32];
new SecureRandom().nextBytes(iv);
new SecureRandom().nextBytes(key);
plaintext = FileUtils.readFileToByteArray(new File("sampleFile"));
EncryptionResult out = ExperimentalCrypto.doSHE(plaintext, key, iv);
ExperimentalCrypto.java:
public static byte[] ExperimentalCrypto(byte[] input ,byte[] key, byte[]iv){
if(input.length%32 != 0){
int length = input.length;
byte[] temp = null;
System.out.println((input.length/32+1)*32 / (1024*1024));
temp=new byte[(input.length/32+1)*32]; // encounter error here
Typical JVM implementations split the Java heap into several parts dedicated to objects with a certain lifetime. Allocations of larger arrays typically bypass the stages for younger objects as these areas are usually smaller and to avoid unnecessary copying. So they will end up in the “old generation” space for which a size of ⅔ is not unusual. As you are using JVisualVM I recommend installing the plugin Visual GC which can show you a live view of the different memory areas and their fill state.
You can use the -XX:MaxPermSize=… and -XX:MaxNewSize=… startup options to reduce the sizes of the areas for the young and permanent generation and thus indirectly raise the fraction of the old generation’s area where your array will be allocated.

Java char array seems to need more than 2 bytes per char

When I run following program(running with "java -Xmx151M -cp . com.some.package.xmlfun.Main") :
package com.some.package.xmlfun;
public class Main {
public static void main(String [] args) {
char [] chars = new char[50 * 1024 * 1024];
}
}
I need to increase maximum memory to at least 151M (-Xmx151M). Accordingly, when I increase array size the limit needs to be increased:
50 * 1024 * 1024 -> -Xmx151M
100 * 1024 * 1024 -> -Xmx301M
150 * 1024 * 1024 -> -Xmx451M
Why does it looks like java needs 3 bytes per char, instead of 2 bytes as documentation suggests?
Also when I similarly create array of long it seems to need 12 bytes per long, instead of 8, with int it needs 6 bytes instead of 4. Generally it looks like it needs array_size * element_size * 1.5
Compiling with - javac \com\som\package\xmlfun\\*java
Running with - java -Xmx151M -cp . com.some.package.xmlfun.Main
I guess what you are seeing can be easily explained by how the heap in the JVM is organized.
When you pass the parameter -Xmx to the JVM, you are defining what the maximum heap size should be. However, it is not directly related to the maximum size of an array that you can allocate.
In the JVM, the garbage collector is responsible for allocating memory for objects and for cleaning up dead objects. It is the garbage collector that decides how it organizes the heap.
You usually have something called Eden space, then two survivor spaces and finally the tenured generation. All of these are inside the heap, and the GC divides the maximum heap among them. For more details on these memory pools, check this brilliant answer: https://stackoverflow.com/a/1262474/150339
I don't know what the default values are, and they might indeed depend on your system. I've just checked (using sudo jmap PID) how the memory pools divide the heap in an application I run on a system running Ubuntu 64-bits and Oracle's Java 7. The machine has 1.7GB ram.
In that configuration, I only pass -Xmx to the JVM, and the GC divides the heap as follows:
about 27% for the Eden space
about 3% for each of the survivor spaces
about 67% for the tenured generation.
If you have a similar distribution, it would mean that the largest contiguous block of your 151MB is in the tenured generation, and is of about 100MB. Since an array is a contiguous block of memory, and you simply cannot have an object span multiple memory pools, it explains the behaviour you are seeing.
You could try playing with the garbage collector parameters. Check the garbage collector parameters over here: http://www.oracle.com/technetwork/java/javase/tech/vmoptions-jsp-140102.html
Your results seem pretty reasonable to me.
In Java HotSpot VM, heap is divided into "new generation" and "old generation". The array must be in either of them. The default value of ratio of new/old generation sizes is 2.
(Which actually denotes old/new=2)
So with some simple math it can be shown an 151MB heap can have 50.33MB new generation and 100.67MB old generation. Also an 150MB heap has exactly 100MB of old generation. Your array + everything else (such as args) will exhaust the 100MB, hence produce OutOfMemoryError.
I tried to run with
java -Xms150m -Xmx150m -XX:+PrintGCDetails Main > c.txt
And from c.txt
(...)
Heap
PSYoungGen total 44800K, used 3072K (addresses...)
eden space 38400K, 8% used (...)
from space 6400K, 0% used (...)
to space 6400K, 0% used (...)
ParOldGen total 102400K, used 217K (...)
object space 102400K, 0% used (...)
PSPermGen total 21248K, used 2411K (...)
object space 21248K, 11% used (...)
The spaces are not exactly equals to my calculations, but they are near.
If you look at the size of the data (for example with Visual GC), you see that the size of the array is indeed 2 bytes per char.
The problem here is that the JVM tries to fit the whole array in the old generation of the heap, and the size of this generation is constrained by the ratio of new/old generation sizes.
Running with -XX:NewRatio=5 will correct the problem (default value is 2).
I will try to build up on Bruno's answer. I tried this code right now:
public static void main(String[] args) throws IOException {
char [] chars = new char[50 * 1024 * 1024];
System.out.println(Runtime.getRuntime().freeMemory());
System.out.println(Runtime.getRuntime().totalMemory());
System.out.println(Runtime.getRuntime().maxMemory());
}
And the output was:
38156248
143654912
143654912
It's obvious that 40 MB were left free for some other purposes of the JVM. My best guess would be for the new generation space.

Java throwing out of memory exception before it's really out of memory?

I wish to make a large int array that very nearly fills all of the memory available to the JVM. Take this code, for instance:
final int numBuffers = (int) ((runtime.freeMemory() - 200000L) / (BUFFER_SIZE));
System.out.println(runtime.freeMemory());
System.out.println(numBuffers*(BUFFER_SIZE/4)*4);
buffers = new int[numBuffers*(BUFFER_SIZE / 4)];
When run with a heap size of 10M, this throws an OutOfMemoryException, despite the output from the printlns being:
9487176
9273344
I realise the array is going to have some overheads, but not 200k, surely? Why does java fail to allocate memory for something it claims to have enough space for? I have to set that constant that is subtracted to something around 4M before Java will run this (By which time the printlns are looking more like:
9487176
5472256
)
Even more bewilderingly, if I replace buffers with a 2D array:
buffers = new int[numBuffers][BUFFER_SIZE / 4];
Then it runs without complaint using the 200k subtraction shown above - even though the amount of integers being stored is the same in both arrays (And wouldn't the overheads on a 2D array be larger than that of a 1D array, since it's got all those references to other arrays to store).
Any ideas?
The VM will divide the heap memory into different areas (mainly for the garbage collector), so you will run out of memory when you attempt to allocate a single object of nearly the entire heap size.
Also, some memory will already have been used up by the JRE. 200k is nothing with todays memory sizes, and 10M heap is almost unrealistically small for most applications.
The actual overhead of an array is relatively small, on a 32bit VM its 12 bytes IIRC (plus what gets wasted if the size is less than the minimal granularity, which is AFAIK 8 bytes). So in the worst case you have something like 19 bytes overhead per array.
Note that Java has no 2D (multi-dimensional) arrays, it implements this internally as an array of arrays.
In the 2D case, you are allocating more, smaller objects. The memory manager is objecting to the single large object taking up most of the heap. Why this is objectionable is a detail of the garbage collection scheme-- it's probably because something like it can move the smaller objects between generations and the heap won't accomodate moving the single large object around.
This might be due to memory fragmentation and the JVM's inability to allocate an array of that size given the current heap.
Imagine your heap is 10 x long:
xxxxxxxxxx
Then, you allocate an object 0 somehere. This makes your heap look like:
xxxxxxx0xx
Now, you can no longer allocate those 10 x spaces. You can not even allocate 8 xs, despite the fact that available memory is 9 xs.
The fact is that an array of arrays does not suffer from the same problem because it's not contiguous.
EDIT: Please note that the above is a very simplistic view of the problem. When in need of space in the heap, Java's garbage collector will try to collect as much memory as it can and, if really, really necessary, try to compact the heap. However, some objects might not be movable or collectible, creating heap fragmentation and putting you in the above situation.
There are also many other factors that you have to consider, some of which include: memory leaks either in the VM (not very likely) or your application (also not likely for a simple scenario), unreliability of using Runtime.freeMemory() (the GC might run right after the call and the available free memory could change), implementation details of each particular JVM, etc.
The point is, as a rule of thumb, don't always expect to have the full amount of Runtime.freeMemory() available to your application.

Categories