Java OutOfMemoryError when allocating byte[] within max heap - java

I'm trying to figure out why I am getting an OOM error even though the byte array I am initializing plus the currently used memory is less than the max heap size (1000MB).
Right before the array is initialized I'm using 373MB with 117 free. When I try to initialize the array that takes up 371MB I get an error. The strange thing is that the error persists until I allocate 1.2G or more for the JVM.
373 + 371 is 744, I should still have 256MB free, this is driving me nuts.
In a second case using 920mb with 117 free initializing a 918mb array takes at least 2800mb.
Is this somehow part of how java functions? If so is there a workaround so that something simple like an array copy operation can be done in less than 3n memory?
(memory numbers are from Runtime and max heap size is set with -Xmx)
test.java:
byte iv[];
iv =new byte[32];
byte key[] = new byte[32];
new SecureRandom().nextBytes(iv);
new SecureRandom().nextBytes(key);
plaintext = FileUtils.readFileToByteArray(new File("sampleFile"));
EncryptionResult out = ExperimentalCrypto.doSHE(plaintext, key, iv);
ExperimentalCrypto.java:
public static byte[] ExperimentalCrypto(byte[] input ,byte[] key, byte[]iv){
if(input.length%32 != 0){
int length = input.length;
byte[] temp = null;
System.out.println((input.length/32+1)*32 / (1024*1024));
temp=new byte[(input.length/32+1)*32]; // encounter error here

Typical JVM implementations split the Java heap into several parts dedicated to objects with a certain lifetime. Allocations of larger arrays typically bypass the stages for younger objects as these areas are usually smaller and to avoid unnecessary copying. So they will end up in the “old generation” space for which a size of ⅔ is not unusual. As you are using JVisualVM I recommend installing the plugin Visual GC which can show you a live view of the different memory areas and their fill state.
You can use the -XX:MaxPermSize=… and -XX:MaxNewSize=… startup options to reduce the sizes of the areas for the young and permanent generation and thus indirectly raise the fraction of the old generation’s area where your array will be allocated.

Related

Home much memory does the JVM need to allocate a character array?

Consider to following 1 line program:
public static void main(String[] args) throws Exception {
// 134217728 * 2 bytes / 1024 / 1024 = 256M
char[] array = new char[134217728];
}
How much memory does the JVM need to allocate this 256M character array?
Turns out the answer is -Xmx384m. Now lets try 512M character array...
// 268435456 * 2 bytes / 1024 / 1024 = 512M
char[] array = new char[268435456];
The answer appears to be -Xmx769m.
In running though a few examples for a character array of size m. The jvm needs at minimum 1.5m megabytes of memory to allocate the array. This seems like a lot, can anyone explain what is happening here?
I believe you're observing the way that the Oracle JVM allocates memory.
In particular, the whole array has to fit into one "generation". By default, the old generation is twice the size of the new generation - which means for every 3MB you add in total, you only get 2MB of extra space in the old generation. If you change the ratio, you can allocate the char array in a smaller total size. For example, this works for me with your 512MB array:
java -Xmx530M -XX:NewRatio=50 Test
As an aside, you see exactly the same effect with a byte array, and then you don't need to worry about doubling the length of the array to get the size in bytes. (There's the small constant overhead for the class reference and the length, but obviously that's trivial.)
The environment needs a little bit of space for itself and as in one comment it depends on the used JVM and compiler (OpenJdk or Oracle, java 6/7/8). All in all it should orient on the size: Character.SIZE * array.length.

Java OutOfMemory Heap Space

everyone. I am working on a project. Along the way, I got the "OutOfMemory" problem with this line of code...
this.A = new int[n][n];
...for n = 10000;
I tried fixing it by adding -Xmx2048m command but the thing is when added, all of the rest of running programs stops responding. Any other suggestion for my case? Thanks a lot
You have to calculate the space that array needs. If it is over 2048M then you'll receive an OutOfMemoryError.
In your case you try to allocate an array of 10000 x 10000 which is 100.000.000.
A primitive int occupies 4 bytes. This means that the whole array takes
100.000.000 * 4 bytes which is 0,37 GB of space.
So it seems that there is something else in your program which caueses the error. For example if you try to allocate multiple arrays in a loop then you can run out of memory real quick.
It can be a problem if your hardware does not have 2048M of memory.
It is also possible that before using -Xmx2048m you had for example -Xmx512m which might be too small for your array.
Use this:
Increase heap size in Java
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
-Xss<size> set java thread stack size
java -Xms16m -Xmx64m ClassName
Let's see...
An int occupies 32bit or 4 bytes in memory. Your 2-dimensional array thus requires
10000 * 10000 * 4 bytes = 400000000 bytes = 381.47 MB
(plus storage for the array objects)
Let's verify that with a simple test:
int n = 10000;
int[][] foo = new int[n][n];
System.out.println("USED MEMORY: " + (Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory()) / 1024 / 1024);
Result: 384 MB (on my machine, your results may vary).
Running out of memory in the heap happens when a requested block of memory cannot be allocated, not necessarily when the memory is full.
As you allocate memory in blocks of 40'000 bytes, you can run out of memory when no contiguous free memory of that size is available (hundreds of blocks with less that 40'000 won't help here). That means you can easily run out of memory when the free memory on the heap is fragmented as small memory blocks.
If you happen to allocate that memory multiple times in a row, try reusing the allocated memory instead of allocating it again (to prevent fragmentation to a certain degree).
Alternatively, try using smaller arrays (even if it means using more of them).

Java -Xmx command not giving me as much memory as I expected

I am running the following code using the java -Xmx60g command.
Each array should be around 8.5GB, for a total of 17GB. The machine has 64GB total, with 63GB "free." It prints DONE DECLARING RIVER HANDS 1, indiciating that it finished declaring the first array.
But I get Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
during the declaration of the second array. I am confused, because my understanding is that the -Xmx60g should allocate 60GB, while the arrays only use 17GB.
Thanks in advance for your help!
long NUM_RIVER_HANDS = 2428287420L;
int half_river = (int)(NUM_RIVER_HANDS/2);
byte[][] river_hands_1 = new byte[half_river][7];
System.out.println("DONE DECLARING RIVER HANDS 1");
byte[][] river_hands_2 = new byte[(int)(NUM_RIVER_HANDS - half_river)][7];
System.out.println("DONE DECLARING RIVER HANDS 2");
In the first allocation, you're creating 1214143710 arrays, each of which is 7 bytes plus the object overhead. If we assume a per-object overhead for an array of 16 bytes (which is reasonable, if not conservative) then that means two thirds of your space is being wasted. If we assume 24 bytes for each array in total, that's ~29GB just for the first allocation... and if there's more overhead than that, pushing it to 32 bytes for each array, for example, it's ~38GB, at which point it's not surprising that you can't do it twice :)
Just change the way you're using the arrays to make this much more efficient:
// Names changed to follow Java naming conventions
byte[][] riverHands1 = new byte[7][halfRiver];
byte[][] riverHands2 = new byte[7][(int)(NUM_RIVER_HANDS - halfRiver)];
You'll need to change how you use the arrays as well, but now you have a total of 14 byte[] objects, each of which is very large... but you'll have hardly any wasted memory due to object overheads.

In Java, how to allocate given amount of memory, and hold it until program exit?

I am doing some experiments on memory. The first problem I met is how to allocate given amount of memory during runtime, say 500MB. I need the program's process hold it until the program exit.
I guess there may be several ways to achieve this? I prefer a simple but practical one.
Well, Java hides memory management from you, so there are two answers to your question:
Create the data structures of this size, you are going to need and hold a reference to them in some thread, until the program exits, because, once there is no reference to data on the heap in an active thread it becomes garbage collectable. On a 32-bit system 500MB should be roughly enough for an int array of 125000 cells, or 125 int arrays of 1000 cells.
If you just want to have the memory allocated and available, but not filled up, then start the virtual machine with -Xms=512M. This is going to make the VM allocate 512 M of memory for your program on startup, but it is going to be empty (just allocated) until you need it (do point 1). Xmx sets the maximum allocatable memory by your program.
public static void main( String[] args ) {
final byte[] x = new byte[500*1024 ]; // 500 Kbytes
final byte[] y = new byte[500*1024*1024]; // 500 Mbytes
...
System.out.println( x.length + y.length );
}
jmalloc lets you do it, but I wouldn't recommend it unless you're truly an expert. You're giving up something that's central to Java - garbage collection. You might as well be writing C.
Java NIO allocates byte buffers off heap this way. I think this is where Oracle is going for memory mapping JARs and getting rid of perm gen, too.

Java throwing out of memory exception before it's really out of memory?

I wish to make a large int array that very nearly fills all of the memory available to the JVM. Take this code, for instance:
final int numBuffers = (int) ((runtime.freeMemory() - 200000L) / (BUFFER_SIZE));
System.out.println(runtime.freeMemory());
System.out.println(numBuffers*(BUFFER_SIZE/4)*4);
buffers = new int[numBuffers*(BUFFER_SIZE / 4)];
When run with a heap size of 10M, this throws an OutOfMemoryException, despite the output from the printlns being:
9487176
9273344
I realise the array is going to have some overheads, but not 200k, surely? Why does java fail to allocate memory for something it claims to have enough space for? I have to set that constant that is subtracted to something around 4M before Java will run this (By which time the printlns are looking more like:
9487176
5472256
)
Even more bewilderingly, if I replace buffers with a 2D array:
buffers = new int[numBuffers][BUFFER_SIZE / 4];
Then it runs without complaint using the 200k subtraction shown above - even though the amount of integers being stored is the same in both arrays (And wouldn't the overheads on a 2D array be larger than that of a 1D array, since it's got all those references to other arrays to store).
Any ideas?
The VM will divide the heap memory into different areas (mainly for the garbage collector), so you will run out of memory when you attempt to allocate a single object of nearly the entire heap size.
Also, some memory will already have been used up by the JRE. 200k is nothing with todays memory sizes, and 10M heap is almost unrealistically small for most applications.
The actual overhead of an array is relatively small, on a 32bit VM its 12 bytes IIRC (plus what gets wasted if the size is less than the minimal granularity, which is AFAIK 8 bytes). So in the worst case you have something like 19 bytes overhead per array.
Note that Java has no 2D (multi-dimensional) arrays, it implements this internally as an array of arrays.
In the 2D case, you are allocating more, smaller objects. The memory manager is objecting to the single large object taking up most of the heap. Why this is objectionable is a detail of the garbage collection scheme-- it's probably because something like it can move the smaller objects between generations and the heap won't accomodate moving the single large object around.
This might be due to memory fragmentation and the JVM's inability to allocate an array of that size given the current heap.
Imagine your heap is 10 x long:
xxxxxxxxxx
Then, you allocate an object 0 somehere. This makes your heap look like:
xxxxxxx0xx
Now, you can no longer allocate those 10 x spaces. You can not even allocate 8 xs, despite the fact that available memory is 9 xs.
The fact is that an array of arrays does not suffer from the same problem because it's not contiguous.
EDIT: Please note that the above is a very simplistic view of the problem. When in need of space in the heap, Java's garbage collector will try to collect as much memory as it can and, if really, really necessary, try to compact the heap. However, some objects might not be movable or collectible, creating heap fragmentation and putting you in the above situation.
There are also many other factors that you have to consider, some of which include: memory leaks either in the VM (not very likely) or your application (also not likely for a simple scenario), unreliability of using Runtime.freeMemory() (the GC might run right after the call and the available free memory could change), implementation details of each particular JVM, etc.
The point is, as a rule of thumb, don't always expect to have the full amount of Runtime.freeMemory() available to your application.

Categories