How to set the maximum memory usage for JVM? - java

I want to limit the maximum memory used by the JVM. Note, this is not just the heap, I want to limit the total memory used by this process.

use the arguments -Xms<memory> -Xmx<memory>. Use M or G after the numbers for indicating Megs and Gigs of bytes respectively. -Xms indicates the minimum and -Xmx the maximum.

You shouldn't have to worry about the stack leaking memory (it is highly uncommon). The only time you can have the stack get out of control is with infinite (or really deep) recursion.
This is just the heap. Sorry, didn't read your question fully at first.
You need to run the JVM with the following command line argument.
-Xmx<ammount of memory>
Example:
-Xmx1024m
That will allow a max of 1GB of memory for the JVM.

If you want to limit memory for jvm (not the heap size )
ulimit -v
To get an idea of the difference between jvm and heap memory , take a look at this excellent article
http://blogs.vmware.com/apps/2011/06/taking-a-closer-look-at-sizing-the-java-process.html

The answer above is kind of correct, you can't gracefully control how much native memory a java process allocates. It depends on what your application is doing.
That said, depending on platform, you may be able to do use some mechanism, ulimit for example, to limit the size of a java or any other process.
Just don't expect it to fail gracefully if it hits that limit. Native memory allocation failures are much harder to handle than allocation failures on the java heap. There's a fairly good chance the application will crash but depending on how critical it is to the system to keep the process size down that might still suit you.

The NativeHeap can be increasded by -XX:MaxDirectMemorySize=256M
(default is 128)
I've never used it. Maybe you'll find it useful.

Related

How to gracefully tell Java about total memory limits?

I have troubles with Java memory consumption.
I'd like to say to Java something like this: "you have 8GB of memory, please use it, and only it. Only if you really can't put all your resources in this memory pool, then fail with OOM".
I know, there are default parameters like -Xmx - they limit only the heap. There are also plenty of other parameters, I know. The problems with these parameters are:
They aren't relevant. I don't want to limit the heap size to 6GB (and trust that native memory won't take more than 2GB). I do want to limit all the memory (heap, native, whatever). And do that effectively, not just saying "-Xmx1GB" - to be safe.
There is too many different parameters related to memory, and I don't know how to configure all of them to achieve the goal.
So, I don't want to go there and care about heap, perm and whatever types of memory. My high-level expectation is: since there is only 8GB, and some static memory is needed - take the static memory from the 8GB, and carefully split the remaining memory between other dynamic memory entities.
Also, ulimit and similar things don't work. I don't want to kill the java process once it consumes more memory than expected. I want Java does its best to not reach the limit firstly, and only if it really, really can't - kill the process.
And I'm OK to define even 100 java parameters, why not. :) But then I need assistance with the full list of needed parameters (for, say, Java 8).
Have you tried -XX:MetaspaceSize?
Is this what you need?
Please, read this article: http://karunsubramanian.com/websphere/one-important-change-in-memory-management-in-java-8/
Keep in mind that this is only valid to Java 8.
AFAIK, there is no java command line parameter or set of parameters that will do that.
Your best bet (IMO) is to set the max heap size and the max metaspace size and hope that other things are going to be pretty static / predictable for your application. (It won't cover the size of the JVM binary and it probably won't cover native libraries, memory mapped files, stacks and so on.)
In a comment you said:
So I'm forced to have a significant amount of memory unused to be safe.
I think you are worrying about the wrong thing here. Assuming that you are not constrained by address space or swap space limitations, memory that is never used doesn't matter.
If a page of your address space is not used, the OS will (in the long term) swap it out, and give the physical RAM page to something else.
Pages in the heap won't be in that situation in a typical Java application. (Address space pages will cycle between in-use and free as the GC moves objects within and between "spaces".)
However, the flip-side is that a GC needs the total heap size to be significantly larger than the sum of the live objects. If too much of the heap is occupied with reachable objects, the interval between garbage collection runs decreases, and your GC ergonomics suffer. In the worst case, a JVM can grind to a halt as the time spent in the GC tends to 100%. Ugly. The GC overhead limit mechanism prevents this, but that just means that your JVM gets an OOME sooner.
So, in the normal heap case, a better way to think about it is that you need to keep a portion of memory "unused" so that the GC can operate efficiently.

Is it good to set the max and min JVM heap size the same?

Currently in our testing environment the max and min JVM heap size are set to the same value, basically as much as the dedicated server machine will allow for our application. Is this the best configuration for performance or would giving the JVM a range be better?
Peter 's answer is correct in that -Xms is allocated at startup and it will grow up to -Xmx (max heap size) but it's a little misleading in how he has worded his answer. (Sorry Peter I know you know this stuff cold).
Setting ms == mx effectively turns off this behavior. While this used to be a good idea in older JVMs, it is no longer the case. Growing and shrinking the heap allows the JVM to adapt to increases in pressure on memory yet reduce pause time by shrinking the heap when memory pressure is reduced. Sometimes this behavior doesn't give you the performance benefits you'd expect and in those cases it's best to set mx == ms.
OOME is thrown when heap is more than 98% of time is spent collecting and the collections cannot recover more than 2% of that. If you are not at max heaps size then the JVM will simply grow so that you're beyond that boundaries. You cannot have an OutOfMemoryError on startup unless your heap hits the max heap size and meets the other conditions that define an OutOfMemoryError.
For the comments that have come in since I posted. I don't know what the JMonitor blog entry is showing but this is from the PSYoung collector.
size_t desired_size = MAX2(MIN2(eden_plus_survivors, gen_size_limit()),
min_gen_size());
I could do more digging about but I'd bet I'd find code that serves the same purpose in the ParNew and PSOldGen and CMS Tenured implementations. In fact it's unlikely that CMS would be able to return memory unless there has been a Concurrent Mode Failure. In the case of a CMF the serial collector will run and that should include a compaction after which top of heap would most likely be clean and therefore eligible to be deallocated.
Main reason to set the -Xms is for if you need a certain heap on start up. (Prevents OutOfMemoryErrors from happening on start up.) As mentioned above, if you need the startup heap to match the max heap is when you would match it. Otherwise you don't really need it. Just asks the application to take up more memory that it may ultimately need. Watching your memory use over time (profiling) while load testing and using your application should give you a good feel for what to need to set them to. But it isn't the worse thing to set them to the same on start up. For a lot of our apps, I actually start out with something like 128, 256, or 512 for min (startup) and one gigabyte for max (this is for non application server applications).
Just found this question on stack overflow which may also be helpful side-effect-for-increasing-maxpermsize-and-max-heap-size. Worth the look.
AFAIK, setting both to the same size does away with the additional step of heap resizing which might be in your favour if you pretty much know how much heap you are going to use. Also, having a large heap size reduces GC invocations to the point that it happens very few times. In my current project (risk analysis of trades), our risk engines have both Xmx and Xms to the same value which pretty large (around 8Gib). This ensures that even after an entire day of invoking the engines, almost no GC takes place.
Also, I found an interesting discussion here.
Definitely yes for a server app. What's the point of having so much memory but not using it?
(No it doesn't save electricity if you don't use a memory cell)
JVM loves memory. For a given app, the more memory JVM has, the less GC it performs. The best part is more objects will die young and less will tenure.
Especially during a server startup, the load is even higher than normal. It's brain dead to give server a small memory to work with at this stage.
From what I see here at http://java-monitor.com/forum/showthread.php?t=427
the JVM under test begins with the Xms setting, but WILL deallocate memory it doesn't need and it will take it upto the Xmx mark when it needs it.
Unless you need a chunk of memory dedicated for a big memory consumer initially, there's not much of a point in putting in a high Xms=Xmx. Looks like deallocation and allocation occur even with Xms=Xmx

Does -Xmx stop automatic GC till maximum memory is consumed?

My question is simple. I have an application that specifies the "-Xmx 3G" command line option. Does this mean that no garbage collection will take place in the application till all (or say 80%) the 3GB of memory is consumed? Any further reading material would be appreciated as well.
No. A minor gc can occur even before the minimum memory -ms has been reached. The JVm reserves the maximum memory -mx on startup. However you can get full collections before this size is reached.
No. A simple test would demonstrate that!

How to get just free heap size (not together w stack/method mem) in Java?

I want to calculate the heap usage for my app. I would like to get a procent value of Heap size only.
How do I get the value in code for the current running app?
EDIT
There was an upvoted answer that was NOT complete/correct. The values returned by those methods include stack and method area too, and I need to monitor only heap size.
With that code I got HeapError exception when I reached 43%, so I can't use those methods to monitor just heap
Runtime.getRuntime().totalMemory()
dbyme's answer is not accurate - these Runtime calls give you an amount of memory used by JVM, but this memory does not consist only of heap , there is also stack and method area e.g.
This information is exposed over the JMX management interface. If you simply want to look at it, JConsole or visualvm (part of the JDK, installed in JAVA_HOME/bin) can display nice graphs of a JVM's memory usage, optionally broken down into the various memory pools.
This interface can also be accessed programmatically; see MemoryMXBean.
MemoryMXBean bean = ManagementFactory.getMemoryMXBean();
bean.getHeapMemoryUsage().getUsed();
There really is no good answer, since how much heap memory the JVM has free is not the same as how much heap memory the operating system has free, which are both not the same as how much heap memory can be assigned to your application.
This is because the JVM and OS heaps are different. When the JVM runs out of memory, it may run garbage-collection, defragment its own heap, or request more memory from the OS. Since unused non-garbage-collected objects still exist, but are technically "free", they make the concept of free memory a bit fuzzy.
Also, heap memory fragments; how/when/if memory is defragmented is up to the implementation of the JVM/OS. For example, the OS-heap may have 100MB of free memory, but due to fragmentation, the largest available contiguous space may be 2MB. Thus, if the JVM requests 3MB, it may get an out-of-memory error, even though 100MB are still available. It is not possible for the JVM to know ahead of time that the OS won't be able to allocate that 3MB.

Obtaining memory available to JVM at runtime

I'm trying to sort a bunch of data such that that the size of data input to the program can be larger than the memory available to the JVM, and handling that requires external sort which is much slower than Quicksort.
Is there any way of obtaining memory available to the JVM at runtime such that I could use in place sorting as much as possible, and only switch to Mergesort when data input is too large?
Check out these methods on the java.lang.Runtime class:
freeMemory
totalMemory
maxMemory
Example
Runtime rt = Runtime.getRuntime();
System.err.println(String.format("Free: %d bytes, Total: %d bytes, Max: %d bytes",
rt.freeMemory(), rt.totalMemory(), rt.maxMemory()));
Also note that if the total amount of memory is being exhausted you can always start the JVM with a larger amount of heap allocated using the -Xmx JVM argument; e.g.
java -Xmx256M MyClass
You can use the Runtime class to get the amount of memory available.
Runtime r = Runtime.getRuntime();
System.out.println(r.totalMemory());
There are various other memory details you can get from the Runtime object - see Runtime class.
In theory yes, using Runtime.getRuntime().maxMemory().
In practice, there are some problem you need to address:
You need to figure out how many application objects are going to fit in a given number of bytes of memory. AFAIK, there is no simple / efficient way to do this within a running application.
You don't want to try to use all available heap space. If you push your percentage heap residency too high, you risk making the GC horribly inefficient.
The maxMemory() method only tells you how big the heap in virtual memory. The physical size can also be a factor (especially if physical size << virtual size), and there's no portable way to figure that out.
If I was trying to implement this application, I think I'd probably just make the in-memory sort size a configuration parameter or command-line option.
Using the methods of Runtime, as the others suggested, is fine, as long as you take some things into consideration:
1) freeMemory() is a lower bound on the actual available memory, because memory that is unreferenced and ready for GC is considered as used. Running System.gc() before the call may return a more accurate result.
2) totalMemory() can change - it only indicates the current total heap size, and the heap can expand/shrink by the JVM during runtime, depending on its usage. You can use maxMemory() to get the actual maximum.

Categories