Estimating maximum safe JVM heap size in 64-bit Java - java

In the course of profiling a 64-bit Java app that's having some issues, I notice that the profiler itself (YourKit) is using truly colossal amounts of memory. What I've got in the YourKit launch script is:
JAVA_HEAP_LIMIT="-Xmx3072m -XX:PermSize=256m -XX:MaxPermSize=768m"
Naively, assuming some overhead, this would lead me to guess that YourKit is going to use a max of something maybe a bit over four GB. However, what I actually see in PS is:
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
dmoles 31379 4.4 68.2 14440032 8321396 ? Sl 11:47 10:42 java -Xmx3072m -XX:PermSize=256m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -Dyjp.probe.table.length.limit=20000 -Xbootclasspath/a:/home/dmoles/Applications/yjp-9.5.6/bin/../lib/tools.jar -jar /home/dmoles/Applications/yjp-9.5.6/bin/../lib/yjp.jar
That's a virtual size of nearly 14 GB and a resident size of nearly 8 GB -- nearly 3x the Java heap.
Now, I've got enough memory on my dev box to run this, but going back to the original memory problem I'm trying to diagnose: How do I know how much Java heap I have to play with?
Clearly, if the customer has, say, 16 GB physical RAM, it's not a great idea for me to tell them to set -Xmx to 16 GB.
So what is a reasonable number? 12 GB? 8 GB?
And how do I estimate it?

Clearly, if the customer has, say, 16 GB physical RAM, it's not a great idea for me to tell them to set -Xmx to 16 GB.
If the customer was running nothing else significant on his/her machine, then setting the heap size to 16G isn't necessarily a bad idea. It depends on what the application is doing.
So what is a reasonable number? 12 GB? 8 GB?
The ideal number would be to have "JVM max heap + JVM non-heap overheads + OS + other active applications' working sets + buffer cache working set" add up to the amount of physical memory. But the problem is that none of those components (apart from the max heap size) can be pinned down without detailed measurements on the customer's machine ... while the application is running on the actual problem.
And how do I estimate it?
The bottom line is that you can't. The best you can do is to guess ... and be conservative.
An alternative approach is to estimate how much heap the application actually needs for the problem it is trying to solve. Then add an extra 50 or 100 percent to give the GC room to work efficiently. (And then tune ...)

Related

GWT Compiler: Error: Could not create the Java Virtual Machine

If anyone could help me with this problem.
I was running the application on Windows 64 bits e it was working fine, but I changed the computer and now it's running on 32 bits.
This error could be about this or it is some configuration?
EDIT: Could not reserve enough space for object heap
Thanks
It is possible that your IntelliJ's available memory space is not large enough to allocate 1024 to your process. Try reducing the JVM size for your process or increase the IntelliJ JVM memory size
EDIT:
I just learned that the process will not run inside the memory space of the IntelliJ JVM, albeit its own. Also a quick search on the www lead me to understand that the max size of the heap that can be allocated is limited to around 4 GB on a 32bit machine subject to the availability of enough contiguous memory... generally gets limited to 1.2 GB. That might be the problem in your case
It could be that your -Xmx setting on 64bit is larger the one supported by 32bit. You could provide more details about the problem too.

Application memory issue with mac

I face a problem with java application I built in javaFx. It consumes only 2-3% of cpu usage and around 50 to 80 MB of memory in windows. But in mac same application initially starts with 50 mb of memory and continuously increases to 1 GB and uses over 90% of CPU Usage. I found this information when I checked Mac task manager. When I use a java profiler to find memory leaks, the profiler shows memory usage same like window (not more than 100 MB).
I am confused with this behaviour in Mac.
Has anyone encountered this problem before, or am I doing something wrong with my application?
Lots of things possible, but i suspect this: Depending on the memory size and cpu count, the jvm may run in server mode, which causes memory management to be different. Use -server option to force it to be server mode always and compare again.
Can also take heap dumps (jmap -dump) to see what is taking up so much memory, and stack traces (kill -3) to see what is taking up so much cpu.

Java high memory usage

I have a problem with a Java app. Yesterday, when i deployed it to have a test run, we noticed that our machine started swapping, even though this is not a real monster app, if you know what i mean.
Anyway, i checked the results of top and saw that it eats around 100mb of memory (RES in top) I tried to profile memory and check if there is a memory leak, but i couldn't find one. There was an unclosed PreparedStatement, which i fixed, but it didn't mean much.
I tried setting the min and max heap size (some said that min heap size is not required), and it didn't make any difference.
This is how i run it now:
#!/bin/sh
$JAVA_HOME/bin/java -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=9025 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -XX:MaxPermSize=40m -Xmx32M -cp ./jarName.jar uk.co.app.App app.properties
Here is the result of top:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
16703 root 20 0 316m 109m 6048 S 0.0 20.8 0:14.80 java
The thing i don't understand that i configure max PermSize and max Heap size, which add up to 72mb. Which is enough, the app runs well. Why is it eating 109mb of memory still and what is eating it up? It is a 37mb difference, which is quite high ratio. (34%).
I don't think this is a memory leak, because max heap size is set and there is no out of memory error, or anything.
One intertesting thing may be that i made a heap dump with VisualVM and then checked it with EclipseMAT and it said that there is a possible leak in a classloader.
This is what it says:
The classloader/component "sun.misc.Launcher$AppClassLoader #
0x87efa40" occupies 9,807,664 (64.90%) bytes. The memory is
accumulated in one instance of "short[][]" loaded by "".Keywords sun.misc.Launcher$AppClassLoader # 0x87efa40
I cannot make much of this, but may be useful.
Thanks for your help in advance.
EDIT
I found this one, maybe there is nothing i can do...
Tomcat memory consumption is more than heap + permgen space
Java's memory includes
the heap space.
the perm gen space
thread stack areas.
shared libraries, including that of the JVM (will be shared)
the direct memory size.
the memory mapped file size (will be shared)
There is likely to be others which are for internal use.
Given that 37 MB of PC memory is worth about 20 cents, I wouldn't worry about it too much. ;)
Did you try using JConsole to profile the application http://docs.oracle.com/javase/1.5.0/docs/guide/management/jconsole.html
Otherwise you can also use JProfiler trial version to profile the application
http://www.ej-technologies.com/products/jprofiler/overview.html?gclid=CKbk1p-Ym7ACFQ176wodgzy4YA
However first step to check high memory usage should be to check if you are using collection of objects in your application like array,map,set,list etc. If yes then check if they keep the references to objects (even though of not used) with them ?

How much -XX:MaxPermSize size i can mention for 4GB and 8GB Ram and calculation for this?

How much -XX:MaxPermSize size i can mention for 4GB and 8GB Ram. Here are the other detalis of my system
OS:-window XP(32 bit)
RAM:-4 GB
java_opt- -Xms1536m -Xmx1536m //(mentioned as environment variable):
tomcat version:-6.0.26
I have another system with 8GB ram with other details exactly same . Yes os is 64 bit Window 7.
Along with this also let me know what can be the max value for -Xmx parameter for both the systems?
It would be great if some body can tell me the calculation to arrive at the figure so that we dont have to cram this figure but we can logically calculate based on RAM ssytem is having?
I have really seen people getting permgen error or heap error but every body keeps on playing with this paramters until they come to figure that resolve the issue.
According to IBM document, application runs with a minimum heap usage of 40%, and a maximum heap usage of 70%.
Refer to this information
Why can't I get a larger heap with the 32-bit JVM?
Max amount of memory per java process in windows?
Sizing the Java heap

32-bit Java on 64-bit OS: is there a limit to number of JVMs?

I have a Solaris sparc (64-bit) server, which has 16 GB of memory. There are a lot of small Java processes running on it, but today I got the "Could not reserve enough space for object heap" error when trying to launch a new one. I was surprised, since there was still more than 4GB free on the server. The new process was able to successfully launch after some of the other processes were shut down; the system had definitely hit a ceiling of some kind.
After searching the web for an explanation, I began to wonder if it was somehow related to the fact that I'm using the 32-bit JVM (none of the java processes on this server require very much memory).
I believe the default max memory pool is 64MB, and I was running close to 64 of these processes. So that would be 4GB all told ... right at the 32-bit limit. But I don't understand why or how any of these processes would be affected by the others. If I'm right, then in order to run more of these processes I'll either have to tune the max heap to be lower than the default, or else switch to using the 64-bit JVM (which may mean raising the max heap to be higher than the default for these processes). I'm not opposed to either of these, but I don't want to waste time and it's still a shot in the dark right now.
Can anyone explain why it might work this way? Or am I completely mistaken?
If I am right about the explanation, then there is probably documentation on this: I'd very much like to find it. (I'm running Sun's JDK 6 update 17 if that matters.)
Edit: I was completely mistaken. The answers below confirmed my gut instinct that there's no reason why I shouldn't be able to run as many JVMs as I can hold. A little while later I got an error on the same server trying to run a non-java process: "fork: not enough space". So there's some other limit I'm encountering that is not java-specific. I'll have to figure out what it is (no, it's not swap space). Over to serverfault I go, most likely.
I believe the default max memory pool
is 64MB, and I was running close to 64
of these processes. So that would be
4GB all told ... right at the 32-bit
limit.
No. The 32bit limit is per process (at least on a 64bit OS). But the default maximum heap is not fixed at 64MB:
initial heap size: Larger of 1/64th of
the machine's physical memory on the
machine or some reasonable minimum.
maximum heap size: Smaller of 1/4th of
the physical memory or 1GB.
Note: The boundaries and fractions given for the heap size are correct for J2SE 5.0. They are likely to be different in subsequent releases as computers get more powerful.
I suspect the memory is fragmented. Check also Tools to view/solve Windows XP memory fragmentation for a confirmation that memory fragmentation can cause such errors.

Categories