Check Java JVM Memory Limit on Windows XP - java

I want to see how much memory is allocated to my Java JVM on my Windows XP installation. I'm running an executable jar that has a main class that makes some JNI calls to a C library via a dll that is loaded using the System.loadLibrary("SampleJni"). Some calls are working and some are not. Whenever there are more than one String parameters passed I get a system dump. If I just have one String, one int, two ints..etc, no crashes. The machine only has .99 GB of ram, so I'm thinking the JVM can't allocate the need memory.

Use jconsole to check the memory used by your program. Jconsole comes with the JDK so you already have it. This memory won't include memory used by your JNI C code, but it will tell you what memory Java is using. Your more likely culprit is JNI mapping isn't correct when using multiple parameters.

I've run JVMs (Java 6) on machines with less memory than that. IIRC the default for the JVM on windows was 64Mb, but that may have changed. Even if it did, it should be enough to start up. You'd also see OutOfMemoryErrors if this were the case rather than hard crashes.
There are various methods in java.lang.Runtime that will let you inspect how much memory you have.
The likely cause is the JNI interface. Its very easy to crash the JVM if the JNI code isn't 100% correct.

Related

Measure peak memory consumption (of a Java Application) at runtime?

I have to run a couple of java services on my machine to obtain a certain dev environment (and get my not-java-related work done)
java -Xmx400m -jar foo-app/target/foo-app-SNAPSHOT.jar
java -Xmx250m -jar bar-app/target/bar-app-SNAPSHOT.jar
...
To not run out of memory, I need to limit the memory usage. The default (512m afaik) ist too high for my machine so I lowered them somewhat (on a wild as guessing basis). Except for one, where I learned the hard way (crashed, even freezes, and thankfully some .pid error files left behind in the project folder...), that I better settle a little higher:
java -Xmx800m -jar doo-app/target/doo-app-SNAPSHOT.jar
Question: is there a way, to track memory usage of a certain app over time?
By some java command line parameter or even with ps -ae, htop or similar? (thus not fiddling in the source itself, remap garbage collectors, etc, etc)
I see plenty of numbers, but figuring out which belong to which java project running, and what could roughly indicate me a proper peak memory consumption (in a -Xmx___m sense)... I have no idea.
I work under Ubuntu-MATE 16.04, x64.
The best way to analyze memory consumption is a profiler. In your jdk there comes the jvisualvm profiler, which is absolutely sufficient for this task. A (lengthy) tutorial can be found here: https://engineering.talkdesk.com/ninjas-guide-to-getting-started-with-visualvm-f8bff061f7e7
Other approaches are basically shotgun-style -reduce the xmx and then generate load in the system and see if it runs oom. If you do NOT have a straight controll flow you have no way to predict the used memory.

Getting contents of main memory in java

Its not problem specific but is it possible to get a copy of current memory state, as in just get whatever is there in the main memory. I mean is there any way we can get an image of RAM in java.
I am editing my question. So here is a screenshot of my Windows 7 Task Manager.
#peter I see that current memory usage is 3.27GB. So, can I get that whole thing in some read only memory and when I restart my OS, it resumes where I left off, as in whatever my last memory snapshot was.
Yes, it's called a heap dump.
jmap -heap {pid}
dumps the heap to a file.
You can use jvisualvm to analyse the heap dump.
It depends on what you mean by "main memory".
The JRE is designed
to insulate your Java program from the OS around it, and vice versa.
to insulate Java objects from the implementation details of other objects
With "ordinary" Java, you only get to see what classes and objects expose through public methods.
However, Java has its Reflection API, and with that, if your JRE is configured to allow it, you can break through these boundaries, and look deeper into the classes and objects within the JRE.
In Oracle HotSpot, you can start the JRE with the Java Servicability Agent - http://openjdk.java.net/groups/hotspot/docs/Serviceability.html - this gives you access through an API to much more detail of the Java heap. But you're still restricted to memory claimed by the JRE process, and allocated to its heap.
One further possibility is to write a native library using JNI. There are C API calls that allow you to browse the OS address space. You need to be root (or the equivalent on your OS) to see other people's address space). You could write C code, and JNI to call it from Java.

How to Use posix_spawn() in Java

I've inherited a legacy application that uses ProcessBuilder.start() to execute a script on a Solaris 10 server.
Unfortunately, this script call fails due to a memory issue, as documented here
Oracle's recommendation is to use posix_spawn() since, under the covers, ProcessBuilder.start() is using fork/exec.
I have been unable to find any examples (e.g., how to call "myScript.sh")
using posix_spawn() in Java, or even what are the packages that are required.
Could you please, point me to a simple example on how to use posix_spawn() in Java?
Recent version of Java 7 and 8 support posix_spawn internally.
command line option
-Djdk.lang.Process.launchMechanism=POSIX_SPAWN
or enable at runtime
System.setProperty("jdk.lang.Process.launchMechanism", "POSIX_SPAWN");
I'm a little confused as to which Java version/OS combinations have this enabled by default, but I'm sure you could test and find out pretty quickly whether setting this option makes a difference.
For reference, to go back to the old fork method simply use
-Djdk.lang.Process.launchMechanism=fork
To prove whether this option is respected in your JVM version use
-Djdk.lang.Process.launchMechanism=dummy
and you will get an error next time you exec. This way you know the JVM is receiving this option.
An alternative, which does not require JNI, is to create a separate "process spawner" application. I would probably have this application expose an RMI interface, and create a wrapper object that is a drop-in replacement for ProcessBuilder.
You might also want to consider having this "spawner" application be the thing that starts your legacy application.
You will need to familiarize yourself with JNI first. Learn how to call out into a native routine from Java code. Once you do - you can look at this example and see if it helps with your issue. Of particular interest to you is:
if( (RC=posix_spawn(&pid, spawnedArgs[0], NULL, NULL, spawnedArgs, NULL)) !=0 ){
printf("Error while executing posix_spawn(), Return code from posix_spawn()=%d",RC);
}
A much simpler solution would be to keep your code unchanged and simply add more virtual memory to your server.
i.e.:
mkfile 2g /somewhere/swap-1
swap -a /somewhere/swap-1
Edit: To clarify as the link present in the question is now broken:
the question is about a system out of virtual memory due to the JVM being forked. Eg, assuming the JVM uses 2 GB of VM, an extra 2 GB of VM is required for the fork to succeed on Solaris. There is no pagination involved here, just memory reservation. Unlike the Linux kernel which by default overcommits memory, Solaris makes sure allocated memory is backed by either RAM or swap. As there is not enough swap available, fork is failing. Enlarging the swap allows the fork to succeed without any performance impact. Just after the fork, the exec "unreserves" this 2GB of RAM and revert to a situation identical to the posix_spawn one.
See also this page for an explanation about memory allocation under Solaris and other OSes.

OutOfMemoryError when calling the main method of a jar

I have a Java app that imports another jar as a library and calls its main method as shown below. But someApp is a very large process and constantly throws an OutOfMemoryError. No matter what I set my Java apps heap size to, someApp does not seem to share the allocated memory.
try {
someApp.main(args);
} catch (Exception ex) {
}
How do I get someApp to allocate more heap space? Can I use processBuilder? What do I do?
Thanks.
As it stands at the moment, you're merely calling a class from another application within your own Java process. This is exactly the same as you'd do for calling a "library method" (the term has no technical difference, you're simply invoking a method on an object of a class that can be resolved by your classloader).
So right now, someApp is running in the same JVM as your own application, and will share its maximum heap size. This can be increase with the JVM argument -Xmx (e.g. -Xmx2048m for a 2GB max heap), though it sounds like you're doing this already withotu success.
It would be possible to launch someApp in a separate Java process, which would allow you to configure separate JVM arguments and thus give it a separate heap size.
However, I don't think this is going to help much. If you're unable to get this application to run in the same JVM, regardless of your heap limit, there's nothing that would suggest it would work in a difference JVM. For example, if you're running with a 2.5GB heap and still running out of memory, running your own app with a 0.5GB heap and spawning a separate JVM with 2GB heap will not solve the problem, as something is still running out of memory. (In fact, separate memory pools make an OOME slightly more likely since there are two distinct chunks of free space, whereas in the former case both applications can benefit from the same pool of free space).
I suggest you verify that your heap sizes really are being picked up (connecting via JMX using JConsole or JVisualVM will quickly let you see how big the max heap size is). If you're really still running out of memory with large heaps, it sounds like someApp has a memory leak (or a requirement for an even larger heap). Capturing a heap dump in this case, with the JVM argument -XX:+HeapDumpOnOutOfMemoryError, will allow you to inspect the heap with an external tool and determine what's filling the memory.
Hopefully you've simply failed to increase the heap size correctly, as if the application really is failing with a large heap there are no simple solutions.
Unless someApp itself is building a new process, this will already be in the same process as your calling code, so it should be affected by whatever heap configuration you've set when starting up the JVM.
Have you kept track of how much memory the process is actually taking?
This doesn't make sense, unless you're running into OS limitations on how much memory you can allocate to a single Java process on your OS (see Java maximum memory on Windows XP)
Short of that, the way you're invoking someApp, it acts as a regular library. The main method acts like any other method.
Have you tried debugging the OutOfMemoryError? There may be something obscure that the app doesn't like about being invoked from your application...
If the jar you are importing is authored by you and could be more efficient, then modify it. It sounds like the problem you are having is loading in a shotty package. If this is a 3rd party package and you are allowed to modify it, poke around in the code and find where there might be limitations, change it, and rebuild it.

Memory footprint issues with JAVA, JNI, and C application

I have a piece of an application that is written in C, it spawns a JVM and uses JNI to interact with a Java application. My memory footprint via Process Explorer gets upto 1GB and runs out of memory. Now as far as I know it should be able to get upto 2GB. One thing I believe is that the memory the JVM is using isn't visible in the Process Explorer. My xmx is set to 256, I added some statements to watch the java side memory and it is peaking at 256 and GC is doing its job and it is all good on that side. So my question is, where is the other 700+ MB being consumed? Anyone out there a Java/JNI/C Memory expert?
There could be a leak in the JNI code.
Remember to use (*jni)->DeleteLocalRef() for any object references you get once you are done with them. If you use any native C buffers to create new Java objects, make sure you free them off once the object is created. Check the JNI Specification for further guidelines.
Depending on the VM you are using you might be able to turn on JNI checking. For example, on the IBM JDK you can specify "-Xcheck:jni".
Try a test app in C that doesn't spawn the JVM but instead tries to allocate more and more memory. See whether the test app can reach the 2 GB barrier.
The C and JNI code can allocate memory as well (malloc/free/new/etc), which is outside of the VM's 256m. The xMX only restricts what the VM will allocate itself. Depending on what you're allocating in the C code, and what other things are loaded in memory you may or may not be able to get up to 2GB.
If you say that it's the Windows process that runs out of memory as opposed to the JVM, then my initial guess is that you probably invoke some (your own) native methods from the JVM and those native methods leak memory. So, I concur with #John Gardner here.
Well thanks to all of your help especially #alexander I have discovered that all the extra memory that isn't visible via Process Explorer is being used by the Java Heap. In fact via other tests that I have run the JVM's memory consumption is included in what I see from the Process Explorer. So the heap is taking large amounts of memory, I will have to do some more research about that and maybe ask a separate question.
Write a C test harness and use valgrind/alleyoop to check for leakage in your C code, and similarly use the java jvisualvm tool.

Categories