wasting memory to speed up jvm - java

On Linux & Mac, is there a way to pre-cache the JVM - either in RAM, or a state of it, so that when I start a Java program, it starts as fast as C/C++ programs?
I am willing to waste memory to make this happen.

No. Unfortunately :(
On second thought, the reason why Java programs start faster on Windows these days, is because there is a process (Java Quickstart) which aggressively keeps a copy of the runtime library files in the memory cache which apparently helps immensely. I do not know if this approach has been ported to Linux.

Would that not load the JVM binary and
libs into memory so that they can be
shared?
Yes, but only in the same JVM instance. So you have to load your application into this instance, as servlet container do.
The whole bootleneck of the JVM invocation is class loading, that is the reason for the Java Quickstart that Thorbjørn mentioned.
So you can put the class libs on faster media (ram disk) this will probably fasten your (first) startup. I once installed Netbeans + JSDK on a RAM disk and it starts really fast but once started it will run equal fast than loaded from disk.

Related

How is JVM instance created per application?

I understand that each java process runs in its own JVM. For example when I run jcmd in my machine, I see
21730 sun.tools.jcmd.JCmd
77558 /usr/local/opt/jenkins-lts/libexec/jenkins.war --httpListenAddress=127.0.0.1 --httpPort=8080
99974
99983 org.jetbrains.jps.cmdline.Launcher /Applications/IntelliJ IDEA.app/Contents/lib/asm-all-7.0.1.jar:/Applications/IntelliJ IDEA.app/Contents/lib/lz4-java-1.6.0.jar:/Applications/IntelliJ IDEA.app/Contents/plugins/java/lib/aether-connector-basic-1.1.0.jar:/Applications/IntelliJ IDEA.app/Contents/plugins/java/lib/plexus-utils-3.0.22.jar:/Applications/IntelliJ IDEA.app/Contents/plugins/java/lib/aether-api-1.1.0.jar:/Applications/IntelliJ IDEA.app/Contents/plugins/java/lib/javac2.jar:/Applications/IntelliJ IDEA.app/Contents/lib/util.jar:/Applications/IntelliJ IDEA.app/Contents/lib/platform-api.jar:/Applications/IntelliJ IDEA.app/Contents/lib/qdox-2.0-M10.jar:/Applications/IntelliJ IDEA.app/Contents/lib/jna.jar:/Applications/IntelliJ IDEA.app/Contents/lib/trove4j.jar:/Applications/IntelliJ IDEA.app/Contents/lib/nanoxml-2.2.3.jar:/Applications/IntelliJ IDEA.app/Contents/lib/jdom.jar:/Applications/IntelliJ IDEA.app/Contents/lib/netty-common-4.1.41.Final.jar:/Applications/IntelliJ IDEA.app/Contents/plugins/java/lib/aet
How is the JVM created per app ? Like what happens when I start jenkins with java -jar jenkins.war. Does some process copy over JVM stuff from JRE folder and initialize an instance of JVM ?
When you start a program like java, the operating system creates a "process". A process is the representation of a live, running program. The process concept is what allows you to run several copies of a program at the same time. Each process has its own private memory space and system resources like open files or network connections. Each process can load a different set of dynamically linked libraries. With Java, much of the jvm is implemented in shared libraries, which the launcher program "java" loads in at run time.
The details are OS dependent and become complicated fast.
One of the things that happen when the process is started is that the executable file is mapped into memory. The CPU cannot execute instructions that are on disk or other external storage, so the program "text" has to be copied from disk into main memory first. Mapping the file into memory simplifies this and makes it more efficient: If the CPU needs to access a memory location that's not actually in RAM, the memory manager unit (MMU) issues a "page fault". The page fault causes data to be loaded into RAM. This is more efficient than simply copying the program text into RAM (what if not all text is needed all the time) and also simplifies the overall system (the virtual memory system is already needed for other OS features)

Measure peak memory consumption (of a Java Application) at runtime?

I have to run a couple of java services on my machine to obtain a certain dev environment (and get my not-java-related work done)
java -Xmx400m -jar foo-app/target/foo-app-SNAPSHOT.jar
java -Xmx250m -jar bar-app/target/bar-app-SNAPSHOT.jar
...
To not run out of memory, I need to limit the memory usage. The default (512m afaik) ist too high for my machine so I lowered them somewhat (on a wild as guessing basis). Except for one, where I learned the hard way (crashed, even freezes, and thankfully some .pid error files left behind in the project folder...), that I better settle a little higher:
java -Xmx800m -jar doo-app/target/doo-app-SNAPSHOT.jar
Question: is there a way, to track memory usage of a certain app over time?
By some java command line parameter or even with ps -ae, htop or similar? (thus not fiddling in the source itself, remap garbage collectors, etc, etc)
I see plenty of numbers, but figuring out which belong to which java project running, and what could roughly indicate me a proper peak memory consumption (in a -Xmx___m sense)... I have no idea.
I work under Ubuntu-MATE 16.04, x64.
The best way to analyze memory consumption is a profiler. In your jdk there comes the jvisualvm profiler, which is absolutely sufficient for this task. A (lengthy) tutorial can be found here: https://engineering.talkdesk.com/ninjas-guide-to-getting-started-with-visualvm-f8bff061f7e7
Other approaches are basically shotgun-style -reduce the xmx and then generate load in the system and see if it runs oom. If you do NOT have a straight controll flow you have no way to predict the used memory.

Check Java JVM Memory Limit on Windows XP

I want to see how much memory is allocated to my Java JVM on my Windows XP installation. I'm running an executable jar that has a main class that makes some JNI calls to a C library via a dll that is loaded using the System.loadLibrary("SampleJni"). Some calls are working and some are not. Whenever there are more than one String parameters passed I get a system dump. If I just have one String, one int, two ints..etc, no crashes. The machine only has .99 GB of ram, so I'm thinking the JVM can't allocate the need memory.
Use jconsole to check the memory used by your program. Jconsole comes with the JDK so you already have it. This memory won't include memory used by your JNI C code, but it will tell you what memory Java is using. Your more likely culprit is JNI mapping isn't correct when using multiple parameters.
I've run JVMs (Java 6) on machines with less memory than that. IIRC the default for the JVM on windows was 64Mb, but that may have changed. Even if it did, it should be enough to start up. You'd also see OutOfMemoryErrors if this were the case rather than hard crashes.
There are various methods in java.lang.Runtime that will let you inspect how much memory you have.
The likely cause is the JNI interface. Its very easy to crash the JVM if the JNI code isn't 100% correct.

Grails application hogging too much memory

Tomcat 5.5.x and 6.0.x
Grails 1.6.x
Java 1.6.x
OS CentOS 5.x (64bit)
VPS Server with memory as 384M
JAVA_OPTS : tried many combinations- including the following
export JAVA_OPTS='-Xms128M -Xmx512M -XX:MaxPermSize=1024m'
export JAVA_OPTS='-server -Xms128M -Xmx128M -XX:MaxPermSize=256M'
(As advised by http://www.grails.org/Deployment)
I have created a blank Grails application i.e simply by giving the command grails create-app and then WARed it
I am running Tomcat on a VPS Server
When I simply start the Tomcat server, with no apps deployed, the free memory is about 236M
and used memory is about 156M
When I deploy my "blank" application, the memory consumption spikes to 360M and finally the Tomcat instance is killed as soon as it takes up all free memory
As you have seen, my app is as light as it can be.
Not sure why the memory consumption is as high it is.
I am actually troubleshooting a real application, but have narrowed down to this scenario which is easier to share and explain.
UPDATE
I tested the same "blank" application on my local Tomcat 5.5.x on Windows and it worked fine
The memory consumption of the Java process shot from 32 M to 107M. But it did not crash and it remained under acceptable limits
So the hunt for answer continues... I wonder if something is wrong about my Linux box. Not sure what though...
UPDATE 2
Also see this http://www.grails.org/Grails+Test+On+Virtual+Server
It confirms my belief that my simple-blank app should work on my configuration.
It is a false economy to try to run a long running Java-based application in the minimal possible memory. The garbage collector, and hence the application will run much more efficiently if it has plenty of regular heap memory. Give an application too little heap and it will spend too much time garbage collecting.
(This may seem a bit counter-intuitive, but trust me: the effect is predictable in theory and observable in practice.)
EDIT
In practical terms, I'd suggest the following approach:
Start by running Tomcat + Grails with as much memory as you can possibly give it so that you have something that runs. (Set the permgen size to the default ... unless you have clear evidence that Tomcat + Grails are exhausting permgen.)
Run the app for a bit to get it to a steady state and figure out what its average working set is. You should be able to figure that out from a memory profiler, or by examining the GC logging.
Then set the Java heap size to be (say) twice the measured working set size or more. (This is the point I was trying to make above.)
Actually, there is another possible cause for your problems. Even though you are telling Java to use heaps of a given size, it may be that it is unable to do this. When the JVM requests memory from the OS, there are a couple of situations where the OS will refuse.
If the machine (real or virtual) that you are running the OS does not have any more unallocated "real" memory, and the OS's swap space is fully allocated, it will have to refuse requests for more memory.
It is also possible (though unlikely) that per-process memory limits are in force. That would cause the OS to refuse requests beyond that limit.
Finally, note that Java uses more virtual memory that can be accounted for by simply adding the stack, heap and permgen numbers together. There is the memory used by the executable + DLLs, memory used for I/O buffers, and possibly other stuff.
384MB is pretty small. I'm running a small Grails app in a 512MB VPS at enjoyvps.net (not affiliated in any way, just a happy customer) and it's been running for months at just under 200MB. I'm running a 32-bit Linux and JDK though, no sense wasting all that memory in 64-bit pointers if you don't have access to much memory anyway.
Can you try deploying a tomcat monitoring webapp e.g. psiprobe and see where the memory is being used?

Tools to view/solve Windows XP memory fragmentation

We have a java program that requires a large amount of heap space - we start it with (among other command line arguments) the argument -Xmx1500m, which specifies a maximum heap space of 1500 MB. When starting this program on a Windows XP box that has been freshly rebooted, it will start and run without issues. But if the program has run several times, the computer has been up for a while, etc., when it tries to start I get this error:
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
I suspect that Windows itself is suffering from memory fragmentation, but I don't know how to confirm this suspicion. At the time that this happens, Task manager and sysinternals procexp report 2000MB free memory. I have looked at this question related to internal fragmentation
So the first question is, How do I confirm my suspicion?
The second question is, if my suspicions are correct, does anyone know of any tools to solve this problem? I've looked around quite a bit, but I haven't found anything that helps, other than periodic reboots of the machine.
ps - changing operating systems is also not currently a viable option.
Agree with Torlack, a lot of this is because other DLLs are getting loaded and go into certain spots, breaking up the amount of memory you can get for the VM in one big chunk.
You can do some work on WinXP if you have more than 3G of memory to get some of the windows stuff moved around, look up PAE here:
http://www.microsoft.com/whdc/system/platform/server/PAE/PAEdrv.mspx
Your best bet, if you really need more than 1.2G of memory for your java app, is to look at 64 bit windows or linux or OSX. If you're using any kind of native libraries with your app you'll have to recompile them for 64 bit, but its going to be a lot easier than trying to rebase dlls and stuff to maximize the memory you can get on 32 bit windows.
Another option would be to split your program up into multiple VMs and have them communicate with eachother via RMI or messaging or something. That way each VM can have some subset of the memory you need. Without knowing what your app does, i'm not sure that this will help in any way, though...
Unless you are running out of page file space, this issue isn't that the computer is running out of memory. The whole point of virtual memory is to allow the processes to use more virtual memory than is physically available.
Not knowing how the JVM handles the heap, it is a bit hard to say exactly what the problem is, but one of the common issues is that there isn't enough contiguous free address space available in your process to allow the heap to be extended. Why this would be a problem after the machine has been running a while is a bit confusing.
I've been working on a similar problem at work. I have found that running the program using WinDBG and using the "!address" and "!address -summary" commands have been invaluable in tracking down why a processes' virtual address space has become fragmented. You can also try running the program after reboot and using the "!address" command to take a picture of the address space and then do the same when the program no longer runs. This might clue you in on the problem. Maybe something simple as an extra DLL getting loading might cause the problem.
I suspect that the problem is Windows memory fragmentation. There is another question here on StackOverflow called Java Maximum Memory on Windows XP that mentions using Process Explorer to look at where DLLs are mapped into memory, and then to address the problem by rebasing the DLLs so that load into memory in a more compact way.
Using Minimem (http://minimem.kerkia.net/) for that application might fix your problem. However, I'm not sure this is the answer you are looking for. I hope it helps.
Maybe you should consider to start the program and reserving the memory and not
end the VM after each run. Look for different GC options and release your objects.
Use vmmap from Microsoft's SysInternals tools to view the fragmentation of the virtual address space, and identify what's breaking up the space

Categories