I have a Java app that imports another jar as a library and calls its main method as shown below. But someApp is a very large process and constantly throws an OutOfMemoryError. No matter what I set my Java apps heap size to, someApp does not seem to share the allocated memory.
try {
someApp.main(args);
} catch (Exception ex) {
}
How do I get someApp to allocate more heap space? Can I use processBuilder? What do I do?
Thanks.
As it stands at the moment, you're merely calling a class from another application within your own Java process. This is exactly the same as you'd do for calling a "library method" (the term has no technical difference, you're simply invoking a method on an object of a class that can be resolved by your classloader).
So right now, someApp is running in the same JVM as your own application, and will share its maximum heap size. This can be increase with the JVM argument -Xmx (e.g. -Xmx2048m for a 2GB max heap), though it sounds like you're doing this already withotu success.
It would be possible to launch someApp in a separate Java process, which would allow you to configure separate JVM arguments and thus give it a separate heap size.
However, I don't think this is going to help much. If you're unable to get this application to run in the same JVM, regardless of your heap limit, there's nothing that would suggest it would work in a difference JVM. For example, if you're running with a 2.5GB heap and still running out of memory, running your own app with a 0.5GB heap and spawning a separate JVM with 2GB heap will not solve the problem, as something is still running out of memory. (In fact, separate memory pools make an OOME slightly more likely since there are two distinct chunks of free space, whereas in the former case both applications can benefit from the same pool of free space).
I suggest you verify that your heap sizes really are being picked up (connecting via JMX using JConsole or JVisualVM will quickly let you see how big the max heap size is). If you're really still running out of memory with large heaps, it sounds like someApp has a memory leak (or a requirement for an even larger heap). Capturing a heap dump in this case, with the JVM argument -XX:+HeapDumpOnOutOfMemoryError, will allow you to inspect the heap with an external tool and determine what's filling the memory.
Hopefully you've simply failed to increase the heap size correctly, as if the application really is failing with a large heap there are no simple solutions.
Unless someApp itself is building a new process, this will already be in the same process as your calling code, so it should be affected by whatever heap configuration you've set when starting up the JVM.
Have you kept track of how much memory the process is actually taking?
This doesn't make sense, unless you're running into OS limitations on how much memory you can allocate to a single Java process on your OS (see Java maximum memory on Windows XP)
Short of that, the way you're invoking someApp, it acts as a regular library. The main method acts like any other method.
Have you tried debugging the OutOfMemoryError? There may be something obscure that the app doesn't like about being invoked from your application...
If the jar you are importing is authored by you and could be more efficient, then modify it. It sounds like the problem you are having is loading in a shotty package. If this is a 3rd party package and you are allowed to modify it, poke around in the code and find where there might be limitations, change it, and rebuild it.
Related
I have a 1.5G xml file, and I use a DOM java parser (after throw out this problem I know DOM is not a good tool for big data, while I am still curious about the problem below). My issue is "OutOfMemoryError: Java heap space", based on the answers already exist, I change the eclipse.ini -xms and -xmx size both to 8096m, and I show the heap status in eclispe window to monitor how much heap size has used. While when I run this code, it just used "80m/8096m" then throw out the "out memory" bug, I wonder why there is clearly huge space not used, i.e. "8096m - 80m ", but still get to out of memory.
I wonder why there is clearly huge space not used, i.e. "8096m - 80m ", but still get to out of memory.
You are misinterpreting something1. And (IMO) the most likely think is that you are confusing the memory used by your IDE (Eclipse) with the memory used to run your application.
Unless you are doing something strange (and inadvisable) your application is run in a separate JVM whose heap parameters are independent of the IDE heap parameters. Unsurprisingly (to me) actual the memory usage of your Eclipse IDE process is small ... because that's not where your application was running.
Tweaking the heap parameters for Eclipse makes no difference to the application JVM heap parameters! What you actually need to do is to set the heap parameters in the Eclipse launcher config for your application. Within Eclipse. Alternatively, launch the application by hand from the command prompt.
But in this case, I doubt that it will work. I would be very surprised if you could represent a 1.5 GB XML file as a DOM in an 8GB Java heap. The overheads of that form of representation are too large.
1 - If the Eclipse IDE itself was running out of memory, it would get horribly slow, and then crash. Been there. Seen that. It typically happens if you are developing or examining a very large code-base.
My application currently consumes quite a lot of memory because it is running physics simulations. The issue is that consistently, at the 51st simulation, Java will throw an error usually because of a heap space out of memory (my program eventually runs thousands of simulations).
Is there anyway I can not just increase the heap space but modify my program so that the heap space is cleared after every run so that I can run an arbitrary number of simulations?
Edit: Thanks guys. Turns out the simulator software wasn't clearing the information after every run and I had those runs all stored in an ArrayList.
There is no way to dynamically increase the heap programatically since the heap is allocated when the Java Virtual Machine is started.
However, you can use this command
java -Xmx1024M YourClass
to set the memory to 1024
or, you can set a min max
java -Xms256m -Xmx1024m YourClassNameHere
If you are using a lot of memory and facing memory leaks, then you might want to check if you are using a large number of ArrayLists or HashMaps with many elements each.
An ArrayList is implemented as a dynamic array. The source code from Sun/Oracle shows that when a new element is inserted into a full ArrayList, a new array of 1.5 times the size of the original array is created, and the elements copied over. What this means is that you could be wasting up to 50% of the space in each ArrayList you use, unless you call its trimToSize method. Or better still, if you know the number of elements you are going to insert before hand, then call the constructor with the initial capacity as its argument.
I did not examine the source code for HashMap very carefully, but at a first glance it appears that the array length in each HashMap must be a power of two, making it another implementation of a dynamic array. Note that HashSet is essentially a wrapper around HashMap.
There are a variety of tools that you can use to help diagnose this problem. The JDK includes JVisualVM that will allow you to attach to your running process and show what objects might be growing out of control. Netbeans has a wrapper around it that works fairly well. Eclipse has the Eclipse Memory Analyzer which is the one I use most often, just seems to handle large dump files a bit better. There's also a command line option, -XX:+HeapDumpOnOutOfMemoryError that will give you a file that is basically a snapshot of your process memory when your program crashed. You can use any of the above mentioned tools to look at it, it can really help a lot when diagnosing these sort of problems.
Depending on how hard the program is working, it may be a simple case of the JVM not knowing when a good time to garbage collect may be, you might also look into the parallel garbage collection options as well.
I also faced the same problem.I resolved by doing the build by following steps as.
-->Right click on the project select RunAs ->Run configurations
Select your project as BaseDirectory. In place of goals give eclipse:eclipse install
-->In the second tab give -Xmx1024m as VM arguments.
I would like to add that this problem is similar to common Java memory leaks.
When the JVM garbage collector is unable to clear the "waste" memory of your Java / Java EE application over time, OutOfMemoryError: Java heap space will be the outcome.
It is important to perform a proper diagnostic first:
Enable verbose:gc. This will allow you to understand the memory growing pattern over time.
Generate and analyze a JVM Heap Dump. This will allow you to understand your application memory footprint and pinpoint the source of the memory leak(s).
You can also use Java profilers and runtime memory leak analyzer such as Plumbr as well to help you with this task.
Try adding -Xmx for more memory ( java -Xmx1024M YourClass ), and don't forget to stop referencing variables you don't need any more (memory leaks).
Are you keeping references to variables that you no longer need (e.g. data from the previous simulations)? If so, you have a memory leak. You just need to find where that is happening and make sure that you remove the references to the variables when they are no longer needed (this would automatically happen if they go out of scope).
If you actually need all that data from previous simulations in memory, you need to increase the heap size or change your algorithm.
Java is supposed to clear the heap space for you when all of the objects are no longer referenced. It won't generally release it back to the OS though, it will keep that memory for it's own internal reuse. Maybe check to see if you have some arrays which are not being cleared or something.
No. The heap is cleared by the garbage collector whenever it feels like it. You can ask it to run (with System.gc()) but it is not guaranteed to run.
First try increasing the memory by setting -Xmx256m
It seems that the JVM uses some fixed amount of memory. At least I have often seen parameters -Xmx (for the maximum size) and -Xms (for the initial size) which suggest that.
I got the feeling that Java applications don't handle memory very well. Some things I have noticed:
Even some very small sample demo applications load huge amounts of memory. Maybe this is because of the Java library which is loaded. But why is it needed to load the library for each Java instance? (It seems that way because multiple small applications linearly take more memory. See here for some details where I describe this problem.) Or why is it done that way?
Big Java applications like Eclipse often crash with some OutOfMemory exception. This was always strange because there was still plenty of memory available on my system. Often, they consume more and more memory over runtime. I'm not sure if they have some memory leaks or if this is because of fragmentation in the memory pool -- I got the feeling that the latter is the case.
The Java library seem to require much more memory than similar powerful libraries like Qt for example. Why is this? (To compare, start some Qt applications and look at their memory usage and start some Java apps.)
Why doesn't it use just the underlying system technics like malloc and free? Or if they don't like the libc implementation, they could use jemalloc (like in FreeBSD and Firefox) which seems to be quite good. I am quite sure that this would perform better than the JVM memory pool. And not only perform better, also require less memory, esp. for small applications.
Addition: Does somebody have tried that already? I would be much interested in a LLVM based JIT-compiler for Java which just uses malloc/free for memory handling.
Or maybe this also differs from JVM implementation to implementation? I have used mostly the Sun JVM.
(Also note: I'm not directly speaking about the GC here. The GC is only responsible to calculate what objects can be deleted and to initialize the memory freeing but the actual freeing is a different subsystem. Afaik, it is some own memory pool implementation, not just a call to free.)
Edit: A very related question: Why does the (Sun) JVM have a fixed upper limit for memory usage? Or to put it differently: Why does JVM handle memory allocations differently than native applications?
You need to keep in mind that the Garbage Collector does a lot more than just collecting unreachable objects. It also optimizes the heap space and keeps track of exactly where there is memory available to allocate for the creation of new objects.
Knowing immediately where there is free memory makes the allocation of new objects into the young generation efficient, and prevents the need to run back and forth to the underlying OS. The JIT compiler also optimizes such allocations away from the JVM layer, according to Sun's Jon Masamitsu:
Fast-path allocation does not call
into the JVM to allocate an object.
The JIT compilers know how to allocate
out of the young generation and code
for an allocation is generated in-line
for object allocation. The interpreter
also knows how to do the allocation
without making a call to the VM.
Note that the JVM goes to great lengths to try to get large contiguous memory blocks as well, which likely have their own performance benefits (See "The Cost of Missing the Cache"). I imagine calls to malloc (or the alternatives) have a limited likelihood of providing contiguous memory across calls, but maybe I missed something there.
Additionally, by maintaining the memory itself, the Garbage Collector can make allocation optimizations based on usage and access patterns. Now, I have no idea to what extent it does this, but given that there's a registered Sun patent for this concept, I imagine they've done something with it.
Keeping these memory blocks allocated also provides a safeguard for the Java program. Since the garbage collection is hidden from the programmer, they can't tell the JVM "No, keep that memory; I'm done with these objects, but I'll need the space for new ones." By keeping the memory, the GC doesn't risk giving up memory it won't be able to get back. Naturally, you can always get an OutOfMemoryException either way, but it seems more reasonable not to needlessly give memory back to the operating system every time you're done with an object, since you already went to the trouble to get it for yourself.
All of that aside, I'll try to directly address a few of your comments:
Often, they consume more and more
memory over runtime.
Assuming that this isn't just what the program is doing (for whatever reason, maybe it has a leak, maybe it has to keep track of an increasing amount of data), I imagine that it has to do with the free hash space ratio defaults set by the (Sun/Oracle) JVM. The default value for -XX:MinHeapFreeRatio is 40%, while -XX:MaxHeapFreeRatio is 70%. This means that any time there is only 40% of the heap space remaining, the heap will be resized by claiming more memory from the operating system (provided that this won't exceed -Xmx). Conversely, it will only* free heap memory back to the operating system if the free space exceeds 70%.
Consider what happens if I run a memory-intensive operation in Eclipse; profiling, for example. My memory consumption will shoot up, resizing the heap (likely multiple times) along the way. Once I'm done, the memory requirement falls back down, but it likely won't drop so far that 70% of the heap is free. That means that there's now a lot of underutilized space allocated that the JVM has no intention of releasing. This is a major drawback, but you might be able to work around it by customizing the percentages to your situation. To get a better picture of this, you really should profile your application so you can see the utilized versus allocated heap space. I personally use YourKit, but there are many good alternatives to choose from.
*I don't know if this is actually the only time and how this is observed from the perspective of the OS, but the documentation says it's the "maximum percentage of heap free after GC to avoid shrinking," which seems to suggest that.
Even some very small sample demo
applications load huge amounts of
memory.
I guess this depends on what kind of applications they are. I feel that Java GUI applications run memory-heavy, but I don't have any evidence one way or another. Did you have a specific example that we could look at?
But why is it needed to load the
library for each Java instance?
Well, how would you handle loading multiple Java applications if not creating new JVM processes? The isolation of the processes is a good thing, which means independent loading. I don't think that's so uncommon for processes in general, though.
As a final note, the slow start times you asked about in another question likely come from several intial heap reallocations necessary to get to the baseline application memory requirement (due to -Xms and -XX:MinHeapFreeRatio), depending what the default values are with your JVM.
Java runs inside a Virtual Machine, which constrains many parts of its behavior. Note the term "Virtual Machine." It is literally running as though the machine is a separate entity, and the underlying machine/OS are simply resources. The -Xmx value is defining the maximum amount of memory that the VM will have, while the -Xms defines the starting memory available to the application.
The VM is a product of the binary being system agnostic - this was a solution used to allow the byte code to execute wherever. This is similar to an emulator - say for old gaming systems. It is emulating the "machine" that the game runs on.
The reason why you run into an OutOfMemoryException is because the Virtual Machine has hit the -Xmx limit - it has literally run out of memory.
As far as smaller programs go, they will often require a larger percentage of their memory for the VM. Also, Java has a default starting -Xmx and -Xms (I don't remember what they are right now) that it will always start with. The overhead of the VM and the libraries becomes much less noticable when you start to build and run "real" applications.
The memory argument related to QT and the like is true, but is not the whole story. While it uses more memory than some of those, those are compiled for specific architectures. It has been a while since I have used QT or similar libraries, but I remember the memory management not being very robust, and memory leaks are still common today in C/C++ programs. The nice thing about Garbage Collection is that it removes many of the common "gotchas" that cause memory leaks. (Note: Not all of them. It is still very possible to leak memory in Java, just a bit harder).
Hope this helps clear up some of the confusion you may have been having.
To answer a portion of your question;
Java at start-up allocates a "heap" of memory, or a fixed size block (the -Xms parameter). It doesn't actually use all this memory right off the bat, but it tells the OS "I want this much memory". Then as you create objects and do work in the Java environment, it puts the created objects into this heap of pre-allocated memory. If that block of memory gets full then it will request a little more memory from the OS, up until the "max heap size" (the -Xmx parameter) is reached.
Once that max size is reached, Java will no longer request more RAM from the OS, even if there is a lot free. If you try to create more objects, there is no heap space left, and you will get an OutOfMemory exception.
Now if you are looking at Windows Task Manager or something like that, you'll see "java.exe" using X megs of memory. That sort-of corresponds to the amount of memory that it has requested for the heap, not really the amount of memory inside the heap thats used.
In other words, I could write the application:
class myfirstjavaprog
{
public static void main(String args[])
{
System.out.println("Hello World!");
}
}
Which would basically take very little memory. But if I ran it with the cmd line:
java.exe myfirstjavaprog -Xms 1024M
then on startup java will immediately ask the OS for 1,024 MB of ram, and thats what will show in Windows Task Manager. In actuallity, that ram isnt being used, but java reserved it for later use.
Conversely, if I had an app that tried to create a 10,000 byte large array:
class myfirstjavaprog
{
public static void main(String args[])
{
byte[] myArray = new byte[10000];
}
}
but ran it with the command line:
java.exe myfirstjavaprog -Xms 100 -Xmx 100
Then Java could only alocate up to 100 bytes of memory. Since a 10,000 byte array won't fit into a 100 byte heap, that would throw an OutOfMemory exception, even though the OS has plenty of RAM.
I hope that makes sense...
Edit:
Going back to "why Java uses so much memory"; why do you think its using a lot of memory? If you are looking at what the OS reports, then that isn't what its actually using, its only what its reserved for use. If you want to know what java has actually used, then you can do a heap dump and explore every object in the heap and see how much memory its using.
To answer "why doesn't it just let the OS handle it?", well I guess that is just a fundamental Java question for those that designed it. The way I look at it; Java runs in the JVM, which is a virtual machine. If you create a VMWare instance or just about any other "virtualization" of a system, you usually have to specify how much memory that virtual system will/can consume. I consider the JVM to be similar. Also, this abstracted memory model lets the JVM's for different OSes all act in a similar way. So for example Linux and Windows have different RAM allocation models, but the JVM can abstract that away and follow the same memory usage for the different OSes.
Java does use malloc and free, or at least the implementations of the JVM may. But since Java tracks allocations and garbage collects unreachable objects, they are definitely not enough.
As for the rest of your text, I'm not sure if there's a question there.
Even some very small sample demo applications load huge amounts of memory. Maybe this is because of the Java library which is loaded. But why is it needed to load the library for each Java instance? (It seems that way because multiple small applications linearly take more memory. See here for some details where I describe this problem.) Or why is it done that way?
That's likely due to the overhead of starting and running the JVM
Big Java applications like Eclipse often crash with some OutOfMemory exception. This was always strange because there was still plenty of memory available on my system. Often, they consume more and more memory over runtime. I'm not sure if they have some memory leaks or if this is because of fragmentation in the memory pool -- I got the feeling that the latter is the case.
I'm not entirely sure what you mean by "often crash," as I don't think this has happened to me in quite a long time. If it is, it's likely due to the "maximum size" setting you mentioned earlier.
Your main question asking why Java doesn't use malloc and free comes down to a matter of target market. Java was designed to eliminate the headache of memory management from the developer. Java's garbage collector does a reasonably good job of freeing up memory when it can be freed, but Java isn't meant to rival C++ in situations with memory restrictions. Java does what it was intended to do (remove developer level memory management) well, and the JVM picks up the responsibility well enough that it's good enough for most applications.
The limits are a deliberate design decision from Sun. I've seen at least two other JVM's which does not have this design - the Microsoft one and the IBM one for their non-pc AS/400 systems. Both grows as needed using as much memory as needed.
Java doesn't use a fixed size of memory it is always in the range from -Xms to -Xmx.
If Eclipse crashes with OutOfMemoryError, than it required more memory than granted by -Xmx (a coniguration issue).
Java must not use malloc/free (for object creation) since its memory handling is much different due to garbage collection (GC). GC removes automatically unused objects, which is a benefit compared to be responsible for memory management.
For details on this complex topic see Tuning Garbage Collection
I have one main class that contains 5 buttons each link to a program/package. Each package runs a jmf program that capture images from a webcam and it also loads about 15 images from file.
The 1st program to load(regardless of which button i press) always runs correctly. But When i run a program after the 1st program ends, java.lang.OutOfMemoryError: java heap space occurs.
Im not sure if java can't handle all of our images or if it has something to do with jmf image capture.
Maybe you should give more memory to your JVM (-Xmx512m on the command line could be a good start),
then, if it solves the problem, investigate why your programs consumes so much memory.
The use of sun diagnostic tools like jvisualvm could be helpful.
Increase the Java maximum memory and re-rerun. If you still see OOM's, you may have a leak. To increase the max memory, append -Xmx<new heap size>m to your command line.
Example:
java -Xmx1024m Foo
How much memory are you giving to your JVM? You can give it more using the following: -Xmx1024m (for 1GB, adjust as necessary)
This assumes that you don't have some memory leak in your program. I don't know anything about JMF, this is just general advice for Out of Memory errors.
JVMs run with a limited amount of maximum memory available to them. This is a little counterintuitive and trips a lot of people up (I can't think of many similar environments).
You can increase the max memory the JVM takes by specifying
java -Xmx128m ...
or similar. If you know in advance that you're going to consume that amount of memory, use
java -Xms128m ...
to specify the memory that the JVM will allocate at startup. Note the -Xms vs -Xmx !
Try to check, if you still have some references around which prevent the first package/program to be garbage-collected.
When the launcher has detected that the first program has ended, set all references to the first program and maybe objects retrieved from it to NULL to allow the JVM to reclaim the memory again and have it ready for the second launch.
Java uses 64 MByte heap space by default. An alternative to the other suggestions (increasing heap space to 512M or 1024M) is to start separate JVMs for the controller and the 5 applications. Then if one of your JMF applications crashes (due to insufficient memory), the controller and the other apps are still running.
(this will only work if the applications and the controller are completely decoupled - otherwise, just increase the heap size and dispose all media as soon as you don't need it anymore to prevent from memory leaks)
I have a piece of an application that is written in C, it spawns a JVM and uses JNI to interact with a Java application. My memory footprint via Process Explorer gets upto 1GB and runs out of memory. Now as far as I know it should be able to get upto 2GB. One thing I believe is that the memory the JVM is using isn't visible in the Process Explorer. My xmx is set to 256, I added some statements to watch the java side memory and it is peaking at 256 and GC is doing its job and it is all good on that side. So my question is, where is the other 700+ MB being consumed? Anyone out there a Java/JNI/C Memory expert?
There could be a leak in the JNI code.
Remember to use (*jni)->DeleteLocalRef() for any object references you get once you are done with them. If you use any native C buffers to create new Java objects, make sure you free them off once the object is created. Check the JNI Specification for further guidelines.
Depending on the VM you are using you might be able to turn on JNI checking. For example, on the IBM JDK you can specify "-Xcheck:jni".
Try a test app in C that doesn't spawn the JVM but instead tries to allocate more and more memory. See whether the test app can reach the 2 GB barrier.
The C and JNI code can allocate memory as well (malloc/free/new/etc), which is outside of the VM's 256m. The xMX only restricts what the VM will allocate itself. Depending on what you're allocating in the C code, and what other things are loaded in memory you may or may not be able to get up to 2GB.
If you say that it's the Windows process that runs out of memory as opposed to the JVM, then my initial guess is that you probably invoke some (your own) native methods from the JVM and those native methods leak memory. So, I concur with #John Gardner here.
Well thanks to all of your help especially #alexander I have discovered that all the extra memory that isn't visible via Process Explorer is being used by the Java Heap. In fact via other tests that I have run the JVM's memory consumption is included in what I see from the Process Explorer. So the heap is taking large amounts of memory, I will have to do some more research about that and maybe ask a separate question.
Write a C test harness and use valgrind/alleyoop to check for leakage in your C code, and similarly use the java jvisualvm tool.