Caused by: java.lang.OutOfMemoryError: Java heap space - java

MY GOAL:
I want run my application for 1000 users.
NOW
I am trying to run for 100 user. During application run, I'd like to do some process for each user that will take a minimum of one hour per user, so I'm using one thread per user.
ERROR
Caused by: java.lang.OutOfMemoryError: Java heap space
I've tried to figure out what this means, but I'm not really sure how to resolve it.
Can anybody help me?

This error means that your program needs more memory than your JVM allowed it to use!
Therefore you pretty much have two options:
Increase the default memory your program is allowed to use using the -Xmx option (for instance for 1024 MB: -Xmx1024m)
Modify your program so that it needs less memory, using less big data structures and getting rid of objects that are not any more used at some point in your program
As Peter Lawrey pointed out, using a profiler to see what your program is doing in such situations is generally a good idea.

Use a producer/consumer pattern with a limited number of worker threads.
100+ threads is ridiculous - no wonder your application is exploding.

You haven't provided any information which indicates the problem is very different to all the answers given in StackOverflow regarding this error either;
You are using too much memory and you need to use a memory profiler to reduce it.
You are setting the maximum memory too low and you need to increase the maximum memory with -mx or -Xmx
I suspect that since you want 1000 users to run processes which take an hour each you may need more resources than you have. e.g. 1000 cores perhaps? I suggest you look at how much hardware you need based on the CPU, memory, disk IO and network IO that is required to run the users at an acceptible level e.g. 20 users and multiple that by 50.

You can try increasing the JVM heap space when you launch your application. You can try setting it to 2GB with -Xmx2g. If you're running 32-bit Java I think 2GB is as high as you can go, but if you have a 64-bit JVM you should be able to go higher.
Edit:
Example: java -Xmx2g MyApp

I will check 2 areas when there is out of memory error
Is the allocated memory to the JVM sufficient, if not increase it using -Xmx
Check the code thoroughly, more than 90% of the time I found the error with some loop going recursive under some border condition.

Related

Limit total memory consumption of Java process (in Cloud Foundry)

Related to these two questions:
How to set the maximum memory usage for JVM?
What would cause a java process to greatly exceed the Xmx or Xss limit?
I run a Java application on Cloud Foundry and need to make sure that the allocated memory is not exceeded. Otherwise, and this is the current issue, the process is killed by Cloud Foundry monitoring mechanisms (Linux CGROUP).
The Java Buildpack automatically sets sane values for -Xmx and -Xss. By tuning the arguments and configuring the (maximum) number of expected threads, I'm pretty sure that the memory consumed by the Java process should be less than the upper limit which I assigned to my Cloud Foundry application.
However, I still experience Cloud Foundry "out of memory" errors (NOT the Java OOM error!):
index: 3, reason: CRASHED, exit_description: out of memory, exit_status: 255
I experimented with the MALLOC_ARENA_MAX setting. Setting the value to 1 or 2 leads to slow startups. With MALLOC_ARENA_MAX=4 I still saw an error as described above, so this is no solution for my problem.
Currently I test with very tight memory settings so that the problem is easier to reproduce. However, even with this, I have to wait about 20-25 minutes for the error to occur.
Which arguments and/or environment variables do I have to specify to ensure that my Java process never exceeds a certain memory limit? Crashing with a Java OOM Error is acceptable if the application actually needs more memory.
Further information regarding MALLOC_ARENA_MAX:
https://github.com/cloudfoundry/java-buildpack/pull/160
https://www.infobright.com/index.php/malloc_arena_max/#.VmgdprgrJaQ
https://www.ibm.com/developerworks/community/blogs/kevgrig/entry/linux_glibc_2_10_rhel_6_malloc_may_show_excessive_virtual_memory_usage?lang=en
EDIT: A possible explaination is this: http://www.evanjones.ca/java-bytebuffer-leak.html. As I currently see the OOM issue when doing lots of outgoing HTTP/REST requests, these buffers might be to blame.
Unfortunately, there is no way to definitively enforce a memory limit on the JVM. Most of the memory regions are configurable (-Xmx, -Xss, -XX:MaxPermSize, -XX: MaxMetaspaceSize, etc.) but the one you can't control is Native memory. Native memory contains a whole host of things from memory mapped files to native libraries to JNI code. The best you can do is profile your application, find out where the memory growth is occurring, and either solve the growth or give yourself enough breathing room to survive.
Certainly unsatisfying, but in the end not much different from other languages and runtimes that have no control over their memory footprint.

Does JVM store memory in system ? If so, how to clear it?

I am running an application using NetBeans and in the project properties I have set the Max JVM heap space to 1 GiB.
But still, the application crashes with Out of Memory.
Does the JVM have memory stored in system? If so how to clear that memory?
You'll want to analyse your code with a profiler - Netbeans has a good one. This will show you where the memory is tied up in your application, and should give you an idea as to where the problem lies.
The JVM will garbage collect objects as much as it can before it runs out of memory, so chances are you're holding onto references long after you no longer need them. Either that, or your application is genuinely one that requires a lot of memory - but I'd say it's far more likely to be a bug in your code, especially if the issue only crops up after running the application for a long period of time.
I do not fully understand all details of your question, but I guess the important part is understandable.
The OutOfMemoryError (not an exception) is thrown if the memory allocated to your JVM does not suffice for the objects created in your program. In your case it might help to increase the available heap space to more than 1 GByte. If you think 1 GByte is enough, you may have a memory leak (which, in Java, most likely means that you have references to objects that you do not need anymore - maybe in some sort of cache?).
Java reserves virtual memory for its maximum heap size on startup. As the program uses this memory, more main memory is allocated to it by the OS. Under UNIX this appear as resident memory. While Java programs can swap to disk, the Garbage Collection performs extremely badly if part of the heap is swapped and it can result in the whole machine locking up or having to reboot. If your program is not doing this, you can be sure it is entirely in main memory.
Depending on what your application does it might need 1 GB, 10 GB or 100 GB or more. If you cannot increase the maximum memory size, you can use a memory profiler to help you find ways to reduce consumption. Start with VisualVM as it is built in and free and does a decent job. If this is not enough, try a commercial profiler such as YourKit for which you can get a free evaluation license (usually works long enough to fix your problem ;)
The garbage collector automatically cleans out the memory as required and may be doing this every few seconds, or even more than once per second. If it is this could be slowing down your application, so you should consider increasing the maximum size or reducing consumption.
As mentioned by #camobap the reason for the OutOfMemory was because Perm Gen size was set very low. Now the issue is resolved.
Thank you all for the answers and comments.
The Java compiler doesn't allocate 1 GiB as I think you are thinking. Java dynamically allocates the needed memory and garbage collects it too, every time it allocates memory it checks whether it has enough space to do so, and if not crashes. I am guessing somewhere in your code, because it would be near impossible to write code that allocates that many variables, you have an array or ArrayList that takes up all the memory. In the case of an array you probably has a variable allocating the size of it and you did some calculation to it that made it take too much memory. In the case of an ArrayList I believe you might have a loop that goes too many iterations adding elements to it.
Check your code for the above errors and you should be good.

Java : Memory utilization issue (sudden spike observed)

I was observing memory utilization for my application / service.
I am running the same load and at that time i have seen through Jconsole that memory was ranging between 1.5 to 1.7 GB(can see on image). Suddenly i have noticed that memory goes high for few second but here i would like to mention that nothing has been changed in terms of use case ( same load ).
I need to know that reason of why memory goes up high suddenly.In my setup nothing has changed that cause the reason of memory goes high.
Is there any bug in GC parameters ??
Yours thoughts are requested.
GC parameters I am using is:
export GC1_OPTS="-XX:+UseConcMarkSweepGC -XX:+UseParNewGC
-XX:CMSInitiatingOccupancyFraction=50 -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -XX:+CMSParallelRemarkEnabled -XX:+UseAdaptiveGCBoundary" export GC2_OPTS="-XX:+ExplicitGCInvokesConcurrent"
You need to know what it is doing in that period. Your load may not have changed but you may find that a tasks which normally uses a small amount of memory happens to rarely use a large amount of memory. I would use a memory profiler on your application to see how memory is being used and which code is causing it.
You are missing an important part of the question: "What is the problem"
You see that the memory used increases. That could be a bug in your application code that is triggered somehow, but it is not a problem from Garbage Collection point of view.
If you can produce this, I would do two heap dumps, one at 1.7 GB and then one later at 2.5 GB heap.
The you can use Eclipse MAT and the delta mode to compare the dumps. You will see what extra objects you have. Then you can find out if it is a problem or not.
With regards to your GC Settings, I would get rid of "-XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled" because they usually cause long pause times which are not really needed. Also "-XX:+UseAdaptiveGCBoundary" raises the question what your motivation was to use this parameter. I personally would not.

Limiting memory usage for Solr on Jetty

I have a memory-limited environment and I'm running Solr on Jetty with the following command:
java -jar -Xmx64M -Xmn32M -Xss512K start.jar
But the total memory consumption of the Solr instance (or Jetty) seems to be much higher than the heap limit I provide. The output of ps is:
ps -u buradayiz -o rss,etime,pid,command
155164 01:37:40 21989 java -jar -Xmx64M -Xmn32M -Xss512K start.jar
As you see, the RSS is over 150M. How can I avoid this situation? I just want to get a simple OutOfMemory exception when Solr/Jetty uses more memory than I let them.
I understand that there may be a difference between the heap limit I provide and the actual memory usage, but a difference factor of two (actually 2.5) seems a lot to me. I must be missing something.
Thanks.
There are a number of factors that contribute to memory usage beyond the heap specification.
A major one in your situation is the permanent generation. It's used to load classes for all the dependencies required to run the application and a few other things. There's not too much getting around a certain minimum for a given application due the classes necessary. You likely need around 64M (perhaps more) to run Solr on Jetty.
You can specify a maximum size to prevent the permanent generation from growing for the other things, e.g. add -XX:MaxPermSize=64M to your command line. It's unlikely going to help much though, and it might even break it if more is required. Usually it's almost all used by classes that you need.
Another contributor to memory usage beyond the heap is the stack size per thread. Each thread in your case is going to consume 512K. You can probably specify 256K safely, although you probably don't have enough threads running to matter too much.
I have the same problem; trying to run it in a limited environment. (Max 400mb ram/vm size). This solution seems to get it running at least.

OutOfMemoryError

I have one main class that contains 5 buttons each link to a program/package. Each package runs a jmf program that capture images from a webcam and it also loads about 15 images from file.
The 1st program to load(regardless of which button i press) always runs correctly. But When i run a program after the 1st program ends, java.lang.OutOfMemoryError: java heap space occurs.
Im not sure if java can't handle all of our images or if it has something to do with jmf image capture.
Maybe you should give more memory to your JVM (-Xmx512m on the command line could be a good start),
then, if it solves the problem, investigate why your programs consumes so much memory.
The use of sun diagnostic tools like jvisualvm could be helpful.
Increase the Java maximum memory and re-rerun. If you still see OOM's, you may have a leak. To increase the max memory, append -Xmx<new heap size>m to your command line.
Example:
java -Xmx1024m Foo
How much memory are you giving to your JVM? You can give it more using the following: -Xmx1024m (for 1GB, adjust as necessary)
This assumes that you don't have some memory leak in your program. I don't know anything about JMF, this is just general advice for Out of Memory errors.
JVMs run with a limited amount of maximum memory available to them. This is a little counterintuitive and trips a lot of people up (I can't think of many similar environments).
You can increase the max memory the JVM takes by specifying
java -Xmx128m ...
or similar. If you know in advance that you're going to consume that amount of memory, use
java -Xms128m ...
to specify the memory that the JVM will allocate at startup. Note the -Xms vs -Xmx !
Try to check, if you still have some references around which prevent the first package/program to be garbage-collected.
When the launcher has detected that the first program has ended, set all references to the first program and maybe objects retrieved from it to NULL to allow the JVM to reclaim the memory again and have it ready for the second launch.
Java uses 64 MByte heap space by default. An alternative to the other suggestions (increasing heap space to 512M or 1024M) is to start separate JVMs for the controller and the 5 applications. Then if one of your JMF applications crashes (due to insufficient memory), the controller and the other apps are still running.
(this will only work if the applications and the controller are completely decoupled - otherwise, just increase the heap size and dispose all media as soon as you don't need it anymore to prevent from memory leaks)

Categories