Avoid "GC overhead limit exceeded" error [duplicate] - java

This question already has answers here:
Error java.lang.OutOfMemoryError: GC overhead limit exceeded
(22 answers)
Closed 6 months ago.
After a lot of effort I can't seem to overcome the problem of getting a
GC overhead limit exceeded
error in my Java program.
It occurs inside a large method that contains large string manipulation, many lists of objects and accesses to DB.
I have tried the following:
after the use of each ArrayList, I have added: list=new ArrayList<>(); list=null;
for the strings, instead of having e.g. 50 appends (str+="....") I try to have one append with the total text
after each DB access I close the statements and the resultSets.
This method is called from main like this:
for(int i=0; i<L; i++) {
cns = new Console(i);
cns.processData();//this is the method
cns=null;
}
When this loop gets executed 1 or 2 times, everything is ok. For L>=3 it's almost certain that I will get the garbage collector error.
Shouldn't the fact that I have a cns=null after each execution of the method, force the GC and free everything from the previous execution?
Should I also delete all private attributes of the object before setting it to null? Maybe putting a Thread.sleep() could force the GC after each loop?

There's actually no reason to set cns to null at the end of each loop. You're setting it to a new Console() at the beginning of the loop anyway - if anything could be freed by setting it to null, it's also going to be freed by setting it to a new object.
You can try a System.gc(); call to suggest the system do a garbage collection, but I don't know if that would help you or make it worse. The system IS already attempting garbage collection - if it wasn't, you wouldn't get this error.
You don't show us exactly how you're building your Strings, but keep in mind that += isn't the only culprit. If you have something like String s = "Hello " + "to the" + " world";, that's just as bad as putting that on three lines and using +=. If it's an issue, StringBuilder may be your friend.
You can read the answers at Error java.lang.OutOfMemoryError: GC overhead limit exceeded for some other suggestions on how to avoid this error. It seems that for some people it's triggered when you're almost, but not quite, out of memory. So increasing the amount of memory available to Java may (or may not) help.

Basically, an "GC overhead limit exceeded" is a symptom of having too much reachable data. The heap is filling up with things that cannot be garbage collected ... 'cos they are not garbage! The JVM is running the GC again and again in an attempt to make space. Eventually, it decides that too much time is being spent garbage collecting, and it gives up. This is usually the correct thing to do.
The following ideas from your question (and the other answer) are NOT solutions.
Forcing the GC to run by calling System.gc() won't help. The GC is already running too often.
Assigning null to cns won't help. It immediately gets something else assigned to it. Besides, there is no evidence that the Console object is occupying much memory.
(Note that the constructor for the java.io.Console class is not public, so your example is not meaningful as written. Perhaps you are actually calling System.getConsole()? Or perhaps this is a different Console class?)
Clearing private attributes of an object before setting it to null is unlikely to make any difference. If an object is not reachable, then the values of its attributes are irrelevant. The GC won't even look at them.
Calling Thread.sleep() will make no difference. The GC runs when it thinks it needs to.
The real problem is ... something that we can't determine from the evidence that you have provided. Why is there so much reachable data?
In general terms, the two most likely explanations are as follows:
Your application (or some library you are) is accumulating more and more objects in some data structure that is surviving beyond a single iteration of the for loop. In short, you have a memory leak.
To solve this, you need to find the storage leak; see How to find a Java Memory Leak. (The leak could be related to database connections, statements or resultsets not being closed, but I doubt it. The GC should find and close these resources if they have become unreachable.)
Your application simply needs more memory. For example, if a single call to processData needs more memory than is available, you will get an OOME no matter what you try to get the GC to do. It cannot delete reachable objects, and it obviously cannot find enough garbage to fast enough.
To solve this, first see if there are ways modify the program so that it needs less (reachable) memory, Here are a couple of ideas:
If you are building a huge string to represent the output before writing it to an OutputStream, Writer or similar. You would save memory if you wrote directly to the output sink.
In some cases, consider using StringBuilder rather than String concatenation when assembling large strings. Particularly when the concatenations are looped.
However, note that 1) in Java 8 and earlier javac already emits StringBuilder sequences for you for concatenation expressions, and 2) in Java 9+, javac emits invokedynamic code that is better than using StringBuilder; see
JDK 9/JEP 280: String Concatenations Will Never Be the Same
If that doesn't help, increase the JVM's heap size or split the problem into smaller problems. Some problems just need lots of memory to solve.

Related

About Java heap space while using Collection objects

I'm using collection objects (Arraylist, hashmap mainly).
My program is running 24*7. In between sometimes, it throws an exception out of memory Error: Java heap space.
I have already given 1gb of space for JVM
My Question is whether I need to use Global objects of Collection or local objects for each methods?
(Almost 1000000 data I have to process per day continuously 24*7)
You could also set the heap space to 2GB and see if the problem still occurs. Thats the poor mans memory leak detection process. Otherwise use a profiler like VisualVM and check for memory leaks.
You can use a source code quality tool like Sonar.
You can also use Eclipse Memory Analysis tool. It enables you to analyse the heap dump & figure out which process is using the maximum memory. Analyze productive heap dumps with hundreds of millions of objects, quickly calculate the retained sizes of objects, see who is preventing the Garbage Collector from collecting objects, run a report to automatically extract leak suspects.
I always use it to fix OutOfMemory exceptions.
All the answer were really helpful :
1.When the requirement of the program is to run 24*7 then use local variable across the method.
2.The program must be thread safe(if Thread is used).
3.Use Connection pooling because when your connection object is used in infinite loop then creating & destroying every time is a big performance issue , so always make 10 or 15 connection in the beginning & checkout and checkin the connection.
4. Use Memory Analysis tool to analyse the heap dump & figure out which process is using the maximum memory.
You should use local variable until and unless it is being used across the methods. and Try to make Global variable null whenever its value is not going to use anymore. But you should be more sure while making object null.
These null objects get garbage collected easily, which helps you to avoid memory exceptions. Also review your code for infinite loops while iterating collections, arrays etc.

How can I delete a specific object? [duplicate]

This question already has answers here:
How to force garbage collection in Java?
(25 answers)
Closed 8 years ago.
How can I manually delete a specific object before the garbage collector would ever collect it ?
For example I want to delete requestToken object. How can I do that ?
The short answer is that you can't, and that you don't need to. The GC will reclaim the memory when it needs to ... and there is no reason to interfere with that.
The only situation I can think of for needing to delete an object sooner is when the object contains information that needs to be erased ... for information security reasons. (The classic example is when you are processing a password provided by the user and you are worried that it might leak via a code dump or something) In that case, you need to implement a method on your object for erasing the object's fields by overwriting them. But this requires careful design; e.g. to make sure that you find and erase all traces of the information.
It is also sometimes necessary to "help" the GC a bit to avoid potential memory leaks. A classic example of this is the ArrayList class, which uses a Java array to represent the list content. The array is often larger than the list's logical size, and the elements of the array could contain pointers to objects that have been removed from the list. The ArrayList class deals with this by assigning null to these elements.
Note that neither of these examples involve actually deleting objects. In both cases, the problem / issue is addressed another way.
It is also worth noting that calling System.gc() is usually a bad idea:
It is not guaranteed to do anything at all.
In most situations, it won't do anything that wouldn't happen anyway.
In most situations, it is inefficient. The JVM is in a better position than application code to know the ergonomically most efficient time to run the GC. (Read this for a first-principles explanation of the ergonomics.)
The only cases where running the GC in production code is advisable are when you are trying to manage GC pauses, and you know that a pause is acceptable at a particular point in time. (For example, when you are changing levels in an interactive game ... )
You cannot delete an object, you can just try to make it eligible for garbage collection in its next cycle. The best you could do is , set the object as null and try calling System.gc();
Note: System.gc() call will only request the JVM to run garbage collector but it cannot force it to.

toStringBuilder causing issues

I am running a multithreaded import which runs for around 1-2 hours.
and in the import, before putting data into the table.
i am checking
if(debug.isEnabled())
logger.debug("Object="+MyObject);
where MyObject uses the ToStringBuilder in the toString method.
java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOfRange(Arrays.java:2694)
at java.lang.String.<init>(String.java:203)
at java.lang.StringBuffer.toString(StringBuffer.java:561)
at org.apache.commons.lang3.builder.ToStringBuilder.toString(ToStringBuilder.java:1063)
I am thinking that toStringBuilder is causing this issues. am i correct? if yes what are ways to fix this?
Not necessarily. All that error means is that you're almost out of heap space, and the garbage collector is giving up on trying to reclaim space because it has run too much without reclaiming enough space. The fact that it happened at that point in the code doesn't necessarily mean anything. It could be that something entirely different ate up the space, but that call kicked off the GC one more time, when it finally gave up. You'd need to take a heap dump and look at it in a profiler like YourKit or VisualVM to see what's really going on.
Your object construct a too-large string in memory, probably within the toStringBuilder() method.
To avoid this, try to do importing as per row into your database like
if (debug.isEnabled()) {
// since you are importing rows it must be a collection/array
logger.debug("Object=");
for(Row r in MyObject.getRows()) {
logger.debug(r);
}
}
or, in rare case, should you really have a big object to log, have the object log itself by streaming content to log, rather than creating an overwhelming string.

After Text was deleted from JTextArea, heap is not empty

I have an application with AWT GUI, and I use JTextArea for logging output. If I erase the text with setText(null) or removeAll() or setText("") and then run garbage collector System.gc(), I notice that the whole text still in memory. How can I really delete the text?
I'm not very familiar with profiler, here is what I see in memory dump after setText(null):
Please have a read on: How Garbage Collection works in Java.
As per the docs System.gc():
Calling the gc method suggests that the Java Virtual Machine expend effort toward recycling unused objects in order to make the memory they currently occupy available for quick reuse. When control returns from the method call, the Java Virtual Machine has made a best effort to reclaim space from all discarded objects
NB - suggests. This means that the garbage collector is only suggested to do a clean up and not forced also it may entirely ignore your request, thus we cannot know when the garbage will be collected only that it will be in time.
NB - disgarded objects: this refers to all objects that are not static/final or in use/referenced by any other instances/classes/fields/variables etc.
Here is also an interesting question I found on the topic:
Why is it a bad practice to call System.gc?
with the top answer going along the lines of:
The reason everyone always says to avoid System.gc() is that it is a
pretty good indicator of fundamentally broken code. Any code that
depends on it for correctness is certainly broken; any that rely on it
for performance are most likely broken
and further there has even been a bug submitted for the bad phrasing of the documentation:
http://bugs.sun.com/view_bug.do?bug_id=6668279
.
As #DavidK notes System.gc() is not a useful way to examine this. Using the mechanism described here, most profilers can force garbage collection in a way that, subject to some limitations, is a useful debugging tool.
if there are any String objects holding this content in your client program, please set them to null as well.
Also you don't need to explicitly call the System.gc() mothod. JVM does garbage collects the orphaned objects when ever it needs more memory to allocate for other objects.
you only need to worry about if you a see an out of memory / continuous heap memory increase usage etc.

frequent garbage collection java web app

I have a web app that serializes a java bean into xml or json according to the user request.
I am facing a mind bending problem when I put a little bit of load on it, it quickly uses all allocated memory, and reach max capacity. I then observe full GC working really hard every 20-40 seconds.
Doesnt look like a memory leak issue... but I am not quite sure how to trouble shoot this?
The bean that is serialized to xml/json has reference to other beans and those to others. I use json-lib and jaxb to serialize the beans.
yourkit memory profiler is telling me that a char[] is the most memory consuming live object...
any insight is appreciated.
There are two possibilities: you've got a memory leak, or your webapp is just generating lots of garbage.
The brute-force way to tell if you've got a memory leak is to run it for a long time and see if it falls over with an OOME. Or turn on GC logging, and see if the average space left after garbage collection continually trends upwards over time.
Whether or not you have a memory leak, you can probably improve performance (reduce the percentage GC time) by increasing the max heap size. The fact that your webapp is seeing lots of full GCs suggests to me that it needs more heap. (This is just a bandaid solution if you have a memory leak.)
If it turns out that you are not suffering from a memory leak, then you should take a look at why your application is generating so much garbage. It could be down to the way that you are doing the XML and JSON serialization.
Why do you think you have a problem? GC is a natural and normal thing to happen. We have customers that GC every second (for less than 100ms duration), and that's fine as long as memory keeps getting reclaimed.
GCing every 20-40 seconds isn't a problem IMO - as long as it doesn't take a large % of that 20-40s. Most major commercial JVMs aim to keep GC in the 5-10% of time range (so 1-4 seconds of that 20-40s). Posting more data in the form of the GC logs might help, and I'd also suggest tools like GCMV would help you visualize and get recommendations on what your GC profile looks like.
It's impossible to diagnose this without a lot more information - code and GC logs - but my guess would be that you're reading data in as large strings, then chopping out little bits with substring(). When you do that, the substring string is made using the same underlying character array as the parent string, and so as long as it's alive, will keep that array in memory. That means code like this:
String big = a string of one million characters;
String small = big.substring(0, 1);
big = null;
Will still keep the huge string's character data in memory. If this is the case, then you can address it by forcing the small strings to use fresh, smaller, character arrays by constructing new instances:
small = new String(small);
But like i said, this is just a guess.
I'm not sure how much of it is in your code and how much might be in the tools you are using, but there are some key things to watch for.
One of the worst is if you constantly add to strings in loops. A simple "hello"+"world" is no problem at all, it's actually very smart about that, but if you do it in a loop it will constantly reallocate the string. Use StringBuilder where you can.
There are profilers for Java that should quickly point you to where the allocations are taking place. Just fool around with a profiler for a while while your java app is running and you will probably be able to reduce your GCs to virtually nothing unless the problem is inside your libraries--and even then you may figure out some way to fix it.
Things you allocate and then free quickly don't require time in the GC phase--it's pretty much free. Be sure you aren't keeping Strings around longer than you need them. Bring them in, process them and return to your previous state before returning from your request handler.
You should attach yourkit and record allocations (e.g., every 10th allocation; including all large ones). They have a step by step guide on diagnosing excessive gc:
http://www.yourkit.com/docs/90/help/excessive_gc.jsp
To me that sounds like you are trying to serialize a recursive object by some encoder which is not prepared for it.
(or at least: very deep/almost recursive)
Java's native XML API is really "noisy" and generally wasteful in terms of resources which means that if your requests and XML/JSON generation cycles are short-lived, the GC will have lots to clean up for.
I have debugged a very similar case and found out this the hard way, only way I could at least somewhat improve the situation without major refactorings was implicitly calling GC with the appropriate VM flags which actually turn System.gc(); from a non-op call to maybe-op call.
I would start by inspecting my running application to see what was being created on the heap.
HPROF can collect this information for you, which you can then analyse using HAT.
To debug issues with memory allocations, InMemProfiler can be used at the command line. Collected object allocations can be tracked and collected objects can be split into buckets based on their lifetimes.
In trace mode this tool can be used to identify the source of memory allocations.

Categories