Java Heap Space: Out of Memory - no garbage collection? - java

I have limited knowledge when it comes to the JVM and Heap Space so I'm trying to understand some behavior that we're seeing.
There's a situation where a user can request data that exceeds our heap space and therefore causes an OOM: Java Heap Space. Okay, that makes sense and we understand why that's happening.
After the OOM occurs, I notice that the amount of memory reported being used is the heap size + what our application normally runs at... however, after the OOM it doesn't appear to get garbage collected and return to normal memory levels; it just stays high. Is that expected? Apologies for such a simple question, I wasn't sure where to find this information elsewhere.
Thanks in advance.
Edit:
Here's what the code boils down to - we (mistakenly) allow a user to fetch too large of a range and when we try and return the <List> of objects it's causing the OOM error (at least that's my theory)
What in this code would cause an object to not get cleaned up?
public List<Class> getClassesByRange(Timestamp startDate, Timestamp endDate) {
final Session session = this.sessionManager.openNewSession();
try {
final Criteria criteria = session.createCriteria(Class.class);
criteria.addOrder(Order.desc("sorting"));
criteria.setFetchMode("someObject", FetchMode.JOIN);
criteria.setResultTransformer(CriteriaSpecification.DISTINCT_ROOT_ENTITY);
criteria.add(Restrictions.ge("createDateTime", startDate));
criteria.add(Restrictions.le("createDateTime", endDate));
return criteria.list();
} finally {
session.close();
}
}

Related

Drop part of a List<> when encountering OutOfMemoryException

I'm writing a program that is supposed to continually push generated data into a List sensorQueue. The side effect is that I will eventually run out of memory. When that happens, I'd like drop parts of the list, in this example the first, or older, half. I imagine that if I encounter an OutOfMemeryException, I won't be able to just use sensorQueue = sensorQueue.subList((sensorQueue.size() / 2), sensorQueue.size());, so I came here looking for an answer.
My code:
public static void pushSensorData(String sensorData) {
try {
sensorQueue.add(parsePacket(sensorData));
} catch (OutOfMemoryError e) {
System.out.println("Backlog full");
//TODO: Cut the sensorQueue in half to make room
}
System.out.println(sensorQueue.size());
}
Is there an easy way to detect an impending OutOfMemoryException then?
You can have something like below to determine MAX memory and USED memory. Using that information you can define next set of actions in your programme. e.g. reduce its size or drop some elements.
final int MEGABYTE = (1024*1024);
MemoryMXBean memoryBean = ManagementFactory.getMemoryMXBean();
MemoryUsage heapUsage = memoryBean.getHeapMemoryUsage();
long maxMemory = heapUsage.getMax() / MEGABYTE;
long usedMemory = heapUsage.getUsed() / MEGABYTE;
Hope this would helps!
The problem with subList is that it creates sublist keeping the original one in memory. However, ArrayList or other extension of AbstractList has removeRange(int fromIndex, int toIndex) which removes elements of current list, so doesn't require additional memory.
For the other List implementations there is similar remove(int index) which you can use multiple times for the same purpose.
I think you idea is severely flawed (sorry).
There is no OutOfMemoryException, there is OutOfMemoryError only! Why that is important? Because errors leaves app in unstable state, well I'm not that sure about that claim in general, but it definitely holds for OutOfMemoryError. Because there is no guarantee, that you will be able to catch it! You can consume all of memory within you try-catch block, and OutOfMemoryError will be thrown somewhere in JDK code. So your catching is pointless.
And what is the reason for this anyways? How many messages do you want in list? Say that your message is 1MB. And your heap is 1000MB. So if we stop considering other classes, your heap size define, that your list will contain up to 1000 messages, right? Wouldn't it be easier to set heap sufficiently big for your desired number of messages, and specify message count in easier, intergral form? And if your answer is "no", then you still cannot catch OutOfMemoryError reliably, so I'd advise that your answer rather should be "yes".
If you really need to consume all what is possible, then checking memory usage in % as #fabsas recommended could be way. But I'd go with integral definition — easier to managed. Your list will contain up-to N messages.
You can drop a range of elements from a ArrayList using subList:
list.subList(from, to).clear();
Where from is the first index of the range to be removed and to is the last. In your case, you can do something like:
list.subList(0, sensorQueue.size() / 2).clear();
Note that this command will return a List.

Repeated replace calls lead to java.lang.OutOfMemoryError

I am mass-processing very large files. I am calling the following method on each URI in each line:
public String shortenUri(String uri) {
uri = uri
.replace("http://www.lemon-model.net/lemon#", "lemon:")
.replace("http://babelnet.org/rdf/", "bn:")
.replace("http://purl.org/dc/", "dc:")
.replace("http://www.w3.org/1999/02/22-rdf-syntax-ns#", "rdf:");
return uri;
}
Strangely, this leads to the following error:
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.regex.Pattern$BnM.optimize(Pattern.java:5411)
at java.util.regex.Pattern.compile(Pattern.java:1711)
at java.util.regex.Pattern.<init>(Pattern.java:1351)
at java.util.regex.Pattern.compile(Pattern.java:1054)
at java.lang.String.replace(String.java:2239)
at XYZ.shortenUri(XYZ.java:217)
I did increase Xmsand Xmx but it did not help. Strangely, I could also not observe an increased memory usage when monitoring the process. Any suggestions on increasing the performance and memory consumption here?
A quote from Oracle:
Excessive GC Time and OutOfMemoryError
The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line.
The first thing you could try is to increase the heap size even more, for example, fot a few GB with -Xmx4G.
Another option might be to prevent the creation of too many objects by not using the replace method. Instead you could create the Pattern and Matcher objects as needed (see below).
The third option I see is to disable this feature altogether with -XX:-UseGCOverheadLimit
private static final Pattern PURL_PATTERN = Pattern.compile("http://purl.org/dc/");
// other patterns
public static String shortenUri(String uri) {
// other matchers
Matcher matcher = PURL_PATTERN.matcher(uri);
return matcher.replaceAll("dc:");
}

Measure memory usage of a certain datastructure

I'm trying to measure the memory usage of my own datastructure in my Tomcat Java EE application at various levels of usage.
To measure the memory usage I have tried two strategies:
Runtime freeMemory and totalMemory:
System.gc(); //about 20 times
long start = Runtime.getRuntime().freeMemory();
useMyDataStructure();
long end = Runtime.getRuntime().freeMemory();
System.out.println(start - end);
MemoryPoolMXBean.getPeakUsage():
long before = Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory();
List<MemoryPoolMXBean> memorymxbeans = ManagementFactory.getMemoryPoolMXBeans();
for(MemoryPoolMXBean memorybean: memorymxbeans){
memorybean.resetPeakUsage();
}
useMyDataStructure();
for(MemoryPoolMXBean memorybean: memorymxbeans){
MemoryUsage peak = memorybean.getPeakUsage();
System.out.println(memorybean.getName() + ": " + (peak.getUsed() - before));
}
Method 1 does not output reliable data at all. The data is useless.
Method 2 outputs negative values. Besides it's getName() tells me it's outputting Code Cache, PS Eden Space, PS Survivor Space and PS Old Gen seperately.
How can I acquire somewhat consistent memory usage numbers before and after my useMyDataStructure() call in Java? I do not wish to use VirtualVM, I prefer to catch the number in a long object and write it to file myself.
Thanks in advance.
edit 1:
useMyDatastructure in the above examples was an attempt to simplify the code. What's really there:
int key = generateKey();
MyOwnObject obj = makeAnObject();
MyContainerClass.getSingleton().addToHashMap(key, obj);
So in essence I'm really trying to measure how much memory the HashMap<Integer, MyOwnObject> in MyContainerClass takes. I will use this memory measurement to perform an experiment where I fill up both the HashMap and MyOwnObject instances.
1st of all sizing objects in java is non-trivial (as explained very well here).
if you wish to know the size of a particular object, there are at least 2 open source libraries that will do the math for you - java.sizeof and javabi-sizeof
now, as for your specific test - System.gc() is mostly ignored by modern (hotspot) jvms, no matter how many times you call it. also, is it possible your useMyDataStructure() method does not retain a reference to the object(s) it creates? in that case measuring free memory after calling it is no good as any allocated Objects might have been cleared out.
You could try https://github.com/jbellis/jamm, this works great for me.

Does Immutability of Strings in Java cause Out Of Memory

I have written a simple Java program that reads a million rows from the Database and writes them to a File.
The max memory that this program can use is 512M.
I frequently notice that this program runs Out Of Memory for more than 500K rows.
Since the program is a very simple program it is easy to find out that this doesn't have a memory leak. the way the program works is that it fetches a thousand rows from the Database, writes them to a file using Streams and then goes and fetches the next thousand rows. The size of each row varies but none of the rows is huge. On taking a dump while the program is running the older string are easily seen on the heap. These String in heap are unreachable which means they are waiting to get Garbage collected. I also believe that the GC doesn't necessarily run during the execution of this program which leaves String's in the heap longer than they should.
I think the solution would be to use long Char Arrays(or Stringbuffer) instead of using String objects to store the lines that are returned by the DB. The assumption is that I can overwrite the contents of a Char Array which means the same Char Array can be used across multiple iterations without having to allocate new Space each time.
Pseudocode :
Create an Array of Arrays using new char[1000][1000];
Fill the thousand rows from DB to the Array.
Write Array to File.
Use the same Array for next thousand rows
If the above pseudocode fixes my problem then in reality the Immutable nature of the String class hurts the Java programmer as there is no direct way to claim the space used up by a String even though the String is no longer in use.
Are there any better alternatives to this problem ?
P.S : I didn't do a static analysis alone. I used yourkit profiler to test a heap dump. The dump clearly says 96% of the Strings have NO GC Roots which means they are waiting to get Garbage collected. Also I don't use Substring in my code.
Immutability of the class String has absolutely nothing to do with OutOfMemoryError. Immutability means that it cannot ever change, only that.
If you run out of memory, it is simply because the garbage collector was unable to find any garbage to collect.
In practice, it is likely that you are holding references to way too many Strings in memory (for instance, do you have any kind of collection holding strings, such as List, Set, Map?). You must destroy these references to allow the garbage collector to do its job and free up some memory.
The simple answer to this question is 'no'. I suspect you're hanging onto references longer than you think.
Are you closing those streams properly ? Are you intern()ing those strings. That would result in a permanent copy being made of the string if it doesn't exist already, and taking up permgen space (which isn't collected). Are you taking substring() of a larger string ? Strings make use of the flyweight pattern and will share a character array if created using substring(). See here for more details.
You suggest that garbage collection isn't running. The option -verbose:gc will log the garbage collections and you can see immediately what's going on.
The only thing about strings which can cause an OutOfMemoryError is if you retain small sections of a much larger string. If you are doing this it should be obvious from a heap dump.
When you take a heap dump I suggest you only look at live objects, in which case any retained objects you don't need is most likely to be a bug in your code.

java while loop memory leak

I used a while loop to fetch message from Amazon SQS. Partial code is as follows:
ReceiveMessageRequest receiveMessageRequest = new ReceiveMessageRequest(myQueueUrl);
while (true) {
List<Message> messages = sqs.receiveMessage(receiveMessageRequest).getMessages();
if (messages.size() > 0) {
MemcachedClient c = new MemcachedClient(new BinaryConnectionFactory(), AddrUtil.getAddresses(memAddress));
for (Message message : messages) {
// get message from aws sqs
String messageid = message.getBody();
String messageRecieptHandle = message.getReceiptHandle();
sqs.deleteMessage(new DeleteMessageRequest(myQueueUrl, messageRecieptHandle));
// get details info from memcache
String result = null;
String key = null;
key = "message-"+messageid;
result = c.get(key);
}
c.shutdown();
}
}
Will it cause memory leak in such case?
I checked using "ps aux". What I found is that the RSS (resident set size, the non-swapped physical memory that a task used) is growing slowly.
You can't evaluate whether your Java application has a memory leak simply based on the RSS of the process. Most JVMs are pretty greedy, they would rather take more memory from the OS than spend a lot of work on Garbage Collection.
That said your while loop doesn't seem like it has any obvious memory "leaks" either, but that depends on what some of the method calls do (which isn't included above). If you are storing things in static variables, that can be a cause of concern but if the only references are within the scope of the loop you're probably fine.
The simplest way to know if you have a memory leak in a certain area of code is to rigorously exercise that code within a single run of your application (potentially set with a relatively low maximum heap size). If you get an OutOfMemoryError, you probably have a memory leak.
Sorry, but I don't see here code to remove message from the message queue. Did you clean the message list? In case that DeleteRequest removes message from the queue then you try to modify message list which you itereate.
Also you can get better memory usage statistic with visualvm tool which is part of JDK now.

Categories