What is effect of "System.gc()" in J2ME? - java

I'm developing a mobile application in J2ME. Here I'm facing memory problem. I'm facing out of memory error. So please give the ideas of how it get rid out of this kind of error/exception, garbage collection, memory management in J2ME.
I had one doubt what is the effect System.gc() in the J2ME.
What is the difference between System.gc() and Runtime.getRuntime().gc() in J2ME/Java.
Thanks & Regards,

Calling System.gc() will not fix an "OutOfMemoryError". An OOME only happens after the system has made a "best effort" attempt to release memory by garbage collecting (and other means) ... and failed to free enough memory to continue.
The way to fix OOME errors is to find out what is using all of the memory and try to do something about it.
Possible problems that can lead to OOMEs include:
Memory leaks; i.e. something in your app is causing lots of objects to remain "reachable" after they are no longer required.
Memory hungry data structures or algorithms.
Not enough memory to run the app with that input data.
Your first step to solving this problem should be to use a profiler to see if there are any significant leaks, and to find out more generally what data structures are using all of the memory.

Runs the garbage collector.
Calling the gc method suggests that the Java Virtual Machine expend
effort toward recycling unused objects in order to make the memory
they currently occupy available for quick reuse. When control returns
from the method call, the Java Virtual Machine has made a best effort
to reclaim space from all discarded objects.
The call System.gc() is effectively equivalent to the call:
Runtime.getRuntime().gc()
-> http://download.oracle.com/javase/6/docs/api/java/lang/System.html#gc%28%29

System.gc() and Runtime.getRuntime().gc() are equivalent. They suggest a garbage collection, but there is no guarantee that this will actually happen.
So, don't rely on it, and in fact, it is very rare that you want to call this at all.

Related

How I get all objects in ram for java application

My Java application consumer more ram and old objects still in the memory
need tool to get all objects still in memory and how to remove it.
need tool for testing performance too
Have you tried forcing the Garbage Collector to run with System.gc()? If it's not properly clearing out your items it is likely you have a lot of memory leaks going.
I recommend running a profiler on your application to start finding where you have left objects that are no longer used but cannot be released by the Garbage Collector due to lingering references.

Java System.GC() and Memory Leakage

I have read up and searched through the forums about leaking memory and gradually increasing RAM. I tried to use the call of System.GC() method every 60 seconds in my program and it seems to be working given that my RAM usage drops every call. Why is it a good idea not to use this method? In every post I have read they seemed to vaguely explain why the method does not free up memory, yet my program seems to say otherwise. Some even said the method did nothing at all but suggest to the Garbage Collector clean itself up. NOTE : My leak is not from static methods I know because I removed them from my entire project and the RAM still increased. I would post my code, but it is rather large so I doubt anyone is up to reading it.
Thanks for the help.
As you stated, System.gc() is just a suggestion. It's not guaranteed to force the garbage collection, though in practice it frequently does.
The Java garbage collection runs on its own periodically. If you see that your memory is increasing over time and you're not reclaiming it, then you have a memory leak. Calling System.gc() won't fix that. If your memory is leaking, eventually there will be nothing to collect.
In general, you shouldn't need to force GC. As I mentioned, the GC will run on its own. You can tweak its behavior - http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html.
The original problem is comes from memory leak.
Sympthom is
* because of memory leak, there are not enough memory space
* so the JVM will try to do GC again again again.
* but still have enough memory. so GC again again.
So the System.GC or kind of GC tuning is not helpful.
To fix this problem, u have to find our where is the memory leakage point.
In JVM, there are tools that dumps current memory foot print (heapdump).
You can find out the leakage point by using this.
For more information, please refer this - http://www.oracle.com/technetwork/java/javase/memleaks-137499.html

Memory Leakage due to HashMap

I am facing memory leakage because of HashMap in my code. When I first login to the application this HashMap populate and I use this Map to cache some data.
I used this cached Data several places in my application.
Its size grows continuously after Login when nothing is running in the application.
The size decreases only in the situation when the garbage collector is called automatically or I call that.
But after that it again starts increasing. It is a memory leak for sure but how can I avoid it?
My profiler also showing ResultSet.getString() and Statement.execute() as the hotspot memory allocation. these methods used to populate this cache.
Is the memory leak there because of these methods? I have close the DB Connection in finally block.
Why it is still showing me these methods?
As the comments above explain, this does not sound like a memory leak.
In a java application, the JVM will create objects and use up memory. As time goes on, some of the objects will go out of scope (become eligible for garbage-collection) but until the next collection happens, they will still be in the heap, 'using up memory'. This is not a problem, it is how java works. When the JVM decides it needs to free up memory, it will run a collection and the used memory should drop.
Should you care about what you are seeing? I can think of two reasons you should and one you why shouldn't. If your garbage-collections free up enough memory for the application keep running, the collections do not affect performance and you are a busy person with other things to do, then I see no reason why you should care.
If however, you are worried that you do not understand how the application works in detail, or you have a reason why "so much memory" is an issue (you will want to run the application with even more data in future, or will want to run it with less heap assigned in future), then you may want to investigate.
If the memory is being used up when the application is doing nothing, then I would focus on that. What is it really doing when it is doing 'nothing'? I bet it's doing 'something'

Tune Java GC, so that it would immediately throw OOME, rather than slow down indeterminately

I've noticed, that sometimes, when memory is nearly exhausted, the GC is trying to complete at any price of performance (causes nearly freeze of the program, sometimes multiple minutes), rather that just throw an OOME (OutOfMemoryError) immediately.
Is there a way to tune the GC concerning this aspect?
Slowing down the program to nearly zero-speed makes it unresponsive. In certain cases it would be better to have a response "I'm dead" rather than no response at all.
Something like what you're after is built into recent JVMs.
If you:
are using Hotspot VM from (at least) Java 6
are using the Parallel or Concurrent garbage collectors
have the option UseGCOverheadLimit enabled (it's on by default with those collectors, so more specifically if you haven't disabled it)
then you will get an OOM before actually running out of memory: if more than 98% of recent time has been spent in GC for recovery of <2% of the heap size, you'll get a preemptive OOM.
Tuning these parameters (the 98% in particular) sounds like it would be useful to you, however there is no way as far as I'm aware to tune those thresholds.
However, check that you qualify under the three points above; if you're not using those collectors with that flag, this may help your situation.
It's worth reading the HotSpot JVM tuning guide, which can be a big help with this stuff.
I am not aware of any way to configure the Java garbage collector in the manner you describe.
One way might be for your application to proactively monitor the amount of free memory, e.g. using Runtime.freeMemory(), and declare the "I'm dead" condition if that drops below a certain threshold and can't be rectified with a forced garbage collection cycle.
The idea is to pick the value for the threshold that's large enough for the process to never get into the situation you describe.
I strongly advice against this, Java trying to GC rather than immediately throwing an OutOfMemoryException makes far much more sense - don't make your application fall over unless every alternative has been exhausted.
If your application is running out of memory, you should be increasing your max heap size or looking at it's performance in terms of memory allocation and seeing if it can be optimised.
Some things to look at would be:
Use weak references in places where your objects would not be required if not referenced anywhere else.
Don't allocated larger objects than you need (ie storing a huge array of 100 objects when you are only going to need access to three of them through the array lifecycle), or using a long datatype when you only need to store eight values.
Don't hold onto references to objects longer than you would need!
Edit: I think you misunderstand my point. If you accidentally leave a live reference to an object that no longer needs to be used it will obviously still not be garbage collected. This is nothing to do with nulling just incase - a typical example to this would be using a large object for a specific purpose, but when it goes out of scope it is not GC because a live reference has accidentally been left elsewhere, somewhere that you don't know about causing a leak. A typical example of this would be in a hashtable lookup which can be solved with weak references as it will be eligible for GC when only weakly reachable.
Regardless these are just general ideas off the top of my head on how to improve performance with memory allocation. The point I am trying to make is that asking how to throw an OutOfMemory error quicker rather than letting Java GC try it's best to free up space on the heap is not a great idea IMO. Optimize your application instead.
Well, turns out, there is a solution since Java8 b92:
-XX:+ExitOnOutOfMemoryError
When you enable this option, the JVM exits on the first occurrence of an out-of-memory error. It can be used if you prefer restarting an instance of the JVM rather than handling out of memory errors.
-XX:+CrashOnOutOfMemoryError
If this option is enabled, when an out-of-memory error occurs, the JVM crashes and produces text and binary crash files (if core files are enabled).
A good idea is to combine one of the above options with the good old -XX:+HeapDumpOnOutOfMemoryError
I tested these options, they actually work as expected!
Links
See the feature description
See List of changes in that Java release

Force full garbage collection when memory occupation goes beyond a certain threshold

I have a server application that, in rare occasions, can allocate large chunks of memory.
It's not a memory leak, as these chunks can be claimed back by the garbage collector by executing a full garbage collection. Normal garbage collection frees amounts of memory that are too small: it is not adequate in this context.
The garbage collector executes these full GCs when it deems appropriate, namely when the memory footprint of the application nears the allotted maximum specified with -Xmx.
That would be ok, if it wasn't for the fact that these problematic memory allocations come in bursts, and can cause OutOfMemoryErrors due to the fact that the jvm is not able to perform a GC quickly enough to free the required memory. If I manually call System.gc() beforehand, I can prevent this situation.
Anyway, I'd prefer not having to monitor my jvm's memory allocation myself (or insert memory management into my application's logic); it would be nice if there was a way to run the virtual machine with a memory threshold, over which full GCs would be executed automatically, in order to release very early the memory I'm going to need.
Long story short: I need a way (a command line option?) to configure the jvm in order to release early a good amount of memory (i.e. perform a full GC) when memory occupation reaches a certain threshold, I don't care if this slows my application down every once in a while.
All I've found till now are ways to modify the size of the generations, but that's not what I need (at least not directly).
I'd appreciate your suggestions,
Silvio
P.S. I'm working on a way to avoid large allocations, but it could require a long time and meanwhile my app needs a little stability
UPDATE: analyzing the app with jvisualvm, I can see that the problem is in the old generation
From here (this is a 1.4.2 page, but the same option should exist in all Sun JVMs):
assuming you're using the CMS garbage collector (which I believe the server turns on by default), the option you want is
-XX:CMSInitiatingOccupancyFraction=<percent>
where % is the % of memory in use that will trigger a full GC.
Insert standard disclaimers here that messing with GC parameters can give you severe performance problems, varies wildly by machine, etc.
When you allocate large objects that do not fit into the young generation, they are immediately allocated in the tenured generation space. This space is only GC'ed when a full-GC is run which you try to force.
However I am not sure this would solve your problem. You say "JVM is not able to perform a GC quickly enough". Even if your allocations come in bursts, each allocation will cause the VM to check if it has enough space available to do it. If not - and if the object is too large for the young generation - it will cause a full GC which should "stop the world", thereby preventing new allocations from taking place in the first place. Once the GC is complete, your new object will be allocated.
If shortly after that the second large allocation is requested in your burst, it will do the same thing again. Depending on whether the initial object is still needed, it will either be able to succeed in GC'ing it, thereby making room for the next allocation, or fail if the first instance is still referenced.
You say "I need a way [...] to release early a good amount of memory (i.e. perform a full GC) when memory occupation reaches a certain threshold". This by definition can only succeed, if that "good amount of memory" is not referenced by anything in your application anymore.
From what I understand here, you might have a race condition which you might sometimes avoid by interspersing manual GC requests. In general you should never have to worry about these things - from my experience an OutOfMemoryError only occurs if there are in fact too many allocations to be fit into the heap concurrently. In all other situations the "only" problem should be a performance degradation (which might become extreme, depending on the circumstances, but this is a different problem).
I suggest you do further analysis of the exact problem to rule this out. I recommend the VisualVM tool that comes with Java 6. Start it and install the VisualGC plugin. This will allow you to see the different memory generations and their sizes. Also there is a plethora of GC related logging options, depending on which VM you use. Some options have been mentioned in other answers.
The other options for choosing which GC to use and how to tweak thresholds should not matter in your case, because they all depend on enough memory being available to contain all the objects that your application needs at any given time. These options can be helpful if you have performance problems related to heavy GC activity, but I fear they will not lead to a solution in your particular case.
Once you are more confident in what is actually happening, finding a solution will become easier.
Do you know which of the garbage collection pools are growing too large?....i.e. eden vs. survivor space? (try the JVM option -Xloggc:<file> log GC status to a file with time stamps)...When you know this, you should be able to tweak the size of the effected pool with one of the options mentioned here: hotspot options for Java 1.4
I know that page is for the 1.4 JVM, I can't seem to find the same -X options on my current 1.6 install help options, unless setting those individual pool sizes is a non-standard, non-standard feature!
The JVM is only supposed to throw an OutOfMemoryError after it has attempted to release memory via garbage collection (according to both the API docs for OutOfMemoryError and the JVM specification). Therefore your attempts to force garbage collection shouldn't make any difference. So there might be something more significant going on here - either a problem with your program not properly clearing references or, less likely, a JVM bug.
There's a very detailed explanation of how GC works here and it lists parameters to control memory available to different memory pools/generations.
Try to use -server option. It will enable parallel gc and you will have some performance increase if you use multi core processor.
Have you tried playing with G1 gc? It should be available in 1.6.0u14 onwards.

Categories