Java - Allocated space not reduced - java

I'm developing a Java application which sometimes do some heavy work.
When this is the case, it use more ram than usually, so the allocated memory space of the app is increased.
My question is why the allocated space is not reduced once the work is finished ?
Using a profiler, I can see that for example 70mb is assigned, but only 5mb are used !
It looks like the allocated space can only grow, and not shrink.
Thanks

Usually the JVM is very restrictive when it comes to freeing memory it has allocated. You can configure it to free more agressively though. Try sending these settings to the JVM when you start your program:
-XX:GCTimeRatio=5
-XX:AdaptiveSizeDecrementScaleFactor=1

The JVM decides when to release the memory back to the operating system. In my experience with Windows XP, this almost never happens. Occasionally I've seem memory released back when the Command Prompt (or Swing window) is minimized. I believe that the JVM on Linux is better at returning memory.

Generally there can be 2 reasons.
Probably your program has memory management problem. If for example you store some objects in collection and never remove these objects from collection they will never be garbage collected. If this is a case you have a bug that should be found and fixed.
But probably your code is OK but GC still does not remove objects that are not used more. The reason for this is that GC lives its own live and decides its own decisions. If for example it thinks that it has enough memory it does not remove used objects until the memory usage arrives to some threshold.
To recognize which case you are having here try to call System.gc() either programmatically or using profiler (usually profilers have button that run GC). If used objects are removed after forcing GC to run your code is OK. Otherwise try to locate the bug. Profiler that you are already using should help you.

Related

Java is not able to collect garbages in time

I have such problem that jvm is not able to perform gc in time and application freezes. "Solution" for that is to connect to application using jconsole and suggest jvm to make garbage collections. I do not have to say that it is very poor behavior of application. Are there some option for jvm to suggest to it to perform gc sooner/more often? Maybe there are some other real solution to this problem?
The problem appears not to be not enough of memory but that gc is not able to do collection in time before new data is send to application. It is so because gc appears to start to collect data to late. If is is suggested early enough by System.gc() button of jconsole then problem does not occur.
Young generation is collected by 'PS Scavenge' which is parallel collector.
Old generation is collected by 'PS MarkSweep' which is parallel mark and sweep collector.
You should check for memory leaks.
I'm pretty sure you won't get OutOfMemoryException unless there's no memory to be released and no more available memory.
There is System.gc() that does exactly what you described: It suggests to the JVM that a garbage collection should take place. (There are also command-line arguments for the JVM that can serve as directives for the memory manager.)
However, if you're running out of memory during an allocation, it typically means that the JVM did attempt a garbage collection first and it failed to release the necessary memory. In that case, you probably have memory leaks (in the sense of keeping unnecessary references) and you should get a memory profiler to check that. This is important because if you have memory leaks, then more frequent garbage collections will not solve your problem - except that maybe they will postpone its manifestation, giving you a false sense of security.
From the Java specification:
OutOfMemoryError: The Java Virtual Machine implementation has run out
of either virtual or physical memory, and the automatic storage
manager was unable to reclaim enough memory to satisfy an object
creation request.
You can deploy java melody on your server and add your application on it, it will give you detailed report of your memory leaks and memory usage. With this you will be able to optimize your system and code correctly.
I guess, either your application requires more memory to run efficiently, try tuning your JVM by setting parameters like -Xms512M -Xmx1024M.
Or,
There is memory leak which is exhausting the memory.
You should check the memory consumption pattern of your application. e.g. what memory it is occupying when it is processing more vs remain idle.
If you observe a constant surge in memory peaks, it could suggest towards a possible memory leak.
One of the best thread on memory leak issue is How to find a Java Memory Leak
Another good one is http://www.ibm.com/developerworks/library/j-leaks/
Additionally,
you may receive an OOME if you're loading a lot of classes (let's say, all classes present in your rt.jar). Since loaded classes reside in PermGen rather than heap memory, you may also want to increase your PermGen size using -XX:MaxPermSize switch.
And, of course, you're free to choose a garbage collector – ParallelGC, ConcMarkSweepGC (CMS) or G1GC (G1).
Please be aware that there're APIs in Java that may cause memory leaks by themselves (w/o any programmer's error) -- e. g. java.lang.String#substring() (see here)
If your application freezes, but gets unfrozen by a forced GC, then your problem is very probably not the memory, but some other resource leak, which is alleviated by running finalizers on dead objects. Properly written code must never rely on finalizers to do the cleanup, so try to find any unclosed resources in your application.
You can start the jvm with more memory
java -Xms512M -Xmx1024M
will start the jvm with 512Mb of memory, allowing it to grow to a gigabyte.
You can use System.gc() to suggest to the VM to run the garbage collector. There is no guarantee that it will run immediately.
I doubt if that will help, but it might work. Another thing you could look at is increasing the maximum memory size of the JVM. You can do this by giving the command line argument -Xmx512m. This would give 512 megabytes of heap size instead of the default 128.
You can use JConsole to view the memory usage of your application. This can help to see how the memory usage develops which is useful in detecting memory leaks.

Does JVM store memory in system ? If so, how to clear it?

I am running an application using NetBeans and in the project properties I have set the Max JVM heap space to 1 GiB.
But still, the application crashes with Out of Memory.
Does the JVM have memory stored in system? If so how to clear that memory?
You'll want to analyse your code with a profiler - Netbeans has a good one. This will show you where the memory is tied up in your application, and should give you an idea as to where the problem lies.
The JVM will garbage collect objects as much as it can before it runs out of memory, so chances are you're holding onto references long after you no longer need them. Either that, or your application is genuinely one that requires a lot of memory - but I'd say it's far more likely to be a bug in your code, especially if the issue only crops up after running the application for a long period of time.
I do not fully understand all details of your question, but I guess the important part is understandable.
The OutOfMemoryError (not an exception) is thrown if the memory allocated to your JVM does not suffice for the objects created in your program. In your case it might help to increase the available heap space to more than 1 GByte. If you think 1 GByte is enough, you may have a memory leak (which, in Java, most likely means that you have references to objects that you do not need anymore - maybe in some sort of cache?).
Java reserves virtual memory for its maximum heap size on startup. As the program uses this memory, more main memory is allocated to it by the OS. Under UNIX this appear as resident memory. While Java programs can swap to disk, the Garbage Collection performs extremely badly if part of the heap is swapped and it can result in the whole machine locking up or having to reboot. If your program is not doing this, you can be sure it is entirely in main memory.
Depending on what your application does it might need 1 GB, 10 GB or 100 GB or more. If you cannot increase the maximum memory size, you can use a memory profiler to help you find ways to reduce consumption. Start with VisualVM as it is built in and free and does a decent job. If this is not enough, try a commercial profiler such as YourKit for which you can get a free evaluation license (usually works long enough to fix your problem ;)
The garbage collector automatically cleans out the memory as required and may be doing this every few seconds, or even more than once per second. If it is this could be slowing down your application, so you should consider increasing the maximum size or reducing consumption.
As mentioned by #camobap the reason for the OutOfMemory was because Perm Gen size was set very low. Now the issue is resolved.
Thank you all for the answers and comments.
The Java compiler doesn't allocate 1 GiB as I think you are thinking. Java dynamically allocates the needed memory and garbage collects it too, every time it allocates memory it checks whether it has enough space to do so, and if not crashes. I am guessing somewhere in your code, because it would be near impossible to write code that allocates that many variables, you have an array or ArrayList that takes up all the memory. In the case of an array you probably has a variable allocating the size of it and you did some calculation to it that made it take too much memory. In the case of an ArrayList I believe you might have a loop that goes too many iterations adding elements to it.
Check your code for the above errors and you should be good.

Java: why does it uses a fixed amount of memory? or how does it manage the memory?

It seems that the JVM uses some fixed amount of memory. At least I have often seen parameters -Xmx (for the maximum size) and -Xms (for the initial size) which suggest that.
I got the feeling that Java applications don't handle memory very well. Some things I have noticed:
Even some very small sample demo applications load huge amounts of memory. Maybe this is because of the Java library which is loaded. But why is it needed to load the library for each Java instance? (It seems that way because multiple small applications linearly take more memory. See here for some details where I describe this problem.) Or why is it done that way?
Big Java applications like Eclipse often crash with some OutOfMemory exception. This was always strange because there was still plenty of memory available on my system. Often, they consume more and more memory over runtime. I'm not sure if they have some memory leaks or if this is because of fragmentation in the memory pool -- I got the feeling that the latter is the case.
The Java library seem to require much more memory than similar powerful libraries like Qt for example. Why is this? (To compare, start some Qt applications and look at their memory usage and start some Java apps.)
Why doesn't it use just the underlying system technics like malloc and free? Or if they don't like the libc implementation, they could use jemalloc (like in FreeBSD and Firefox) which seems to be quite good. I am quite sure that this would perform better than the JVM memory pool. And not only perform better, also require less memory, esp. for small applications.
Addition: Does somebody have tried that already? I would be much interested in a LLVM based JIT-compiler for Java which just uses malloc/free for memory handling.
Or maybe this also differs from JVM implementation to implementation? I have used mostly the Sun JVM.
(Also note: I'm not directly speaking about the GC here. The GC is only responsible to calculate what objects can be deleted and to initialize the memory freeing but the actual freeing is a different subsystem. Afaik, it is some own memory pool implementation, not just a call to free.)
Edit: A very related question: Why does the (Sun) JVM have a fixed upper limit for memory usage? Or to put it differently: Why does JVM handle memory allocations differently than native applications?
You need to keep in mind that the Garbage Collector does a lot more than just collecting unreachable objects. It also optimizes the heap space and keeps track of exactly where there is memory available to allocate for the creation of new objects.
Knowing immediately where there is free memory makes the allocation of new objects into the young generation efficient, and prevents the need to run back and forth to the underlying OS. The JIT compiler also optimizes such allocations away from the JVM layer, according to Sun's Jon Masamitsu:
Fast-path allocation does not call
into the JVM to allocate an object.
The JIT compilers know how to allocate
out of the young generation and code
for an allocation is generated in-line
for object allocation. The interpreter
also knows how to do the allocation
without making a call to the VM.
Note that the JVM goes to great lengths to try to get large contiguous memory blocks as well, which likely have their own performance benefits (See "The Cost of Missing the Cache"). I imagine calls to malloc (or the alternatives) have a limited likelihood of providing contiguous memory across calls, but maybe I missed something there.
Additionally, by maintaining the memory itself, the Garbage Collector can make allocation optimizations based on usage and access patterns. Now, I have no idea to what extent it does this, but given that there's a registered Sun patent for this concept, I imagine they've done something with it.
Keeping these memory blocks allocated also provides a safeguard for the Java program. Since the garbage collection is hidden from the programmer, they can't tell the JVM "No, keep that memory; I'm done with these objects, but I'll need the space for new ones." By keeping the memory, the GC doesn't risk giving up memory it won't be able to get back. Naturally, you can always get an OutOfMemoryException either way, but it seems more reasonable not to needlessly give memory back to the operating system every time you're done with an object, since you already went to the trouble to get it for yourself.
All of that aside, I'll try to directly address a few of your comments:
Often, they consume more and more
memory over runtime.
Assuming that this isn't just what the program is doing (for whatever reason, maybe it has a leak, maybe it has to keep track of an increasing amount of data), I imagine that it has to do with the free hash space ratio defaults set by the (Sun/Oracle) JVM. The default value for -XX:MinHeapFreeRatio is 40%, while -XX:MaxHeapFreeRatio is 70%. This means that any time there is only 40% of the heap space remaining, the heap will be resized by claiming more memory from the operating system (provided that this won't exceed -Xmx). Conversely, it will only* free heap memory back to the operating system if the free space exceeds 70%.
Consider what happens if I run a memory-intensive operation in Eclipse; profiling, for example. My memory consumption will shoot up, resizing the heap (likely multiple times) along the way. Once I'm done, the memory requirement falls back down, but it likely won't drop so far that 70% of the heap is free. That means that there's now a lot of underutilized space allocated that the JVM has no intention of releasing. This is a major drawback, but you might be able to work around it by customizing the percentages to your situation. To get a better picture of this, you really should profile your application so you can see the utilized versus allocated heap space. I personally use YourKit, but there are many good alternatives to choose from.
*I don't know if this is actually the only time and how this is observed from the perspective of the OS, but the documentation says it's the "maximum percentage of heap free after GC to avoid shrinking," which seems to suggest that.
Even some very small sample demo
applications load huge amounts of
memory.
I guess this depends on what kind of applications they are. I feel that Java GUI applications run memory-heavy, but I don't have any evidence one way or another. Did you have a specific example that we could look at?
But why is it needed to load the
library for each Java instance?
Well, how would you handle loading multiple Java applications if not creating new JVM processes? The isolation of the processes is a good thing, which means independent loading. I don't think that's so uncommon for processes in general, though.
As a final note, the slow start times you asked about in another question likely come from several intial heap reallocations necessary to get to the baseline application memory requirement (due to -Xms and -XX:MinHeapFreeRatio), depending what the default values are with your JVM.
Java runs inside a Virtual Machine, which constrains many parts of its behavior. Note the term "Virtual Machine." It is literally running as though the machine is a separate entity, and the underlying machine/OS are simply resources. The -Xmx value is defining the maximum amount of memory that the VM will have, while the -Xms defines the starting memory available to the application.
The VM is a product of the binary being system agnostic - this was a solution used to allow the byte code to execute wherever. This is similar to an emulator - say for old gaming systems. It is emulating the "machine" that the game runs on.
The reason why you run into an OutOfMemoryException is because the Virtual Machine has hit the -Xmx limit - it has literally run out of memory.
As far as smaller programs go, they will often require a larger percentage of their memory for the VM. Also, Java has a default starting -Xmx and -Xms (I don't remember what they are right now) that it will always start with. The overhead of the VM and the libraries becomes much less noticable when you start to build and run "real" applications.
The memory argument related to QT and the like is true, but is not the whole story. While it uses more memory than some of those, those are compiled for specific architectures. It has been a while since I have used QT or similar libraries, but I remember the memory management not being very robust, and memory leaks are still common today in C/C++ programs. The nice thing about Garbage Collection is that it removes many of the common "gotchas" that cause memory leaks. (Note: Not all of them. It is still very possible to leak memory in Java, just a bit harder).
Hope this helps clear up some of the confusion you may have been having.
To answer a portion of your question;
Java at start-up allocates a "heap" of memory, or a fixed size block (the -Xms parameter). It doesn't actually use all this memory right off the bat, but it tells the OS "I want this much memory". Then as you create objects and do work in the Java environment, it puts the created objects into this heap of pre-allocated memory. If that block of memory gets full then it will request a little more memory from the OS, up until the "max heap size" (the -Xmx parameter) is reached.
Once that max size is reached, Java will no longer request more RAM from the OS, even if there is a lot free. If you try to create more objects, there is no heap space left, and you will get an OutOfMemory exception.
Now if you are looking at Windows Task Manager or something like that, you'll see "java.exe" using X megs of memory. That sort-of corresponds to the amount of memory that it has requested for the heap, not really the amount of memory inside the heap thats used.
In other words, I could write the application:
class myfirstjavaprog
{
public static void main(String args[])
{
System.out.println("Hello World!");
}
}
Which would basically take very little memory. But if I ran it with the cmd line:
java.exe myfirstjavaprog -Xms 1024M
then on startup java will immediately ask the OS for 1,024 MB of ram, and thats what will show in Windows Task Manager. In actuallity, that ram isnt being used, but java reserved it for later use.
Conversely, if I had an app that tried to create a 10,000 byte large array:
class myfirstjavaprog
{
public static void main(String args[])
{
byte[] myArray = new byte[10000];
}
}
but ran it with the command line:
java.exe myfirstjavaprog -Xms 100 -Xmx 100
Then Java could only alocate up to 100 bytes of memory. Since a 10,000 byte array won't fit into a 100 byte heap, that would throw an OutOfMemory exception, even though the OS has plenty of RAM.
I hope that makes sense...
Edit:
Going back to "why Java uses so much memory"; why do you think its using a lot of memory? If you are looking at what the OS reports, then that isn't what its actually using, its only what its reserved for use. If you want to know what java has actually used, then you can do a heap dump and explore every object in the heap and see how much memory its using.
To answer "why doesn't it just let the OS handle it?", well I guess that is just a fundamental Java question for those that designed it. The way I look at it; Java runs in the JVM, which is a virtual machine. If you create a VMWare instance or just about any other "virtualization" of a system, you usually have to specify how much memory that virtual system will/can consume. I consider the JVM to be similar. Also, this abstracted memory model lets the JVM's for different OSes all act in a similar way. So for example Linux and Windows have different RAM allocation models, but the JVM can abstract that away and follow the same memory usage for the different OSes.
Java does use malloc and free, or at least the implementations of the JVM may. But since Java tracks allocations and garbage collects unreachable objects, they are definitely not enough.
As for the rest of your text, I'm not sure if there's a question there.
Even some very small sample demo applications load huge amounts of memory. Maybe this is because of the Java library which is loaded. But why is it needed to load the library for each Java instance? (It seems that way because multiple small applications linearly take more memory. See here for some details where I describe this problem.) Or why is it done that way?
That's likely due to the overhead of starting and running the JVM
Big Java applications like Eclipse often crash with some OutOfMemory exception. This was always strange because there was still plenty of memory available on my system. Often, they consume more and more memory over runtime. I'm not sure if they have some memory leaks or if this is because of fragmentation in the memory pool -- I got the feeling that the latter is the case.
I'm not entirely sure what you mean by "often crash," as I don't think this has happened to me in quite a long time. If it is, it's likely due to the "maximum size" setting you mentioned earlier.
Your main question asking why Java doesn't use malloc and free comes down to a matter of target market. Java was designed to eliminate the headache of memory management from the developer. Java's garbage collector does a reasonably good job of freeing up memory when it can be freed, but Java isn't meant to rival C++ in situations with memory restrictions. Java does what it was intended to do (remove developer level memory management) well, and the JVM picks up the responsibility well enough that it's good enough for most applications.
The limits are a deliberate design decision from Sun. I've seen at least two other JVM's which does not have this design - the Microsoft one and the IBM one for their non-pc AS/400 systems. Both grows as needed using as much memory as needed.
Java doesn't use a fixed size of memory it is always in the range from -Xms to -Xmx.
If Eclipse crashes with OutOfMemoryError, than it required more memory than granted by -Xmx (a coniguration issue).
Java must not use malloc/free (for object creation) since its memory handling is much different due to garbage collection (GC). GC removes automatically unused objects, which is a benefit compared to be responsible for memory management.
For details on this complex topic see Tuning Garbage Collection

Java memory usage increases when App is used, but doesnt decrease when not being used

I have a java application that uses a lot of memory when used, but when the program is not being used, the memory usage doesnt go down.
Is there a way to force Java to release this memory? Because this memory is not needed at that time, I can understand to reserve a small amount of memory, but Java just reserves all the memory it ever uses. It also reuses this memory later but there must be a way to force Java to release it when its not needed.
System.gc is not working.
As pointed out in the comments, it's not certain that, while the garbage collector disposes objects, it gives back memory to the system.
Perhaps Tuning Garbage Collection Outline provides the solution to your problem:
By default the JVM grows or shrinks the heap at each GC to keep the ratio of free space to live objects at each collection within a specified range.
-XX:MinHeapFreeRatio - when the percentage of free space in a generation falls below this value the generation will be expanded to meet this percentage. Default is 40
-XX:MaxHeapFreeRatio - when the percentage of free space in a generation exceeded this value the generation will shrink to meet this value. Default is 70
Otherwise, if you suspect that you're leaking references you can figure out how, what and where objects are leaked is to monitor the heap in JVisualVM (a tool bundled with the standard SDK). You can, through this program, perform a heap-dump and get a histogram over object memory consumption:
What memory do you mean? If it is RAM (as opposed to the amount of used heap space of the Java VM itself) then this might be normal. It is a relatively expensive operation to allocate memory so once the JVM got some it is quite reluctant to give it back even if it is not needed at the time.
Have you considered using a memory profiler? If you don't have access to one, you can start with capturing a bunch of jmap -histo <pid> and writing a script to figure the differences.
System.gc has no guarantees about if it should free any memory when ran. See Why is it bad practice to call System.gc()?
Try tweaking the Xmx JVM arg down if it is set to a large value and take a look in JConsole to see what's going on with memory usage and GC activity. Normally you'd see a saw tooth pattern.
You might also want to use a profiler to see where the memory is being used and to identify any leaks.
One of two things is happening:
1) Your application is leaking references. Are you sure that you aren't hanging on to objects when you'll no longer need them? If you do, Java must maintain them in memory.
2) Java's working just fine. You get no benefit from memory that you aren't using.

Force full garbage collection when memory occupation goes beyond a certain threshold

I have a server application that, in rare occasions, can allocate large chunks of memory.
It's not a memory leak, as these chunks can be claimed back by the garbage collector by executing a full garbage collection. Normal garbage collection frees amounts of memory that are too small: it is not adequate in this context.
The garbage collector executes these full GCs when it deems appropriate, namely when the memory footprint of the application nears the allotted maximum specified with -Xmx.
That would be ok, if it wasn't for the fact that these problematic memory allocations come in bursts, and can cause OutOfMemoryErrors due to the fact that the jvm is not able to perform a GC quickly enough to free the required memory. If I manually call System.gc() beforehand, I can prevent this situation.
Anyway, I'd prefer not having to monitor my jvm's memory allocation myself (or insert memory management into my application's logic); it would be nice if there was a way to run the virtual machine with a memory threshold, over which full GCs would be executed automatically, in order to release very early the memory I'm going to need.
Long story short: I need a way (a command line option?) to configure the jvm in order to release early a good amount of memory (i.e. perform a full GC) when memory occupation reaches a certain threshold, I don't care if this slows my application down every once in a while.
All I've found till now are ways to modify the size of the generations, but that's not what I need (at least not directly).
I'd appreciate your suggestions,
Silvio
P.S. I'm working on a way to avoid large allocations, but it could require a long time and meanwhile my app needs a little stability
UPDATE: analyzing the app with jvisualvm, I can see that the problem is in the old generation
From here (this is a 1.4.2 page, but the same option should exist in all Sun JVMs):
assuming you're using the CMS garbage collector (which I believe the server turns on by default), the option you want is
-XX:CMSInitiatingOccupancyFraction=<percent>
where % is the % of memory in use that will trigger a full GC.
Insert standard disclaimers here that messing with GC parameters can give you severe performance problems, varies wildly by machine, etc.
When you allocate large objects that do not fit into the young generation, they are immediately allocated in the tenured generation space. This space is only GC'ed when a full-GC is run which you try to force.
However I am not sure this would solve your problem. You say "JVM is not able to perform a GC quickly enough". Even if your allocations come in bursts, each allocation will cause the VM to check if it has enough space available to do it. If not - and if the object is too large for the young generation - it will cause a full GC which should "stop the world", thereby preventing new allocations from taking place in the first place. Once the GC is complete, your new object will be allocated.
If shortly after that the second large allocation is requested in your burst, it will do the same thing again. Depending on whether the initial object is still needed, it will either be able to succeed in GC'ing it, thereby making room for the next allocation, or fail if the first instance is still referenced.
You say "I need a way [...] to release early a good amount of memory (i.e. perform a full GC) when memory occupation reaches a certain threshold". This by definition can only succeed, if that "good amount of memory" is not referenced by anything in your application anymore.
From what I understand here, you might have a race condition which you might sometimes avoid by interspersing manual GC requests. In general you should never have to worry about these things - from my experience an OutOfMemoryError only occurs if there are in fact too many allocations to be fit into the heap concurrently. In all other situations the "only" problem should be a performance degradation (which might become extreme, depending on the circumstances, but this is a different problem).
I suggest you do further analysis of the exact problem to rule this out. I recommend the VisualVM tool that comes with Java 6. Start it and install the VisualGC plugin. This will allow you to see the different memory generations and their sizes. Also there is a plethora of GC related logging options, depending on which VM you use. Some options have been mentioned in other answers.
The other options for choosing which GC to use and how to tweak thresholds should not matter in your case, because they all depend on enough memory being available to contain all the objects that your application needs at any given time. These options can be helpful if you have performance problems related to heavy GC activity, but I fear they will not lead to a solution in your particular case.
Once you are more confident in what is actually happening, finding a solution will become easier.
Do you know which of the garbage collection pools are growing too large?....i.e. eden vs. survivor space? (try the JVM option -Xloggc:<file> log GC status to a file with time stamps)...When you know this, you should be able to tweak the size of the effected pool with one of the options mentioned here: hotspot options for Java 1.4
I know that page is for the 1.4 JVM, I can't seem to find the same -X options on my current 1.6 install help options, unless setting those individual pool sizes is a non-standard, non-standard feature!
The JVM is only supposed to throw an OutOfMemoryError after it has attempted to release memory via garbage collection (according to both the API docs for OutOfMemoryError and the JVM specification). Therefore your attempts to force garbage collection shouldn't make any difference. So there might be something more significant going on here - either a problem with your program not properly clearing references or, less likely, a JVM bug.
There's a very detailed explanation of how GC works here and it lists parameters to control memory available to different memory pools/generations.
Try to use -server option. It will enable parallel gc and you will have some performance increase if you use multi core processor.
Have you tried playing with G1 gc? It should be available in 1.6.0u14 onwards.

Categories