I build a sample program demonstrate memory leak in java.
public class MemoryLeakTest {
static int depth = 0;
int number=0;
MemoryLeakTest mobj;
MemoryLeakTest(){
number = depth;
if(depth < 6500){
depth++;
mobj = new MemoryLeakTest();
}
}
protected void finalize(){
System.out.println(number + " released.");
}
public static void main(String[] args) {
try{
System.out.println(ManagementFactory.getMemoryMXBean().getHeapMemoryUsage());
System.out.println("Free Memory in starting "+ Runtime.getRuntime().freeMemory());
MemoryLeakTest testObj = new MemoryLeakTest();
System.out.println("Free Memory in end "+ Runtime.getRuntime().freeMemory());
System.out.println(ManagementFactory.getMemoryMXBean().getHeapMemoryUsage());
}
catch(Exception exp){}
finally{
System.out.println("Free Memory"+ Runtime.getRuntime().freeMemory());
System.out.println(ManagementFactory.getMemoryMXBean().getHeapMemoryUsage());
}
}
}
I run it by changing value of N in if(depth < N). An here is the result;
when depth is 1000
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory in starting 15964120
Free Memory in end 15964120
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory 15964120
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
when depth is 1500
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory in starting 15964120
Free Memory in end 15964120
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory 15873528
init = 16777216(16384K) used = 379400(370K) committed = 16252928(15872K) max = 259522560(253440K)
when depth is 6000
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory in starting 15964120
Free Memory in end 15692784
init = 16777216(16384K) used = 560144(547K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory 15692784
init = 16777216(16384K) used = 560144(547K) committed = 16252928(15872K) max = 259522560(253440K)
when depth is 6500 (Exception in thread "main" java.lang.StackOverflowError)
init = 16777216(16384K) used = 288808(282K) committed = 16252928(15872K) max = 259522560(253440K)
Free Memory in starting 15964120
Free Memory in end 15676656
init = 16777216(16384K) used = 576272(562K) committed = 16252928(15872K) max = 259522560(253440K)
My questions are;
It is not calling finalize(). Is it memory leak?
There is not change in free memory up to N=1000. But when N=1500 there is 2 different
values for used memory at the end of the program ie 282K and 370K.
Why does it so?
When N=6500, JVM generates error. So why last 2
statements of try{} are executed.
Your program won't "leak" as Java will take care of anything "dangling" out there. That's the benefit of a garbage-collected language.
But what you do have is a StackOverFlow error. Basically, the stack (which is the chain of functions you're in, and how deep that is) is much MUCH smaller than the heap. The heap is "more or less" the size of main memory. Each thread's stack is much much smaller. Basically you're reaching that limit by doing your "Depth" thing.
If you want to test "leaks" (or the idea that you won't have any eventually) try something more like this:
public class MemoryLeakTest {
int number=0;
public MemoryLeakTest mobj;
MemoryLeakTest(int num){
number = num;
}
protected void finalize(){
System.out.println(number + " released.");
}
public static void main(String[] args) {
try{
System.out.println(ManagementFactory.getMemoryMXBean().getHeapMemoryUsage());
System.out.println("Free Memory in starting "+ Runtime.getRuntime().freeMemory());
MemoryLeakTest first = new MemoryLeakTest(0); // Keep a reference to one of them
MemoryLeakTest current = first;
for(int i = 1; i < Int.Parse(args[0]); i++) // forgive me, Java's been a while. This may be C#. But parse the first arg for your number of objects
{
current.mobj = new MemoryLeakTest(i);
current = current.mobj;
}
System.out.println("Free Memory in end "+ Runtime.getRuntime().freeMemory());
System.out.println(ManagementFactory.getMemoryMXBean().getHeapMemoryUsage());
}
catch(Exception exp){}
finally{
System.out.println("Free Memory"+ Runtime.getRuntime().freeMemory());
System.out.println(ManagementFactory.getMemoryMXBean().getHeapMemoryUsage());
}
}
}
That will give you a "chain" of objects all in memory until first goes out of scope.
It is not calling finalize(). Is it memory leak?
No there is no memory leak you always keep accessible reference to your testObj object
and that's the reason why finalizewill never be called in your application.
All you do in your application is to create a huge object graph.
Here you can find an explanation how to create a real memory leak in java.
It is not calling finalize(). Is it memory leak?
Finalize is not guaranteed to be called, it is called when the garbage collector collects the given object but the objects are not guaranteed to be collected before the execution ends.
There is not change in free memory up to N=1000. But when N=1500 there is 2 different values >for used memory at the end of the program ie 282K and 370K. Why does it so?
I think it depends on the execution of the garbage collector and the moments that it gets executed.
When N=6500, JVM generates error. So why last 2 statements of try{} are executed.
This is because you're not catching the exception since StackOverflowError inherits from Error that is not part of the Exception inheritance branch but rather is a brother of Exception, anyway you have no code in the catch, the last two methods of your try are not being executed because the exception has been thrown.
In summary you didn't produce a memory leak, memory leaks happen in java when you have references to objects that are reachable (directly or indirectly) from the execution flow at some point, for example you store objects in a collection that you can reach, or singletons.
The garbage collector itself is smart enough to free object graphs that are not reachable from the program at all.
Hope I could make it clear.
Already most of the answers explained difference between StackOverflowError and memory leak.
There is not change in free memory up to N=1000. But when N=1500 there is 2 different values for used memory at the end of the program ie 282K and 370K. Why does it so?
it is because every time you create new object and previous obj become unreachable(no references, overriding reference) and hence can be freed if required.
So far simplest example to make jvm run out of memory (not leak).
public class PrintSeries {
private static String COMMA = ",";
private StringBuilder buildStream;// = new StringBuilder();
public static void main(String[] args) {
System.out.println(new PrintSeries().convert(10));
System.out.println(new PrintSeries().convert(1000000000));
}
private String convert(int n) {
buildStream = new StringBuilder();
while (n > 1) {
buildStream.append(n-- + COMMA);
}
buildStream.append(n);
return buildStream.toString();
}
}
output
10,9,8,7,6,5,4,3,2,1
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:390)
at java.lang.StringBuilder.append(StringBuilder.java:119)
at com.cctest.algotest.string.PrintSeries.convert(PrintSeries.java:17)
at com.cctest.algotest.string.PrintSeries.main(PrintSeries.java:10)
This is not evidence of a memory leak. The program is throwing StackOverflowError not OutOfMemoryError. In fact, that is going on is that the constructor is calling itself recursively, and when the number of recursive calls exceeds some large number (between 6,000 and 6,500), you run out of stack space.
It is not calling finalize(). Is it memory leak?
No. The finalize() method is most likely not being called because the GC has not run. And it has not run because you haven't filled the heap. And even if that is not the real explanation, there is no guarantee that the finalize() method will ever be called. The only absolute guarantee you have is that finalize() will be called before the object's memory is reused by the JVM.
There is not change in free memory up to N=1000. But when N=1500 there is 2 different values for used memory at the end of the program ie 282K and 370K. Why does it so?
I'm not sure why that happens, but I don't think it indicates anything significant. (There are all sorts of things that happen under the hood in a JVM that can be sources of non-determinacy in things like memory allocation and usage patterns.)
When N=6500, JVM generates error. So why last 2 statements of try{} are executed.
The statements in the finally is always executed, unless the JVM terminates abruptly. When the StackOverflowError is thrown, it propagates like any other exception, and can be caught and recovered from (in some cases).
Related
I always thought that a while (true) {...Any code..} would always result in a out of memory error.
But as I go deeper in java it seems it might not be like that.
I'm not able to confirm but if we have a while true that only does calculations, We are not expected to have an out of memory error, only a very detrimental cpu performance, right?
On the other hand if we are always requiring more memory it is expected to have a out of memory error.
I've 3 cases below.
calculations only (I think no memory is being allocated under the hood)
Ever increasing arraylist which it looks an obvious out of memory error
always instanting arraylist with new keyword. I dont know if it causes an out of memory error, because of garbage collector.
I'm not testing im my pc because I only have one, hope someone has the knowledge.
Code
import java.util.*;
public class HelloLeak{
//calculations only, MemoryLeak?
public static void outofmemo1(){
long i = 0;
while (true)
{
i = i * i;
}
}
//adding infinite number of objects, memory leak confirmed.
public static void outofmemo2(){
int i = 0;
List<Integer> l = new ArrayList<>();
while (true)
{
l.add(i);
}
}
//Creating infinite number of ArrayList objects, will garbage collector clear the unused objects or we will get memory leak?
public static void outofmemo3(){
List<Integer> l = new ArrayList<>();
while (true)
{
l = new ArrayList<>();
}
}
public static void main(String []args){
outofmemo1();
//outofmemo2();
//outofmemo3();
}
}
Will do absolutly nothing except ending in an endless loop.
Will crash with an OutOfMemoryError, because you add always a new element to the list, until the heap is filled.
Will be like 1. but you may have spikes up to for example 2GB, then the GC will come, see that there are unused objects, removes them. After that it will spike again, and so on
Does throwing OutOfMemoryError trigger the heap dump, or does memory actually need to be exhausted?
In other words, will a heap dump be produced if I:
throw new java.lang.OutOfMemoryError();
and have set
-XX:+HeapDumpOnOutOfMemoryError
Is this universally true for all JVMs, or is this likely to be vendor-specific?
Why: I want to simulate OOME for testing purposes, and would prefer to have a one-line way of doing this. Just throwing the Error seems logical.
Because the documentation doesn't say so and it may or may not be vendor specific, I would just create a large object to force an OOME.
I used this simple Runnable to spawn a Thread causing an OOME when I needed to:
private static class OOMRunnable implements Runnable {
private static final int ALLOCATE_STEP_SIZE = 1_000_000;
#Override
public void run() {
long bytesUsed = 0L;
List<long[]> eatingMemory = new ArrayList<>();
while (true) {
eatingMemory.add(new long[ALLOCATE_STEP_SIZE]);
bytesUsed += Long.BYTES * ALLOCATE_STEP_SIZE;
System.out.printf("%d MB allocated%n", bytesUsed / 1_000_000);
}
}
}
I was trying to understand SoftReferences in Java which basically ensures clearing memories of SoftReferenced objects before throwing StackOverflowError.
public class Temp
{
public static void main(String args[])
{
Temp temp2 = new Temp();
SoftReference<Temp> sr=new SoftReference<Temp>(temp2);
temp2=null;
Temp temp=new Temp();
temp.infinite(sr);
}
public void infinite(SoftReference sr)
{
try
{
infinite(sr);
}
catch(StackOverflowError ex)
{
System.out.println(sr.get());
System.out.println(sr.isEnqueued());
}
}
}
However the outcome of above was
test.Temp#7852e922
false
Can someone explain me why the object was not cleared by GC? How can I make it work?
Looks like you may have some confusion with the StackOverFlowError and OutOfMemoryError. StackOverFlowError and OutOfMemoryError error are different. StackOverFlowError happens when there is no space in the call stack: OutOfMemoryError occurs when the JVM is unable to allocate memory in the heap space for a new object. Your code leads to StackOverflow: that means stack memory is full, not the heap space. I believe there will be enough space to store your SoftReference that's why it does not GCd the object.
We have a service that is being monitored via JMX. The JVM heap usage is growing and even major collections are not able to remove the garbage. Inspecting the heap shows garbage consisting of RMI related references (mostly, if not all, related class loaders). The only way to alleviate the issue is to issue explicit gc call through JMX (that removes all accumulated garbage). Our gc related options are:
-XX:+UseParNewGC
-XX:+UseConcMarkSweepGC
-XX:+CMSParallelRemarkEnabled
-XX:SurvivorRatio=8
-XX:MaxTenuringThreshold=1
-XX:CMSInitiatingOccupancyFraction=75
-XX:+UseCMSInitiatingOccupancyOnly
And we have not touched either of: DisableExplicitGC or sun.rmi.dgc.server.gcInterval
I believe the problem is supposed to addressed by the code in sun.misc.GC.Daemon:
public void run() {
for (;;) {
long l;
synchronized (lock) {
l = latencyTarget;
if (l == NO_TARGET) {
/* No latency target, so exit */
GC.daemon = null;
return;
}
long d = maxObjectInspectionAge();
if (d >= l) {
/* Do a full collection. There is a remote possibility
* that a full collection will occurr between the time
* we sample the inspection age and the time the GC
* actually starts, but this is sufficiently unlikely
* that it doesn't seem worth the more expensive JVM
* interface that would be required.
*/
System.gc();
d = 0;
}
/* Wait for the latency period to expire,
* or for notification that the period has changed
*/
try {
lock.wait(l - d);
} catch (InterruptedException x) {
continue;
}
}
}
}
For some reason the above System.gc is not being invoked (that has been verified by looking at gc logs). Anyone has a suggestion as to how to address the issue?
While investigating Why ThreadPoolExecutor behaves differently when running Java program in Eclipse and from command line? I wrote a test that throws a very strange OutOfMemoryError (max mem = 256 Mb)
class A {
byte[] buf = new byte[150_000_000];
protected void finalize() {
int i = 1;
}
}
A a1 = new A();
a1 = null;
A a2 = new A();
comment out int i = 1 and the test works. As far as I understand when finalize is empty HotSpot simply ignores it. But how can just one practically empty finalize invocation break GC / JVM?
But how can just one empty finalize invocation break GC / JVM?
When there's a finalizer, objects survive one more round of garbage collection than they would otherwise (as the object itself has to be kept alive until it's finalized). Therefore if you have a large object with a finalizer, that will naturally lead to an OutOfMemoryError occurring in situations when it wouldn't without a finalizer.
In this code:
A a1 = new A();
a1 = null;
A a2 = new A();
... the GC will trigger on the last line in order to try to find enough memory to allocate the second A. Unfortunately, it can't garbage collect the first A (and the array it refers to) because the finalizer hasn't run yet. It doesn't wait until the finalizer completes, then try to garbage collect again - it just throws OutOfMemoryError.