SoftReference is not getting cleared by Java GC - java

I was trying to understand SoftReferences in Java which basically ensures clearing memories of SoftReferenced objects before throwing StackOverflowError.
public class Temp
{
public static void main(String args[])
{
Temp temp2 = new Temp();
SoftReference<Temp> sr=new SoftReference<Temp>(temp2);
temp2=null;
Temp temp=new Temp();
temp.infinite(sr);
}
public void infinite(SoftReference sr)
{
try
{
infinite(sr);
}
catch(StackOverflowError ex)
{
System.out.println(sr.get());
System.out.println(sr.isEnqueued());
}
}
}
However the outcome of above was
test.Temp#7852e922
false
Can someone explain me why the object was not cleared by GC? How can I make it work?

Looks like you may have some confusion with the StackOverFlowError and OutOfMemoryError. StackOverFlowError and OutOfMemoryError error are different. StackOverFlowError happens when there is no space in the call stack: OutOfMemoryError occurs when the JVM is unable to allocate memory in the heap space for a new object. Your code leads to StackOverflow: that means stack memory is full, not the heap space. I believe there will be enough space to store your SoftReference that's why it does not GCd the object.

Related

Does While true loop always cause Out of memory error?

I always thought that a while (true) {...Any code..} would always result in a out of memory error.
But as I go deeper in java it seems it might not be like that.
I'm not able to confirm but if we have a while true that only does calculations, We are not expected to have an out of memory error, only a very detrimental cpu performance, right?
On the other hand if we are always requiring more memory it is expected to have a out of memory error.
I've 3 cases below.
calculations only (I think no memory is being allocated under the hood)
Ever increasing arraylist which it looks an obvious out of memory error
always instanting arraylist with new keyword. I dont know if it causes an out of memory error, because of garbage collector.
I'm not testing im my pc because I only have one, hope someone has the knowledge.
Code
import java.util.*;
public class HelloLeak{
//calculations only, MemoryLeak?
public static void outofmemo1(){
long i = 0;
while (true)
{
i = i * i;
}
}
//adding infinite number of objects, memory leak confirmed.
public static void outofmemo2(){
int i = 0;
List<Integer> l = new ArrayList<>();
while (true)
{
l.add(i);
}
}
//Creating infinite number of ArrayList objects, will garbage collector clear the unused objects or we will get memory leak?
public static void outofmemo3(){
List<Integer> l = new ArrayList<>();
while (true)
{
l = new ArrayList<>();
}
}
public static void main(String []args){
outofmemo1();
//outofmemo2();
//outofmemo3();
}
}
Will do absolutly nothing except ending in an endless loop.
Will crash with an OutOfMemoryError, because you add always a new element to the list, until the heap is filled.
Will be like 1. but you may have spikes up to for example 2GB, then the GC will come, see that there are unused objects, removes them. After that it will spike again, and so on

Android: Can a memory leak happen on the same thread?

I am new to handling the memory leak situations, but one thing that I have noticed is that all the examples showing memory leaks have the activity contexts on a different thread.
So I need to know if a memory leak can happen if there is an object reference on the same thread as well, because the activity reference is stored somewhere in other classes.
Thanks in advance!
A Memory Leak is a situation when there are objects present in the heap that are no longer used, but the garbage collector is unable to remove them from memory and, thus they are unnecessarily maintained.
Memory leaks can happend in the same thread as well. For example if a method stored data in a static variable which it does need to refer in the subsequent call.
E.g: In the code below we are storing numbers generates in a static list even though we do not require those generated numbers in subsequent calls.
public class MemoryLeak{
public static List<Double> list = new ArrayList<>();
public void doSomething() {
for (int i = 0; i < 10000000; i++) {
list.add(Math.random());
}
Log.info("Debug Point 2");
}
public static void main(String[] args) {
Log.info("Debug Point 1");
new MemoryLeak().doSomething();
Log.info("Debug Point 3");
}
}

can java 8 lambdas cause memory leaks?

well I found this code in a blog, and wanted to understand why it would cause a memory leak, if it is potential of causing a memory leak.
class Test {
public static void main(String[] args) {
Runnable runnable = new EnterpriseBean()
.runnable();
runnable.run(); // Breakpoint here
}
}
#ImportantDeclaration
#NoMoreXML({
#CoolNewValidationStuff("Annotations"),
#CoolNewValidationStuff("Rock")
})
class EnterpriseBean {
Object[] enterpriseStateObject =
new Object[100_000_000];
Runnable runnable() {
return () -> {
System.out.println("Hello from: " + this);
};
}
}
The provided code does not have a memory leak, and the blog entry from which it is drawn does not say otherwise. What it says is that the object returned by EnterpriseBean.runnable() has much (much) larger state than you might naively expect, and that that state cannot be garbage collected before the Runnable itself is.
However, there is nothing in that code that would prevent the Runnable from eventually being collected, and at that time all the extra state will be eligible for collection, too.
So no, the code is not an example of a memory leak, and does not suggest a means to produce one.

I was expecting outOfMemory but here I get stackOverFlow in java

package com.atul;
public class StackOverFlow {
public StackOverFlow() {
callStackOverFlow();
}
public void callStackOverFlow() {
StackOverFlow st = new StackOverFlow();
}
public static void main(String[] args) {
StackOverFlow st2 = new StackOverFlow();
}
}
In above program I was trying to get OutOfMemory error but I get StackOverFlow error. As per my knowledge all the objects are created in the Heap. Here we are doing recursion with constructor, still I get the StackOverFlow error.
Why?
You run out of stack (which has a maximum depth around 10,000 for simple cases) long before you run out of heap memory. This is because every thread has its own stack so it must be a lot smaller than the shared heap.
If you want to run out of memory, you need to use up the heap faster.
public class OutOfMemoryMain {
byte[] bytes = new byte[100*1024*1024];
OutOfMemoryMain main = new OutOfMemoryMain();
public static void main(String... args) {
new OutOfMemoryMain();
}
}
The stack size in the JVM is limited (per-thread) and configurable via -Xss.
If you want to generate an OOM, I would suggest looping infinitely and instantiating a new object per loop, and storing it in a collection (otherwise the garbage collection will destory each instance)
Before the memory get full of objects and program aborts due to out of memory; you ran out of stack which stores the method call and hence you are getting Stackoverflow Error.
Overflow error would come when your objects would fill up the heap space...

How to make the java system release Soft References?

I'm going to use a SoftReference-based cache (a pretty simple thing by itself). However, I've came across a problem when writing a test for it.
The objective of the test is to check if the cache does request the previously cached object from the server again after the memory cleanup occurs.
Here I find the problem how to make system to release soft referenced objects. Calling System.gc() is not enough because soft references will not be released until the memory is low. I'm running this unit test on the PC so the memory budget for the VM could be pretty large.
================== Added later ==============================
Thank you all who took care to answer!
After considering all pro's and contra's I've decided to go the brute force way as advised by nanda and jarnbjo. It appeared, however, that JVM is not that dumb - it won't even attempt garbage collecting if you ask for a block which alone is bigger than VM's memory budget. So I've modified the code like this:
/* Force releasing SoftReferences */
try {
final List<long[]> memhog = new LinkedList<long[]>();
while(true) {
memhog.add(new long[102400]);
}
}
catch(final OutOfMemoryError e) {
/* At this point all SoftReferences have been released - GUARANTEED. */
}
/* continue the test here */
This piece of code forces the JVM to flush all SoftReferences. And it's very fast to do.
It's working better than the Integer.MAX_VALUE approach, since here the JVM really tries to allocate that much memory.
try {
Object[] ignored = new Object[(int) Runtime.getRuntime().maxMemory()];
} catch (OutOfMemoryError e) {
// Ignore
}
I now use this bit of code everywhere I need to unit test code using SoftReferences.
Update: This approach will indeed work only with less than 2G of max memory.
Also, one need to be very careful with SoftReferences. It's so easy to keep a hard reference by mistake that will negate the effect of SoftReferences.
Here is a simple test that shows it working every time on OSX. Would be interested in knowing if JVM's behavior is the same on Linux and Windows.
for (int i = 0; i < 1000; i++) {
SoftReference<Object> softReference = new SoftReferencelt<Object>(new Object());
if (null == softReference.get()) {
throw new IllegalStateException("Reference should NOT be null");
}
try {
Object[] ignored = new Object[(int) Runtime.getRuntime().maxMemory()];
} catch (OutOfMemoryError e) {
// Ignore
}
if (null != softReference.get()) {
throw new IllegalStateException("Reference should be null");
}
System.out.println("It worked!");
}
An improvement that will work for more than 2G max memory. It loops until an OutOfMemory error occurs.
#Test
public void shouldNotHoldReferencesToObject() {
final SoftReference<T> reference = new SoftReference<T>( ... );
// Sanity check
assertThat(reference.get(), not(equalTo(null)));
// Force an OoM
try {
final ArrayList<Object[]> allocations = new ArrayList<Object[]>();
int size;
while( (size = Math.min(Math.abs((int)Runtime.getRuntime().freeMemory()),Integer.MAX_VALUE))>0 )
allocations.add( new Object[size] );
} catch( OutOfMemoryError e ) {
// great!
}
// Verify object has been garbage collected
assertThat(reference.get(), equalTo(null));
}
Set the parameter -Xmx to a very
small value.
Prepare your soft
reference
Create as many object as
possible. Ask for the object everytime until it asked the object from server again.
This is my small test. Modify as your need.
#Test
public void testSoftReference() throws Exception {
Set<Object[]> s = new HashSet<Object[]>();
SoftReference<Object> sr = new SoftReference<Object>(new Object());
int i = 0;
while (true) {
try {
s.add(new Object[1000]);
} catch (OutOfMemoryError e) {
// ignore
}
if (sr.get() == null) {
System.out.println("Soft reference is cleared. Success!");
break;
}
i++;
System.out.println("Soft reference is not yet cleared. Iteration " + i);
}
}
You could explicitly set the soft reference to null in your test, and as such simulate that the soft reference has been released.
This avoids any complicated test setup that is memory and garbage collection dependend.
Instead of a long running loop (as suggested by nanda), it's probably faster and easier to simply create a huge primitive array to allocate more memory than available to the VM, then catch and ignore the OutOfMemoryError:
try {
long[] foo = new long[Integer.MAX_VALUE];
}
catch(OutOfMemoryError e) {
// ignore
}
This will clear all weak and soft references, unless your VM has more than 16GB heap available.

Categories