Used memory dosnt changed after get additional object from memcached - java

I dump all memory (jmap -histo ) after i load data from memcahced and than i load the the same data again (The data is loaded to another instance), but the used memory wasnt changed.
No GC was done (i allocate 2g new size heap and check with jstat and another tools that no gc was done ).
The second heap contains 2 instance from the type that I load from memcached.
I compare the heap after the second load to the heap after the first load.
The second heap has in any class types more instances than the first heap except [I that the second heap has less than the first one. The same of the gap (bytes) between the other classes in the heap is the same like the gap for [I.
I look on the byte code and didnt see anything suspicious. Any idea?
public static void main(String[] args) throws Throwable {
Object a1 = mc.get("userClient");
-- get dump
Object a11 = mc.get("userClient");
-- get dump
mc.shutdown();
}

If you wish to see the size of an Object on the heap, then turn off TLAB's (-XX:-UseTLAB) and monitor the change in Runtime.freeMemory() (dump is better because freememory is approximation ) while creating an object. This technique will only work for objects added to the heap, so local variables which are added to the stack wont show up.

Related

Java's varying available heap size

I just found out that there are some libraries to compute the shallow size of a java object, so I thought I can also write this in a very simple way. Here is what I tried.
Start the program with some Xmx say A
Create objects of type whose size you want to calculate (say type T) and store them in a list so that GC shouldn't clean them up.
When we hit OOM, let the code handle it and empty the list.
Now check the number of the objects of type T we allocated. Let this be n
Do a binary search to find out the delta inorder to successfully allocate n+1 objects.
Here is the code, I tried out
import java.util.ArrayList;
public class test {
public static void main(String[] a) {
ArrayList<Integer> l = new ArrayList<>();
int i=0;
try {
while(true) {
l.add(new Integer(1));
i++;
}
} catch(Throwable e) {
} finally {
l.clear();
System.out.println(i + "");
}
}
}
But I noticed that the number of objects allocated in each run for a same Xmx was varying. Why is this? Is there anything inside JVM randomized?
But I noticed that the number of objects allocated in each run for a same Xmx was varying. Why is this?
Some events in the JVM are non-deterministic, and this can effect garbage collector behavior.
But there could be factors in play that are resulting in variable numbers of (your) objects being created before you fill up the heap. These include:
Not all of the objects in the heap will be the ArrayList and Integer objects that you are explicitly creating. There will be Object[] objects that get created when you resize the ArrayList, various objects generated by your println calls ... and other things that happen under the hood.
Heap resizing behavior. The heap is not immediately sized to -Xmx size. The JVM typically starts with a smaller heap size and expands it on demand. By the time you get an OOME, the JVM has most likely expanded the heap to the max permitted, but the sequence of expansions is potentially sensitive to ... various factors including some that may be non-deterministic.
Heap generations. A typical Java GC uses an old space and a new space. The old space contains long-lived objects. New objects are allocated into the new space ... unless they are very large. The actual distribution of the objects can affect when the GC runs occur, and when JVM is going to decide that the heap is full.
JIT compilation. At certain points in the execution of your application, the JVM will (typically) decide to JIT compile your code. When this happens, extra objects get allocated.
Is there anything inside JVM randomized?
It is unlikely to be explicit randomization affecting this benchmark. There is sufficient non-determinism at various levels (i.e. in the hardware, the OS and the JVM) to explain the inconsistent results you are seeing.
In short: I wouldn't expect your benchmark to give consistent results for the number of objects that can be created.

how much memory is a stack object allocated upon creation?

I was wondering- when defining a new stack through the stack class
Stack stack=new Stack();
How much memory is allocated to it? It cannot depend on the amount of N objects (like arrays and lists, for example) because it is initialized without any data regarding the amount of objects that would be placed in.
However, it also doesn't make a lot of sense that it'd have a fixed amount of memory like an intor double for example, because you constantly place objects in it.
Does push command increases the memory allocation of the stack?
I assume it is placed in the 'heap' memory?
Thanks!
I'm speaking from C#, so bear with me; Whenever you allocate memory for a local variable, it gets allocated on the stack, heap is for things like objects, which allocates a reference to the object and then the actual object, then the object reference gets used by the garbage collector to go through and figure out what objects need to be cleaned up and which ones don't.
In this case, I believe you are allocating the object on the heap, because all a "stack" object is, is a filo data structure.
Stacks in Java only store primitives that exist within a local scope, ergo the stack size in Java is usually pretty small, the size however depends on several factors and is variable at runtime, the initial size for example is typically calculated based on how much memory the compiler thinks it will need to run, then as it grows it will increase in size (I think Windows for example increases the stack by pages, which is 256 bytes of memory, but don't hold me to that.)
In your case, since you are asking about the initial size of an uninitialized stack object, the size is the size of the stack object, and it changes as you add elements to it.
Hope that helps.
Stack extends Vector, and Stack() calls Vector() implicitly, which uses a default initial capacity of 10.
Stack inherits from Vector. The default constructor of Vector initializes an array with size 10.

Memory area for Static methods in java [duplicate]

For example:
class A {
static int i=0;
static int j;
static void method() {
// static k=0; can't use static for local variables only final is permitted
// static int L;
}
}
Where will these variables be stored in Java, in heap or in stack memory? How are they stored?
Static methods (in fact all methods) as well as static variables are stored in the PermGen section of the heap, since they are part of the reflection data (class related data, not instance related). As of Java 8 PermGen has been replaced by MetaSpace and as per JEP 122 it only holds meta-data while static fields are stored in the heap.
Note that this mostly applies to Oracle's Hotspot JVM and others that are based on it. However, not every JVM has PermGen or Metaspace like Eclipse OpenJ9.
Update for clarification:
Note that only the variables and their technical values (primitives or references) are stored in PermGen space.
If your static variable is a reference to an object that object itself is stored in the normal sections of the heap (young/old generation or survivor space). Those objects (unless they are internal objects like classes etc.) are not stored in PermGen space.
Example:
static int i = 1; //the value 1 is stored in the PermGen section
static Object o = new SomeObject(); //the reference(pointer/memory address) is stored in the PermGen section, the object itself is not.
A word on garbage collection:
Do not rely on finalize() as it's not guaranteed to run. It is totally up to the JVM to decide when to run the garbage collector and what to collect, even if an object is eligible for garbage collection.
Of course you can set a static variable to null and thus remove the reference to the object on the heap but that doesn't mean the garbage collector will collect it (even if there are no more references).
Additionally finalize() is run only once, so you have to make sure it doesn't throw exceptions or otherwise prevent the object to be collected. If you halt finalization through some exception, finalize() won't be invoked on the same object a second time.
A final note: how code, runtime data etc. are stored depends on the JVM which is used, i.e. HotSpot might do it differently than JRockit and this might even differ between versions of the same JVM. The above is based on HotSpot for Java 5 and 6 (those are basically the same) since at the time of answering I'd say that most people used those JVMs. Due to major changes in the memory model as of Java 8, the statements above might not be true for Java 8 HotSpot - and I didn't check the changes of Java 7 HotSpot, so I guess the above is still true for that version, but I'm not sure here.
Prior to Java 8:
The static variables were stored in the permgen space(also called the method area).
PermGen Space is also known as Method Area
PermGen Space used to store 3 things
Class level data (meta-data)
interned strings
static variables
From Java 8 onwards
The static variables are stored in the Heap itself.From Java 8 onwards the PermGen Space have been removed and new space named as MetaSpace is introduced which is not the part of Heap any more unlike the previous Permgen Space. Meta-Space is present on the native memory (memory provided by the OS to a particular Application for its own usage) and it now only stores the class meta-data.
The interned strings and static variables are moved into the heap itself.
For official information refer : JEP 122:Remove the Permanent Gen Space
Class variables(Static variables) are stored as part of the Class object associated with that class. This Class object can only be created by JVM and is stored in permanent generation.
Also some have answered that it is stored in non heap area which is called Method Area. Even this answer is not wrong. It is just a debatable topic whether Permgen Area is a part of heap or not. Obviously perceptions differ from person to person. In my opinion we provide heap space and permgen space differently in JVM arguments. So it is a good assumption to treat them differently.
Another way to see it
Memory pools are created by JVM memory managers during runtime. Memory pool may belong to either heap or non-heap memory.A run time constant pool is a per-class or per-interface run time representation of the constant_pool table in a class file. Each runtime constant pool is allocated from the Java virtual machine’s method area and Static Variables are stored in this Method Area.
Also this non-heap is nothing but perm gen area.Actually Method area is part of perm gen.(Reference)
This is a question with a simple answer and a long-winded answer.
The simple answer is the heap. Classes and all of the data applying to classes (not instance data) is stored in the Permanent Generation section of the heap.
The long answer is already on stack overflow:
There is a thorough description of memory and garbage collection in the JVM as well as an answer that talks more concisely about it.
It is stored in the heap referenced by the class definition. If you think about it, it has nothing to do with stack because there is no scope.
In addition to the Thomas's answer , static variable are stored in non heap area which is called Method Area.
As static variables are class level variables, they will store " permanent generation " of heap memory.
Please look into this for more details of JVM. Hoping this will be helpful
static variables are stored in the heap
When we create a static variable or method it is stored in the special area on heap: PermGen(Permanent Generation), where it lays down with all the data applying to classes(non-instance data). Starting from Java 8 the PermGen became - Metaspace. The difference is that Metaspace is auto-growing space, while PermGen has a fixed Max size, and this space is shared among all of the instances. Plus the Metaspace is a part of a Native Memory and not JVM Memory.
You can look into this for more details.
In real world or project we have requirement in advance and needs to create variable and methods inside the class , On the basis of requirement we needs to decide whether we needs to create
Local ( create n access within block or method constructor)
Static,
Instance Variable( every object has its own copy of it),
=>2. Static Keyword will be used with variable which will going to be same for particular class throughout for all objects,
e.g in selenium : we decalre webDriver as static => so we do not need to create webdriver again and again for every test case
Static Webdriver driver
(but parallel execution it will cause problem, but thats another case);
Real world scenario => If India is class, then flag, money would be same for every Indian, so we might take them as static.
Another example: utility method we always declare as static b'cos it will be used in different test cases.
Static stored in CMA( PreGen space)=PreGen (Fixed memory)changed to Metaspace after Java8 as now its growing dynamically
As of Java 8 , PermGen space is Obsolete. Static Methods,Primitives and Reference Variables are stored in Java MetaSpace. The actual objects reside in the JAVA heap. Since static methods never get out of reference they are never Garbage collected both from MetaSpace and the HEAP.

Does this Java example cause a memory leak?

I have a simple example. The example loads an ArrayList<Integer> from a file f containing 10000000 random integers.
doLog("Test 2");
{
FileInputStream fis = new FileInputStream(f);
ObjectInputStream ois = new ObjectInputStream(fis);
List<Integer> l = (List<Integer>) ois.readObject();
ois.close();
fis.close();
doLog("Test 2.1");
//l = null;
doLog("Test 2.2");
}
doLog("Test 2.3");
System.gc();
doLog("Test 2.4");
When I have l = null, I get this log:
Test 2 Used Mem = 492 KB Total Mem = 123 MB
Test 2.1 Used Mem = 44 MB Total Mem = 123 MB
Test 2.2 Used Mem = 44 MB Total Mem = 123 MB
Test 2.3 Used Mem = 44 MB Total Mem = 123 MB
Test 2.4 Used Mem = 493 KB Total Mem = 123 MB
But when I remove it, I get this log instead.
Test 2 Used Mem = 492 KB Total Mem = 123 MB
Test 2.1 Used Mem = 44 MB Total Mem = 123 MB
Test 2.2 Used Mem = 44 MB Total Mem = 123 MB
Test 2.3 Used Mem = 44 MB Total Mem = 123 MB
Test 2.4 Used Mem = 44 MB Total Mem = 123 MB
Used Memory is calculated by: runTime.totalMemory() - runTime.freeMemory()
Question: In case where l = null; is present, is there a memory leak?
l is inaccessible, so why can't it be freed?
There is no memory leak in the above code.
As soon as you leave the code block enclosed in {}, the variable l falls out of scope, and the List is a candidate for garbage collection, regardless of if you set it to null first or not.
However, after the code block and until the return of the method, the List is in a state called invisible. While this is true, the JVM is unlikely to automatically null out the reference and collect the List's memory. Therefore, explicitly setting l = null can help the JVM collect the memory before you do your memory calculations. Otherwise, it will happen automatically when the method returns.
You will probably get different results for different runs of your code, since you never know exactly when the garbage collector will run. You can suggest that you think it should run using System.gc() (and it might even collect the invisible List even without setting l = null), but there are no promises. It is stated in the javadoc for System.gc():
Calling the gc method suggests that the Java Virtual Machine expend
effort toward recycling unused objects in order to make the memory
they currently occupy available for quick reuse. When control returns
from the method call, the Java Virtual Machine has made a best effort
to reclaim space from all discarded objects.
I think there's a bit of semantics issue here. "Memory leak" generally means having some data stored in memory by a program (piece of software, etc) and getting that program into a state where it can no longer access that in-memory data to clean it up, thus getting into a situation where that memory cannot be claimed for future use. This, as far as I can tell, is the general definition.
A real-world use of the term "memory leak" is usually in reference to programming languages where it's up to the developer to manually allocate memory for the data that he intends to place on the heap. Such languages are C, C++, Objective-C (*), etc. For example the "malloc" command or the "new" operator both allocate memory for an instance of a class that will be placed in the heap memory space. In such languages, a pointer needs to be kept to those thusly allocated instances, if we later-on want to clean up the memory used by them (when they're no longer needed). Continuing on the above example, a pointer referencing an instance that has been created on the heap using "new" can later on be "removed" from memory by using the "delete" command and passing it the pointer as parameter.
Thus, for such languages, a memory leak usually means having data placed on the heap and subsequentlly either:
arriving into a state where there's no longer a pointer to that data
or
forgetting/ignoring to manually "de-allocate" that on-the-heap data (via it's pointer)
Now, in the context of such a definition of "memory leak" this can pretty much never happend with Java. Technically, in Java it's the Garbage Collector's task to decide when heap-allocated instances are no longer referenced or fall out of scope and clean them up. There's no such equivalent of the C++ "delete" command in Java that would even allow the developer to manually "de-allocate" instances/data from the heap. Even making all the pointers of an instance null will not immediatelly free up that instance's memory, but instead it will only make it "garbage collectable" leaving it to the Garbage Collector thread(s) to clean it up when it makes its sweeps.
Now, one other thing that can happen in Java is to never let go of pointers to certain instances, even though they will no longer be needed after a given point. Or, to give certain instance a scope that's too big for what they are used. This way, they will hang around in memory longer than needed (or forever, where forever means until the JDK process is killed) and thus not have them collected by the Garbage Collector even though from a functional stand-point they should be cleaned up. This can lead to behaviour similar to a "memory leak" in the broader sense where "memory leak" simply stands for "having stuff in memory when it's no longer needed and having no way to clean it up".
Now, as you can see, "memory leak" is somewhat vague, but from what I can see, your example doesn't contain a memory leak (even the version where you don't make l=null). All your variables are in a tight scope as delimited by the accolade block, they are used inside that block and will fall out of scope when the block ends, thus they'll be Garbage Collected "properly" (from the functional stand-point of your program). As #Keppil states: making the pointer null will give the GC a better hint as to when to clean up it's corresponding instance, but even if you never make it null, your code will not (un-necessarely) hang on to instances, so no memory leak there.
A typical example of Java memory leak is when having code deployed into a Java EE application server, such that it will spawn threads outside the control of said application server (imaging a servlet that starts a Quartz job). If the application is deployed and undeployed multiple times, it's possible that some of the threads will not be killed at undeploy time, but also (re) started at deploy time, thus leaving them and any instances they might have created hang uselessly in memory.
(*) The later versions of Objective-C also give the possibility to have heap memory managed automatically, in a fashion similar to Javas Garbage Collection mechanism.
The real answer is that unless the code is JIT'd all local variables are 'reachable' within the method body.
Morealso, curly brackets do absolutely nothing in the bytecode. They exist only in the source level - JVM is absolutely unaware of them. Setting l to null effectively frees the reference up off the stack, so it's GC'd for real. Happy stuff.
If you used another method instead of an inline block everything would have passed w/o any surprises.
If the code is JIT'd and the JVM compiler has built reaching-definitions (also this) most likely setting l=null would have no effect and memory be freed in either case.
Question: In case of removing l = null; (do not have this line of
code), is this a memory leak?
No, but it facilitates the gc in claiming the memory if you do this "pattern"

Actual memory allocation by JVM and how do they differ?

This might seem a lot of questions but they are all interrelated.I'm little confused as in where is the heap space allocated and where is the stack memory located ? If both are present in main memory then why it is said that stack memory is easier to access and why can't we allocate objects in stack memory ? Since classes are stored in PermGen where is this space allocated and how does it differ from heap space and where are constant strings stored ?
"Where are the heap and stack allocated?" The accepted answer to this question covers this. Each thread gets its own stack and they all share one heap. The operating system controls the exact memory locations of the stacks and heap items and it varies.
"Why is stack memory easier to access" Each thread has its own stack, so there are fewer concurrency issues. The stack and heap are both eligible for caching in the L1, L2, and L3 portions of the memory hierarchy, so I disagree with Daniel's answer here. Really I would not say that one kind of memory is particularly easier to access than the other.
"Why can't we allocated objects in stack memory?" This is a design decision taken by the JVM. In other languages like C/C++ you can allocate objects on the stack. Once you return from the function that allocated that stack frame such objects are lost. A common source of errors in C/C++ programs is sharing a pointer to such a stack allocated object. I bet that's why the JVM designers made this choice, though I am not sure.
The PermGen is another piece of the heap. Constant strings are stored here for the lifetime of the JVM. It is garbage collected just like the rest of the heap.
If both are present in main memory then why it is said that stack memory is easier to access
There's speed of access and speed of allocation. Stack allocation (as in alloca) is fast because there's no need to search for an unused block of memory. But Java doesn't allow stack allocation, unless you count the allocation of new stack frames.
Accessing stack memory is fast because it tends to be cached. Not only are locals near one another, they are also stored very compactly.
and why can't we allocate objects in stack memory ?
This would be useful, but dangerous. A user could allocate an object on the stack, create references to it from permanent objects, and then try to access the object after the associated stack frame is gone.
It's safe to store primitives on the stack because we can't create references to them from elsewhere.
Since classes are stored in PermGen where is this space allocated and how does it differ from heap space and where are constant strings stored ?
PermGen is just another heap space. String literals are stored in the literal pool, which is just a table in memory which is allocated when a class is loaded.

Categories