I am working on spring web mvc and recently encountered java.lang.OutOfMemoryError: Java heap space.
So, i was reading about it and the major mistake i am doing is that i am not deferencing the used objects. so GC is not cleaning lot of memory.
Now the question is when to dereference it.
Here is basic out flow:
From front end user sends a request
server calls a library with the users request
library returns it a big chunk of array of results.
server forwards it to front end.
Now til this point i cannot dereference results array as i need the result object. Am i correct ?
So when user sends new request should i clean the results array and call the library with new request.
Also i used -XX:+HeapDumpOnOutOfMemoryError to get a dump file. But i dont see the dump file in project folder. In log i see that dump file is created. Did any one run into this case.
Generally the solution for this type of problem is:
(Obvious) Increase the maximum heap settings using -Xmx, get a bigger hardware. This appraoch might not be scalable, but could provide short term solution to the problem
Ask yourself do you really need a big chunk ? If not try requesting smaller chunks instead to conserve heap usage. Make sure you are not holding reference to an object any longer than you should. As soon as you know your variable is no longer needed, set it to null so they can be garbage collected.
It will be very difficult to help you without a http://sscce.org/ . The JVM does the GC for you and if your objects have well defined scope they should get GCed automatically.
I would recommend you start by increasing the heap memory figuring out these :
what is the scope of Result array (Global or restricted to method calls hierarchy i.e passed through method invocation stack to fron-end)?
In case it has global scope is it part of a singleton instance or created per request instance?. What objects are stored in the Result array are they referenced anywhere else in your code?.
You can Use Jhat to get the object reference graph and find out who else is referencing the objects (DISCLAIMER: It would get messy if the objects stored in the array contain references to other objects & which is usually the case)
Better ways to identify objects not getting garbage collected?
Related
We have some legacy code that was written in the early 2000s that was running fine till now, except when we added more users to the system it had daily OutOfMemory exceptions
We are going to redo the project from ground up, but we want to best use the existing code base.
The code has thousands (yes not the best design) of HashMaps that are only modified in a loop on creation and after that remain readonly.
When i run Memory analyser it says that there is 15% of the heap consists of unused space in HahsMaps.
For these instances if we use reflection to make the entry set accessible and resize it so the size is equal to the actual needed, will iterators fail?
After initial construction these maps are read only caches. We are running the code on Java 6 and few instances on Java 8. wont be upgrading the JVM version, in another 10-12 months the new code will be in production and replace the current.
I know we can give a better initial size when creating the hash map but that is a lot of work compared to one function that resizes a map based on current size. ALso most of the maps are part of libraries (that i do not have access to source code) and these do nt expose the the constructor that accepts size. they call default constructor of their internal HashMap only.
Can someone tell me what is the size of a DLFileEntry record in the memory? I'm holding a List of DLFileEntries and I want to be just sure that my portlet won't have a memory issue after deploying it on a server, operating with a large number of records. Or can someone give me a guide how to obtain this information? Thank you.
You could run a quick test or look at the source and identify members. You're probably referring to the binary data, which is not directly stored in an object of this class. However, it will most likely be cached, so yes, there's some memory overhead. Do you actually need the binary data or will you just hold placeholders without accessing the binary data for all documents that you're holding in memory?
(Note: The source code I'm linking is the current master branch - check the version that you're actually using and figure out if something changed. As you don't give the version, I'll leave this task for yourself. Also, you might want to check the superclasses (I didn't find anything suspicious in master)
If I have a List<Object>, would it be possible to run some method on each Object to see how much memory each is consuming? I know nothing about each Object it may be an entire video file loaded onto the heap or just a two-byte string. I ultimately would like to know which objects to drop first before running out of memory.
I think Runtime.totalMemory() shows the memory currently used by the JVM, but I want to see the memory used by a single object.
SoftReference looks kinda like what you need. Create a list of soft references to your objects, and if those objects are not referenced anywhere and you run out of memory, JVM will delete some of them. I don't know how smart the algorithm for choosing what to delete is, but it could as well be removing those that will free most memory.
If you are in a container you can use Jconsole http://java.sun.com/developer/technicalArticles/J2SE/jconsole.html
The jdk since 1.5 comes with heap dump ulits... You in a container or in eclipse? Also why do you have a List of Objects??
There is no clean way to do it. You can create a dummy OutputStream which will do nothing but counting number of bytes written. So, you can make some estimation about your object graph size by serializing it to such stream.
I would not advise to do it in production system. I, personally, did it once for experimenting and making estimations.
Actually another possible tactic is just to make a crap load of instance of the class you want to check (like a million in an array).
The sheer number of objects should negate the overhead (as in the overhead of other stuff will be much smaller than your crap load of objects).
You will want to run this in isolation of course (ie public static main()).
I will admit you will need lots of memory for this test.
Something you could do is make a Map<Object, Long> which maps each object to it's memory size.
Then to measure the size of a particular object, you have to do it at instantiation of each object - measure the JVM memory use before (calling Runtime.totalMemory()) and after building the object (calling Runtime.totalMemory()) and take the difference between the two - that is the size of the object in memory. Then add the Object and Long to your map. From there you should be able to loop through all of the keys in the map and find the object using the largest amount of space.
I am not sure there is a way to do it per object after you already have your List<Object>... I hope this is helpful!
I am making an app for Android, in my Activity I need to load an array of about 10000 strings. Loading it from database was slow, so I decided to put it directly into one .java file (as a private field). I have about 20 of these classes containing string arrays and my question is, are all the classes loaded into memory after my application is started? If so the Activity in which I need these strings would be loaded quickly, but the application as a whole would have a slow start...
Is there other way, how to very quickly load an 10000 string array from a file?
UPDATE:
Why I need these strings? My Android app allows you to find "journeys" in Prague's public transit - you choose departure stop, arrival stop and it finds your journey (have a look here). My app has a suggestions feature - you enter leter "c" as your departure stop and a suggestions ListView appears with stops starting with "c". For these suggestions I need the strings. Fetching the suggestions from database is slow (about 400ms on G1).
First, 400ms to perform a simple database query is really slow. So slow that I'd suspect that there is some problem in your database schema (e.g. indices) or your database connection configuration.
But if you a serious about not using a database, there are a couple of alternatives to what you are currently doing:
Arrange that the classes containing the arrays are lazily loaded as required, using Class.forName(...). If you implement it right, it should be possible for the garbage collector to reclaim the classes after they have been loaded and the strings have been added to your primary data structure.
Turn the 10000 Strings into a flat file, put the file into your app's JAR file. Then use Class.getResourceAsStream(...) to open the file and read it into the in-memory array.
As above, but using an indexed file and replacing the array with a data structure that allows you to read Strings from the file lazily. (This will be a bit complicated, but if you are worried by the memory consumed by the 10000 Strings, this will help address that.)
A class is loaded only when it is first referenced.
Though you need an array of 10000, you may not need all at once. Here is where the concept of paging comes in. This link indicates that Paging is often done in Android.Initialy have only a small amount of array in memory, and as you need it, keep loading it in to memory and unloading any previous data from memory if not wanted.
For e.g. in any table, at one shot, the user may see at best 50 records, then he will have to scroll(considering his screen is not size of an iMax movie theatre). When he scrolls, load the next chunk of data and unload any data that is now inivsible to the user.
When is a Type Loaded? This is a
surprisingly tricky question to
answer. This is due in large part to
the significant flexibility afforded,
by the JVM spec, to JVM
implementations. Loading must be
performed before linking and linking
must be performed before
initialization. The VM spec does
stipulate the timing of
initialization. It strictly requires
that a type be initialized on its
first active use (see Appendix A for a
list of what constitutes an "active
use"). This means that loading (and
linking) of a type MUST be performed
at or before that type's first active
use.
From http://www.developer.com/java/other/article.php/2248831/Java-Class-Loading-The-Basics.htm
I don't think that you will be happy with maintaining 10K Strings, hardcoded at Java files.
Rather check if you are using the right database for your problem and if your indices are set correctly. A wrong index can cause really poor performance.
Additionally you should limit the amount of results returned by the query, but make sure you don't fetch the entries one by one.
If nothing fits, you can still preload the Strings from the database at startup.
You could preload, let's say 10 entries, for each character. If a character is keyed in, you can preload the entries with that character following another and so on.
Is there anyway in Java to delete data (e.g., a variable value, object) and be sure it can't be recovered from memory? Does assigning null to a variable in Java delete the value from memory? Any ideas? Answers applicable to other languages are also acceptable.
Due to the wonders virtual memory, it is nearly impossible to delete something from memory in a completely irretrievable manner. Your best bet is to zero out the value fields; however:
This does not mean that an old (unzeroed) copy of the object won't be left on an unused swap page, which could persist across reboots.
Neither does it stop someone from attaching a debugger to your application and poking around before the object gets zeroed or crashing the VM and poking around in the heap dump.
Store sensitive data in an array, then "zero" it out as soon as possible.
Any data in RAM can be copied to the disk by a virtual memory system. Data in RAM (or a core dump) can also be inspected by debugging tools. To minimize the chance of this happening, you should strive for the following
keep the time window a secret is
present in memory as short as
possible
be careful about IO pipelines (e.g.,
BufferedInputStream) that internally
buffer data
keep the references to the secret on the stack and out of the heap
don't use immutable types, like
String, to hold secrets
The cryptographic APIs in Java use this approach, and any APIs you create should support it too. For example, KeyStore.load allows a caller to clear a password char[], and when the call completes, as does the KeySpec for password-based encryption.
Ideally, you would use a finally block to zero the array, like this:
KeyStore ks = KeyStore.getInstance(KeyStore.getDefaultType());
InputStream is = …
char[] pw = System.console().readPassword();
try {
ks.load(is, pw);
}
finally {
Arrays.fill(pw, '\0');
}
Nothing gets deleted, its just about being accessible or not to the application.
Once inaccessible, the space becomes a candidate for subsequent usage when need arises and the space will be overwritten.
In case of direct memory access, something is always there to read but it might be junk and wont make sense.
By setting your Object to null doesn't mean that your object is removed from memory. The Virtual Machine will flag that Object as ready for Garbage Collection if there are no more references to that Object. Depending on your code it might still be referenced even though you have set it to null in which case it will not be removed. (Essentially if you expect it to be garbage collected and it is not you have a memory leak!)
Once it is flagged as ready for collection you have no control over when the Garbage Collector will remove it. You can mess around with Garbage Collection strategies but I wouldn't advise it.
Profile your application and look at the object and it's id and you can see what is referencing it. Java provide VisualVM with 1.6.0_07 and above or you can use NetBeans
As zacherates said, zero out the sensitive fields of your Object before removing references to it. Note that you can't zero out the contents of a String, so use char arrays and zero each element.
Nope, unless you have direct answer to hardware. There is a chance that variable will be cached somewhere. Sensitive data can even be stored in swap :) If you're concerning only about RAM, you can play with garbage collector. In high level langs usually you don't have a direct access to memory, so it's not possible to control this aspect. For example in .NET there is a class SecureString which uses interop and direct memory access.
I would think that your best bet (that isn't complex) is to use a char[] and then change each position in the array. The other comments about it being possible for it to be copied in memory still apply.
Primitive data (byte, char, int, double) and arrays of them (byte[], ...) are erasable by writing new random content into them.
Object data have to be sanitized by overwriting their primitive properties; setting a variable to null just makes the object available for GC, but not immediately dead. A dump of VM will contain them for anyone to see.
Immutable data such as String cannot be overwritten in any way. Any modification just makes a copy. You shall avoid keeping sensitive data in such objects.
P.S. If we talk about passwords, it's better to use crypto-strong hash functions (MD5, SHA1, ...), and never ever work with passwords in clear text.
If you're thinking about securing password/key management, you could write some JNI code that uses platform-specific API to store the keys in a secure way and not leak the data into the memory managed by the JVM. For example, you could store the keys in a page locked in physical memory and could prevent the IO bus from accessing the memory.
EDIT: To comment on some of the previous answers, the JVM could relocate your objects in memory without erasing their previous locations, so, even char[], bytes, ints and other "erasable" data types aren't an answer if you really want to make sure that no sensitive information is stored in the memory managed by the JVM or swapped on to the hard drive.
Totally and completely irretrievable is something almost impossible in this day and age.
When you normally delete something, the onlything that happens is that the first spot in your memory is emptied. This first spot used to contain the information as to howfar the memory had to be reserved for that program or something else.
But all the other info is still there untill it's overwritten by someone else.
i sudgest either TinyShredder, or using CCleaner set to the Gutmann-pass