I'm trying to sort a bunch of data such that that the size of data input to the program can be larger than the memory available to the JVM, and handling that requires external sort which is much slower than Quicksort.
Is there any way of obtaining memory available to the JVM at runtime such that I could use in place sorting as much as possible, and only switch to Mergesort when data input is too large?
Check out these methods on the java.lang.Runtime class:
freeMemory
totalMemory
maxMemory
Example
Runtime rt = Runtime.getRuntime();
System.err.println(String.format("Free: %d bytes, Total: %d bytes, Max: %d bytes",
rt.freeMemory(), rt.totalMemory(), rt.maxMemory()));
Also note that if the total amount of memory is being exhausted you can always start the JVM with a larger amount of heap allocated using the -Xmx JVM argument; e.g.
java -Xmx256M MyClass
You can use the Runtime class to get the amount of memory available.
Runtime r = Runtime.getRuntime();
System.out.println(r.totalMemory());
There are various other memory details you can get from the Runtime object - see Runtime class.
In theory yes, using Runtime.getRuntime().maxMemory().
In practice, there are some problem you need to address:
You need to figure out how many application objects are going to fit in a given number of bytes of memory. AFAIK, there is no simple / efficient way to do this within a running application.
You don't want to try to use all available heap space. If you push your percentage heap residency too high, you risk making the GC horribly inefficient.
The maxMemory() method only tells you how big the heap in virtual memory. The physical size can also be a factor (especially if physical size << virtual size), and there's no portable way to figure that out.
If I was trying to implement this application, I think I'd probably just make the in-memory sort size a configuration parameter or command-line option.
Using the methods of Runtime, as the others suggested, is fine, as long as you take some things into consideration:
1) freeMemory() is a lower bound on the actual available memory, because memory that is unreferenced and ready for GC is considered as used. Running System.gc() before the call may return a more accurate result.
2) totalMemory() can change - it only indicates the current total heap size, and the heap can expand/shrink by the JVM during runtime, depending on its usage. You can use maxMemory() to get the actual maximum.
Related
I have troubles with Java memory consumption.
I'd like to say to Java something like this: "you have 8GB of memory, please use it, and only it. Only if you really can't put all your resources in this memory pool, then fail with OOM".
I know, there are default parameters like -Xmx - they limit only the heap. There are also plenty of other parameters, I know. The problems with these parameters are:
They aren't relevant. I don't want to limit the heap size to 6GB (and trust that native memory won't take more than 2GB). I do want to limit all the memory (heap, native, whatever). And do that effectively, not just saying "-Xmx1GB" - to be safe.
There is too many different parameters related to memory, and I don't know how to configure all of them to achieve the goal.
So, I don't want to go there and care about heap, perm and whatever types of memory. My high-level expectation is: since there is only 8GB, and some static memory is needed - take the static memory from the 8GB, and carefully split the remaining memory between other dynamic memory entities.
Also, ulimit and similar things don't work. I don't want to kill the java process once it consumes more memory than expected. I want Java does its best to not reach the limit firstly, and only if it really, really can't - kill the process.
And I'm OK to define even 100 java parameters, why not. :) But then I need assistance with the full list of needed parameters (for, say, Java 8).
Have you tried -XX:MetaspaceSize?
Is this what you need?
Please, read this article: http://karunsubramanian.com/websphere/one-important-change-in-memory-management-in-java-8/
Keep in mind that this is only valid to Java 8.
AFAIK, there is no java command line parameter or set of parameters that will do that.
Your best bet (IMO) is to set the max heap size and the max metaspace size and hope that other things are going to be pretty static / predictable for your application. (It won't cover the size of the JVM binary and it probably won't cover native libraries, memory mapped files, stacks and so on.)
In a comment you said:
So I'm forced to have a significant amount of memory unused to be safe.
I think you are worrying about the wrong thing here. Assuming that you are not constrained by address space or swap space limitations, memory that is never used doesn't matter.
If a page of your address space is not used, the OS will (in the long term) swap it out, and give the physical RAM page to something else.
Pages in the heap won't be in that situation in a typical Java application. (Address space pages will cycle between in-use and free as the GC moves objects within and between "spaces".)
However, the flip-side is that a GC needs the total heap size to be significantly larger than the sum of the live objects. If too much of the heap is occupied with reachable objects, the interval between garbage collection runs decreases, and your GC ergonomics suffer. In the worst case, a JVM can grind to a halt as the time spent in the GC tends to 100%. Ugly. The GC overhead limit mechanism prevents this, but that just means that your JVM gets an OOME sooner.
So, in the normal heap case, a better way to think about it is that you need to keep a portion of memory "unused" so that the GC can operate efficiently.
I have an application that causes an OutOfMemoryError, so I try to debug it using Runtime.getRuntime().freeMemory(). Here is what I get:
freeMemory=48792216
## Reading real sentences file...map size=4709. freeMemory=57056656
## Reading full sentences file...map size=28360. freeMemory=42028760
freeMemory=42028760
## Reading suffix array files of main corpus ...array size=513762 freeMemory=90063112
## Reading reverse suffix array files... array size=513762. freeMemory=64449240
I try to understand the behaviour of freeMemory. It starts with 48 MB, then - after I read a large file - it jumps UP to 57 MB, then down again to 42 MB, then - after I read a very large file (513762 elements) it jumps UP to 90 MB, then down again to 64 MB.
What happens here? How can I make sense of these numbers?
Java memory is a bit tricky. Your program runs inside the JVM, the JVM runs inside the OS, the OS uses your computer resources. When your program needs memory, the JVM will see if it has already requested to the OS some memory that is currently unused, if there isn't enough memory, the JVM will ask the OS and, if possible, obtain some memory.
From time to time, the JVM will look around for memory that is not used anymore, and will free it. Depending on a (huge) number of factors, the JVM can also give that memory back to the OS, so that other programs can use it.
This mean that, at any given moment, you have a certain quantity of memory the JVM has obtained from the OS, and a certain amount the JVM is currently using.
At any given point, the JVM may refuse to acquire more memory, because it has been instructed to do so, or the OS may deny the JVM to access to more memory, either because again instructed to do so, or simply because there is no more free RAM.
When you run your program on your computer, you are probably not giving any limit to the JVM, so you can use plenty of RAM. When running on google apps, there could be some limits imposed to the JVM by google operators, so that available memory may be less.
Runtime.freeMemory will tell you how much of the RAM the JVM has obtained from the OS is currently free.
When you allocate a big object, say one MB, the JVM may require more RAM to the OS, say 5 MB, resulting in freeMemory be 4 MB more than before, which is counterintuitive. Allocating another MB will probably shrink free memory as expected, but later the JVM could decide to release some memory to the OS, and freeMemory will shrink again with no apparent reason.
Using totalMemory and maxMemory in combination with freeMemory you can have a better insight of your current RAM limits and consumption.
To understand why you are consuming more RAM than you would expect, you should use a memory profiler. A simple but effective one is packaged with VisualVM, a tool usually already installed with the JDK. There you'll be able to see what is using RAM in your program and why that memory cannot be reclaimed by the JVM.
(Note, the memory system of the JVM is by far more complicated than this, but I hope that this simplification can help you understand more than a complete and complicated picture.)
It's not terribly clear or user friendly. If you look at the runtime api you see 3 different memory calls:
freeMemory Returns the amount of free memory in the Java Virtual
Machine. Calling the gc method may result in increasing the value
returned by freeMemory.
totalMemory Returns the total amount of memory in the Java virtual
machine. The value returned by this method may vary over time,
depending on the host environment.
maxMemory Returns the maximum amount of memory that the Java virtual
machine will attempt to use.
When you start up the jvm, you can set the initial heap size (-Xms) as well as the max heap size (-Xmx). e.g. java -Xms100m -Xmx 200m starts with a heap of 100m, will grow the heap as more space is needed up to 200, and will fail with OutOfMemory if it needs to grow beyond that. So there's a ceiling, which gives you maxMemory().
The memory currently available in the JVM is somewhere between your starting and max. Somwhere. That's your totalMemory(). freeMemory() is how much is free out of that total.
To add to the confusion, see what they say about gc - "Calling the gc method may result in increasing the value returned by freeMemory." This implies that uncollected garbage is not included in free memory.
OK, based on your comments I wrote this function, which prints a summary of memory measures:
static String memory() {
final int unit = 1000000; // MB
long usedMemory = Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory();
long availableMemory = Runtime.getRuntime().maxMemory() - usedMemory;
return "Memory: free="+(Runtime.getRuntime().freeMemory()/unit)+" total="+(Runtime.getRuntime().totalMemory()/unit)+" max="+(Runtime.getRuntime().maxMemory()/unit+" used="+usedMemory/unit+" available="+availableMemory/unit);
}
It seems that the best measures for how much my program is using are usedMemory, and the complementary availableMemory. They increase/decrease monotonically when I use more memory:
Memory: free=61 total=62 max=922 used=0 available=921
Memory: free=46 total=62 max=922 used=15 available=906
Memory: free=46 total=62 max=922 used=15 available=876
Memory: free=44 total=118 max=922 used=73 available=877
Memory: free=97 total=189 max=922 used=92 available=825
Try running your app against something like http://download.oracle.com/javase/1.5.0/docs/guide/management/jconsole.html.
It comes with the JVM (or certainly used to) and is invaluable in terms of monitoring what is happening inside the JVM during the execution of an applicaiton.
It'll provide more of a useful insight as to what is going on with regards to your memory than your debug statements.
Also, if you are really keen, you can learn a bit more about tuning garbage collections via something like;
http://www.oracle.com/technetwork/java/gc-tuning-5-138395.html
This is pretty in depth, but it is good to get an insight into the various generations of memory in the JVM and how objects are retained in these generations. If you are seeing that objects are being retained in old gen and old gen is continually increasing, then this could be an indicator of a leak.
For debugging why data is being retained and not collected, then you can't go past profilers. Check out JProfiler or Yourkit.
Best of luck.
I have written a code and I want to pass some lists with different sizes but when the size of my list goes over 1024 ,it will throw the exception below! how can i handle it?
size, running time for x
2,184073
3,98308
5,617257
9,481714379
17,55230
33,64505
65,41094
129,65120
257,102555
513,197511
1025,465897
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at OBSTclasses.MemoizedVersion.<init>(MemoizedVersion.java:33)
at OBSTclasses.Coordinator.main(Coordinator.java:102)
Java Result: 1
and also the line that throws this exception is :
minAverageTimeArray = new double[array.size()][array.size()];
thanks
You'll have to increase the Java VM's heap space, see here: http://hausheer.osola.com/docs/5
As malfy's answer mentions, if one encounters an OutOfMemoryError, aside from finding a way to use less memory, increasing the heap space by telling the JVM to allocate more memory to the heap is one way to handle the situation.
In general, one should not perform error handling against an Error such as an OutOfMemoryError. An Error, as opposed to an Exception, is an condition thrown by the JVM which notes that a problem that is fatal to the JVM has occurred, which is something that can't be truly "handled" by the program itself.
From the Java API Specification for the Error class:
An Error is a subclass of Throwable
that indicates serious problems that a
reasonable application should not try
to catch. Most such errors are
abnormal conditions.
So, to answer the question concisely, you should not be error handling the OutOfMemoryError, but find ways to avoid that Error from occurring in the first place.
It sounds like you need to increase the maximum amount of memory that your java process is allowed to use. Add a parameter like -Xmx512m to your java invocation, where 512m means 512 meegabytes.
Possible reasons of OutOfMemory Error could be a memory Leak
Solution:
Increase the heap size by using the following command
Usage :: java -Xms<initial heap size> -Xmx<maximum heap size>
Defaults are:java -Xms32m -Xmx128m
Other values might be java -Xms128m -Xmx512m
-Xms - Initial Heap Size.
-Xmx - Extended(Maximum) Heap Size.
m-megabytes
As coobird said never handle the Error in java.
You can use MAT (Memory Analyzer - http://www.eclipse.org/mat/) to check if you really has any memory leak or its just that heap memory is less for JVM. In case of memory leak you can optimize the memory footprint using the result of MAT. Else you can increase the heap size as already mentioned above by many friends.
Yep, it's your heapspace alright. By default Java allocates 128MB to the heap on most platforms. You should consider the maximum size of list that you're willing to process, and therefore how much memory you need. Think about it this way: a variable of type double in Java is usually 8 bytes long. If your lists are 1024 items long, then your 2D array will need 8 * 1024 * 1024 bytes, or 8MB, of heap space just for the array itself. Now, if your lists double in length, you'll need four times as much heap (32MB), and if they double again (4096 items) you'll need all 128MB of your heap space! This is, of course, ignoring all the heap used by other objects created by your program.
So, you have a few answers. As others have said, the easiest is to increase the maximum heap a JVM instance will use. You should also consider reducing the amount of memory your program needs at any one time. Are there intermediate summations or averages you can compute without needing to store all of your data? Or can you eliminate the need to store your data as both a list and an array? Can you lower the precision of your data - will floats or integers be as accurate as doubles for your application?
I want to calculate the heap usage for my app. I would like to get a procent value of Heap size only.
How do I get the value in code for the current running app?
EDIT
There was an upvoted answer that was NOT complete/correct. The values returned by those methods include stack and method area too, and I need to monitor only heap size.
With that code I got HeapError exception when I reached 43%, so I can't use those methods to monitor just heap
Runtime.getRuntime().totalMemory()
dbyme's answer is not accurate - these Runtime calls give you an amount of memory used by JVM, but this memory does not consist only of heap , there is also stack and method area e.g.
This information is exposed over the JMX management interface. If you simply want to look at it, JConsole or visualvm (part of the JDK, installed in JAVA_HOME/bin) can display nice graphs of a JVM's memory usage, optionally broken down into the various memory pools.
This interface can also be accessed programmatically; see MemoryMXBean.
MemoryMXBean bean = ManagementFactory.getMemoryMXBean();
bean.getHeapMemoryUsage().getUsed();
There really is no good answer, since how much heap memory the JVM has free is not the same as how much heap memory the operating system has free, which are both not the same as how much heap memory can be assigned to your application.
This is because the JVM and OS heaps are different. When the JVM runs out of memory, it may run garbage-collection, defragment its own heap, or request more memory from the OS. Since unused non-garbage-collected objects still exist, but are technically "free", they make the concept of free memory a bit fuzzy.
Also, heap memory fragments; how/when/if memory is defragmented is up to the implementation of the JVM/OS. For example, the OS-heap may have 100MB of free memory, but due to fragmentation, the largest available contiguous space may be 2MB. Thus, if the JVM requests 3MB, it may get an out-of-memory error, even though 100MB are still available. It is not possible for the JVM to know ahead of time that the OS won't be able to allocate that 3MB.
I want to limit the maximum memory used by the JVM. Note, this is not just the heap, I want to limit the total memory used by this process.
use the arguments -Xms<memory> -Xmx<memory>. Use M or G after the numbers for indicating Megs and Gigs of bytes respectively. -Xms indicates the minimum and -Xmx the maximum.
You shouldn't have to worry about the stack leaking memory (it is highly uncommon). The only time you can have the stack get out of control is with infinite (or really deep) recursion.
This is just the heap. Sorry, didn't read your question fully at first.
You need to run the JVM with the following command line argument.
-Xmx<ammount of memory>
Example:
-Xmx1024m
That will allow a max of 1GB of memory for the JVM.
If you want to limit memory for jvm (not the heap size )
ulimit -v
To get an idea of the difference between jvm and heap memory , take a look at this excellent article
http://blogs.vmware.com/apps/2011/06/taking-a-closer-look-at-sizing-the-java-process.html
The answer above is kind of correct, you can't gracefully control how much native memory a java process allocates. It depends on what your application is doing.
That said, depending on platform, you may be able to do use some mechanism, ulimit for example, to limit the size of a java or any other process.
Just don't expect it to fail gracefully if it hits that limit. Native memory allocation failures are much harder to handle than allocation failures on the java heap. There's a fairly good chance the application will crash but depending on how critical it is to the system to keep the process size down that might still suit you.
The NativeHeap can be increasded by -XX:MaxDirectMemorySize=256M
(default is 128)
I've never used it. Maybe you'll find it useful.