I was looking at the following article:
Increase heap size in Java
Now I have a program that needs about 5GB memory and while doing what was told in the article (that is increasing heap size by using -Xmx5g in the arguments field), I am still getting
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
My system is Windows 7 (64 bit) with 8GB RAM. Am I doing something wrong? If yes, how shall I proceed to get 5GB of heap memory or it is just not feasible for my system to handle?
Note: I have to do calculations with a 2D matrix that is of 25K*25K size having all non-zero values. Hence I cannot use sparse matrix as well.
OutOfMemoryError is thrown when JVM does not have enough memory for objects being allocated. If you defined heap of 5G this almost definitely mean that you have a kind of memory leak. For example I can write very simple code that will cause OutOfMemoryError at any environment:
List<String> list = new LinkedList<>();
while(true) {
list.add("a");
}
Run this code and wait several seconds. OutOfMemoryError will be thrown. This is because I add strings to list and never clean it.
I believe that something similar happens with your application.
I understand, that it is not so trivial as my example, so you will probably have to use profiler to debug it and understand the reason of your memory leak.
EDIT:
I've just saw that you are working with 25K*25K martrix. It means that you have 625M cells. You have not mentioned the type of the matrix but if it is int that occupies 4 bytes you need 625*4=2500M=2.5G memory, so 5G should be enough.
Please try to analyze what else happens in your program and where your memory is spent.
5G/(25K*25K) ~ 8 bytes.
Generously assuming that you program does not use memory except for that matrix, each matrix element must take no more than 8 bytes.
You should calculate at least approximate memory requirements to check whether it is even possible to handle problem of such size on your hardware. For example, if you need a 2D array of MxN size of double values then you need at least 8*M*N bytes of memory.
Related
I am trying to read a text file and store every line into ArrayList
However, the text file is too long (about 2,000,000) lines and error: java.lang.OutOfMemoryError occurs.
How do i know if the arraylist is full and then create another arraylist to store the remaining data automatically?
Sorry for my poor english.
Thanks for your help.
2 million lines is far beyond the maximum size for Java Collection (INTEGER.MAX_VALUE or 2 billion indexes).
You are more likely to have heap space outOfMemory error. You can do either
Increase your JVM maximum heap memory allocation.
java -Xmx4g
4g = 4GB.
The default maximum heap size is half of the physical memory up to a physical memory size of 192 megabytes and otherwise one fourth of the physical memory up to a physical memory size of 1 gigabyte.
http://www.oracle.com/technetwork/java/javase/6u18-142093.html
as konsolas recommends, read line by line and store it into a file and flush the variable.
Hope it helps!
This depends on what you are planning to do with the file. You're definitely not going to be able to store all of it in memory, as shown by your error.
Depending on what you're trying to do with the file, processing it in blocks and then saving it would be a better idea. For example:
Read the first 1000 lines of the file
Process these lines/save into another file, etc.
Read the next 1000 lines
etc.
An ArrayList can theoretically hold 2,147,483,647 items. (max int)
As the other answers suggested, your problem is because you run out of memory before your ArrayList is full. If you still don't have enough memory after increasing the heap space size, BigArrayList will solve your problems. It functions like a normal ArrayList and automatically handles swapping data between disk and memory. Note that the library currently supports a limited number of operations, which may or may not be sufficient for you.
I am getting:
java.lang.OutOfMemoryError : Java heap space
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2894)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:117)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:407)
at java.lang.StringBuilder.append(StringBuilder.java:136)
ltimately you always have a finite max of heap to use no matter what platform you are running on. In Windows 32 bit this is around 2gb (not specifically heap but total amount of memory per process). It just happens that Java happens to make the default smaller (presumably so that the programmer can't create programs that have runaway memory allocation without running into this problem and having to examine exactly what they are doing).
So this given there are several approaches you could take to either determine what amount of memory you need or to reduce the amount of memory you are using. One common mistake with garbage collected languages such as Java or C# is to keep around references to objects that you no longer are using, or allocating many objects when you could reuse them instead. As long as objects have a reference to them they will continue to use heap space as the garbage collector will not delete them.
In this case you can use a Java memory profiler to determine what methods in your program are allocating large number of objects and then determine if there is a way to make sure they are no longer referenced, or to not allocate them in the first place. One option which I have used in the past is "JMP" http://www.khelekore.org/jmp/.
If you determine that you are allocating these objects for a reason and you need to keep around references (depending on what you are doing this might be the case), you will just need to increase the max heap size when you start the program. However, once you do the memory profiling and understand how your objects are getting allocated you should have a better idea about how much memory you need.
In general if you can't guarantee that your program will run in some finite amount of memory (perhaps depending on input size) you will always run into this problem. Only after exhausting all of this will you need to look into caching objects out to disk etc. At this point you should have a very good reason to say "I need Xgb of memory" for something and you can't work around it by improving your algorithms or memory allocation patterns. Generally this will only usually be the case for algorithms operating on large datasets (like a database or some scientific analysis program) and then techniques like caching and memory mapped IO become useful.
The OutOfMemoryError is usually caused by the VM not having enough memory to run your project. Did you run it directly from the command line or did you use an IDE ?
For example, Try running your programm with adding the -Xmx1G option which allocate 1Go of memory heap to your programm, you can of course adjust it to your convenience. the G is for Go and the m is for Mb.
You should give the heap a bigger size for it to work.
I am running a memory intensive application. Some info about the environment:
64 bit debian
13 GB of RAM
64 bit JVM (I output System.getProperty("sun.arch.data.model") when my program runs, and it says "64")
Here is the exact command I am issuing:
java -Xmx9000m -jar "ale.jar" testconfig
I have run the program with same exact data, config, etc. on several other systems, and I know that the JVM uses (at its peak) 6 GB of memory on those systems. However, I am getting an OutOfMemory error. Furthermore, during the execution of the program, the system never drops below 8.5 GB of free memory.
When I output Runtime.getRuntime().maxMemory() during execution, I get the value 3044540416, i.e. ~ 3 GB.
I don't know whether it is relevant, but this is a Google Compute Engine instance.
The only explanation I can think of is that there may be some sort of system restriction on the maximum amount of memory that a single process may use.
-Xmx will only set the maximum assigned memory. Use -Xms to specify the minimum. Setting them to the same value will make the memory footprint static.
The only explanation I can think of is that there may be some sort of system restriction on the maximum amount of memory that a single process may use.
That is one possible explanation.
Another one is that you are attempting to allocate a really large array. The largest possible arrays are 2^31 - 1 elements, but the actual size depends on the element size:
byte[] or boolean[] ... 2G bytes
char[] or short[] ... 4G bytes
int[] ... 8 Gbytes
long[] or Object[] ... 16 Gbytes
If you allocate a really large array, the GC needs to find a contiguous region of free memory of the required size. Depending on the array size, and how the heap space is split into spaces, it may be able to find considerably less contiguous space than you think.
A third possibility is that you are getting OOMEs because the GC is hitting the GC Overhead limit for time spent running the GC.
Some of these theories could be confirmed or dismissed if you showed us the stacktrace ...
I have written a code and I want to pass some lists with different sizes but when the size of my list goes over 1024 ,it will throw the exception below! how can i handle it?
size, running time for x
2,184073
3,98308
5,617257
9,481714379
17,55230
33,64505
65,41094
129,65120
257,102555
513,197511
1025,465897
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at OBSTclasses.MemoizedVersion.<init>(MemoizedVersion.java:33)
at OBSTclasses.Coordinator.main(Coordinator.java:102)
Java Result: 1
and also the line that throws this exception is :
minAverageTimeArray = new double[array.size()][array.size()];
thanks
You'll have to increase the Java VM's heap space, see here: http://hausheer.osola.com/docs/5
As malfy's answer mentions, if one encounters an OutOfMemoryError, aside from finding a way to use less memory, increasing the heap space by telling the JVM to allocate more memory to the heap is one way to handle the situation.
In general, one should not perform error handling against an Error such as an OutOfMemoryError. An Error, as opposed to an Exception, is an condition thrown by the JVM which notes that a problem that is fatal to the JVM has occurred, which is something that can't be truly "handled" by the program itself.
From the Java API Specification for the Error class:
An Error is a subclass of Throwable
that indicates serious problems that a
reasonable application should not try
to catch. Most such errors are
abnormal conditions.
So, to answer the question concisely, you should not be error handling the OutOfMemoryError, but find ways to avoid that Error from occurring in the first place.
It sounds like you need to increase the maximum amount of memory that your java process is allowed to use. Add a parameter like -Xmx512m to your java invocation, where 512m means 512 meegabytes.
Possible reasons of OutOfMemory Error could be a memory Leak
Solution:
Increase the heap size by using the following command
Usage :: java -Xms<initial heap size> -Xmx<maximum heap size>
Defaults are:java -Xms32m -Xmx128m
Other values might be java -Xms128m -Xmx512m
-Xms - Initial Heap Size.
-Xmx - Extended(Maximum) Heap Size.
m-megabytes
As coobird said never handle the Error in java.
You can use MAT (Memory Analyzer - http://www.eclipse.org/mat/) to check if you really has any memory leak or its just that heap memory is less for JVM. In case of memory leak you can optimize the memory footprint using the result of MAT. Else you can increase the heap size as already mentioned above by many friends.
Yep, it's your heapspace alright. By default Java allocates 128MB to the heap on most platforms. You should consider the maximum size of list that you're willing to process, and therefore how much memory you need. Think about it this way: a variable of type double in Java is usually 8 bytes long. If your lists are 1024 items long, then your 2D array will need 8 * 1024 * 1024 bytes, or 8MB, of heap space just for the array itself. Now, if your lists double in length, you'll need four times as much heap (32MB), and if they double again (4096 items) you'll need all 128MB of your heap space! This is, of course, ignoring all the heap used by other objects created by your program.
So, you have a few answers. As others have said, the easiest is to increase the maximum heap a JVM instance will use. You should also consider reducing the amount of memory your program needs at any one time. Are there intermediate summations or averages you can compute without needing to store all of your data? Or can you eliminate the need to store your data as both a list and an array? Can you lower the precision of your data - will floats or integers be as accurate as doubles for your application?
I am trying to insert about 50,000 objects (and therefore 50,000 keys) into a java.util.HashMap<java.awt.Point, Segment>. However, I keep getting an OutOfMemory exception. (Segment is my own class - very light weight - one String field, and 3 int fields).
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.HashMap.resize(HashMap.java:508)
at java.util.HashMap.addEntry(HashMap.java:799)
at java.util.HashMap.put(HashMap.java:431)
at bus.tools.UpdateMap.putSegment(UpdateMap.java:168)
This seems quite ridiculous since I see that there is plenty of memory available on the machine - both in free RAM and HD space for virtual memory.
Is it possible Java is running with some stringent memory requirements? Can I increase these?
Is there some weird limitation with HashMap? Am I going to have to implement my own? Are there any other classes worth looking at?
(I am running Java 5 under OS X 10.5 on an Intel machine with 2GB RAM.)
You can increase the maximum heap size by passing -Xmx128m (where 128 is the number of megabytes) to java. I can't remember the default size, but it strikes me that it was something rather small.
You can programmatically check how much memory is available by using the Runtime class.
// Get current size of heap in bytes
long heapSize = Runtime.getRuntime().totalMemory();
// Get maximum size of heap in bytes. The heap cannot grow beyond this size.
// Any attempt will result in an OutOfMemoryException.
long heapMaxSize = Runtime.getRuntime().maxMemory();
// Get amount of free memory within the heap in bytes. This size will increase
// after garbage collection and decrease as new objects are created.
long heapFreeSize = Runtime.getRuntime().freeMemory();
(Example from Java Developers Almanac)
This is also partially addressed in Frequently Asked Questions About the Java HotSpot VM, and in the Java 6 GC Tuning page.
Some people are suggesting changing the parameters of the HashMap to tighten up the memory requirements. I would suggest to measure instead of guessing; it might be something else causing the OOME. In particular, I'd suggest using either NetBeans Profiler or VisualVM (which comes with Java 6, but I see you're stuck with Java 5).
Another thing to try if you know the number of objects beforehand is to use the HashMap(int capacity,double loadfactor) constructor instead of the default no-arg one which uses defaults of (16,0.75). If the number of elements in your HashMap exceeds (capacity * loadfactor) then the underlying array in the HashMap will be resized to the next power of 2 and the table will be rehashed. This array also requires a contiguous area of memory so for example if you're doubling from a 32768 to a 65536 size array you'll need a 256kB chunk of memory free. To avoid the extra allocation and rehashing penalties, just use a larger hash table from the start. It'll also decrease the chance that you won't have a contiguous area of memory large enough to fit the map.
The implementations are backed by arrays usually. Arrays are fixed size blocks of memory. The hashmap implementation starts by storing data in one of these arrays at a given capacity, say 100 objects.
If it fills up the array and you keep adding objects the map needs to secretly increase its array size. Since arrays are fixed, it does this by creating an entirely new array, in memory, along with the current array, that is slightly larger. This is referred to as growing the array. Then all the items from the old array are copied into the new array and the old array is dereferenced with the hope it will be garbage collected and the memory freed at some point.
Usually the code that increases the capacity of the map by copying items into a larger array is the cause of such a problem. There are "dumb" implementations and smart ones that use a growth or load factor that determines the size of the new array based on the size of the old array. Some implementations hide these parameters and some do not so you cannot always set them. The problem is that when you cannot set it, it chooses some default load factor, like 2. So the new array is twice the size of the old. Now your supposedly 50k map has a backing array of 100k.
Look to see if you can reduce the load factor down to 0.25 or something. this causes more hash map collisions which hurts performance but you are hitting a memory bottleneck and need to do so.
Use this constructor:
(http://java.sun.com/javase/6/docs/api/java/util/HashMap.html#HashMap(int, float))
You probably need to set the flag -Xmx512m or some larger number when starting java. I think 64mb is the default.
Edited to add:
After you figure out how much memory your objects are actually using with a profiler, you may want to look into weak references or soft references to make sure you're not accidentally holding some of your memory hostage from the garbage collector when you're no longer using them.
Also might want to take a look at this:
http://java.sun.com/docs/hotspot/gc/
Implicit in these answers it that Java has a fixed size for memory and doesn't grow beyond the configured maximum heap size. This is unlike, say, C, where it's constrained only by the machine on which it's being run.
By default, the JVM uses a limited heap space. The limit is JVM implementation-dependent, and it's not clear what JVM you are using. On OS's other than Windows, a 32-bit Sun JVM on a machine with 2 Gb or more will use a default maximum heap size of 1/4 of the physical memory, or 512 Mb in your case. However, the default for a "client" mode JVM is only 64 Mb maximum heap size, which may be what you've run into. Other vendor's JVM's may select different defaults.
Of course, you can specify the heap limit explicitly with the -Xmx<NN>m option to java, where <NN> is the number of megabytes for the heap.
As a rough guess, your hash table should only be using about 16 Mb, so there must be some other large objects on the heap. If you could use a Comparable key in a TreeMap, that would save some memory.
See "Ergonomics in the 5.0 JVM" for more details.
The Java heap space is limited by default, but that still sounds extreme (though how big are your 50000 segments?)
I am suspecting that you have some other problem, like the arrays in the set growing too big because everything gets assigned into the same "slot" (also affects performance, of course). However, that seems unlikely if your points are uniformly distributed.
I'm wondering though why you're using a HashMap rather than a TreeMap? Even though points are two dimensional, you could subclass them with a compare function and then do log(n) lookups.
Random thought: The hash buckets associated with HashMap are not particularly memory efficient. You may want to try out TreeMap as an alternative and see if it still provide sufficient performance.