I am trying to set the Xmx parameter when starting up a program. If I set it to 1408M, the JRE starts up fine. If I set it to 1536M, I get
"Could not create the java virtual machine".
I understand that it's trying to reserve consecutive memory space, but the machine I'm running on has 16GB of RAM and 13GB of that is currently free. The program I'm running is running out of heap space and crashing on me. Is there anything I can do to fix this?
Use a 64bit JVM. The 32bit JVM is limited (depending on the OS) to at most 3 GByte (on linux I have a limitation of about 1.5 GByte).
32-bit JVMs are limited to roughly 1.5 GB of heap space due to addressing constraints and the need for memory for other reasons. On Windows 2 GB is assigned to the process, and 0.5 is used for non-heap memory. If you can use PAE on Windows Server or possibly Linux, you can address up to 3 or 4 GB, respectively.
Otherwise use a 64-bit JVM.
Related
I have an application that on launch requests a specific amount of RAM using the following command.
java -Xms512m -Xmx985m -jar someJarfile.jar
This command fails to run on my computer with 8.0GB of RAM because it can not create an object heap of the specified size. If I lower the max range to something below 700MB it works fine.
What is even stranger is that even doing a simple java -Xmx768m -version fails when the -Xmx flag value exceeds 700m. I am trying to run it with Java 1.7Uu67 32-bit(that is what the jar was built with) and even newer versions of Java 1.7 and event Java 1.8. I would understand if the max heap was higher and I was using 32bit, but it is not above the ~1.4GB cap of 32-bit java
Is there a configuration parameter that I am missing somewhere that would be causing this, some sort of software that may be interfering? It does not make sense to me as to why I can not allocate 700MB of RAM on a machine with 8.0GB of RAM. I
I should also note that there are no other processes running that are taking up all of my RAM. It is a fresh install of Windows 7.
While 700 MB is pretty low, it is not surprising.
The 32-bit Windows XP emulator in Windows works the same way as Windows XP with all it's limitations. It means you lose 2 GB or a potential 4 GB to the OS. This means programs already running use up virtual memory space. Also if your program uses shared libraries or off heap storage like direct memory and memory mapped files this will means you lose virtual memory for the heap. Effectively you are limited to 1.4 GB of virtual memory for your applications no matter how much memory you actually have.
The simple way around this it to use the 64-bit JVM which runs in your 64-bit OS and is also limited but instead to 192 TB of virtual memory on Windows.
You should try using a 64 bit Java Runtime. It is probably the case that there is no 985 MB large one-piece memory chunk free within the 32-bit address space of your computer (the 32 bit address space 4GB). When you use a 64 bit Java Runtime, Java can allocate the memory within the 64 bit address space, in which the free memory is much more likely to be available.
It doesn't matter that your JAR file was built using a 32 bit version.
The answer to your question may lie in the fact that Windows tries and fails to find a contiguous block of memory that is large enough: see http://javarevisited.blogspot.nl/2013/04/what-is-maximum-heap-size-for-32-bit-64-JVM-Java-memory.html. (Though this suggests that other processes are hogging memory, which seems to be contradicted by your last remark.)
I'm struggling with Java heap space settings. The default Java on Windows is the 32-bit client regardless of OS version (that's what Oracle recommends to all users). It appears to set max heap size to 256 MB by default, and that is too little for me.
I use a custom launcher to start the application. I would like it to use more memory on computers with plenty RAM, and default to -Xmx512m on those with less RAM. As far as I'm aware, the only way is the static -Xmx setting (that has to be set on launch).
I have a user who has 8 GB RAM, 64-bit Windows and 32-bit Java 7. Maximum memory visible to the JVM is 4G (as returned by querying OperatingSystemMXBean). I understand why, no issue.
For some reason my application is unable to start for this user with -Xmx1300m, even though he has 2.3G free memory. He closed some applications (having 5G free memory), and still it would not launch. The error reported to me was:
error occured during init of vm
could not reserve enough space for object heap
What's going on? Could it be that the 32-bit JVM is only able to address the "first" 4G of memory and has to have a 1300M block available within those first 4 gigabytes?
How can I solve this problem, except for asking everyone to install 64-bit Java (what is unlikely to be acceptable)?
EDIT: In case it matters, it is a fat Swing client, not an applet.
It is not a question of memory but a question of address space.
On those 4 GB (2^32) theoretically addressable by a 32bit process, one must take into account the fact that the OS kernel needs a part of that address space (which obviously the process cannot touch).
And when you use Java, the address space of the java process itself is split further, between the heap, the permgen, native resources, the JVM itself etc.
You are using a 64bit OS. Run a 64bit JVM. Your bytecode (ie, all your jars) will run all the same. There is just no reason to be using a 32bit JVM!
Why won't it work?
As others have mentioned, this particular user's computer most likely does not have a large enough contiguous block of free memory for the JVM in a 32-bit address space. The maximum 32-bit heap space is system-dependent (note that both the OS and the exact JVM version make a difference) but is usually around 1100-1600 MB on Windows. For example, on my 64-bit Windows 7 system, these are my maximum -Xmx sizes for the specific 32-bit JVM versions I have installed:
Java 7: between 1100m and 1200m
Java 6: between 1400m and 1500m
Java 5: between 1500m and 1600m
The remaining memory allocated to the process is used by the OS (or emulator, in the case of a 32-bit process on a 64-bit host), the JVM itself, and other structures used by the JVM.
Recommended solution: bundle a 64-bit JVM with your application
If you cannot get the client to install a 64-bit JVM, bundle one with your application. The 64-bit address space will have a contiguous block of memory larger than 1300 MB free, even if there is not necessarily a large enough contiguous block of physical memory available.
If your software is a standalone application, it's a piece of cake to bundle the JVM. If the launcher is an applet, you might have to have your applet check for the 64-bit JVM in a known location (and download it if necessary) before launching your application.
Don't forget about dependencies
If your application uses 32-bit native libraries, make sure you can also get 64-bit versions of those native libraries.
What if you can't bundle a JVM or have 32-bit native dependencies?
There's really no reason why you shouldn't be able to bundle a JVM with your application, but you might have some 32-bit native dependencies that haven't been ported to 64-bit--in which case, it won't matter whether you bundle a JVM because you're still stuck with 32-bit. In that case, you can make your launcher perform a binary search to determine the maximum heap size by repeatedly executing java -Xmx####m -version and parsing the output (where #### is the Xmx value, of course). Once you've found the maximum Xmx, you can launch your program with that value. (Note: a slightly safer option would be to simply try to run your program and check for the heap space error, decreasing Xmx after each failed attempt to launch your program.)
You'll need to use a smaller Xmx value if you get an error message like one of the following:
Java 7:
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Java 6:
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
Java 5:
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
But if you get something like the following, you know you've either found the maximum or you can try a larger Xmx value:
Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
Java HotSpot(TM) Client VM (build 23.21-b01, mixed mode)
I agree with the previous answers. Just a couple of additional comments. Yes 32-bits can theoretically access 4GB, however in Windows the top half of this reserved for the operating system and the bottom half is for all applications. Since Windows considers Java a "user" program, not part of the OS, the best you could ever do is 2GB. And in practice it's a lot less than that. 1.2GB sounds about right.
However, when running in 32-bit mode, I would actually recommend dropping down to 1024M. If you absolutely max out the heap space, you may run into a more serious problem where you run out of "native memory". And if you have never experienced this before it's a real treat - because instead of getting a nice Java stack trace, the whole JVM immediately crashes.
And I agree with everyone else that you need to bite the bullet and enhance your application to support the 64-bit JVM. In my case we have a service wrapper so we needed to redistribute both JVMs and then a 32-bit service wrapper and a 64-bit service wrapper. The user could then register either the 32-bit or 64-bit version as needed.
Windows XP is limited to 1.2 - 1.4 GB of continuous memory. Even if you have a 64-bit JVM, the 32-bit emulation works the same as it does for Windows XP for compatibility i.e. the also has the same limitations.
If you want to use more memory, run the 64-bit JVM. Unless you have 32-bit DLLs, there is littel reason not to.
The application I work on requires as much memory as I can give it. through trial and error I have found that under Windows the most I can reliably assign to a 32-bit JVM is about 1200 MB. It varies slightly but I've never known it drop below this. Under Linux running OpenJVM I can sometimes assign 1300MB. There are many reasons for this limit but from what I've read one of the main issues stopping the JVM acquiring a larger heap than this is the requirement that the heap be one contiguous block of memory.
As you are on a 64bit machine, running a 64bit operating system I'd strongly recommend just switching to a 64bit JVM. You can then assign essentially unlimited amounts of memory. My experimentation however indicates that over about 10GB of memory it's a serious case of diminishing returns as the JVM doesn't seem to use it very well and performance suffers. I believe Java 8 will have better management of large amounts of memory.
I've always been able to allocate 1400 megabytes for Java SE running on 32-bit Windows XP (Java 1.4, 1.5 and 1.6).
java -Xmx1400m ...
Today I tried the same option on a new Windows XP machine using Java 1.5_16 and 1.6.0_07 and got the error:
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
Through trial and error it seems 1200 megabytes is the most I can allocate on this machine.
Any ideas why one machine would allow 1400 and another only 1200?
Edit: The machine has 4GB of RAM with about 3.5GB that Windows can recognize.
Keep in mind that Windows has virtual memory management and the JVM only needs memory that is contiguous in its address space. So, other programs running on the system shouldn't necessarily impact your heap size. What will get in your way are DLL's that get loaded in to your address space. Unfortunately optimizations in Windows that minimize the relocation of DLL's during linking make it more likely you'll have a fragmented address space. Things that are likely to cut in to your address space aside from the usual stuff include security software, CBT software, spyware and other forms of malware. Likely causes of the variances are different security patches, C runtime versions, etc. Device drivers and other kernel bits have their own address space (the other 2GB of the 4GB 32-bit space).
You could try going through your DLL bindings in your JVM process and look at trying to rebase your DLL's in to a more compact address space. Not fun, but if you are desperate...
Alternatively, you can just switch to 64-bit Windows and a 64-bit JVM. Despite what others have suggested, while it will chew up more RAM, you will have much more contiguous virtual address space, and allocating 2GB contiguously would be trivial.
This has to do with contiguous memory.
Here's some info I found online for somebody asking that before, supposedly from a "VM god":
The reason we need a contiguous memory
region for the heap is that we have a
bunch of side data structures that are
indexed by (scaled) offsets from the
start of the heap. For example, we
track object reference updates with a
"card mark array" that has one byte
for each 512 bytes of heap. When we
store a reference in the heap we have
to mark the corresponding byte in the
card mark array. We right shift the
destination address of the store and
use that to index the card mark array.
Fun addressing arithmetic games you
can't do in Java that you get to (have
to :-) play in C++.
Usually we don't have trouble getting
modest contiguous regions (up to about
1.5GB on Windohs, up to about 3.8GB on Solaris. YMMV.). On Windohs, the
problem is mostly that there are some
libraries that get loaded before the
JVM starts up that break up the
address space. Using the /3GB switch
won't rebase those libraries, so they
are still a problem for us.
We know how to make chunked heaps, but
there would be some overhead to using
them. We have more requests for faster
storage management than we do for
larger heaps in the 32-bit JVM. If you
really want large heaps, switch to the
64-bit JVM. We still need contiguous
memory, but it's much easier to get in
a 64-bit address space.
The Java heap size limits for Windows are:
maximum possible heap size on 32-bit Java: 1.8 GB
recommended heap size limit on 32-bit Java: 1.5 GB (or 1.8 GB with /3GB option)
This doesn't help you getting a bigger Java heap, but now you know you can't go beyond these values.
Oracle JRockit, which can handle a non-contiguous heap, can have a Java heap size of 2.85 GB on Windows 2003/XP with the /3GB switch. It seems that fragmentation can have quite an impact on how large a Java heap can be.
The JVM needs contiguous memory and depending on what else is running, what was running before, and how windows has managed memory you may be able to get up to 1.4GB of contiguous memory. I think 64bit Windows will allow larger heaps.
Sun's JVM needs contiguous memory. So the maximal amount of available memory is dictated by memory fragmentation. Especially driver's dlls tend to fragment the memory, when loading into some predefined base address. So your hardware and its drivers determine how much memory you can get.
Two sources for this with statements from Sun engineers: forum blog
Maybe another JVM? Have you tried Harmony? I think they planned to allow non-continuous memory.
I think it has more to do with how Windows is configured as hinted by this response:
Java -Xmx Option
Some more testing: I was able to allocate 1300MB on an old Windows XP machine with only 768MB physical RAM (plus virtual memory). On my 2GB RAM machine I can only get 1220MB. On various other corporate machines (with older Windows XP) I was able to get 1400MB. The machine with a 1220MB limit is pretty new (just purchased from Dell), so maybe it has newer (and more bloated) Windows and DLLs (it's running Window XP Pro Version 2002 SP2).
I got this error message when running a java program from a (limited memory) virtuozzo VPS. I had not specified any memory arguments, and found I had to explicitly set a small amount as the default must have been too high. E.g. -Xmx32m (obviously needs to be tuned depending on the program you run).
Just putting this here in case anyone else gets the above error message without specifying a large amount of memory like the questioner did.
sun's JDK/JRE needs a contiguous amount of memory if you allocate a huge block.
The OS and initial apps tend to allocate bits and pieces during loading which fragments the available RAM. If a contiguous block is NOT available, the SUN JDK cannot use it. JRockit from Bea(acquired by Oracle) can allocate memory from pieces.
Everyone seems to be answering about contiguous memory, but have neglected to acknowledge a more pressing issue.
Even with 100% contiguous memory allocation, you can't have a 2 GiB heap size on a 32-bit Windows OS (*by default). This is because 32-bit Windows processes cannot address more than 2 GiB of space.
The Java process will contain perm gen (pre Java 8), stack size per thread, JVM / library overhead (which pretty much increases with each build) all in addition to the heap.
Furthermore, JVM flags and their default values change between versions. Just run the following and you'll get some idea:
java -XX:+PrintFlagsFinal
Lots of the options affect memory division in and out of the heap. Leaving you with more or less of that 2 GiB to play with...
To reuse portions of this answer of mine (about Tomcat, but applies to any Java process):
The Windows OS
limits the memory allocation of a 32-bit process to 2 GiB in total (by
default).
[You will only be able] to allocate around 1.5 GiB heap
space because there is also other memory allocated to the process
(the JVM / library overhead, perm gen space etc.).
Why does 32-bit Windows impose a 2 GB process address space limit, but
64-bit Windows impose a 4GB limit?
Other modern operating systems [cough Linux] allow 32-bit processes to
use all (or most) of the 4 GiB addressable space.
That said, 64-bit Windows OS's can be configured to increase the limit
of 32-bit processes to 4 GiB (3 GiB on 32-bit):
http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx
Here is how to increase the Paging size
right click on mycomputer--->properties--->Advanced
in the performance section click settings
click Advanced tab
in Virtual memory section, click change. It will show ur current paging
size.
Select Drive where HDD space is available.
Provide initial size and max size ...e.g. initial size 0 MB and max size
4000 MB. (As much as you will require)
**There are numerous ways to change heap size like,
file->setting->build, exceution, deployment->compiler here you will find heap size
file->setting->build, exceution, deployment->compiler->andriod here also you will find heap size. You can refer this for andriod project if you facing same issue.
What worked for me was
Set proper appropriate JAVA_HOME path incase you java got updated.
create new system variable computer->properties->advanced setting->create new system variable
name: _JAVA_OPTION value: -Xmx750m
FYI:
you can find default VMoption in Intellij
help->edit custom VM option , In this file you see min and max size of heap.**
First, using a page-file when you have 4 GB of RAM is useless. Windows can't access more than 4GB (actually, less because of memory holes) so the page file is not used.
Second, the address space is split in 2, half for kernel, half for user mode. If you need more RAM for your applications use the /3GB option in boot.ini (make sure java.exe is marked as "large address aware" (google for more info).
Third, I think you can't allocate the full 2 GB of address space because java wastes some memory internally (for threads, JIT compiler, VM initialization, etc). Use the /3GB switch for more.
I have a Tomcat instance running on a Windows 2008 Server with 4GB of RAM. The server is dedicated to this one application, so I would quite like to be able to dedicate most of the RAM to Tomcat. My Tomcat setup currently has the following java options:
-Xms256m
-Xmx1600m
I'd like to increase the amount of RAM, preferably up to about 3GB (obviously I know how to do that, just increase the -Xmx value). However, Tomcat refuses to start up if I increase the maximum heap space beyond 1600MB. Several websites that I have read say that Tomcat cannot use more than 40% of the available RAM, which seems consistent with what I'm seeing.
Is there a way of increasing the proportion of memory that Tomcat can use, so that I can increase the amount of memory that Tomcat can use?
Your issue was probably OS related, not Tomcat / Java. The Windows OS limits the memory allocation of a 32-bit process to 2 GiB in total (by default).
The reason why it only allowed you to allocate around 1.5 GiB heap space is because there is also other memory allocated to the process (the JVM / library overhead, perm gen space etc.).
Why does 32-bit Windows impose a 2 GB process address space limit, but 64-bit Windows impose a 4GB limit?
Other modern operating systems [cough Linux] allow 32-bit processes to use all (or most) of the 4 GiB addressable space.
That said, 64-bit Windows OS's can be configured to increase the limit of 32-bit processes to 4 GiB (3 GiB on 32-bit):
http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx
However, as you've rightly done, the best solution is to use a 64-bit JVM with your 64-bit OS. Terabyte heaps anyone:
Max memory for 64bit Java :D
Despite having a 64 bit server I only had 32 bit Java/Tomcat installed. I uninstalled Java and Tomcat and installed 64 bit versions and everything worked fine. it seems that the issue was that 32 bit Java can only address 1.5 GB.
I've got a problem on my Solaris servers. When I launch a Sun Java process with restricted memory it takes more than twice the ressources.
For example, I have 64 Go of memory on my servers. 1 is on Linux, the others are on Solaris. I ran the same softwares on all servers (only java).
When servers starts they took between 400Mb and 1,2Gb of RAM. I launch my java process (generally between 4 and 16go per java process) and I can't run more than 32 Gb defined with Xmx and Xmx values. I got this kind of errors :
> /java -d64 -Xms8G -Xmx8G -version
Error occurred during initialization of VM
Could not reserve enough space for object heap
As we can see here, I got a lot of reserved memory and it's made by java process :
> swap -s
total: 22303112k bytes allocated + 33845592k reserved = 56148704k used, 704828k available
As soon as I kill them 1 by 1, I recover my reserved space and could launch others. But in fact I can't use more than a half my memory.
Anybody know how to resolve this problem ?
Thanks
I believe the issue is Linux over committing memory allocation while Solaris is make sure what you allocate fit in virtual memory.
If you think that's a Linux advantage, you might reconsider it when Linux OOM killer randomly kill your mission critical application at it worst stage.
To fix the issue, just add more swap space to Solaris.