Error: Could not create the Java Virtual Machine - java

I have visited all existing questions which are related to my question but I still have a problem. All installations are correctly installed. I am using the newest Netbeans version. After executing program I have this error:
Error: Could not create the Java Virtual Machine.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: A fatal exception has occurred. Program will exit.
And my Netbeans.conf is:
# ${HOME} will be replaced by JVM user.home system property
netbeans_default_userdir="${HOME}/.netbeans/7.1.2"
# Options used by NetBeans launcher by default, can be overridden by explicit
# command line switches:
netbeans_default_options="-J-client -J-Xss2m -J-Xms16m -J-XX:PermSize=16m -J-Dapple.laf.useScreenMenuBar=true -J-Dapple.awt.graphics.UseQuartz=true -J-Dsun.java2d.noddraw=true -J-Dsun.zip.disableMemoryMapping=true"
# Note that default -Xmx and -XX:MaxPermSize are selected for you automatically.
# You can find these values in var/log/messages.log file in your userdir.
# The automatically selected value can be overridden by specifying -J-Xmx or
# -J-XX:MaxPermSize= here or on the command line.
# If you specify the heap size (-Xmx) explicitly, you may also want to enable
# Concurrent Mark & Sweep garbage collector. In such case add the following
# options to the netbeans_default_options:
# -J-XX:+UseConcMarkSweepGC -J-XX:+CMSClassUnloadingEnabled J-XX:+CMSPermGenSweepingEnabled
# (see http://wiki.netbeans.org/FaqGCPauses)
# Default location of JDK, can be overridden by using --jdkhome <dir>:
netbeans_jdkhome="C:\Arquivos de programas\Java\jdk1.7.0_07"
# Additional module clusters, using ${path.separator} (';' on Windows or ':' on Unix):
#netbeans_extraclusters="/absolute/path/to/cluster1:/absolute/path/to/cluster2"
# If you have some problems with detect of proxy settings, you may want to enable
# detect the proxy settings provided by JDK5 or higher.
# In such case add -J-Djava.net.useSystemProxies=true to the netbeans_default_options.
What do I have to do? I've been trying to resolve this error all day. My system Memory is 3Gb.

Could not reserve enough space for object heap
This almost always means that your -Xmx is too high for the machine. There is a message above:
# Note that default -Xmx and -XX:MaxPermSize are selected for you automatically
Try to provide explicit value, start small. Note that -Xms must be less then or equal to -Xmx

I got the same error while starting Netbeans
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
I try to restart Netbeans many times. Same error was repeating. Later I found it was happening as some other application is already using JVM. So I looked for such an application which was Tomcat Server. I terminated Tomcat and tried starting Netbeans again and it was fine. So try looking for any application that uses JVM.

We have couple of solutions for the above Problem
Solution 1: You can re-install the all components. ie it means you have install the entire s/w.
for the Error: Could not create the Java Virtual Machine.
Solution 2: that maximum heap size varies based upon machine architecture e.g. 32 bit or 64 bit, JVM bit size e.g. 32 bit JVM or 64 bit JVM and operating system.
In 32 bit machine though theoretical limit of maximum heap size is 4GB, it varies on operating system to operating system e.g. on 32 bit windows XP maximum heap size limits upto 1.5G due to various reason while on 64 bit Solaris machine even with 32 bit JVM you can afford around 3.5GB. So when you run following java command in 32 bit Windows XP machine
Solution 3: One more worth noting syntax error while providing heap space is space between numeric literal and unit, as shown in below example:
Correct: ~/java java -Xmx1500 M
In-correct: ~/java java -Xmx1500MB
In-Correct: ~/java java -Xmx 1400M

-J-Xss2m -J-Xms16m -J-XX:PermSize=16m
Those are insufficient values. Try it increase from 16Mb to 512Mb (or 1Gb)

Related

Java Wrapper OutofMemory

I have a program running under a java wrapper, as the application has to be run as a Windows service. However, I am encountering Exception in thread "Thread-5" java.lang.OutOfMemoryError: Java heap space after running the application every couple of weeks.
The memory values are commented out. Does this mean there is no limit of memory this application is allowed to use?
I'm also checking the codes of possible memory leaks, but are there any ways to modify the config in order to find the cause/prevent out of memory errors?
#********************************************************************
# Wrapper Java Properties
#********************************************************************
# Java Application
# Locate the java binary on the system PATH:
#wrapper.java.command=%JAVA_HOME%\bin\java
# Specify a specific java binary:
set.JAVA_HOME=%JAVA_HOME%
wrapper.java.command=%JAVA_HOME%\bin\java
# Tell the Wrapper to log the full generated Java command line.
#wrapper.java.command.loglevel=INFO
# Java Main class. This class must implement the WrapperListener interface
# or guarantee that the WrapperManager class is initialized. Helper
# classes are provided to do this for you. See the Integration section
# of the documentation for details.
wrapper.java.mainclass=org.tanukisoftware.wrapper.WrapperSimpleApp
# Java Classpath (include wrapper.jar) Add class path elements as
# needed starting from 1
wrapper.java.classpath.1=../lib/wrapper.jar
wrapper.java.classpath.2=%JAVA_HOME%\lib\tools.jar
wrapper.java.classpath.3=C:\daifuku\wms\tomcat\webapps\wms\WEB-INF\classes
wrapper.java.classpath.4=C:\daifuku\wms\tomcat\webapps\wms\WEB-INF\lib\*.jar
wrapper.java.classpath.5=C:\daifuku\wms\tomcat\lib\comm.jar
wrapper.java.classpath.6=C:\daifuku\wms\tomcat\lib\servlet-api.jar
wrapper.java.classpath.7=C:\daifuku\wms\tomcat\lib\jsp-api.jar
# Java Library Path (location of Wrapper.DLL or libwrapper.so)
wrapper.java.library.path.1=%JAVA_HOME%\jre\lib
# Java Bits. On applicable platforms, tells the JVM to run in 32 or 64-bit mode.
wrapper.java.additional.auto_bits=TRUE
# Java Additional Parameters
wrapper.java.additional.1=
# Initial Java Heap Size (in MB)
#wrapper.java.initmemory=64
# Maximum Java Heap Size (in MB)
#wrapper.java.maxmemory=512
# Application parameters. Add parameters as needed starting from 1
wrapper.app.parameter.1=XXX
Many Thanks!
Does this mean there is no limit of memory this application is allowed to use?
I believe that it will use the default value configured for your system: How is the default Java heap size determined?
Since you mentioned that the problem appears after running your application for a few weeks, most likely you have a memory leak. I advise you to make a dump using JvisualVm for your java wrapper process, and afterwards just analyze the dump using the Mat analyzer https://www.eclipse.org/mat/.
So you're basically asking which parameters for Heap does java actually allocate when you don't specify the init and max memory (which I believe translate to well known -Xmx and -Xms
In general its system dependent and the algorithm also has changed many times, so to be sure, you actually should check on your system:
java -XX:+PrintFlagsFinal -version | grep HeapSize // or run you application with that flag if you wish
// or on windows
java -XX:+PrintFlagsFinal -version | findstr HeapSize
Then check for:
InitialHeapSize
MaxHeapSize

OpenJ9 tomcat won't start with high -Xmx heap option

I have a Spring app running in a Tomcat 9.0.6 on Linux 64. Because it needs a lot of memory, I would like to try the OpenJ9 JVM which is supposedly more efficient in that regard (current heap limit with Hotspot: -Xmx128G).
I installed the 64-bit adoptopenjdk-8-jdk-openj9:
/usr/lib/jvm/adoptopenjdk-8-jdk-openj9/bin/java -version
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (build 1.8.0_212-b04)
Eclipse OpenJ9 VM (build openj9-0.14.2, JRE 1.8.0 Linux amd64-64-Bit Compressed References 20190521_315 (JIT enabled, AOT enable
OpenJ9 - 4b1df46fe
OMR - b56045d2
JCL - a8c217d402 based on jdk8u212-b04)
Starting the tomcat causes the following error:
This JVM package only includes the '-Xcompressedrefs' configuration. Please run the VM without specifying the '-Xnocompressedrefs' option or by specifying the '-Xcompressedrefs' option.
After I set this option I get the following error:
JVMJ9GC028E Option too large: '-Xmx'
JVMJ9VM015W Initialization error for library j9gc29(2): Failed to initialize
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Documentation isn't that clear, but I found this:
https://www.ibm.com/support/knowledgecenter/SSYKE2_8.0.0/com.ibm.java.vm.80.doc/docs/mm_gc_compressed_refs.html
Compressed references are used by default on a 64-bit IBM SDK when the value of -Xmx, which sets the maximum Java heap size, is in the correct range. Start of changes for service refresh 2 fix pack 10On AIX®, Linux and Windows systems, the default range is 0 - 57 GB. For larger heap sizes, you can try to use compressed references by explicitly setting -Xcompressedrefs.End of changes for service refresh 2 fix pack 10 However, larger heap sizes might result in an out of memory condition at run time because the VM requires some memory at low addresses. You might be able to resolve an out of memory condition in low addresses by using the -Xmcrs option.
So basically, at least this build of the JDK only supports compressedrefs, and in order to use that, I must set it manually since my Xmx is above the range where it is enabled by default, but that fails because my OS already allocated to much of <4GB memory ranges, but some is needed to use compressedrefs. Since I can never guarantee that that won't be the case, is there any way I can use OpenJ9 without compressedrefs? And will that even yield the benefits in terms of memory consumption? Or is there any way I can use compressedrefs with very high Xmx settings?
I also tried setting this option, but it didn't help: https://www.ibm.com/support/knowledgecenter/SSYKE2_8.0.0/openj9/xmcrs/index.html?view=embed
How do I find the correct size for it? 1G and 64m failed. Even if I find the correct setting, how would this value guarantee that the OS hasn't already allocated all the lower memory addresses?
The limit to use the compressed refs JVM is 57G and you can't run it if the -Xnocompressedrefs option is specified.
The 57G division is documented here: https://www.eclipse.org/openj9/docs/xcompressedrefs/
The -Xnocompressedrefs problem is mentioned in the release notes: https://github.com/eclipse/openj9/blob/master/doc/release-notes/0.15/0.15.md
With a reference to: https://github.com/eclipse/openj9/issues/479
Creating a single JVM that supports both is covered by: https://github.com/eclipse/openj9/issues/643
https://github.com/eclipse/openj9/pull/7505
(With thanks to the help from the Eclipse OpenJ9 slack community, especially to Peter Shipton)
I found this build which allows noncompressedrefs and thus solves my issues: https://adoptopenjdk.net/releases.html?variant=openjdk8&jvmVariant=openj9#linuxxl

Error occurred during initialization of VM. Could not reserve enough space for object heap [duplicate]

I am getting the following exception repeatedly each time I try to run the program.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
I tried to increase my virtual memory (page size) and RAM size, but to no avail.
How can I eliminate this error?
Run the JVM with -XX:MaxHeapSize=512m (or any big number as you need) (or -Xmx512m for short)
This can also be caused by setting something too large on a 32-bit HotSpot vm, for example:
-Xms1536m -Xmx1536m
where this might/would work:
-Xms1336m -Xmx1336m
here is how to fix it:
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System
Variables->New: Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
Variable name: Path
Variable value: %PATH%;C:\Program Files\Java\jre6\bin;F:\JDK\bin;
Change this to your appropriate path.
I ran into this when using javac, and it doesn't seem to pick up on the command line options,
-bash-3.2$ javac -Xmx256M HelloWorldApp.java
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
so the solution here it so set _JAVA_OPTIONS
-bash-3.2$ export _JAVA_OPTIONS="-Xmx256M"
-bash-3.2$ javac HelloWorldApp.java
Picked up _JAVA_OPTIONS: -Xmx256M
And this compiles fine.
This happens to me on machines with a lot of RAM, but with lower memory ulimits. Java decides to allocate a big heap because it detects the ram in the machine, but it's not allowed to allocate it because of ulimits.
32-bit Java requires contiguous free space in memory to run. If you specify a large heap size, there may not be so much contiguous free space in memory even if you have much more free space available than necessary.
Installing a 64-bit version of Java helps in these cases, the contiguous memory requirements only applies to 32-bit Java.
Combined with -Xmx512M use -d64 to make sure you're running 64-bit VM. On a 64-bit machine I thought for sure I was running 64-bit virtual machine, but no. After installing 64-bit Java the -d64 option works and -Xmx allows much larger memory sizes.
java -d64 -Xmx512M mypackage.Test
Open gradle.properties file in android folder.
Replace this line:
org.gradle.jvmargs=-Xmx1536M
with:
org.gradle.jvmargs=-Xmx512m
Explanation:
Max limit from Gradle document:
If the requested build environment does not specify a maximum heap size, the Daemon will use up to 512MB of heap.
I got the same error and resolved this by configuring it in the run.conf.bat
Run the JVM with the configuring run.conf.bat in Jboss5x
If free memory is not available AS you are passing in the statement then please make changes in run.conf.bat
set "JAVA_OPTS=-Xms512m -Xmx512m -XX:MaxPermSize=256m"
I had similar issues. I had installed 32 bit version of Java on a 64 bit machine.
By uninstalling that version and installing 64 bit version of Java. I was able to resolve the issue.
I know there are a lot of answers here already, but none of them helped me. In the end I opened the file /etc/elasticsearch/jvm.options and changed:
-Xms2G
-Xmx2G
to
-Xms256M
-Xmx256M
That solved it for me. Hopefully this helps someone else here.
Suppose your class is called Test in package mypackage. Run your code like this:
java -Xmx1024m mypackage.Test
This will reserve 1024 MB of heap space for your code. If you want 512 MB, you can use:
java -Xmx512m mypackage.Test
Use little m in 1024m, 512m, etc
Sometimes, this error indicates that physical memory and swap on the server actually are fully utilized!
I was seeing this problem recently on a server running RedHat Enterprise Linux 5.7 with 48 GB of RAM. I found that even just running
java -version
caused the same error, which established that the problem was not specific to my application.
Running
cat /proc/meminfo
reported that MemFree and SwapFree were both well under 1% of the MemTotal and SwapTotal values, respectively:
MemTotal: 49300620 kB
MemFree: 146376 kB
...
SwapTotal: 4192956 kB
SwapFree: 1364 kB
Stopping a few other running applications on the machine brought the free memory figures up somewhat:
MemTotal: 49300620 kB
MemFree: 2908664 kB
...
SwapTotal: 4192956 kB
SwapFree: 1016052 kB
At this point, a new instance of Java would start up okay, and I was able to run my application.
(Obviously, for me, this was just a temporary solution; I still have an outstanding task to do a more thorough examination of the processes running on that machine to see if there's something that can be done to reduce the nominal memory utilization levels, without having to resort to stopping applications.)
Error :
For the error, "error occurred during initialization of vm could not reserve enough space for object heap jboss"
Root Cause :
Improper/insufficient memory allocation to our JVM as mentioned below.
e.g. JAVA_OPTS="-Xms1303m -Xmx1303m -XX:MaxPermSize=256m" in jboss-eap-6.2\bin\standalone.conf or "JAVA_OPTS=-Xms1G -Xmx1G -XX:MaxPermSize=256M" in jboss-eap-6.2\bin\standalone.conf.bat which is nothing but JVM memory allocation pool parameters.
Resolution :
Increase the heap size. To increase the heap size,
goto -> jboss-eap-6.2\bin\standalone.conf.bat or jboss-eap-6.2\bin\standalone.conf
change ->JAVA_OPTS="-Xms256m -Xmx512m -XX:MaxPermSize=256m" where -Xms is Minimum heap size and -Xmx is Maximum heap size.
Usually its not recommanded to have same size for min and max.
If you are running your application from eclipse,
Double click on the server
select 'open launch configuration' you will be redirected to the window 'Edit launch configuration properties'.
In this windown goto the tab '(x)=Arguments'.
In VM Arguments, define your heap size as mentioned below
"-Dprogram.name=JBossTools: JBoss EAP 6.1+ Runtime Server" -server -Xms256m -Xmx512m -XX:MaxPermSize=256m -Dorg.jboss.resolver.warning=true
I recently faced this issue. I have 3 java applications that start with 1024m or 1280m heap size.
Java is looking at the available space in swap, and if there is not enough memory available, the jvm exits.
To resolve the issue, I had to end several programs that had a large amount of virtual memory allocated.
I was running on x86-64 linux with a 64-bit jvm.
I had right amount of memory settings but for me it was using a 64bit intellij with 32 bit jvm. Once I switched to 64 bit VM, the error was gone.
If you're running 32bit JVM, change heap size to smaller would probabaly help. You can do this by passing args to java directly or through enviroment variables like following,
java -Xms128M -Xmx512M
JAVA_OPTS="-Xms128M -Xmx512M"
For 64bit JVM, bigger heap size like -Xms512M -Xmx1536M should work.
Run java -version or java -d32, java--d64 for Java7 to check which version you're running.
Assuming you have enough free memory and you setup you JVM arguments correctly, you might have a problem of memory fragmentation. Check Java maximum memory on Windows XP.
Anyway, here is how to fix it:
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System Variables->New:
Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
OR
Change the ant call as shown as below.
<exec
**<arg value="-J-Xmx512m" />**
</exec>
It worked for me.
Error occurred during initialization of VM
Could not reserve enough space for 1572864KB object heap
I changed value of memory in settings.grade file
1536 to 512 and it helped
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System Variables->New:
Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
In case you are running a java program:
- run your program in a terminal using the correct command for linux it would be 'java -jar myprogram.jar' and add -Xms256m -Xmx512m, for instance: 'java -jar myprogram.jar Xms256m -Xmx512m'
In case you are running a .sh script (linux, mac?) or a .bat script (windows) open the script and look for the java options if they are present and increase the memory.
If all of the above doesn't work, check your processes (ctrl+alt+delete on windows) (ps aux on linux/mac) and kill the processes which use allot of memory and are not necessary for your operating system! => Try to re-run your program.
In CASSANDRA_HOME/bin/cassandra.bat you would find following configuration
REM JVM Opts we'll use in legacy run or installation
set JAVA_OPTS=-ea^
-javaagent:"%CASSANDRA_HOME%\lib\jamm-0.3.0.jar"^
-Xms**2G**^
-Xmx**2G**^
You can reduce 2G to some smaller number for e.g. 1G or even lesser and it should work.
Same if you are running on unix box, change in .sh file appropriately.
I got the same error and it got resolved when I deleted temp files using %temp% and restarting eclipse.
Sometimes it relates as
$ sysctl vm.overcommit_memory
vm.overcommit_memory = 2
If you set it to:
$ sysctl vm.overcommit_memory=0
It should work.
Replace -Xmx2G with -Xms512M or any greater memory size in cassandra.bat file in cassandra bin directory.
In my case I couldn't increase org.gradle.jvmargs=-Xmx... in gradle.properties beyond 1GB. It didn't work because I had two Java installation on my machine, one 32 bit (Gradle was using this one) and the other 64 bit. I resolved the problem by adding JAVA_HOME environment variable pointing to 64 bit Java.
No need to do anything just chnage in POM file like below
<configuration>
<maxmemory>1024M</maxmemory>
</configuration>

IBM Heap Dump Analyzer | Out of memory

I am running on 7 GM Ram machine , I have a heap dump file of size 1.8 GB . I am using Java 8 of 64 bit and running on 64 bit machine.
When i try to open the phd file from heap dump analyzer tool , it throws out of memory error. I am setting java vm args for heap analyzer tool as below
java -Xmx4g -XX:-UseGCOverheadLimit
but still i am unable to open file. Please let me know how can i overcome this.
This happens due to that the default heap size is smaller than needed by the dump size to be loaded, to resolve this, you need to set the VM args Xms, and XmX with the right values, below is what worked for me:
"<JAVA_PATH>\Java.exe" -Xms256m -Xmx6144m -jar <HEAP_ANALYSER_NAME>.jar
I hope that helps, I know it is a bit late response :)
I faced the same issue multiple times. I noticed that the analyzer runs better on Linux. On windows it needs a very large amount of memory most of the times - and surprisingly I did not see any apparent direct co-relation between the heapdump size and the required xmx size by the analyzer.
You can either try on Linux if that is an option, or increase the xmx size further.
I Installed JDK 1.8 along with JRE 1.8 and made the changes Java Runtime Environment Settings : java control panel --> Java --> View --> User (Run Time Parameters to -Xms256m -Xmx6144m) and enable both JRE and JDK 1.8 versions.
This works out finally :) ; give it a try, JDK1.8 64bit in Windows.

Jenkins is failing to start a 32-bit JVM for a job

I'm running Jenkins 1.557. I have a job that I need to be built with a 32-bit version of JDK 1.6_u45. I have that version properly configured in my job's JDK setting. However, when I attempt to run the job, I get the following error.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
If I switch the job's JDK setting to a 64-bit version, the JVM is able to be created and it runs as normal. The server has 8GB of RAM available, and I've even attempted to pass in a string parameters of JAVA_OPTS=-Xms512m -Xmx1024m & ANT_OPTS=-Xms512m -Xmx1024m to the build, but to no avail.
Please note this is not a duplicate of Could not reserve enough space for object heap. If I attempt to build the project at a regular command line (Windows environment variable JAVA_HOME pointing to the same 32-bit JDK installation as the Jenkins attempt), the project builds. This is seemingly a Jenkins specific issue.
My guess is somewhere in Jenkins (or in some hidden Jenkins config file) the JVM heap size is being set too large for the 32-bit JVM, but I can't seem to pinpoint where that is being set. I've checked the jenkins.xml in JENKINS_HOME but the heap size is not being set in the arguments tag.
Answer
Try a lower max heap (-Xmx) value, such as -Xmx900m or -Xmx800m and see if this solves the problem.
From my experience, Jenkins honors your ANT_OPTS environment variable and does not mess with it. I use Jenkins Freestyle Jobs that launch Ant personally and I've always set ANT_OPTS, MAVEN_OPTS, ... separate from Jenkins and it has never changed anything. Make
Better yet, start with a much lower value like -Xmx512m (I would use ANT_OPTS, which Ant uses for this and not bother with JAVA_OPTS). If it still fails to initialize, OK, then maybe I'll entertain that Jenkins is doing something. If not, there's your answer.
At the root, I believe this is the same problem as the duplicate question you linked, it just reproduces in more limited circumstances. More details below.
Background
Just yesterday, on a coworker's machine I saw -Xmx1024m fail in a standard command window with the same message with 32-bit Java. Just because it works in one situation does not mean it will always work.
On Windows, 2GB max address space per 32-bit process severely limits the maximum heap size you can set in Java since Java requires that the entire object heap be allocated in one contiguous block. Especially in modern versions of Windows that use ASLR (Address Space Layout Randomization), you simply can't be guaranteed large heap sizes for 32-bit processes...even 1024m can sometimes be too large since in Java the heap must be contiguous. Picture a horizontal line from 0 to 2GB, and then a [1GB] chunk taking up 50% of the width. Now insert 50 random DLLs into that 2GB horizontal line in random locations...now try to fit your [1GB] chunk without hitting a dot.
Not exact, here's my poor man's diagram of the address space:
0 [________________________________________] 2GB
_ is unallocated, available, | is occupied
Now with DLLs:
0 [__|_______|___________________|___|_____] 2GB
You need to fit this (including edges) into that address space:
[__________________]
Maybe it barely squeeks in...now let's add one more blip
0 [__|_______|_____________|_____|___|_____] 2GB
[__________________]
Suddenly it won't fit.
It's possible there is an extra DLL being loaded by Jenkins that is fragmenting your address space just slightly more so that 1024m fails under Jenkins but not in a standalone window. Since your goal is to run it under Jenkins, I don't see a clear solution to that other than to reduce your max heap size since your goal is to run a 32-bit build. In the Windows XP days, it was common to get -Xmx1300m or so to work, but apparently even -Xmx1024m is a stretch on Windows 7 and Windows 8 (in some cases, anyway). It really seems like the most likely case is...you're trying to set the heap too big for 32-bit.
Verification
If this really isn't the problem, or if you don't believe me, you can verify what Java memory settings your 64-bit version of the build is actually using (namely because it has to actually start to see the settings while it's running). Since your other build is failing to even start, I'm not sure you can use this method there. Whether Jenkins is doing something or not, whether you tell your job to use a 32-bit JDK or 64-bit JDK, if it's reading ANT_OPTS it should be the getting the same end result -Xmx value from that environment variable for both builds (the one that works (64-bit), and the one that fails). You can use a utility included with the JDK to do this called jconsole. From the bin directory of your JDK installation, run 'jconsole'. Or, if you have %JAVA_HOME%\bin in your PATH, you should be able to directly launch jconsole.
This will start a graphical client allowing you to select from any Process IDs (PIDs) that have a JVM running in them, this list should be pretty short in most cases. Select your Ant process and connect to it. Switch to the VM Information tab, and you will see the heap settings and other VM arguments that the JVM is using.
You will see a "VM Arguments" section, which should include your -Xms and -Xmx settings, but also "Maximum Heap Size", which will probably display in kilobytes.
Bonus knowledge, but not directly relevant since you've stated Java 6. If this were Java 7 or later, you could use:
jcmd
to obtain the PID, then:
jcmd <PID> VM.arguments
to see the VM arguments for the Java process with the PID you specified. jcmd being another utility that comes with the JDK. This, for me at least, displays the raw bytes value so you'll need to translate in your head. (it won't show -Xmx1024m it will show -XX:MaxHeapSize=1073741824)

Categories