How to set G1GC in Play application - java

I'm trying to run a play application on Debian, running on Java 8, but I want to change the default garbage collector in the options to -XX:+UseG1GC.
My OS details:
Linux version 3.16.0-4-amd64 (debian-kernel#lists.debian.org) (gcc version 4.8.4 (Debian 4.8.4-1) ) #1 SMP Debian 3.16.43-2+deb8u2 (2017-06-26)
I've tried multiple option combinations and none appear to work.
My command is something like:
bin/playapp -mem 1024
And I have tried to change it to:
bin/playapp -XX:+UseG1GC -mem 1024
And...
bin/playapp -J-XX:+UseG1GC -mem 1024
I've even removed the mem variable to see if would work without it in both of the above scenarios and neither are working.
Anyone know how to set the G1GC garbage collector for a Play app running on Java 8?
UPDATE:
I should add, for context, that it is run via supervisorctl, where the command in the command is:
command=/home/mdmuser/playapp/bin/playapp -mem 1024
I tried using -J-XX:+UseG1GC directly from the command line and it seems to work, but it doesn't work when running via supervisorctl configuation.

The issue was not actually the syntax at all. It was the fact that when I moved to the G1GC garbage collector, I needed to allocate less memory to the JVM in my virtual machine. I reduced the memory from 1Gb to 512Mb and it then worked fine using the following syntax:
command=/home/mdmuser/playapp/bin/playapp -J-XX:+UseG1GC -mem 512
Apologies for any time wasted.

Related

Error occurred during initialization of VM. Could not reserve enough space for object heap [duplicate]

I am getting the following exception repeatedly each time I try to run the program.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
I tried to increase my virtual memory (page size) and RAM size, but to no avail.
How can I eliminate this error?
Run the JVM with -XX:MaxHeapSize=512m (or any big number as you need) (or -Xmx512m for short)
This can also be caused by setting something too large on a 32-bit HotSpot vm, for example:
-Xms1536m -Xmx1536m
where this might/would work:
-Xms1336m -Xmx1336m
here is how to fix it:
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System
Variables->New: Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
Variable name: Path
Variable value: %PATH%;C:\Program Files\Java\jre6\bin;F:\JDK\bin;
Change this to your appropriate path.
I ran into this when using javac, and it doesn't seem to pick up on the command line options,
-bash-3.2$ javac -Xmx256M HelloWorldApp.java
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
so the solution here it so set _JAVA_OPTIONS
-bash-3.2$ export _JAVA_OPTIONS="-Xmx256M"
-bash-3.2$ javac HelloWorldApp.java
Picked up _JAVA_OPTIONS: -Xmx256M
And this compiles fine.
This happens to me on machines with a lot of RAM, but with lower memory ulimits. Java decides to allocate a big heap because it detects the ram in the machine, but it's not allowed to allocate it because of ulimits.
32-bit Java requires contiguous free space in memory to run. If you specify a large heap size, there may not be so much contiguous free space in memory even if you have much more free space available than necessary.
Installing a 64-bit version of Java helps in these cases, the contiguous memory requirements only applies to 32-bit Java.
Combined with -Xmx512M use -d64 to make sure you're running 64-bit VM. On a 64-bit machine I thought for sure I was running 64-bit virtual machine, but no. After installing 64-bit Java the -d64 option works and -Xmx allows much larger memory sizes.
java -d64 -Xmx512M mypackage.Test
Open gradle.properties file in android folder.
Replace this line:
org.gradle.jvmargs=-Xmx1536M
with:
org.gradle.jvmargs=-Xmx512m
Explanation:
Max limit from Gradle document:
If the requested build environment does not specify a maximum heap size, the Daemon will use up to 512MB of heap.
I got the same error and resolved this by configuring it in the run.conf.bat
Run the JVM with the configuring run.conf.bat in Jboss5x
If free memory is not available AS you are passing in the statement then please make changes in run.conf.bat
set "JAVA_OPTS=-Xms512m -Xmx512m -XX:MaxPermSize=256m"
I had similar issues. I had installed 32 bit version of Java on a 64 bit machine.
By uninstalling that version and installing 64 bit version of Java. I was able to resolve the issue.
I know there are a lot of answers here already, but none of them helped me. In the end I opened the file /etc/elasticsearch/jvm.options and changed:
-Xms2G
-Xmx2G
to
-Xms256M
-Xmx256M
That solved it for me. Hopefully this helps someone else here.
Suppose your class is called Test in package mypackage. Run your code like this:
java -Xmx1024m mypackage.Test
This will reserve 1024 MB of heap space for your code. If you want 512 MB, you can use:
java -Xmx512m mypackage.Test
Use little m in 1024m, 512m, etc
Sometimes, this error indicates that physical memory and swap on the server actually are fully utilized!
I was seeing this problem recently on a server running RedHat Enterprise Linux 5.7 with 48 GB of RAM. I found that even just running
java -version
caused the same error, which established that the problem was not specific to my application.
Running
cat /proc/meminfo
reported that MemFree and SwapFree were both well under 1% of the MemTotal and SwapTotal values, respectively:
MemTotal: 49300620 kB
MemFree: 146376 kB
...
SwapTotal: 4192956 kB
SwapFree: 1364 kB
Stopping a few other running applications on the machine brought the free memory figures up somewhat:
MemTotal: 49300620 kB
MemFree: 2908664 kB
...
SwapTotal: 4192956 kB
SwapFree: 1016052 kB
At this point, a new instance of Java would start up okay, and I was able to run my application.
(Obviously, for me, this was just a temporary solution; I still have an outstanding task to do a more thorough examination of the processes running on that machine to see if there's something that can be done to reduce the nominal memory utilization levels, without having to resort to stopping applications.)
Error :
For the error, "error occurred during initialization of vm could not reserve enough space for object heap jboss"
Root Cause :
Improper/insufficient memory allocation to our JVM as mentioned below.
e.g. JAVA_OPTS="-Xms1303m -Xmx1303m -XX:MaxPermSize=256m" in jboss-eap-6.2\bin\standalone.conf or "JAVA_OPTS=-Xms1G -Xmx1G -XX:MaxPermSize=256M" in jboss-eap-6.2\bin\standalone.conf.bat which is nothing but JVM memory allocation pool parameters.
Resolution :
Increase the heap size. To increase the heap size,
goto -> jboss-eap-6.2\bin\standalone.conf.bat or jboss-eap-6.2\bin\standalone.conf
change ->JAVA_OPTS="-Xms256m -Xmx512m -XX:MaxPermSize=256m" where -Xms is Minimum heap size and -Xmx is Maximum heap size.
Usually its not recommanded to have same size for min and max.
If you are running your application from eclipse,
Double click on the server
select 'open launch configuration' you will be redirected to the window 'Edit launch configuration properties'.
In this windown goto the tab '(x)=Arguments'.
In VM Arguments, define your heap size as mentioned below
"-Dprogram.name=JBossTools: JBoss EAP 6.1+ Runtime Server" -server -Xms256m -Xmx512m -XX:MaxPermSize=256m -Dorg.jboss.resolver.warning=true
I recently faced this issue. I have 3 java applications that start with 1024m or 1280m heap size.
Java is looking at the available space in swap, and if there is not enough memory available, the jvm exits.
To resolve the issue, I had to end several programs that had a large amount of virtual memory allocated.
I was running on x86-64 linux with a 64-bit jvm.
I had right amount of memory settings but for me it was using a 64bit intellij with 32 bit jvm. Once I switched to 64 bit VM, the error was gone.
If you're running 32bit JVM, change heap size to smaller would probabaly help. You can do this by passing args to java directly or through enviroment variables like following,
java -Xms128M -Xmx512M
JAVA_OPTS="-Xms128M -Xmx512M"
For 64bit JVM, bigger heap size like -Xms512M -Xmx1536M should work.
Run java -version or java -d32, java--d64 for Java7 to check which version you're running.
Assuming you have enough free memory and you setup you JVM arguments correctly, you might have a problem of memory fragmentation. Check Java maximum memory on Windows XP.
Anyway, here is how to fix it:
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System Variables->New:
Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
OR
Change the ant call as shown as below.
<exec
**<arg value="-J-Xmx512m" />**
</exec>
It worked for me.
Error occurred during initialization of VM
Could not reserve enough space for 1572864KB object heap
I changed value of memory in settings.grade file
1536 to 512 and it helped
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System Variables->New:
Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
In case you are running a java program:
- run your program in a terminal using the correct command for linux it would be 'java -jar myprogram.jar' and add -Xms256m -Xmx512m, for instance: 'java -jar myprogram.jar Xms256m -Xmx512m'
In case you are running a .sh script (linux, mac?) or a .bat script (windows) open the script and look for the java options if they are present and increase the memory.
If all of the above doesn't work, check your processes (ctrl+alt+delete on windows) (ps aux on linux/mac) and kill the processes which use allot of memory and are not necessary for your operating system! => Try to re-run your program.
In CASSANDRA_HOME/bin/cassandra.bat you would find following configuration
REM JVM Opts we'll use in legacy run or installation
set JAVA_OPTS=-ea^
-javaagent:"%CASSANDRA_HOME%\lib\jamm-0.3.0.jar"^
-Xms**2G**^
-Xmx**2G**^
You can reduce 2G to some smaller number for e.g. 1G or even lesser and it should work.
Same if you are running on unix box, change in .sh file appropriately.
I got the same error and it got resolved when I deleted temp files using %temp% and restarting eclipse.
Sometimes it relates as
$ sysctl vm.overcommit_memory
vm.overcommit_memory = 2
If you set it to:
$ sysctl vm.overcommit_memory=0
It should work.
Replace -Xmx2G with -Xms512M or any greater memory size in cassandra.bat file in cassandra bin directory.
In my case I couldn't increase org.gradle.jvmargs=-Xmx... in gradle.properties beyond 1GB. It didn't work because I had two Java installation on my machine, one 32 bit (Gradle was using this one) and the other 64 bit. I resolved the problem by adding JAVA_HOME environment variable pointing to 64 bit Java.
No need to do anything just chnage in POM file like below
<configuration>
<maxmemory>1024M</maxmemory>
</configuration>

IBM Heap Dump Analyzer | Out of memory

I am running on 7 GM Ram machine , I have a heap dump file of size 1.8 GB . I am using Java 8 of 64 bit and running on 64 bit machine.
When i try to open the phd file from heap dump analyzer tool , it throws out of memory error. I am setting java vm args for heap analyzer tool as below
java -Xmx4g -XX:-UseGCOverheadLimit
but still i am unable to open file. Please let me know how can i overcome this.
This happens due to that the default heap size is smaller than needed by the dump size to be loaded, to resolve this, you need to set the VM args Xms, and XmX with the right values, below is what worked for me:
"<JAVA_PATH>\Java.exe" -Xms256m -Xmx6144m -jar <HEAP_ANALYSER_NAME>.jar
I hope that helps, I know it is a bit late response :)
I faced the same issue multiple times. I noticed that the analyzer runs better on Linux. On windows it needs a very large amount of memory most of the times - and surprisingly I did not see any apparent direct co-relation between the heapdump size and the required xmx size by the analyzer.
You can either try on Linux if that is an option, or increase the xmx size further.
I Installed JDK 1.8 along with JRE 1.8 and made the changes Java Runtime Environment Settings : java control panel --> Java --> View --> User (Run Time Parameters to -Xms256m -Xmx6144m) and enable both JRE and JDK 1.8 versions.
This works out finally :) ; give it a try, JDK1.8 64bit in Windows.

Eclipse java application heap size cannot exceed 4G

I need to run an application with -Xmx12g but I cannot get 12g in eclipse.
I can run it fine from terminal directly, java -Xmx12g ... which shows me the max memory as 12G from this command:
Runtime.getRuntime().maxMemory();
Running same thing in eclipse, as runtime vm parameters, I get 4G max. I tried maxing values in eclipse.ini(which should not affect my java application right?), no change.
I have 16G ram, my friend has 64G, he can run it fine but I can't get more than 4g with same settings. I'm not getting any error or anything.
64 bit os, 64 bit vm
Eclipse -> Preferences -> Java -> Installed JREs. There is a default VM arguments part for each JRE, which was causing everything to run in 4G for me, even though I try to override -Xmx in Run Configuration

Jenkins is failing to start a 32-bit JVM for a job

I'm running Jenkins 1.557. I have a job that I need to be built with a 32-bit version of JDK 1.6_u45. I have that version properly configured in my job's JDK setting. However, when I attempt to run the job, I get the following error.
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
If I switch the job's JDK setting to a 64-bit version, the JVM is able to be created and it runs as normal. The server has 8GB of RAM available, and I've even attempted to pass in a string parameters of JAVA_OPTS=-Xms512m -Xmx1024m & ANT_OPTS=-Xms512m -Xmx1024m to the build, but to no avail.
Please note this is not a duplicate of Could not reserve enough space for object heap. If I attempt to build the project at a regular command line (Windows environment variable JAVA_HOME pointing to the same 32-bit JDK installation as the Jenkins attempt), the project builds. This is seemingly a Jenkins specific issue.
My guess is somewhere in Jenkins (or in some hidden Jenkins config file) the JVM heap size is being set too large for the 32-bit JVM, but I can't seem to pinpoint where that is being set. I've checked the jenkins.xml in JENKINS_HOME but the heap size is not being set in the arguments tag.
Answer
Try a lower max heap (-Xmx) value, such as -Xmx900m or -Xmx800m and see if this solves the problem.
From my experience, Jenkins honors your ANT_OPTS environment variable and does not mess with it. I use Jenkins Freestyle Jobs that launch Ant personally and I've always set ANT_OPTS, MAVEN_OPTS, ... separate from Jenkins and it has never changed anything. Make
Better yet, start with a much lower value like -Xmx512m (I would use ANT_OPTS, which Ant uses for this and not bother with JAVA_OPTS). If it still fails to initialize, OK, then maybe I'll entertain that Jenkins is doing something. If not, there's your answer.
At the root, I believe this is the same problem as the duplicate question you linked, it just reproduces in more limited circumstances. More details below.
Background
Just yesterday, on a coworker's machine I saw -Xmx1024m fail in a standard command window with the same message with 32-bit Java. Just because it works in one situation does not mean it will always work.
On Windows, 2GB max address space per 32-bit process severely limits the maximum heap size you can set in Java since Java requires that the entire object heap be allocated in one contiguous block. Especially in modern versions of Windows that use ASLR (Address Space Layout Randomization), you simply can't be guaranteed large heap sizes for 32-bit processes...even 1024m can sometimes be too large since in Java the heap must be contiguous. Picture a horizontal line from 0 to 2GB, and then a [1GB] chunk taking up 50% of the width. Now insert 50 random DLLs into that 2GB horizontal line in random locations...now try to fit your [1GB] chunk without hitting a dot.
Not exact, here's my poor man's diagram of the address space:
0 [________________________________________] 2GB
_ is unallocated, available, | is occupied
Now with DLLs:
0 [__|_______|___________________|___|_____] 2GB
You need to fit this (including edges) into that address space:
[__________________]
Maybe it barely squeeks in...now let's add one more blip
0 [__|_______|_____________|_____|___|_____] 2GB
[__________________]
Suddenly it won't fit.
It's possible there is an extra DLL being loaded by Jenkins that is fragmenting your address space just slightly more so that 1024m fails under Jenkins but not in a standalone window. Since your goal is to run it under Jenkins, I don't see a clear solution to that other than to reduce your max heap size since your goal is to run a 32-bit build. In the Windows XP days, it was common to get -Xmx1300m or so to work, but apparently even -Xmx1024m is a stretch on Windows 7 and Windows 8 (in some cases, anyway). It really seems like the most likely case is...you're trying to set the heap too big for 32-bit.
Verification
If this really isn't the problem, or if you don't believe me, you can verify what Java memory settings your 64-bit version of the build is actually using (namely because it has to actually start to see the settings while it's running). Since your other build is failing to even start, I'm not sure you can use this method there. Whether Jenkins is doing something or not, whether you tell your job to use a 32-bit JDK or 64-bit JDK, if it's reading ANT_OPTS it should be the getting the same end result -Xmx value from that environment variable for both builds (the one that works (64-bit), and the one that fails). You can use a utility included with the JDK to do this called jconsole. From the bin directory of your JDK installation, run 'jconsole'. Or, if you have %JAVA_HOME%\bin in your PATH, you should be able to directly launch jconsole.
This will start a graphical client allowing you to select from any Process IDs (PIDs) that have a JVM running in them, this list should be pretty short in most cases. Select your Ant process and connect to it. Switch to the VM Information tab, and you will see the heap settings and other VM arguments that the JVM is using.
You will see a "VM Arguments" section, which should include your -Xms and -Xmx settings, but also "Maximum Heap Size", which will probably display in kilobytes.
Bonus knowledge, but not directly relevant since you've stated Java 6. If this were Java 7 or later, you could use:
jcmd
to obtain the PID, then:
jcmd <PID> VM.arguments
to see the VM arguments for the Java process with the PID you specified. jcmd being another utility that comes with the JDK. This, for me at least, displays the raw bytes value so you'll need to translate in your head. (it won't show -Xmx1024m it will show -XX:MaxHeapSize=1073741824)

“Error occurred during initialization of VM; Could not reserve enough space for object heap” using -Xmx3G

First of all, I have a box with 8gb of ram, so I doubt total memory is the issue.
This application is running fine on machines with 6gb or less.
I am trying to reserve 3GB of space using -Xmx3G under "VM Arguments" in Run Configurations in Eclipse.
Every time I try to reserve more than 1500mb, I get this error:
“Error occurred during initialization of VM; Could not reserve enough space for object heap” using -Xmx3G
What is going on here?
Could it be that you're using a 32-bit jvm on that machine?
Here is how to fix it:
Go to Start->Control Panel->System->Advanced(tab)->Environment Variables->System
Variables->New:
Variable name: _JAVA_OPTIONS
Variable value: -Xmx512M
Variable name: Path
Variable value: ;C:\Program Files\Java\jre6\bin;F:\JDK\bin;
Change this to your appropriate path.
This is actually not an Eclipse-specific issue; it's a general
Java-on-Windows issue. It's because of how the JVM allocates memory on
Windows; it insists on allocating a contiguous chunk of memory, which
often Windows can't provide, even if there are enough separate chunks to
satisfy the allocation request.
There are utilities that will try to help Windows "defrag" its memory,
which would, in theory, help this situation; but I've not really tried
them in earnest so can't speak to their effectiveness.
One thing that I've heard sometimes that might help is to reboot Windows
and, before starting any other apps, launch the Java app that needs the
big chunk of memory. If you're lucky, Windows won't have fragmented its
memory space yet and Java will get the contiguous block that is asks for.
Somewhere out on the interwebs there are more technical explanations and
analyses of this issue, but I don't have any references handy.
I did find this, though, which looks helpful: https://stackoverflow.com/a/497757/639520
First the JRE of 32bits can't use more ~1.5Gb of ram. So if you want more, use a 64bits JRE.
Second, When a new JVM starts, this sum the -Xmx property of the all JVM that are running, and check if there is enough memory left on the system to run at their own -Xmx, if is not enough then the error occurs.
I was using Liferay with Tomcat server from eclipse IDE.
I was stuck with this same error on click on server start up.
Double click on server from eclipse.
it open up Server Overview page. Updated memory arguments from -Xmx1024m -XX:MaxPermSize=256m to -Xmx512m -XX:MaxPermSize=256m.
Then it was working for me.
Make sure that Eclipse is actually running the same JVM you think it's running. If you use java in your web browser ever, you likely have a 32-bit version floating around too that might be taking precedence if it installed or updated lately.
To be absolutely sure, I recommend adding these two lines to your eclipse.ini file at the top:
-vm
C:/Java/jdk1.6.0_27/bin
...where on my machine C:/Java/jdk1.6.0_27/bin where the JVM I know is 64-bit is located. Be sure to have the bin folder there.
(As a bonus, on Windows 7, this also allows you to actually "pin the tab" which is why I had to do this for my own usage)
This is the issue of Heap size. Edit your .bat (Batch file). It might be showing Heap size 1024. Change it to 512 Then it should work.
Just put # symbol in front of org.gradle.jvmargs=-Xmx1536m in gradle.properties
# org.gradle.jvmargs=-Xmx1536m
I also had the same problem while using Eclipse which was 32 bit and the JVM used by it was 64 bit.
When I routed the Eclipse to 32 bit JVM then it worked
I know that i am a bit late, but here my answer comes:
I just installed the Java online Version from Oracle(not the offline 64-Bit one).
After having added the JAVA_HOME ENV variable, it just worked!
Hope I could help :)
Probably you are trying wrong options anyways.
I got a similar error with supporting error log:
Java HotSpot(TM) Client VM warning: ignoring option PermSize=32M; support was removed in 8.0
Java HotSpot(TM) Client VM warning: ignoring option MaxPermSize=128M; support was removed in 8.0
Im my case, the software did not support java 8 yet(script was using old JVM arguments) but I had had java 8 by default.
One of the reason for this issue is no memory available for Tomcat to start. Try to delete the unwanted running software from windows and restart the eclipse and tomcat.
Solution is simple. No need to go deep into this issue.
If you are running on 64bit machine then follow below steps:
Unistall 32 bit java first (check in C:\Program Files (x86) for its existence)
Install the newer version JDK kit 64 bit (includes JRE)
Set the environment path (To avoid conflict error if you have two different 64bit JRE)
Check in command prompt by typing javac command.
Restart / Done
You can have two different Java installed but don't forgot to set path.
Please set JAVA_OPTS=-Xms256m -Xmx512m in environment variables, it should solve the issue, it worked for me.
Find out if you are using a 32bit version or 64bit version of Java. To know that use the command
java -version
The 3rd line of the output should give you if it 32bit or 64bit.
If it is 32bit uninstall and install a 64bit version.

Categories