I've been trying to automate the creation of our development environment by combining batch files and WLST, but I am struggling to change the memory WebLogic server will start with.
Currently we are manually changing the memory settings in the <DOMAIN_HOME>/bin/setDomainEnv.cmd script, but this is a workaround. It should be possible to to do it automatically without much effort.
Setting the Domain
The script that sets the Domain in pretty simple:
set JAVA_HOME=C:\Program Files\Java\jdk1.6.0_45
set MW_HOME=C:\dev\wls1036_dev
set DOMAIN_HOME=C:\dev\domain
cd %MW_HOME%
call configure.cmd
mkdir %DOMAIN_HOME%
cd %DOMAIN_HOME%
%JAVA_HOME%\bin\java.exe -Xmx1024m -XX:MaxPermSize=256m -Dweblogic.management.username=weblogic -Dweblogic.management.password=welcome1 weblogic.Server
I've tried to use some variables in this script such as MEM_ARGS, JAVA_OPTIONS, but none of these are forwarded to the final configuration of the domain, later it always starts with 512 heap, and 128 perm, which are not enough.
WLST memory start args
We are using Eclipse, and it does call the startWebLogic.cmd as start script. It is the standard configuration.
I tried to use the following WLST script. It does set the server start arguments, but WebLogic is not using those properties and loads not enough memory.
edit()
startEdit()
cd('/Servers/myserver/ServerStart/myserver')
cmo.setArguments('-Xmx1024m -XX:MaxPermSize=256m')
activate()
Any ideas?
You can use the trick for getting ServerStart arguments:
Write simple offline WLST-script to get arguments from config.xml:
getArguments.py
import sys
readDomain(sys.argv[1])
cd('Server/%s/ServerStart/NO_NAME_0' % sys.argv[2])
argsFile = open('arguments.txt', 'w')
print >>argsFile, cmo.arguments
Add this script to startWeblogic.cmd like:
startWebLogic.cmd
...
set DOMAIN_HOME=%~dp0
path\to\wlst.cmd getArguments.py %DOMAIN_HOME% admin_server_name
set /p EXTRA_JAVA_PROPERTIES=<arguments.txt
call "%DOMAIN_HOME%\bin\startWebLogic.cmd" %*
There is no easy way of setting values when executing WebLogic from Eclipse. It'll call the batch script and, at least at current version, does not allow to send dynamic parameters.
We solved it making the setDomainEnv.cmd file part of our versioned configuration:
Copy the setDomainEnv.cmd file to your version control configuration.
Edit whatever you want (memory, etc)
Copy the file like copy custom\setDomainEnv.cmd %DOMAIN_HOME%\bin /y when running your development environment configuration script.
Now every time you configure your development environment memory values will be ready without manual intervention.
You have to reedit your stuff when updating WebLogic, so you don't end up with an outdated component.
Related
I am setting environment variable in my machine using export MY_KEY=foo. And I am trying to fetch it in JVM using System.getenv("MY_KEY"). This returns null. But running echo $MY_KEY shows foo on the terminal.
I have tried restarting the IDE. Doesn't work, still.
The environment variable is only available to sub processes of the shell that exported it. Did you start your IDE from that shell?
If you want the variable to be available all the time, you need to add it
to the /etc/profile file or create a extra file in /etc/profile.d. It depends on your operating system.
We are working on jruby on rails app. Want to set JRUBY_OPTS or JAVA_OPTS.
I can easily use the export to set them on my machine, but I'd like these options to persist with the application.
Our application runs on tomcat in production can set options there in the config file.
Putting export JAVA_OPTS in the initializer file didn't work, seems like the options are set after application starts
you can not "persist" such options within the app.
as you exported locally you might export JAVA_OPTS before Tomcat starts - its a JVM process that will read and set those opts for the VM, your app being deployed there has no control over most settings possible via JAVA_OPTS (memory, stack-size etc)
How do I increase the memory used by my Weblogic (Java). While starting the server from eclipse it shows a message that JAVA Memory arguments: -Xms256m -Xmx512m -XX:MaxPermSize=256m. I couldn't understand from where is it taking that value from. After sometime the Weblogic server fails because of low permgen space.
I added startup arguments from console but that doesn't have any effect. Can you help me from where is it taking the memory values from?
When you configure a "Server" in Eclipse for WebLogic, you select a domain directory (for local). That domain directory contains the startup scripts that Eclipse will use to start the WebLogic Server. These are the same scripts that you would use if you started the server if you did it without Eclipse. Inside the domain directory is a folder called "bin". In the "bin" directory, locate the setDomainEnv file (.sh for unix, or .cmd for Windows). In that file, alter the memory settings to suite your needs.
Based on the error message you mentioned in your question, I would increase both the PermSize and MaxPermSize settings to 512m. For PermSize and MaxPermSize, there are two locations each by default in a simple WLS installation, one for 32-bit, and another for 64-bit. It won't hurt to change them both. But if you know which JVM architecture you are running, you can change the one that applies to your environment.
You will have a file setDomainEnv.cmd/setDomainEnv.sh under your server bin folder. this file contains
set MEM_MAX_PERM_SIZE_64BIT=-XX:MaxPermSize=512m
set MEM_MAX_PERM_SIZE_32BIT=-XX:MaxPermSize=512m
Max and Min memory values as
if "%JAVA_VENDOR%"=="Sun" (
set WLS_MEM_ARGS_64BIT=-Xms256m -Xmx512m
set WLS_MEM_ARGS_32BIT=-Xms256m -Xmx512m
) else (
set WLS_MEM_ARGS_64BIT=-Xms512m -Xmx512m
set WLS_MEM_ARGS_32BIT=-Xms512m -Xmx512m
)
You can update the values inside it.
In addition to the previous two answers that are correct (modifying setDomainEnv and potentially wl_server\common\bin\commEnv), you can also modify servers individually if you are starting them with the nodemanager.
In the admin console navigate to:
Servers -> <server name> -> Server Start tab -> Arguments
Here you can set the JVM args you want for that server rather than making a blanket change in all servers to setDomainEnv
Looks like eclipse plugin adds it own USER_MEM_ARGS variable, so it should be rewritten using following approach:
Open following file:
$WL_HOME/user_projects/domains/mydomain/bin/setDomainEnv.sh
and add the next line to it (e.g. after help description)
USER_MEM_ARGS=$ECLIPSE_MEM_ARGS
Then, open weblogic server properties (double click at weblogic in servers view) and click "Open launch configuration". In a opened window select "Environment" and add new variable ECLIPSE_MEM_ARGS with memory params as value, e.g.:
ECLIPSE_MEM_ARGS=-Xms1024m -Xmx2048m -XX:MaxPermSize=512m
Save and check that server was applied new configuration.
P.S. Using this approach you can change/add memory params directly from eclipse.
P.S.S. In Windows you should use setDomainEnv.cmd file and added line should be:
set USER_MEM_ARGS=%ECLIPSE_MEM_ARGS%
For Weblogic12:
Environment->Servers->[your_server]->Configuration/Server Start -> Arguments:
-Xms1024m -Xmx2048m
Restart the server:
Environment->Servers->[your_server]->Control/"Start/Stop" -> Suspend and then Start.
Check your memory:
Environment->Servers->[your_server]->Monitoring/Performance.
This question already has answers here:
How to pass the -D additional parameter while starting tomcat?
(6 answers)
Closed 5 years ago.
I have a webProject with a VM Argument called "-Dfolder"
I use that argument on applicationContext like this:
<value>file:${FNET_CORE_CONFIG}/conf/${folder}/jdbc.properties</value>
In Eclipse, for testing, i use "Run Configuration" to set the value like this:
-Dfolder=Dev
Now, I want to test my webapp on Apache Tomcat so I need to set/send the folder VM Argument.
How I do that?
I have to use setenv.sh? How?. Can someone give me and example?
Thanks and sorry for my english
I don't know what version of Tomcat you using, but in Tomcat 7 in file catalina.sh you can specify variable CATALINA_OPTS and this variable will pass to jvm.
But maybe setting environment variable isn't a best way to achive your goal. Maybe best will be creation of separate "app.properties" file, and including it in applicationContext like this:
<context:property-placeholder location="classpath*:app.properties" />
And solution for catalina.sh
# CATALINA_OPTS (Optional) Java runtime options used when the "start",
# "run" or "debug" command is executed.
# Include here and not in JAVA_OPTS all options, that should
# only be used by Tomcat itself, not by the stop process,
# the version command etc.
# Examples are heap size, GC logging, JMX ports etc.
example:
CATALINA_OPTS="-Dfolder=Dev"
EDIT:
for windows it should be something like set CATALINA_OPTS="-Dfolder=Dev"
EDIT:
In Spring configuration you can use system property just like ${propertyname}, and also can include file with property definition, with context:property-placeholder, and all defined in that file properties also become avaliable in config.
For example, you have base set properties: config.properties, and set of files with db connection settings (DEV.properties, UAT.properties, PROD.properties). So, how can you include different properties for different environment? It can be done it many ways, for example, set system properties in catalina.bat
set CATALINA_OPTS="-Dbuild=DEV"
and in applicationConfig.xml
<context:property-placeholder location="classpath*:${build}.properties, classpath*:config.properties" />
Or you can create different build configuration and include in final WAR only one properties (DEV, UAT, PROD) for each build configuration. In applicationConfig set something like:
<context:property-placeholder location="classpath*:*.properties" />
Go to $CATALINA_HOME and edit setenv.sh file by adding the parameters with the value.
If you want to mass multiple parameters, separate them using space
E.g.
cd /opt/tomcat/bin/
sudo nano setenv.sh
edit the line
CATALINA_OPTS="${CATALINA_OPTS}"
to
CATALINA_OPTS="${CATALINA_OPTS} -Dparam=value -Dparam2=value2"
restart tomcat:
service tomcat restart
you should now be able to see the arguments passed to tomcat when you run:
ps aux | grep tomcat
Made it work in Windows, by generating a setenv.bat file in the same directory as catalina.bat and startup.bat (as recommended in catalina.bat)
and put in the contents of the .bat:
set CATALINA_OPTS="-DyourVariableName=yourValue"
That's all. I liked this way as it looks pretty clean
I need to pass a couple of JVM arguments to the JVM which Tomcat is running in so that my application can pick them up.
I want to follow the process outlined in this article to pick up environment variables.
How would I go about doing this?
UPDATE
Sorry This is running under windows (7 on my Dev machine 2003 on client server)
Windows:
In your Tomcat /bin folder, you should have a tomcat5w.exe admin app (or in later versions tomcat6w.ex, tomcat8w.exe, etc). Go to the Java tab and add the args in the "Java Options:" box.
Note that when you add new args, you need to add them as NEW LINES in that box (above or below any others there), not as additional arguments IN FRONT or BACK of values on one of the existing lines.
Linux / UNIX:
In *nix, changes to the setenv.sh file should be picked up:
export JAVA_OPTS=-server -Xms2g -Xmx4g -XX:PermSize=64m -XX:MaxPermSize=256m $JAVA_OPTS
Don't touch catalina.sh or other files in bin
You need to edit the Windows service. There are three ways to do this:
Start Tomcat5w with //MS//ServiceName to get an icon in the system tray which gives you a quick access to the configuration of the service.
Open the service manager in the "Control Panel". There is an entry for Tomcat.
In the editor, there is a tab where you can add additional JVM parameters.
The third way (which I prefer) is to write a script which edits the config for you. This way, you can save the config somewhere for backup. See the docs how to do that (Hint: use tomcat5 //US//...)