I'm using log4j2 to log messages to a system via JMS/MQ. I used IBM MQ Explorer to create a .bindings file in my local C:\JNDI-Directory folder. This works great when running locally, but when I deploy to WebSphere, I would like to be able to bundle my .bindings file with my EAR. The appender in my log4j2.xml looks like:
<JMS name="jmsQueue"
destinationBindingName="AuditDest"
factoryName="com.sun.jndi.fscontext.RefFSContextFactory"
providerURL="file:/C:/JNDI-Directory"
factoryBindingName="JMSConnectionFactory">
<PatternLayout pattern="%m"/>
</JMS>
My deploy environment is WebSphere Application Server running on AIX. I will have several environments I'm deploying to (i.e. Dev, Test, QA, Stage, Production) and in my build I would like to use an environment specific .bindings file based on the one I created locally as only the queue manager name and IP will change for each environment.
So I thought I could put the .bindings file in a properties folder and during the build for a deployed environment just copy dev.bindings to .bindings in the WAR or EAR archive. However, I'm not certain how to construct the providerURL string. If I just use "file:/properties/" on my local machine it gets interpreted as C:\properties instead of looking in my WAR or EAR MANIFEST.MF Class-Path.
Is it possible to do this, or will I have to request our WAS admins to create a .bindings file for each server? I want to avoid this so I can more directly control things as requesting changes in our WAS environment will require a ticket, that requires lead time to complete, etc.
Thanks!!!
I have a properties file called app.properties that is correctly put in src/main/resources/app.properties.
In the WAR file it is correctly located in \WEB-INF\classes.
In Standalone mode in my local environment (windows) and standalone mode on a linux test server, the WAR file starts up correctly reading my properties file.
In Jboss Domain mode on another linux server, using the exact same WAR file, I get error file not found app.properties. It is indeed there.
Other than domain mode being the difference between the two servers, the first test server jboss is installed under root and running as root. The other server is running as a user that has read and execute access.
I've thoroughly debugged the code with print statements and im 99% sure it is not a code issue, any ideas what in jboss domain mode may be causing the problem of not being able to read the properties file on the classpath?
Thanks in advance.
Relevant parts of code
MutablePropertySources sources = new MutablePropertySources();
sources.addLast(getEncryptablePropertiesSource(new ClassPathResource("app.properties")));
partial method
private EncryptablePropertiesPropertySource getEncryptablePropertiesSource(Resource propsResource) throws IOException{
//don't use file system resource because property files may be in a jar
System.out.println(">>>> in getEncryptablePropertiesSource filename is :");
System.out.println(propsResource.getFilename());
System.out.print(">>>> URL is: ");
System.out.println(propsResource.getURL());
The last System out print statement throws the error in the 2nd test environment, does not cause problems in any other environment.
If your ClassPathResource is the class from Spring:
public class ClassPathResource extends AbstractFileResolvingResource
Resource implementation for class path resources. Uses either a given
ClassLoader or a given Class for loading resources.
Supports resolution as java.io.File if the class path resource resides
in the file system, but not for resources in a JAR. Always supports
resolution as URL.
Therefore I don't think you can use it in your case.
Have you tried using one of the following methods?
ClassLoader.getResource(String name)
ClassLoader.getResourceAsStream(String name)
I am facing a little strange issue while deploying web service to WAR file.
If I deploy the application via Netbeans IDE it is going under \standalone\deployments directory.
However, if I deploy the war file from Admin Console it is always getting deployed at \standalone\tmp directory.
Please guide on this issue. The deployment should go under \standalone\deployments directory only.
The deployment should go under \standalone\deployments directory only
You are quite not right.
It is not an issue. It is what it is.
standalone/deployment folder stand there only for "hot-deployment" functionality available only with standalone mode.
So, Netbeans uses it. You can do the same just by saving EAR or WAR into standalone/deployment and server will pick it. (default scan interval is 5 sec.)
but Admin console or CLI is only (and standard) way to deploy application on domain. In domain mode deployment folder is not in use and there is no deployment scanner.
Then when you use console it goes common way - deploys as on domain regardless is it domain or standalone server.
Updated / follow-up:
In general it is better to keep .properties file(s) out of deployment, in separate location. It is main idea behind them - to be able to change properties without application rebuilding and redeploying. Usually properties are different in different environments (DEV/UAT/PROD)
So there are 2 most popular solutions:
store properties in different location add that location to class path and access them through ClassLoader.getResourceAsStream() mechanism
store properties in different location, pass that location through system (or -D) variable and access them as file. for JBoss you can place your .properties under configuration directory. there is already JBoss variable. Kind of jboss.config.dir (or such, you can find it in Admin console, I do not have JBoss right now).
But of course sometime it still needed to access resources inside WAR/EAR - in that situation it is pretty much the same as first solution above.
Just be sure your .properties file(s) are accessible through to ClassLoader (in class path) and use them from ClassLoader.getResourceAsStream (or if you use Spring point it as "classpath:" not as "file:".
I'd like to create a desktop standalone application from my Java/Spring web application. I created MSI-installer that copy all required files to C:\Program Files (x86)\App. But tomcat doesn't have permission and can't write to its own folder. How I can configure tomcat so it would write all app-specific data to other folder? I wouldn't like to install my app to C:\App or user dir.
java.io.FileNotFoundException: C:\Program Files (x86)\App\tomcat\logs\catalina.2016-06-18.log (Access denied)
By setting the environment variable CATALINA_BASE to another directory in your tomcat start script you can configure Tomcat to read/put the working data, configuration and stuff from/to another location. If CATALINA_BASE is set, Tomcat will use the folders %CATALINA_BASE%/bin, %CATALINA_BASE%/conf, %CATALINA_BASE%/logs, %CATALINA_BASE%/temp, etc. for the current instance of Tomcat. This is described in more detail in the Advanced Configuration - Multiple Tomcat Instances section of the RUNNING.txt file in Tomcat's root folder.
If you are planning to ship Tomcat with your application and put all Tomcat files to some user choosable folder you should set CATALINA_HOME to this folder. Tomcat will then use this folder as base directory for everything.
However I think, as you mentioned to ship a standalone application based on Spring, you should seriously take a look at Spring Boot. This will allow you to ship a single fat jar containing all of your application's dependencies (including Tomcat). And this application can simply be started by executing the jar file.
If it's just about the log and temp files, you can set the Java system properties java.util.logging.config.file and java.io.tmpdir in the setenv.bat file under %CATALINA_BASE%/bin to make Tomcat use a custom logging configuration and a different temp dir, respectively. That is the file would look something like this:
set CATALINA_OPTS="-Djava.util.logging.config.file=file:///c:/path/to/log/config.properties"
set CATALINA_OPTS="%CATALINA_OPTS% -Djava.io.tmpdir=c:/path/to/temp/dir"
Find and update all occurrences of "$CATALINA_BASE"/logs/catalina.out to custom path in catalina.sh script.
We're developing a big J2ee e-sales solution. It's got a lot of integrations: CMS, ERP, Mail server etc. All these systems are divided into test and production environments.
We need to deploy our application to our test servers with test configuration and when deployed to our production servers it should use the production configuration. How do we make our application select the correct properties?
The thing we've tried so far is this:
All our property files contain test properties and production properties
test.mvxapi.server = SERV100TS
test.mvxapi.username = user
test.mvxapi.password = password
test.mvxapi.port = 6006
test.mvxapi.cono = 600
mvxapi.server = SERV10001
mvxapi.username = user
mvxapi.password = password
mvxapi.port = 6001
mvxapi.cono = 100
The Util that reads these properties has a switch: isTest() which prefixes the key with "test."
public String getProperty(String property)
{
return properties.getProperty(prefix + "" + property);
}
The switch is set by another property which is created by our build server. When the .EAR is built the script for our production servers injects (input to build.xml) "isProduction=true" into system.properties.
<propertyfile file="${buildDir}/system.properties">
<entry key="isProduction" value="${systemType}"/>
</propertyfile>
I'm not sure this is the best way to do it. If for some reason "isProduction=false" is committed wrongly to our production environment all hell is loose.
I've read people have properties locally on the server. But we really don't want to have files spread around. We have cluster of production servers. Making sure every server has the right property file doesn't seem fail-safe
What you want to avoid is having the config file inside the EAR, the problem with this is that you need different EAR's for different environments, and also, changing the config file requires a rebuild.
Rather deploy the same EAR to every server but configure each server with a different URL resource. iow, add a JNDI URL resource to all the servers you deploy to that point to the config file for that resource. If you have read only SVN access to your repo then create the config files on the svn repo, or any repo you can access via a URL. The cool thing here is that all your configuration is centralized and thus managing them is easy.
What I've done (by customizing with spring) is make sure that JNDI URL resource optional. So, if it's there, the app will use it, if not, it won't. The app starts up whether it's there or not. That way, even when running with no JNDI resource available, the app still works (development environment for example).
You deploy an EAR? Then put the properties needed in JNDI.
I can't say if this is the best way, however, what we do is include a client and server jar which houses the properties accordingly. We then include those jars in the EAR file. So during our build process we include the appropriate (QA, TEST, PROD) jars for the environment in which we are deploying to.
The downside is we have to manage three sets of environment jars and the build team has to be careful not to deploy the incorrect one. In fact, it has happened once that we had a PROD jar deployed to our QA environment and QA data was getting put into production....yes that sucked and was a major mess to clean up.
I will be watching this discussion because I often wonder how we can make this process better/safer. Great Post +1
In a previous J2EE project, we've been doing exactly that. The build process (an ant script) put together the right config files, added them to a certain jar which was then put into the EAR file for production environments, test, training, QA, etc.
The file name of the EAR file contained the name of the target environment, so it was basically impossible to deploy a file to the wrong environment. If we built for target 156p2 (factory 156, production env. 2), this would be part of the file name of the EAR file and ant would include config_156p2.xml. If the target was incorrect, the EAR file's name would be wrong and as a last failsafe the guy who deployed it would notice.
The build file had to contain this: one ant target to start the build for each environment which would set a property that told ant which config file to include.
The only difference between the EAR files would then be the config files. Everything else was identical. There is a possibility, of course, that someone might have written a wrong value to a config file for a certain environment. However, in practice this never happened in several years, even with some pretty junior developers and about fifteen target environments (different test, QA, training and production servers in different countries).
We have 3 folders for this purpose in our projects, each one contains configuration files (filenames are the same between the folders):
personal: contains paths to test db, server, etc
test: contains paths to the servers shared with my colleagues
production: contains... well you guessed
When I build my project I add the suited profile to Intellij Idea project build, in the desidered module, this basically means that i am adding a different folder to the project structure, but because filenames are the same what changes are only profile properties.
Very old post still responding in case someone checks it. In each application server you can set System properties e.g
Wildfly Management Console --> Configuration --> System Properties
There I add a variable SERVER_ENVIRONMENT with value as DEV/UAT/PROD.
In my java code I use:
System.getProperty ("SERVER_ENVIRONMENT")
which gives me value from the server.
Like #Alberto-Zaccagni said you can have separate folders with properties files that exist only in respective environment. Your code checks for existence of folder starting with PROD then UAT then DEV and when it finds a path exists it uses the properties files there.