Logging in separate .log files using Log4j - java

I'm working on a multi-module Maven web project.
Let's say the structure looks something like this:
Project
Module1
Module2
Persist
log4j.properties
I managed to log all the log-entries into one file (placing the log4j.properties files into the project-persist module). So far, this is clear.
Now, there are some modules which I would like to separate the logs into other files.
Adding just a new appender (testAppender in the example) doesn't work for me because then I don't get the path of the .java file the log was written from.
If I write it like this:
Logger log = Logger.getLogger("testAppender");
I get something like this:
2017-276-06 15:00:00,032 [INFO ] Start rule activation. (testAppender)[__ejb-thread-pool3]
And this is what I want:
2016-06-06 15:00:00,032 [INFO ] Start rule activation. (Module1.src.main.java.somepkg.MyClass)[__ejb-thread-pool3]
Where MyClass is the .java file.
I also tried to add a completely new (independent) log4j.properties file to the Maven-modules (Module1 and Module2), just like in the Persist-module, I want to separate the logs from, with a different path to the .log file, but it logs only the entries from the maven tests - which is another problem I have, but one at a time.
Is there a way to add a new appender that will separate the log entries by the modules they're comming from and to have them printed in the .log file?

In the jar module, exclude the file from the jar:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.3.1</version>
<configuration>
<excludes>
<exclude>log4j.xml</exclude>
</excludes>
</configuration>
</plugin>
Use the buildhelper plugin to attach the log4j.xml to the build as a separate artifact
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>${project.build.outputDirectory}/log4j.xml</file>
<type>xml</type>
<classifier>log4j</classifier>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
Now in your war artifact, copy the xml to the output directory:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>prepare-package</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>${project.groupId}</groupId>
<artifactId>your.jar.project.artifactId</artifactId>
<version>${project.version}</version>
<type>xml</type>
<classifier>log4j</classifier>
<outputDirectory>${project.build.outputDirectory}
</outputDirectory>
<destFileName>log4j.xml</destFileName>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
But of course it would be easier to just put the file in [web-artifact]/src/main/resources in the first place :-)

So, I found a solution to this problem.
I made a class that gets a logger object and from the logger object I take the informations I need to seperate them.
I initilize the logger manually, kind of.
Hope this will help someone!
In every class where the Logger is used, I also call the initilizer.
public class ClassA{
private final Logger LOGGER = Logger.getLogger(ClassA.class);
static{
LOGGER.addAppender(LogConfig.init(LOGGER));
}
some code...
}
A here is what I did in the LogConfig class:
public class LogConfig {
public static RollingFileAppender init(Logger LOGGER) {
String logInfo = LOGGER.getClass().toString();
String loglevel = getLogLevel(logInfo);
String logClazz = getLogClazz(logInfo);
String logModule = getModule(logInfo);
PatternLayout PL = new PatternLayout("%d [%-5p] %m (%c)[%t]%n");
try {
if (logModule == "presentation") {
RollingFileAppender appender = new RollingFileAppender(PL, "PathToLogFile_1.log", true);
return appender;
}
else if (logModule == "business") {
RollingFileAppender appender = new RollingFileAppender(PL, "PathToLogFile_2.log", true);
return appender;
}
RollingFileAppender appender = new RollingFileAppender(PL, "PathToLogFile_3.log", true);
return appender;
}catch (IOException e) {
e.printStackTrace();
}
return null;
}
private static String getLogClazz(String logInfo) {
return logInfo.substring(logInfo.indexOf("("), logInfo.indexOf(")") + 1);
}
private static String getLogLevel(String logInfo) {
return logInfo.substring(24, 31);
}
private static String getModule(String logInfo) {
logInfo = logInfo.substring(logInfo.indexOf("(") + 15, logInfo.length());
return logInfo.substring(0, logInfo.indexOf("."));
}
}
If you ever use this code, noticethat you will have to adjust the three methods that return the classname, log level and module name, depending on how you have named your packages, classes, modules etc..
The LOGGER.getClass().toString();
returned a string that looks something like this
[timestamp] (moduleName.packageName.src.java.yourClassName);
With the String.class() methods you can easily take the information you need.

Related

JOOQ custom code generator with maven mojos

I have a custom code generator extending JavaGenerator and it would be very useful if the user using this generator could specify additional information like a list of column names to which the custom generator applies to.
My first though would be to add configuration options to the dependency with mojos.
However this does not seem to work properly because during the build cycle two separate instance are created, once from maven and once from JOOQ.
This is my custom generator:
#Mojo(name="generation")
public class CustomGenerator extends JavaGenerator implements org.apache.maven.plugin.Mojo, ContextEnabled {
#Parameter(property="tableNames")
private List tableNames;
private static final Logger log = LoggerFactory.getLogger(CustomGenerator.class);
#Override
protected void generateSchema(SchemaDefinition schema) {
//custom code generation based on the variable "tableNames"
}
#Override
public void execute() throws MojoExecutionException, MojoFailureException {
//called when maven instantiates this class
}
//bunch of empty methods I do not care about but have to be there because
//I cannot let this class also extend from AbstractMojo
#Override
public void setPluginContext(Map map) {
}
#Override
public Map getPluginContext() {
return null;
}
#Override
public void setLog(Log log) {
}
#Override
public Log getLog() {
return null;
}
}
And this is my pom of the project where I use the generator and want to supply additional information:
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<version>3.15.4</version>
<configuration>
<generator>
<name>org.example.CustomGenerator</name>
</generator>
</configuration>
</plugin>
<plugin>
<groupId>org.example</groupId>
<artifactId>customgenerator</artifactId>
<version>1.0</version>
<configuration>
<tableNames>
<tableName>test_table</tableName>
</tableNames>
</configuration>
<executions>
<execution>
<goals>
<goal>generation</goal>
</goals>
</execution>
</executions>
</plugin>
...
<dependency>
<groupId>org.example</groupId>
<artifactId>customgenerator</artifactId>
<version>1.0</version>
</dependency>
If there are any other methods to supply custom information to the generator, please let me know.
Using database properties
Maybe, much simpler, use the properties available in database as follows:
<configuration>
<generator>
<database>
<properties>
<property>
<key>some_key</key>
<value>some_value</value>
</property>
</properties>
</database>
</generator>
</configuration>
There's no specification what you place as key/value in those properties. They've been added for purposes like yours, e.g. to add custom configuration to advanced <database> implementations like:
DDLDatabase
XMLDatabase
LiquibaseDatabase
JPADatabase
See also those sections to see how the properties are used by those databases.
Your custom JavaGenerator logic can then access the properties via:
Properties properties = definition.getDatabase().getProperties();
Where definition is any object that is being generated, e.g. SchemaDefinition in your code.
Simplest solution
Of course, setting system properties is always a pragmatic option.

Setting CventSessionHeader CVENT API WSDL

I am generating Java classes from the CVENT WSDL file using a maven plugin (see the sample below from my POM file). The code generates successfully.
I then call the code (see below) (the start and end dates passed into the getUpdated call are parameters to my method)
When I run / debug, it connects succesfully, but the getUpdated call fails:
Fault from server: INVALID_CVENT_HEADER_VALUE
In examples online, I can see that I need to set the header on the session - but I don't see any method in V200611Soap that allows me to set it.
Anyone with experience of this, or any sample code?
Thanks in advance.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxws-maven-plugin</artifactId>
<version>1.12</version>
<configuration>
<wsdlUrls>
<wsdlUrl>https://api.cvent.com/soap/V200611.ASMX?WSDL</wsdlUrl>
</wsdlUrls>
<keep>true</keep>
<sourceDestDir>${basedir}/target/generated/src/main/java</sourceDestDir>
</configuration>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
</execution>
</executions>
</plugin>
V200611 aV200611 = new V200611();
V200611Soap soap = aV200611.getV200611Soap();
String accountNumber = "xxxxxx";
String userName = "xxxxxx";
String password = "xxxxxx";
LoginResult logingResult = soap.login(accountNumber, userName, password);
CventSessionHeader header = new CventSessionHeader();
header.setCventSessionValue(logingResult.getCventSessionHeader());
GetUpdatedResult getUpdatedResult = soap.getUpdated(CvObjectType.TRAVEL, startDateXMLGregorianCalendar, endDateXmlGregorianCalendar);
I fixed by changing to use the cxf plugin
Then added the wsdlOption
<extendedSoapHeaders>true</extendedSoapHeaders>
Which puts the arguments that are implicit (in the wsdl:binding but not wsdl:port), into the generated API classes.

Get absolute filepath in maven-plugin using Apache Maven File Management API

I am currently trying to write a maven plugin which should be able to create/process some resource files during the "generate-resources" phase. Everything works fine but during the process my plugin needs to read some other files as input, so I decided to use the Apache Maven File Management API to specify the input file paths. I set up everything like described in the In a MOJO example.
<plugin>
<groupId>my.groupId</groupId>
<artifactId>my-maven-plugin</artifactId>
<version>0.0.1-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>mygoal</goal>
</goals>
<phase>generate-resources</phase>
</execution>
</executions>
<configuration>
<fileset>
<directory>${basedir}/src/main/resources</directory>
<includes>
<include>**/*.xml</include>
</includes>
</fileset>
</configuration>
</plugin>
But I am not able to retrieve the absolute Filepath of the files:
public void execute() throws MojoExecutionException {
FileSetManager fileSetManager = new FileSetManager();
for (String includedFile : fileSetManager.getIncludedFiles(fileset)) {
getLog().info(includedFile);
}
}
...as the result is just the filename like:
[INFO] --- my-maven-plugin:0.0.1-SNAPSHOT:mygoal (default) ---
[INFO] some-file-A.xml
[INFO] some-file-B.xml
I am also not able to concatenate the fileset.directory with the filename because FileSetManager does not contain a method to retrieve the fileset.directory value.
So how can I retrieve the absolute file path of the includes?
I found out that fileset.getDirectory() does the trick.
public void execute() throws MojoExecutionException {
FileSetManager fileSetManager = new FileSetManager();
for (String includedFile : fileSetManager.getIncludedFiles(fileset)) {
getLog().info(fileset.getDirectory() + File.separator + includedFile);
}
}

jaxb - add elements to list (maven-plugin)

I generated my classes with jaxb and now I need to populate some list. What's the best way to do that?
pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>xjc</goal>
</goals>
</execution>
</executions>
<configuration>
<schemaDirectory>${basedir}/src/main/resources/META-INF/xsd</schemaDirectory>
<packageName>be.structure</packageName>
<outputDirectory>${basedir}/target/generated/java</outputDirectory>
</configuration>
</plugin>
The generated class where the list is located:
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "configuration", propOrder = {
"professions"
})
public class Configuration {
protected List<Profession> professions;
public List<Profession> getProfessions() {
if (professions == null) {
professions = new ArrayList<Profession>();
}
return this.professions;
}
}
but as you can see there is no "addProfession" or "setProfessions()" or something. I know there is a way, but I can't really remember it..
getProfessions().add(profession) should do the trick if the underlying list is mutable, but normally you wouldn't change the contents of JAX-instances since JAXB populates the objects for you based on the XML data it is read from - if you change that lists, its no representation of the XML anymore.

Maven cli gives syntax error but in eclipse is ok. Is java syntax correct? Really strange

I'm having a very strange issue in my java application. I'm building it with maven and developing it in Eclipse IDE. This might be a bit lengthy explanation but please stick to the end of it, cause the problem is really strange and I have no clue what can be the cause of it.
Here's an example of code I'm writing:
Suppose we have a "Handler" interface. It can handle a specific object type:
public interface Handler<T> {
public void handle(T obj);
}
Now let's say we want to have Handler chaining. We could do it like this:
public class HandlerChain<T> implements Handler<T> {
private Handler<? super T> h;
#Override
public void handle(T obj) {
//h can handle T objects
h.handle(obj);
}
private HandlerChain(Handler<? super T> h) {
super();
this.h = h;
}
//syntax sugar to start the chain
public static <T> HandlerChain<T> start(Handler<? super T> h){
return new HandlerChain<T>(h);
}
//add another handler to the chain
public HandlerChain<T> next(final Handler<? super T> handler){
return new HandlerChain<T>(new Handler<T>() {
#Override
public void handle(T obj) {
h.handle(obj);
handler.handle(obj);
}
});
}
}
Now let's make some handler factories for, say, String handlers:
public class Handlers {
public static Handler<String> h1(){
return new Handler<String>(){
#Override
public void handle(String obj) {
// do something
}
};
}
public static Handler<String> h2(){
return new Handler<String>(){
#Override
public void handle(String obj) {
// do something
}
};
}
}
So finally we make a class that handles some Strings using the two handlers in a chain:
public class Test {
public void doHandle(String obj){
HandlerChain.start(Handlers.h1()).next(Handlers.h2()).handle(obj);
}
}
So, to me there seemed nothing wrong with this code. Eclipse IDE didn't mind either. It even ran it correctly. But when I tried to compile this code with maven from cli, I got an error:
Test.java:[7,50] next(Handler<? super java.lang.Object>) in HandlerChain<java.lang.Object> cannot be applied to (Handler<java.lang.String>)
Had anyone stumbled upon similar problems? I would really like to know if this kind of syntax is valid in java or is this some strange compiler bug due to bad setup or something? To repeat Eclipse compiles AND runs this code correctly, but maven cli cannot compile it.
Finally, here's my maven-compiler-plugin settings in pom.xml. It might be relevant to the whole issue.
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<executions>
<execution>
<id>default-testCompile</id>
<phase>test-compile</phase>
<goals>
<goal>testCompile</goal>
</goals>
<configuration>
<encoding>UTF-8</encoding>
<source>1.6</source>
<target>1.6</target>
</configuration>
</execution>
<execution>
<id>default-compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<encoding>UTF-8</encoding>
<source>1.6</source>
<target>1.6</target>
</configuration>
</execution>
</executions>
<configuration>
<encoding>UTF-8</encoding>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
Thanks in advance!
Had anyone stumbled upon similar problems?
Yes, and also when doing weird and unusual (probably incorrect) stuffs with generic types.
The problem was that the eclipse compiler don't report a compilation error on those strange construct while the standard javac from Sun JDK complains about type erasure. (it was with JDK 1.6). (If I remember well: eclipse report a only a warning)
My solution was to setup maven to use eclipse compiler. The other (better) option was to fix the code, but since it was a quite complex task and since I didn't have any issue at runtime I choose the "quick and dirty" first option.
Here is how to setup the compiler:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${compiler.source}</source>
<target>${compiler.target}</target>
<encoding>${source.encoding}</encoding>
<fork>false</fork>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.13.0</version>
</dependency>
</dependencies>
</plugin>

Categories