Log4j : Creating/Modifying appenders at runtime, log file recreated and not appended - java

I want to create and enable an appender for a particular method call MyMethod(), whose log output is supposed to go to a file present at "logFilePath".
I do not want to include this appender in the xml configuration file, so I thought to create it at run time.
First, I tried to modify the logger properties at runtime and then calling activateOptions, eg. setting level to DEBUG before and setting it to Off in the finally block, so that output is logged only while the method is in use. That didn't work.
My problem here is that appender recreates a file everytime, and does not append to the same file. This is inspite of setAppend being true.
I am not very familiar with log4j, so please feel free to suggest an alternative approach.
The following is sample code to explain what I am trying.
private static FileAppender createNewAppender(String logFilePath) {
FileAppender appender = new FileAppender();
appender.setName("MyFileAppender");
appender.setLayout(new PatternLayout("%d %-5p [%c{1}] %m%n"));
appender.setFile(logFilePath);
appender.setAppend(true);
appender.setThreshold(Level.INFO);
appender.activateOptions();
Logger.getRootLogger().addAppender(appender);
return appender;
}
private static void removeAppender() {
Logger.getRootLogger().removeAppender(fileAppender) ; // ("MyFileAppender");
}
I call the above methods in the following way:
private static FileAppender fileAppender = null;
private static void myMethod(String logFilePath) {
try {
fileAppender = createNewAppender();
someOperation();
}
finally {
removeAppender();
fileAppender=null;
}
}

very easy just create a method and add this
String targetLog="where ever you want your log"
FileAppender apndr = new FileAppender(new PatternLayout("%d %-5p [%c{1}] %m%n"),targetLog,true);
logger.addAppender(apndr);
logger.setLevel((Level) Level.ALL);
then in any method you need to log just do this:
logger.error("your error here");

I do the following from scala (basically the same):
Set my root logging level to TRACE but set the threshold on my global appenders to info.
# Root logger option
log4j.rootLogger=TRACE, file, stdout
# log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=./log.log
log4j.appender.file.MaxFileSize=100MB
log4j.appender.file.MaxBackupIndex=1
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{HH:mm:ss} %m%n
log4j.appender.file.Threshold=INFO
# log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{HH:mm:ss} %m%n
log4j.appender.stdout.Threshold=INFO
Then in the class I want to log:
private def set_debug_level(debug: String) {
import org.apache.log4j._
def create_appender(level: Level) {
val console_appender = new ConsoleAppender()
val pattern = "%d %p [%c,%C{1}] %m%n"
console_appender.setLayout(new PatternLayout(pattern))
console_appender.setThreshold(level)
console_appender.activateOptions()
Logger.getRootLogger().addAppender(console_appender)
}
debug match {
case "TRACE" => create_appender(Level.TRACE)
case "DEBUG" => create_appender(Level.DEBUG)
case _ => // just ignore other levels
}
}
So basically, since I set the threshold of my new appender to TRACE or DEBUG it will actually append. If I change the root to another level it will not log a lower level.

Related

log4j2 with pax-logging: can't use values from StructuredDataMessage

I'm using pax-logging-api along with pax-logging-log4j2 for logging from my OSGi bundles. I would like to utilize Log4J2's StructuredDataMessage (using EventLogger) to write some messages to a database. However, I'm unable to read the values I put into the StructuredDataMessage from the appenders when using Pax Logging.
The following works in a non-OSGi project using the Log4J2 libraries directly:
log4j2.properties:
appender.console.type = Console
appender.console.name = STDOUT
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %m%n
appender.event.type = Console
appender.event.name = event
appender.event.layout.type = PatternLayout
appender.event.layout.pattern = %marker ${sd:id} ${sd:testKey} %n %m%n
rootLogger.level = debug
rootLogger.appenderRef.console.ref = STDOUT
logger.event.name = EventLogger
logger.event.level = debug
logger.event.appenderRef.console.ref = event
logger.event.additivity = false
Test.java:
public class Test {
private static final Logger LOGGER = LogManager.getLogger(Test.class);
public static void main(String[] args) {
StructuredDataMessage msg = new StructuredDataMessage("1", "message", "event");
msg.put("testKey", "testValue");
LOGGER.info(msg);
EventLogger.logEvent(msg);
}
}
Output:
1 testValue event [1 testKey="testValue"] message
EVENT 1 testValue
event [1 testKey="testValue"] message
Note that the event appender properly dereferenced the sd keys from the StructuredDataMessage.
However, the following does not work in OSGi with pax-logging:
org.ops4j.pax.logging.cfg:
log4j2.appender.console.type = Console
log4j2.appender.console.name = STDOUT
log4j2.appender.console.layout.type = PatternLayout
log4j2.appender.console.layout.pattern = %m%n
log4j2.appender.event.type = Console
log4j2.appender.event.name = event
log4j2.appender.event.layout.type = PatternLayout
log4j2.appender.event.layout.pattern = %marker \$\\\{sd:id\} \$\\\{sd:testKey\} %n %m%n
log4j2.rootLogger.level = debug
log4j2.rootLogger.appenderRef.console.ref = STDOUT
log4j2.logger.event.name = EventLogger
log4j2.logger.event.level = debug
log4j2.logger.event.appenderRef.console.ref = event
log4j2.logger.event.additivity = false
Test.java:
public class Test implements BundleActivator {
private static final Logger LOGGER = LogManager.getLogger(Test.class);
#Override
public void start(BundleContext context) throws Exception {
StructuredDataMessage msg = new StructuredDataMessage("1", "message", "event");
msg.put("testKey", "testValue");
LOGGER.info(msg);
EventLogger.logEvent(msg, Level.INFO);
}
#Override
public void stop(BundleContext context) throws Exception {
}
}
Output:
event [1 testKey="testValue"] message
EVENT ${sd:id} ${sd:testKey}
event [1 testKey="testValue"] message
Is there a trick to getting this to work in pax-logging? I am able to access values from the MDC using \$\\\{ctx:key\} when applicable, so I'm assuming the syntax is similar. I've also tried using the lookups in patterns for RoutingAppender, FileAppender, etc. to no avail.
Thanks in advance!
Edit: I'm using the latest version of pax-logging-api and pax-logging-log4j2 (1.11.3)
OK, it's not yet a definitive answer - simply comment is too short to describe what happens.
The stack trace with your call is:
"pipe-restart 238#10666" prio=5 tid=0xc3 nid=NA runnable
java.lang.Thread.State: RUNNABLE
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.doLog0(PaxLoggerImpl.java:354)
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.doLog(PaxLoggerImpl.java:337)
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.inform(PaxLoggerImpl.java:233)
at org.ops4j.pax.logging.internal.TrackingLogger.inform(TrackingLogger.java:209)
at org.ops4j.pax.logging.log4jv2.Log4jv2Logger.logMessage(Log4jv2Logger.java:162)
at org.apache.logging.log4j.spi.AbstractLogger.log(AbstractLogger.java:2102)
at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2190)
at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2144)
at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2127)
at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1828)
at org.apache.logging.log4j.EventLogger.logEvent(EventLogger.java:56)
at grgr.test.ActivatorLogging.start(ActivatorLogging.java:39)
...
org.ops4j.pax.logging.log4jv2.Log4jv2Logger.logMessage() is bridge between logging facade and logging backend.
Remember - with pax-logging, you can, say, use Commons Logging facade with Log4J1 backend, or Log4j2 facade (and that's what you're doing) with e.g., Logback backend.
That's why org.ops4j.pax.logging.log4jv2.Log4jv2Logger.logMessage() does this:
} else if (level.intLevel() >= Level.INFO.intLevel()) {
m_delegate.inform(paxMarker, message.getFormattedMessage(), t, fqcn);
and your structured message is changed into String event [1 testKey="testValue"] message.
Only then configured appenders are called - and the appender with layout that extract structured data can't find it, because the structured message is already converted to plain String.
These 3 lines:
at org.ops4j.pax.logging.log4j2.internal.PaxLoggerImpl.inform(PaxLoggerImpl.java:233)
at org.ops4j.pax.logging.internal.TrackingLogger.inform(TrackingLogger.java:209)
at org.ops4j.pax.logging.log4jv2.Log4jv2Logger.logMessage(Log4jv2Logger.java:162)
do the crossing from pax-logging-api (facade) through TrackingLogger (bridge) to pax-logging-log4j2 (backend) losing structured information in between.
I've created https://ops4j1.jira.com/browse/PAXLOGGING-302 and hope to do something about it soon.
EDIT1
The key is that in org.apache.logging.log4j.core.lookup.StructuredDataLookup#lookup(), this condition is true:
if (event == null || !(event.getMessage() instanceof StructuredDataMessage)) {
return null;
}
EDIT2
I've just fixed https://ops4j1.jira.com/browse/PAXLOGGING-302 and this test proves it works:
Logger logger = LogManager.getLogger("my.logger");
logger.info(new StructuredDataMessage("1", "hello!", "typeX").with("key1", "sd1"));
logger.info(new StringMapMessage().with("key1", "map1"));
List<String> lines = readLines();
assertTrue(lines.contains("my.logger/org.ops4j.pax.logging.it.Log4J2MessagesIntegrationTest typeX/sd1 [INFO] typeX [1 key1=\"sd1\"] hello!"));
assertTrue(lines.contains("my.logger/org.ops4j.pax.logging.it.Log4J2MessagesIntegrationTest ${sd:type}/map1 [INFO] key1=\"map1\""));
the configuration is:
log4j2.appender.console.type = Console
log4j2.appender.console.name = console
log4j2.appender.console.layout.type = PatternLayout
log4j2.appender.console.layout.pattern = %c/%C ${sd:type}/${map:key1} [%p] %m%n
log4j2.rootLogger.level = info
log4j2.rootLogger.appenderRef.file.ref = console
(you have to use the escape sequences you've used, if configuring via etc/org.ops4j.pax.logging.cfg in Karaf).

error when start up tomcat with log4j: log4j:ERROR Could not find value for key log4j.appender

I have 2 issues with the log4j and maybe one causes the other.
when I tried to start up my tomcat I get this error:
log4j:ERROR Could not find value for key log4j.appender.rollingFile
I cannot insert the log line to DB.
log4j.properties file:
log4j.rootLogger=INFO, S, rollingFile, sql
log4j.appender.S =org.apache.log4j.ConsoleAppender
log4j.appender.S.layout =org.apache.log4j.PatternLayout
log4j.appender.S.layout.ConversionPattern = %d{dd MMM yyyy HH:mm:ss,SSS} %c{1} [%p] %m%n
log4j.appender.rollingFile = org.apache.log4j.DailyRollingFileAppender
log4j.appender.rollingFile.File =C:\\log.log
log4j.appender.rollingFile.Append = true
log4j.appender.rollingFile.MaxFileSize=2000KB
log4j.appender.rollingFile.MaxBackupIndex=19
log4j.appender.rollingFile.Threshold = ALL
log4j.appender.rollingFile.DatePattern = '.'yyy-MM-dd
log4j.appender.rollingFile.layout = org.apache.log4j.PatternLayout
log4j.appender.rollingFile.layout.ConversionPattern = %d{dd MMM yyyy HH:mm:ss,SSS} %c{1} [%p] %m%n
log4j.appender.sql=org.apache.log4j.jdbc.JDBCAppender
log4j.appender.sql.URL=jdbc:jtds:sqlserver://devsql/DBName
log4j.appender.sql.driver=net.sourceforge.jtds.jdbc.Driver
log4j.appender.sql.user=user_name
log4j.appender.sql.password=password
log4j.appender.sql.sql=INSERT INTO Log (logTypeId,logServiceId,hostId,logDateTime,logTitle,logModule,logDesc,logLineNumber,logStackTrace,userId) VALUES('%p','Project','172.0.12.123',now(),'%c{1}','%c{2}',N'%m','%L','%throwable{200}','%X{userId}')
log4j.appender.sql.layout=org.apache.log4j.PatternLayout
my class is:
import org.apache.log4j.Logger;
public class BasicApi {
private static final Logger LOG = Logger.getLogger(BasicApi.class.getName());
public ResponseEntity<String> runApi(InputBo input) {
LOG.info("!!!!!!!!!!!!!trying1!!!!!!!!!!!!!!");
LOG.debug("!!!!!!!!!!!!!trying2!!!!!!!!!!!!!!");
}
}
the error is:
log4j:ERROR Could not find value for key log4j.appender.rollingFile
log4j:ERROR Could not instantiate appender named "rollingFile".
important thing: this code is under maven jar project that another project adds it as a dependency.
can you please help me whats wrong in this code? Thanks

Can't see the prints from the map tasks

I have set mapred-site.xml [1] and the log4j.properties [2] to show debug log files, and I have set my map class [3] System.out.printlns, but I don't see this getting printed. I have looked to the logs of the hadoop, or the log of the tasks, but I don't see nothing from the map tasks getting printed. I don't understand why I am getting this. Any help?
[1] mapred-site.xml
~/Programs/hadoop$ cat etc/hadoop/mapred-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
(...)
<property> <name>mapreduce.map.log.level</name> <value>DEBUG</value> </property>
</configuration>
[2] log4j.properties
hadoop.root.logger=DEBUG,console
hadoop.log.dir=.
hadoop.log.file=hadoop.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=ALL
# Null Appender
log4j.appender.NullAppender=org.apache.log4j.varia.NullAppender
#
# Rolling File Appender - cap space usage at 5gb.
#
hadoop.log.maxfilesize=256MB
hadoop.log.maxbackupindex=20
log4j.appender.RFA=org.apache.log4j.RollingFileAppender
log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}
log4j.appender.RFA.MaxFileSize=${hadoop.log.maxfilesize}
log4j.appender.RFA.MaxBackupIndex=${hadoop.log.maxbackupindex}
log4j.appender.RFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
#log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} (%F:%M(%L)) - %m%n
#
# Daily Rolling File Appender
#
log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hadoop.log.dir}/${hadoop.log.file}
# Rollver at midnight
log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
# 30-day backup
#log4j.appender.DRFA.MaxBackupIndex=30
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} (%F:%M(%L)) - %m%n
#
# console
# Add "console" to rootlogger above if you want to use this
#
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
#
# TaskLog Appender
#
#Default values
hadoop.tasklog.taskid=null
hadoop.tasklog.iscleanup=false
hadoop.tasklog.noKeepSplits=4
hadoop.tasklog.totalLogFileSize=100
hadoop.tasklog.purgeLogSplits=true
hadoop.tasklog.logsRetainHours=12
log4j.appender.TLA=org.apache.hadoop.mapred.TaskLogAppender
log4j.appender.TLA.taskId=${hadoop.tasklog.taskid}
log4j.appender.TLA.isCleanup=${hadoop.tasklog.iscleanup}
log4j.appender.TLA.totalLogFileSize=${hadoop.tasklog.totalLogFileSize}
log4j.appender.TLA.layout=org.apache.log4j.PatternLayout
log4j.appender.TLA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
#
# HDFS block state change log from block manager
#
# Uncomment the following to suppress normal block state change
# messages from BlockManager in NameNode.
#log4j.logger.BlockStateChange=WARN
#
#Security appender
#
hadoop.security.logger=INFO,NullAppender
hadoop.security.log.maxfilesize=256MB
hadoop.security.log.maxbackupindex=20
log4j.category.SecurityLogger=${hadoop.security.logger}
hadoop.security.log.file=SecurityAuth-${user.name}.audit
log4j.appender.RFAS=org.apache.log4j.RollingFileAppender
log4j.appender.RFAS.File=${hadoop.log.dir}/${hadoop.security.log.file}
log4j.appender.RFAS.layout=org.apache.log4j.PatternLayout
log4j.appender.RFAS.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
log4j.appender.RFAS.MaxFileSize=${hadoop.security.log.maxfilesize}
log4j.appender.RFAS.MaxBackupIndex=${hadoop.security.log.maxbackupindex}
#
# Daily Rolling Security appender
#
log4j.appender.DRFAS=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFAS.File=${hadoop.log.dir}/${hadoop.security.log.file}
log4j.appender.DRFAS.layout=org.apache.log4j.PatternLayout
log4j.appender.DRFAS.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
log4j.appender.DRFAS.DatePattern=.yyyy-MM-dd
#
# hadoop configuration logging
#
# Uncomment the following line to turn off configuration deprecation warnings.
# log4j.logger.org.apache.hadoop.conf.Configuration.deprecation=WARN
#
# hdfs audit logging
#
hdfs.audit.logger=INFO,NullAppender
hdfs.audit.log.maxfilesize=256MB
hdfs.audit.log.maxbackupindex=20
log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=${hdfs.audit.logger}
log4j.additivity.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=false
log4j.appender.RFAAUDIT=org.apache.log4j.RollingFileAppender
log4j.appender.RFAAUDIT.File=${hadoop.log.dir}/hdfs-audit.log
log4j.appender.RFAAUDIT.layout=org.apache.log4j.PatternLayout
log4j.appender.RFAAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n
log4j.appender.RFAAUDIT.MaxFileSize=${hdfs.audit.log.maxfilesize}
log4j.appender.RFAAUDIT.MaxBackupIndex=${hdfs.audit.log.maxbackupindex}
#
# mapred audit logging
#
mapred.audit.logger=INFO,NullAppender
mapred.audit.log.maxfilesize=256MB
mapred.audit.log.maxbackupindex=20
log4j.logger.org.apache.hadoop.mapred.AuditLogger=${mapred.audit.logger}
log4j.additivity.org.apache.hadoop.mapred.AuditLogger=false
log4j.appender.MRAUDIT=org.apache.log4j.RollingFileAppender
log4j.appender.MRAUDIT.File=${hadoop.log.dir}/mapred-audit.log
log4j.appender.MRAUDIT.layout=org.apache.log4j.PatternLayout
log4j.appender.MRAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n
log4j.appender.MRAUDIT.MaxFileSize=${mapred.audit.log.maxfilesize}
log4j.appender.MRAUDIT.MaxBackupIndex=${mapred.audit.log.maxbackupindex}
# Custom Logging levels
log4j.logger.org.apache.hadoop=DEBUG
# Jets3t library
log4j.logger.org.jets3t.service.impl.rest.httpclient.RestS3Service=ERROR
# AWS SDK & S3A FileSystem
log4j.logger.com.amazonaws=ERROR
log4j.logger.com.amazonaws.http.AmazonHttpClient=ERROR
log4j.logger.org.apache.hadoop.fs.s3a.S3AFileSystem=WARN
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter
#
# Job Summary Appender
#
# Use following logger to send summary to separate file defined by
# hadoop.mapreduce.jobsummary.log.file :
# hadoop.mapreduce.jobsummary.logger=DEBUG,JSA
#
hadoop.mapreduce.jobsummary.logger=${hadoop.root.logger}
hadoop.mapreduce.jobsummary.log.file=hadoop-mapreduce.jobsummary.log
hadoop.mapreduce.jobsummary.log.maxfilesize=256MB
hadoop.mapreduce.jobsummary.log.maxbackupindex=20
log4j.appender.JSA=org.apache.log4j.RollingFileAppender
log4j.appender.JSA.File=${hadoop.log.dir}/${hadoop.mapreduce.jobsummary.log.file}
log4j.appender.JSA.MaxFileSize=${hadoop.mapreduce.jobsummary.log.maxfilesize}
log4j.appender.JSA.MaxBackupIndex=${hadoop.mapreduce.jobsummary.log.maxbackupindex}
log4j.appender.JSA.layout=org.apache.log4j.PatternLayout
log4j.appender.JSA.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
log4j.logger.org.apache.hadoop.mapred.JobInProgress$JobSummary=${hadoop.mapreduce.jobsummary.logger}
log4j.additivity.org.apache.hadoop.mapred.JobInProgress$JobSummary=false
log4j.logger.org.apache.hadoop.yarn.server.resourcemanager.RMAppManager$ApplicationSummary=${yarn.server.resourcemanager.appsummary.logger}
log4j.additivity.org.apache.hadoop.yarn.server.resourcemanager.RMAppManager$ApplicationSummary=false
log4j.appender.RMSUMMARY=org.apache.log4j.RollingFileAppender
log4j.appender.RMSUMMARY.File=${hadoop.log.dir}/${yarn.server.resourcemanager.appsummary.log.file}
log4j.appender.RMSUMMARY.MaxFileSize=256MB
log4j.appender.RMSUMMARY.MaxBackupIndex=20
log4j.appender.RMSUMMARY.layout=org.apache.log4j.PatternLayout
log4j.appender.RMSUMMARY.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n
[3] My map class:
static abstract class SampleMapBase
extends Mapper<Text, Text, Text, Text>{
private long total;
private long kept = 0;
private float keep;
protected void setKeep(float keep) {
this.keep = keep;
}
protected void emit(Text key, Text val, OutputCollector<Text,Text> out)
throws IOException {
System.out.println("Key: " + key.toString() + " Value: " + val.toString());
System.out.println("Key: " + key.getClass() + " Value: " + val.getClass());
++total;
while((float) kept / total < keep) {
++kept;
out.collect(key, val);
}
}
}
public static class MySampleMapper extends SampleMapBase{
String inputFile;
public void setup(Context context) {
this.inputFile = ((FileSplit) context.getInputSplit()).getPath().toString();
setKeep(context.getConfiguration().getFloat("hadoop.sort.map.keep.percent", (float) 100.0) / (float) 100.0);
}
public void map(Text key, Text val,
OutputCollector<Text,Text> output, Reporter reporter)
throws IOException {
emit(key, val, output);
System.out.println("EMIT");
}
public static void main(String[] args) throws Exception {
GenericOptionsParser parser = new GenericOptionsParser(new Configuration(), args);
String[] otherArgs = parser.getRemainingArgs();
if (otherArgs.length < 2) {
System.err.println("Usage: webdatascan -Dfile.path=<path> [<in>...] <out>");
System.exit(2);
}
System.setProperty("file.path", parser.getConfiguration().get("file.path"));
List<String> remoteHosts = new ArrayList<String>();// directory where the map output is remotely
remoteHosts.add(Inet4Address.getLocalHost().getHostAddress());
// first map tasks
JobConf conf = MyWebDataScan.addJob(1, true, true);
Job job = new Job(conf, conf.getJobName());
job.setJarByClass(MyWebDataScan.class);
job.setInputFormatClass(org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.class);
job.setOutputFormatClass(org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.class);
job.setMapperClass(MySampleMapper.class);
job.setReducerClass(MySampleReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
Path[] inputPaths = new Path[otherArgs.length-1];
for (int i = 0; i < otherArgs.length - 1; ++i) { inputPaths[i] = new Path(otherArgs[i]); }
Path outputPath = new Path(otherArgs[otherArgs.length - 1]);
FileInputFormat.setInputPaths(conf, inputPaths);
FileOutputFormat.setOutputPath(conf, outputPath);
job.getConfiguration().set("mapred.output.dir", outputPath.toString());
JobClient.runJob(job);
}
You have mixed up old API and new API. Your input and output format classes use classes from new API, but your map function is specified in the form of old API. If you have imported org.apache.hadoop.mapreduce.mapper, then map function should have specification like this
private static class RecordMapper
extends Mapper<LongWritable, Text, Text, LongWritable> {
public void map(LongWritable lineOffset, Text record, Context output)
throws IOException, InterruptedException {
output.write(new Text("Count"), new LongWritable(1));
}
}
As you are using map function like this, the map function will never invoked. Instead it will use default map function as explained in this video
public void map(Text key,
Text val,
OutputCollector<Text,Text> output,
Reporter reporter)
throws IOException

Set messages pattern in Log4J

I have this Log4J configuration which generates messages into log file:
log = Logger.getLogger(LogMessages.class.getName());
BasicConfigurator.configure(); // Basic configuration for Log4J 1.x
ConsoleAppender console = new ConsoleAppender(); //create appender
//configure the appender
String PATTERN = "%d{DATE} [%p|%c|%C{1}] %m%n";
console.setLayout(new PatternLayout(PATTERN));
console.setThreshold(Level.FATAL);
console.activateOptions();
//add appender to any Logger (here is root)
Logger.getRootLogger().addAppender(console);
DailyRollingFileAppender fa = new DailyRollingFileAppender();
fa.setName("FileLogger");
fa.setFile("log" + File.separator + "messages.log");
fa.setDatePattern("'.'yyyy-MM-dd");
fa.setLayout(new PatternLayout("%d %-5p [%c{1}] %m%n"));
fa.setThreshold(Level.DEBUG);
fa.setAppend(true);
fa.activateOptions();
//add appender to any Logger (here is root)
Logger.getRootLogger().addAppender(fa); //repeat with all other desired appenders
This is the generated output:
2013-12-26 10:19:27,501 WARN [LogMessages] test_message
I would like to generate message like this:
2013-12-26 10:19:27 WARN test_message
How I can remove [LogMessages] from the messages?
For the file logging, change the setLayout invocation to:
fa.setLayout(new PatternLayout("%d{dd/MM/yyyy HH\:mm\:ss} %-5p %m%n"));
For the console logging also, use the same pattern:
String PATTERN = "%d{dd/MM/yyyy HH\:mm\:ss} %-5p %m%n";

log4j log problem

I am writing an servlet.
I having several classes, some of them I want their log to be seperated from each other.
Here is the log4j configuration file:
log4j.rootLogger=INFO, CONSOLE, SearchPdfBill, Scheduler
# CONSOLE is set to be a ConsoleAppender.
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=<%p> %c{1}.%t %d{HH:mm:ss} - %m%n
# LOGFILE is set to be a file
log4j.appender.SearchPdfBill=org.apache.log4j.DailyRollingFileAppender
log4j.appender.SearchPdfBill.File = /bps/app/BpsPdfBill/BpsPdfBill.ear/BpsPdfBill.war/WEB-INF/logs/BpsPdfBill.log
#log4j.appender.SearchPdfBill.File = E:\\Workspace\\Eclipse_Workspace\\BpsPdfBill\\log\\BpsPdfBill.log
log4j.appender.SearchPdfBill.Append = true
log4j.appender.SearchPdfBill.DatePattern = '.'yyyy-MM-dd
log4j.appender.SearchPdfBill.layout=org.apache.log4j.PatternLayout
log4j.appender.SearchPdfBill.layout.ConversionPattern=<%p> %c{1}.%t %d{HH:mm:ss} - %m%n
# LOGFILE is set to be a file
log4j.appender.Scheduler=org.apache.log4j.DailyRollingFileAppender
log4j.appender.Scheduler.File = /bps/app/BpsPdfBill/BpsPdfBill.ear/BpsPdfBill.war/WEB-INF/logs/Schedule.log
#log4j.appender.Scheduler.File = E:\\Workspace\\Eclipse_Workspace\\BpsPdfBill\\log\\BpsPdfBill.log
log4j.appender.Scheduler.Append = true
log4j.appender.Scheduler.DatePattern = '.'yyyy-MM-dd
log4j.appender.Scheduler.layout=org.apache.log4j.PatternLayout
log4j.appender.Scheduler.layout.ConversionPattern=<%p> %c{1}.%t %d{HH:mm:ss} - %m%n
I setup a logger here:
String logDir = conf.getInitParameter("log_file_path");
if (logDir == null) {
initErrMsg = "Param - log_file_path cannot be empty";
throw new ServletException(initErrMsg);
}
if ((logger = Logger.getLogger(SearchPdfBill.class)) != null) {
//writeLog("Initializing log4j.");
conf.getServletContext().log("Log4j initialized.");
} else {
conf.getServletContext().log("Cannot initialize log4j properly.");
}
And another logger here:
logDir = sc.getInitParameter("log_file_path");
if (logDir == null) {
initErrMsg = "Param - log_file_path cannot be empty";
try {
throw new ServletException(initErrMsg);
} catch (ServletException e) {
// TODO Auto-generated catch block
conditionalWriteLog(logEnabled, e.getMessage());
}
}
if ((logger = Logger.getLogger(Scheduler.class)) != null) {
//writeLog("Initializing log4j.");
conditionalWriteLog(logEnabled, "Log4j initialized.");
} else {
conditionalWriteLog(logEnabled, "Cannot initialize log4j properly.");
}
However, it end up the 2 two logger are logging the same thing. Every log is log to the 2 file identically. Why?
I think the configuration file is wrong probably, but I don't know where it is, can somebody help me to correct that?
You need to define those two loggers in your configuration file. Are those objects within a package? That is important to the way you will configure them. Say the package is com.gunbuster:
log4j.category.com.gunbuster.SearchPdfBill=INFO, SearchPdfBill
log4j.additivity.com.gunbuster.SearchPdfBill=false
log4j.category.com.gunbuster.Scheduler=INFO, Scheduler
log4j.additivity.com.gunbuster.Scheduler=false
The additivity setting is to prevent those loggers from adding to the rootLogger output to CONSOLE Also, you should make the first line:
log4j.rootLogger=INFO, CONSOLE
So that the rootLogger does not add entries into those files.

Categories