I am trying to deserialize incoming PUT request with JSON request body using org.codehaus.jackson package and I am receiving error message The request sent by the client was syntactically incorrect. How can I get more detailed log/error messages in my Pivotal TC server logs, e.g. in catalina.log?
I have added the following line to logging.properties:
org.codehaus.level = FINEST
But NO messages from org.codehaus is displayed in my log, although the error message is displayed on web page. Maybe codehaus does not support Java logging and I should configure J4Log or similar another logging facility?
My Jackson version is 1.9.13, I am using Pivotal tc server from Spring Tools Suite (3.8).
For what you say it seems that you're trying to change the tomcat logging.properties.
That's usually a bad idea as you may want different logging on different webApps loaded in the same Tomcat server.
What you should do is configure the log4j in your project.
Usually Java projects define a "resource" folder. You should add in there a file called
log4j.properties
and add in there the following:
log4j.rootLogger=ERROR,stdout
# Logger for jackson lib
log4j.logger.org.codehaus=TRACE
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%p\t%d{ISO8601}\t%r\t%c\t[%t]\t%m%n
This is extracted from the default log4j configuration and will log in the standard output that for Tomcat is redirected to the catalina.out log file.
You may want to read the documentation at https://docs.oracle.com/cd/E29578_01/webhelp/cas_webcrawler/src/cwcg_config_log4j_file.html
explaining how to redirect the logging to a different file and how to use rolling appenders so you keep some history
Hopefully this will work!
The request sent by the client was syntactically incorrect
^^^ This message is created by the HTTP servlet processing the client request and not by the jackson mapper.
You could find the related log messages under spring-mvc logger name: org.springframework.web.
Related
I am running a java application in GKE and monitoring logs in Log explorer. Java application is writing logs to stdout and as far as I understand GKE agent parse it and send it to log explorer. What I found is that the log explorer shows WARN and ERROR messages with severity INFO.
I figured out that I can't change the default parser of logs and configured logback to represent java logs in JSON format suitable for GCP (I used implementation from this answer), here is an example:
{"message":"2022-02-17 12:42:05.000 [QuartzScheduler_Worker-8] DEBUG some debug message","timestamp":{"seconds":1645101725,"nanos":0},"thread":"QuartzScheduler_Worker-8","severity":"DEBUG"}
{"message":"2022-02-17 12:42:05.008 [QuartzScheduler_Worker-8] INFO some info message","timestamp":{"seconds":1645101725,"nanos":8000000},"thread":"QuartzScheduler_Worker-8","severity":"INFO"}
{"message":"2022-02-17 12:42:05.009 [QuartzScheduler_Worker-8] ERROR some error message","timestamp":{"seconds":1645101725,"nanos":9000000},"thread":"QuartzScheduler_Worker-8","severity":"ERROR"}
But it didn't help at all.
Please point me out where I am wrong with JSON format or maybe I need to configure something additionally on the GCP side. I've checked the official doc regarding log JSON format and I don't understand what I am missing.
According to the documentation link 1 & link 2
Severities: By default, logs written to the standard output are on the INFO level and logs written to the standard error are on the ERROR level. Structured logs can include a severity field, which defines the log's severity.
If you're using Google Kubernetes Engine or the App Engine flexible environment, you can write structured logs as JSON objects serialized on a single line to stdout or stderr. The Logging agent then sends the structured logs to Cloud Logging as the jsonPayload of the LogEntry structure.
If the manual implementation is not working, you may try to:
Directly send logs to Cloud Logging API
Use this official Java logback lib (note: it's currently a WiP)
I am a Java newbie so please excuse any incorrect terminology.
My Java application uses SLF4j which I have configured to use the SimpleLogger implementation and redirect log messages to a file.
The application uses the Jersey framework which depends on Grizzly, which as far as I can tell, uses the standard java.util.logging framework.
When I run the application the output from my log statements appears in the lof file as expected but the output from Grizzly's log statements appears in the console in red.
Ideally, I would like to redirect the Grizzly log output to my file. Is this possible?
Using this answer I was able to redirect the Grizzly output to a different file and I can turn it off altogether with:
Logger.getLogger("").setLevel(Level.OFF);
A similar question was asked here but the only answer provided is not helpful and was not accepted.
Can anyone help?
I have a web application, which uses log4j and slf4j as facade for logging. Logging is configured not from log4j.properties file, but from Java code.
Application is running on three different machines using Websphere Application Server.
Issue is that on two instances logging works as expected. But on the third one nothing is written in my logfile. Output goes to SystemErr.log instead (there are messages of ERROR and INFO levels).
Initially I thought that something is with Websphere server configuration, but later I found this link http://www.webagesolutions.com/knowledgebase/waskb/waskb004/ which says that such situation can be when log4j.properties can not be read.
But I am not using property file for this. And configuration form Java works OK on other two instances.
Any ideas what can be the issue?
Thank you in advance!
Please make sure that no alternative slf4j binding (such as simple) exists on the CLASSPATH.
I am developing a client api where I have a specific requirement to log the client api specific log messages to a separate file and it is pretty straight forward: I created an appender and associated the appender with logger which is specific to my package.
Now the question is:
What happens if the client application has its own log4j.xml? How will my appender and logger work in that environment ?
The log4j initialization process only handles one configuration file - thus this file should contain all logging configuration that should be active. You probably have to define a specific configuration for the client application that contains your and the client applications logging configuration.
The client application has to be configured to use this file for the initialization. This is done by setting the log4j.configuration system property as described in the log4j manual (suppose that you are using log4j 1).
Actually I am monitoring more than one Tomcat log files using Taillistener API in java. Now I want to find that log messages are coming from which tomcat.
Is it possible to add log file name in the log messages.
Thanks in advance.
You can make use of log4j instead of the standard java.util.logging Tomcat uses.
That will allow you to set a log file per Tomcat instance and much more.
See Tomcat Log4J Logging.