Weird output in catalina.out Tomcat9 - java

I'm using Tomcat9 as a server and I'm using Catalina.out as my logger output "System.out" and "System.error" . I don't know every time I open and refresh the Catalina.out file, it gives out the weird output (as shown in picture below) and this output is keep increasing until more than million length. It makes slow loading to open the Catalina.out file. After this weird thing loaded, the logger output that I needed will be at the bottom of those things.
Opened Catalina.out via notepad++ will have below output :
I expect the output doesn't give this weird annoying output.

Guess you need some formatting here. Just follow these steps.
Open your file in Notepad++
Type Control-A (select all)
Type Control-H (replace) In 'Find What' type '\x00'
In 'Replace With' leave BLANK In 'Search Mode' Selected 'Extended'
Then Click on 'Replace All'

I think, the encoding of the log is wrong.
May be, you should check the property of the log writing of Apache Tomcat.
(see http://tomcat.apache.org/tomcat-9.0-doc/logging.html)
Hope, it should help you.

I just found out that weird output will only happens when I delete all the Startup logger in Catalina.out.
Example of Startup logger :
.
.
{some logger}
08-Aug-2019 15:15:22.692 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [33,192] milliseconds
If I just leave it, it will not give the weird output again.
So, I will close this issue.

Related

Change jena-fuseki logging level

There's quite little information on jena-fuseki logging on apache documentation, found just this: https://jena.apache.org/documentation/fuseki2/fuseki-logging.html
My jena-fuseki in running in rkt-container, started by command
./fuseki-server ...
I need to change the logging level from INFO to at least WARN or perhaps ERROR.
I tried on test server (i.e. outside of rkt-container) by changing all occurrences of "INFO" to "WARN" 1st in file <directory of fuseki-server>/webapp/log4j2.properties and then in file <directory of fuseki-server>/log4j2.properties without help.
So, how should I change the logging level ?

JMeter can't read custom search_paths directory

I want JMeter to find a jar in lib/ext/custom.
In my jmeter.properties:
search_paths=lib/ext/custom
When I run the test, I get this output:
2019-06-25 10:21:54,792 INFO o.a.j.JMeter: search_paths=lib/ext/custom
2019-06-25 10:21:54,792 WARN o.a.j.JMeter: Can't read lib/ext/custom
Does anyone have an idea why it wouldn't be able to read that directory? It has the same owner as all the other directories/files and has the same permissions as lib/ext itself.
I turned the root longer to DEBUG but received no extra information than the above log messages.
The answer was simple. It could not read the directory because it could not find it because the value I provided it wasn't correct.
search_paths=../lib/ext/custom
It needed that look back to correctly find it. Putting the full path also worked.

Java: Using JVM argument -XX:ErrorFile and append the logs in existing log file without pid

I have following configuration for my service
exec java -Djava.io.tmpdir=$tmpdir -Djava.library.path="Some_Path"
-Xmx"$heapsize"m -XX:+UseConcMarkSweepGC -XX:OnOutOfMemoryError="Do something, may be restart"
-XX:ErrorFile=/var/log/service/myService/"myServiceCrash".log -jar .jar
I am not able to append the crash logs into the same file. But new file with new PID is created every time.
Requirement : Dump crash logs into same file.
This is expected behavior. For the first time it will write to the file provided in -XX:ErrorFile=, Once the file exists it won't be overwritten and you will then get the default error file.
Ideally there should be some way top show the file creation fails, but it can't be done as part of the error handling code.
Please check the evaluation here - https://bugs.openjdk.java.net/browse/JDK-8189672

tomcat jdbc SlowQueryReport interceptors - log in a separate file

I am trying to log all my slow queries in a separate file. Until now I have to following Tomcat context configuration:
<Resource name="jdbc/paymentDB" auth="Container" type="org.apache.tomcat.jdbc.pool.DataSource"
driverClassName="...oracle..."
...
jdbcInterceptors="org.apache.tomcat.jdbc.pool.interceptor.QueryTimeoutInterceptor(queryTimeout=2);org.apache.tomcat.jdbc.pool.interceptor.SlowQueryReport(threshold=1000,maxQueries=200)"
</Context>
This works as long as I do not set another kind logger and it prints to the console. One thing that I believe should be added is that I run this test in IntelliJ Idea using default IDE configuration.
The next thing I wanted to do was to log into a separate file. So I opened logging.properties and did the following changes:
handlers = ..., 5slowqueries.org.apache.juli.FileHandler, ...
.handlers =..., 5slowqueries.org.apache.juli.FileHandler, ...
5slowqueries.org.apache.juli.FileHandler.level = ALL
5slowqueries.org.apache.juli.FileHandler.directory = ${catalina.base}/logs
5slowqueries.org.apache.juli.FileHandler.prefix = slow-queries.
org.apache.tomcat.jdbc.pool.interceptor.SlowQueryReport.level = ALL
org.apache.tomcat.jdbc.pool.interceptor.SlowQueryReport.handlers = 5slowqueries.org.apache.juli.FileHandler
The problem is that executing the same slow queries and have been printed earlier in console, this time, using this configuration, no slow-queries.* file is created. (I ran this from IntelliJ Idea)
I can't figure out how to make this work. Maybe it has something to do with IDE? I have noticed that IDEA has a Logs category in Run/Debug Configurations, I tried to play with these options too but didn't have any luck.
I found the problem. It was the IDE. When IDEA starts the server it prints something like this in the console:
Using CATALINA_BASE: "C:\Users\..."
Using CATALINA_HOME: "C:\..."
another few variables
The logs are created, by default, if not changed, in CATALINA_BASE/logs.

Solr 5.1: Solr is creating way too many log files

I'm dealing with a problem where Solr 5.1 is creating way too many log files. Every time Solr is restarted, and periodically throughout the week, Solr creates the following files and I need it to stop:
Files of the type solr_gc_xxxxxxxx_xxxx, where the x's stand for the date and some kind of identifying number, respectively. These contain garbage collection information.
Files of the type solr_log_xxxxxxxx_xxxx, where the x's stand for the date and some kind of identifying number, respectively. These contain the same kind of information you'd find in solr.log.
One file of the type solr-[port]-console.log. It always contains
only the following text: WARNING: System properties and/or JVM args
set. Consider using --dry-run or --exec
In one week I racked up nearly thirty of files of the type 1 and 2!
Even worse, file types 1 and 2 don't seem to respect my log4j.rootlogger setting and instead are filled with INFO level material.
Here are the relevant parts of my log4j.properties file:
# Logging level
solr.log=logs
log4j.rootLogger=WARN, file
#- size rotation with log cleanup.
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.MaxFileSize=100MB
log4j.appender.file.File=${solr.log}/solr.log
log4j.appender.file.MaxBackupIndex=0
What I want to do is the following:
Create only solr.log + one backup file. solr.log should be periodically overwritten.
Not create any other log file.
What can I do to accomplish this?
So after some time, I figured out how to fix this.
To recap, Solr kept creating a whole bunch of files with the solr_log* and gc_log* patterns on startup and periodically throughout the day. Eventually I had some pretty serious space issues because of the endless amount of logs Solr likes to create.
Navigate to /path/to/solr/bin and locate the solr script, which runs at startup. Open the file, look for the following, and comment out mv "$SOLR_LOGS_DIR/solr.log" "$SOLR_LOGS_DIR/solr_log_$(date +"%Y%m%d_%H%M")":
# backup the log files before starting
if [ -f "$SOLR_LOGS_DIR/solr.log" ]; then
if $verbose ; then
echo "Backing up $SOLR_LOGS_DIR/solr.log"
fi
mv "$SOLR_LOGS_DIR/solr.log" "$SOLR_LOGS_DIR/solr_log_$(date +"%Y%m%d_%H%M")"
fi
Or remove it, if you like. You could also try not using the -f flag but here at my shop we like it.
This will retain solr.log, but Solr won't make any more backups. If you want daily backups, I recommend configuring a TimeBasedRollingPolicy or, better yet, a DailyRollingFileAppender in the log4j.properties file, which can be found under /path/to/solr/server/resources.
If you want, you can also comment out the mv line for the Solr garbage collection logs, which will leave you with solr_gc.log only.
If, like me, you have other ways you monitor gc for Solr, then you need to turn off gc logging completely.
In the same directory as the solr script, open solr.in.sh (Mac/Linux only, I think solr.cmd is for Windows users) and comment this line out: # Enable verbose GC logging
GC_LOG_OPTS="-verbose:gc -XX:+PrintHeapAtGC -XX:+PrintGCDetails \
-XX:+PrintGCDateStamps -XX:+PrintGCTimeStamps -XX:+PrintTenuringDistribution -XX:+PrintGCApplicationStoppedTime".
You will need to restart Solr.

Categories