In a spring-boot application I can specify a custom log file with
java -jar spring-boot-app.jar --logging.file=/home/ubuntu/spring-boot-app.log
But if I don't specify one, where does it go?
I couldn't find it in any of the following folders:
/tmp/
/var/log/
~/
I do not have spring-boot-starter-logging or any additional logging dependencies.
I was hoping to have something similar to catalina.out since the default configuration runs an embedded Tomcat:
INFO 10374 --- [main] s.b.c.e.t.TomcatEmbeddedServletContainer : Tomcat initialized with port(s): 8100 (http)
Spring Boot uses Commons Logging for all internal logging, but leaves the underlying log implementation open.
Default configurations are provided for Java Util Logging, Log4J, Log4J2 and Logback. In each case loggers are pre-configured to use console output with optional file output also available.
From the Spring Boot logging documentation.
The default log configuration will echo messages to the console as they are written.
So until you explicitly specify a file as you described, it stays in the Console.
By default Spring Boot does not output logs to any file. If you want to have logs written in a file (in addition to the console output) then you should use either of logging.file or logging.path properties (not both).
In application.properties, just set:
logging.file=/home/ubuntu/spring-boot-app.log
This will create a spring-boot-app.log file under /home/ubuntu.
By default, Spring Boot logs only to the console and does not write log files. If you want to write log files in addition to the console output, you need to set a logging.file or logging.path property (for example, in your application.properties).
Below codes in your application.properties will write the log into /home/user/my.log:
logging.path = /home/user
logging.file = my.log
Related
I have a number of questions here (and I'm new to Spring Boot).
Our project's existing codebase uses YAML but no .properties file that I can see anywhere. My reading of usual Spring Boot tutorial uses .properties file.
(1) Is it possible to use application.yml as replacement for application.properties?
(2) Where is the default directory/file where the Spring Boot's embedded Tomcat server dumps its logs?
I need to modify the configuration such that we have custom directory to dump embedded Tomcat server logs. According to here, it should be
server.tomcat.basedir=my-tomcat
server.tomcat.accesslog.enabled=true
server.tomcat.accesslog.pattern=%t %a "%r" %s (%D ms)
That is how it's done if using application.properties way.
(3) Assuming YAML can replace the .properties file entirely, how can I do the above configuration in YAML? Do I need to edit something in the Java source files in order for the configuration in YAML to take effect (that is Tomcat logs going into specific directory that I want)?
Yes. Completely
Spring boot doesn't log to a file by default (as far I know).
Whether you use yaml or properties file, spring boot uses this configuration to bootstrap the application. The below from here
# LOGGING
logging.config= # Location of the logging configuration file. For instance `classpath:logback.xml` for Logback
logging.exception-conversion-word=%wEx # Conversion word used when logging exceptions.
logging.file= # Log file name. For instance `myapp.log`
logging.level.*= # Log levels severity mapping. For instance `logging.level.org.springframework=DEBUG`
logging.path= # Location of the log file. For instance `/var/log`
logging.pattern.console= # Appender pattern for output to the console. Only supported with the default logback setup.
logging.pattern.file= # Appender pattern for output to the file. Only supported with the default logback setup.
logging.pattern.level= # Appender pattern for log level (default %5p). Only supported with the default logback setup.
logging.register-shutdown-hook=false # Register a shutdown hook for the logging system when it is initialized.
server.tomcat.accept-count= # Maximum queue length for incoming connection requests when all possible request processing threads are in use.
server.tomcat.accesslog.buffered=true # Buffer output such that it is only flushed periodically.
server.tomcat.accesslog.directory=logs # Directory in which log files are created. Can be relative to the tomcat base dir or absolute.
server.tomcat.accesslog.enabled=false # Enable access log.
server.tomcat.accesslog.file-date-format=.yyyy-MM-dd # Date format to place in log file name.
server.tomcat.accesslog.pattern=common # Format pattern for access logs.
server.tomcat.accesslog.prefix=access_log # Log file name prefix.
server.tomcat.accesslog.rename-on-rotate=false # Defer inclusion of the date stamp in the file name until rotate time.
server.tomcat.accesslog.request-attributes-enabled=false # Set request attributes for IP address, Hostname, protocol and port used for the request.
server.tomcat.accesslog.rotate=true # Enable access log rotation.
server.tomcat.accesslog.suffix=.log # Log file name suffix.
server.tomcat.additional-tld-skip-patterns= # Comma-separated list of additional patterns that match jars to ignore for TLD scanning.
server.tomcat.background-processor-delay=30 # Delay in seconds between the invocation of backgroundProcess methods.
server.tomcat.basedir= # Tomcat base directory. If not specified a temporary directory will be used.
server.tomcat.internal-proxies=10\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}|\\
192\\.168\\.\\d{1,3}\\.\\d{1,3}|\\
169\\.254\\.\\d{1,3}\\.\\d{1,3}|\\
127\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}|\\
172\\.1[6-9]{1}\\.\\d{1,3}\\.\\d{1,3}|\\
172\\.2[0-9]{1}\\.\\d{1,3}\\.\\d{1,3}|\\
172\\.3[0-1]{1}\\.\\d{1,3}\\.\\d{1,3} # regular expression matching trusted IP addresses.
server.tomcat.max-connections= # Maximum number of connections that the server will accept and process at any given time.
server.tomcat.max-http-post-size=0 # Maximum size in bytes of the HTTP post content.
server.tomcat.max-threads=0 # Maximum amount of worker threads.
server.tomcat.min-spare-threads=0 # Minimum amount of worker threads.
server.tomcat.port-header=X-Forwarded-Port # Name of the HTTP header used to override the original port value.
server.tomcat.protocol-header= # Header that holds the incoming protocol, usually named "X-Forwarded-Proto".
server.tomcat.protocol-header-https-value=https # Value of the protocol header that indicates that the incoming request uses SSL.
server.tomcat.redirect-context-root= # Whether requests to the context root should be redirected by appending a / to the path.
server.tomcat.remote-ip-header= # Name of the http header from which the remote ip is extracted. For instance `X-FORWARDED-FOR`
server.tomcat.uri-encoding=UTF-8 # Character encoding to use to decode the URI.
Yes. You can change these to YML. All you need is to replace properties file with yml file. You could even have both in your workspace (spring looks at both application.properties and application.yml)
You can manually do the conversion or even with plugins.
A simple line like
logging.level.netpl.com = DEBUG
changes to
logging:
level:
netpl.com: DEBUG
I am trying to run a spark job in EMR cluster.
I my spark-submit I have added configs to read from log4j.properties
--files log4j.properties --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/log4j.properties"
Also I have added
log4j.rootLogger=INFO, file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/log/test.log
log4j.appender.file.MaxFileSize=10MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %5p %c{7} - %m%n
in my log4j configurations.
Anyhow I see the logs in the console, though I don't see the log file generated. What am I doing wrong here ?
Quoting spark-submit --help:
--files FILES Comma-separated list of files to be placed in the working directory of each executor. File paths of these files in executors can be accessed via SparkFiles.get(fileName).
That doesn't much say what to do with the FILES if you cannot use SparkFiles.get(fileName) (which you cannot for log4j).
Quoting SparkFiles.get's scaladoc:
Get the absolute path of a file added through SparkContext.addFile().
That does not give you much either, but suggest to have a look at the source code of SparkFiles.get:
def get(filename: String): String =
new File(getRootDirectory(), filename).getAbsolutePath()
The nice thing about it is that getRootDirectory() uses an optional property or just the current working directory:
def getRootDirectory(): String =
SparkEnv.get.driverTmpDir.getOrElse(".")
That gives as something to work on, doesn't it?
On the driver the so-called driverTmpDir directory should be easy to find in Environment tab of web UI (under Spark Properties for spark.files property or Classpath Entries marked as "Added By User" in Source column).
On executors, I'd assume a local directory so rather than using file:/log4j.properties I'd use
-Dlog4j.configuration=file://./log4j.properties
or
-Dlog4j.configuration=file:log4j.properties
Note the dot to specify the local working directory (in the first option) or no leading / (in the latter).
Don't forget about spark.driver.extraJavaOptions to set the Java options for the driver if that's something you haven't thought about yet. You've been focusing on executors only so far.
You may want to add -Dlog4j.debug=true to spark.executor.extraJavaOptions that is supposed to print what locations log4j uses to find log4j.properties.
I have not checked that answer on a EMR or YARN cluster myself but believe that may have given you some hints where to find the answer. Fingers crossed!
With Spark 2.2.0 standalone cluster, executor JVM is started first and only then Spark distributes application jar and --files
Which means passing
spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j-spark.xml
does not makes sense as this file does not exist yet (is not downloaded) at the executor JVM launch time and log4j initialization
If you pass
spark.executor.extraJavaOptions=-Dlog4j.debug -Dlog4j.configuration=file:log4j-spark.xml
you will find at the beginning of the executor's stderr failed attempt to load log4j config file
log4j:ERROR Could not parse url [file:log4j-spark.xml].
java.io.FileNotFoundException: log4j-spark.xml (No such file or directory)
...
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
And bit later is logged download of --files from driver
18/07/18 17:24:12 INFO Utils: Fetching spark://123.123.123.123:35171/files/log4j-spark.xml to /ca/tmp-spark/spark-49815375-3f02-456a-94cd-8099a0add073/executor-7df1c819-ffb7-4ef9-b473-4a2f7747237a/spark-0b50a7b9-ca68-4abc-a05f-59df471f2d16/fetchFileTemp5898748096473125894.tmp
18/07/18 17:24:12 INFO Utils: Copying /ca/tmp-spark/spark-49815375-3f02-456a-94cd-8099a0add073/executor-7df1c819-ffb7-4ef9-b473-4a2f7747237a/spark-0b50a7b9-ca68-4abc-a05f-59df471f2d16/-18631083971531927447443_cache to /opt/spark-2.2.0-bin-hadoop2.7/work/app-20180718172407-0225/2/./log4j-spark.xml
It may work differently with yarn or another cluster manager but with standalone cluster, it seems there is no way you can specify your own logging configuration for executors on spark-submit.
You can dynamically reconfigure log4j in your job code (override log4j configuration programmatically: file location for FileAppender), but you would need to do it carefully in some mapPartition lambda that is executed in executor's JVM. Or maybe you can dedicate first stage of your job to it. All that sucks though...
Here is the complete command I used to run my uber-jar in EMR and I see log files generated in driver and executor nodes.
spark-submit --class com.myapp.cloud.app.UPApp --master yarn --deploy-mode client --driver-memory 4g --executor-memory 2g --executor-cores 8 --files log4j.properties --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties" --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties -Dlog4j.debug=true" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties" --conf "spark.eventLog.dir=/mnt/var/log/" uber-up-0.0.1.jar
where log4j.properties is in my local filesystem.
We have different config servers per environment. Each spring boot application should target its corresponding config server. I have tried to achieve this by setting profiles in the bootstrap.properties file, e.g.:
spring.application.name=app-name
spring.cloud.config.uri=http://default-config-server.com
---
spring.profiles=dev
spring.cloud.config.uri=http://dev-config-server.com
---
spring.profiles=stage
spring.cloud.config.uri=http://stage-config-server.com
---
spring.profiles=prod
spring.cloud.config.uri=http://prod-config-server.com
And then I set the cla -Dspring.profiles.active=dev but the loaded config server is always the last one set in the file (i.e. prod config server would be loaded in the above settings, and then if prod is removed, stage would be loaded).
Is it possible to set bootstrap profiles for the cloud config server? I followed this example but can't seem to get it working. For what it's worth, these profiles work great to load the correct config (i.e. app-name-dev.properties will load if the dev profile is active), but aren't being pulled from the proper config server.
Specifying different profiles in a single file is only support for YAML files and doesn't apply to property files. For property files specify an environment specific bootstrap-[profile].properties to override properties from the default bootstrap.properties.
So in your case you would get 4 files bootstrap.properties, bootstrap-prod.properties, bootstrap-stage.properties and bootstrap-dev.properties.
However instead of that you could also only provide the default bootstrap.properties and when starting the application override the property by passing a -Dspring.cloud.config.uri=<desired-uri> to your application.
java -jar <your-app>.jar -Dspring.cloud.config.uri=<desired-url>
This will take precedence over the default configured values.
I solved a similar problem with an environment variable in Docker.
bootstrap.yml
spring:
application:
name: dummy_service
cloud:
config:
uri: ${CONFIG_SERVER_URL:http://localhost:8888/}
enabled: true
profiles:
active: ${SPR_PROFILE:dev}
Dockerfile
ENV CONFIG_SERVER_URL=""
ENV SPR_PROFILE=""
Docker-compose.yml
version: '3'
services:
dummy:
image: xxx/xxx:latest
restart: always
environment:
- SPR_PROFILE=docker
- CONFIG_SERVER_URL=http://configserver:8888/
ports:
- 8080:8080
depends_on:
- postgres
- configserver
- discovery
#LarryW (I cannot answer on the same comment):
I guess the advantage of explicitly adding the property is that it allows you to add a default value (in this case "dev") in case of not setting up the environment variable.
Im using security manager in my tomcat.
I need all the consol logs to be written in the tomcat.log file.
Fo this im using this cmd
catalina start -security > tomcat.log
But the log not goes to tomcat.log rather it is again writing in console only.
How can i write this log into a separate file?
Check out your catalina.out file. By default logs are redirected to $CATALINA_BASE/logs/catalina.out file. To override default configuration ,you can set Tomcat environment variables CATALINA_OUT in setenv.sh which is full path to the file where stdout and stderr will be redirected.
While running the application i do not able to set the jndi.properties and log4j.properties
Actually i have to se the following properities but I do not know where to write these code in a file or somewhere else. If in file what will be the file extension and file name and where to keep it in application.
jndi.properties:
java.naming.factory.initial=org.jnp.interfaces.NamingContextFactory
java.naming.factory.url.pkgs=org.jboss.naming:org.jnp.interfaces
java.naming.provider.url=localhost:1099
log4j.properties:
# Set root category priority to INFO and its only appender to CONSOLE.
log4j.rootCategory=INFO, CONSOLE
# CONSOLE is set to be a ConsoleAppender using a PatternLayout.
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.Threshold=INFO
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=- %m%n
You can set the default JBoss JNDI properties in jboss\server\default\conf\jndi.properties file.
I can't understand what you want to do or which Jboss version you use. But for the log4j options in Jboss 4.2.3 go to:
jboss\server\default\conf\jboss-log4j.xml