I'm looking for a way to conditionally remove functionality from my Spring Boot app during the build. Whether that is a Maven plugin that strips out the actual Java classes or packages, or just is able to set a variable that cannot be later changed environmentally.
My spring boot app has a bunch of reports that can be generated but they shouldn't all be available in every environment they are deployed in. I would like an easy build config file where I can set which reports are enabled and disabled before build. I need them hard set at build, so someone with control of the environment can't just go in and update an associated property value and they become enabled again.
I was looking for some plugin that would remove classes or packages, or that could hardcode the variable value into the compiled class files but couldn't find anything. I figure there must be some way to make configuration stuck at build time. Have been unsuccessful finding a way to set properties at build that cannot be overridden.
Related
I have a java swing project that will end up in a fat client. There is no JEE background, no application server or servlet container will be involved. Nor any other runtime container then a plain JVM.
I am using the build tool maven to manage dependencies and to build the application.
I am using logback api for logging purposes.
Now I have two build profiles, one for building a developer version and one for building the final version.
Is it possible to set values to the logback.xml file using a maven property? For example, I have the following <root level="${log.level}"> tag in my logback.xml. Now I want to define the value of ${log.level} in my maven build profile. Is it possible?
This question is not related to
It is certainly possible, with maven resource filtering.
https://maven.apache.org/plugins/maven-resources-plugin/examples/filter.html
I would strongly recommend not to do it this way though. Managing 2 binaries just for the sake of having different properties seems an overkill.
I would recommend some setting on the target host (env variable, parameter to your program) that will allow you to select log level at runtime.
Passing log level, name or path of logging config, or name of target environment (dev/prod) are all acceptable solutions.
Let's say we build a Java SDK and different projects can consume by adding it as a jar in the classpath or adding it as a dependency in the maven pom.xml or gradle file. Logs from the SDK are not visible at run-time when the other projects are consuming this library. I tried with SL4J, and none of the logs are visible at the run time when it is used by the other projects. Should I go with log4j2? If yes, should I provide with a log4j configuration /properties file in my SDK? Will the properties/configuration be picked-up at run time from the consumer libraries? What are the best practices for this? Can you please advise?
Best practice #1: never include the logging configuration file in your jar.
Applications using your library will likely want to provide their own logging configuration. If one of the jars contains a different configuration it becomes a “fun” guessing game for these applications to figure out why their configuration isn’t working. (And depending on the classpath order, their configuration may sometimes take precedence over the configuration in the library, so sometimes it will work, just to make it extra challenging...)
I'm not sure if this is the best place to post such a question, but here it is. I'm a test automation engineer that works primarily with backend, spring boot command line apps. My tests, at a high level, are designed to ensure that any type of data that is thrown at the app will be handled correctly. We are a Java shop.
As with any "good" testing practice, I am treating the app like a blackbox, in that I do not pull in the model objects to run my tests. I simply supply the app with data, execute a command line type script (run.sh) that takes my data and processes it. My tests are comprised mostly of JDBC (to interact with the database) and a slew of ArrayList utilities that I have put together to sort out result sets and get specific db information.
Thus far, I have been deploying my tests as a JAR. I bundle everything up and deploy it to the environment with a script that will execute the tests. The tests do not run when the app is run. Though they do live inside of the project, they are a separate entity with separate launcher classes. However, I am finding that managing dependencies in a JAR is a real headache. Is there a better way to deploy automation / integration tests for command line apps?
I'm pulling in maven shade plugin to bundle all of my dependencies into a "God JAR", but that isn't helping me to resolve the issues that occur when I attempt to execute the JAR. I get multiple bean instantiation errors, relating to the app itself, and not my tests. For this reason, I pull in the app model, and the app itself as dependencies. When I ran the tests in my initial testing, they worked just fine. Deployed to environment and they continued to work correctly. Fast forward a couple of months, a few changes made to the app, and now it's a dependency nightmare when I build the new JAR.
TLDR: I'm having trouble managing dependencies in a maven project, integration tests JAR. Is there a better way to deploy automation / integration tests for command line apps where dependency management is easier?
(Note: I'm relatively new to this world, so pardon me if the question seems a bit vague).
I think the error happens when you use the shade plugin to re-package the spring boot jar. The way spring boot works is to add dependencies into the jar as jars itself and configure its own class loader (in the meta config) that is capable of reading classes from jar files inside the jar file. The standard java class loader does not do this - thats probably why the shade plugin misses out some jars (probably the ones embedded in the spring boot uber jar).
what I would try is to create a test-version of the spring boot app that contains the test-classes in the compile scope and a dependency to the original spring boot jar (you don't need the uber jar - therefore you may have to add a classifier to the original (app) spring boot plugin config to have that jar still available as it is replaced by default) and use the spring boot plugin to package the test version of it (using the dependency and its classifier above you used for the original app).
I am migrating my Spring Boot application from version 1.5.7 to 2.0.0 and I noticed that it no longer takes mail properties from ENV variables for some reason.
I am using java.mail.Sender and have the following propeties in my application.properties file:
spring.mail.host=smtp.example.com
spring.mail.username=username
spring.mail.password=password
spring.mail.port=587
spring.mail.properties.mail.smtp.auth=true
spring.mail.properties.mail.smtp.starttls.enable=true
spring.mail.defaultEncoding=UTF-8
This is there just to mock the mail properties in tests. I am injecting the real ones using the same keys as ENV variables: spring.mail.host=smtp.google.com, etc.
But when I try to send the email, I see that it is still using smtp.example.com. I thought that ENV variables had higher priority than values from application.properties. Did something change? Everything worked fine in Spring Boot 1.5.7.
EDIT:
The following command works so it is definitely some problem with Eclipse:
SPRING_PROFILES_ACTIVE=development SPRING_MAIL_HOST=smtp.gmail.com SPRING_MAIL_USERNAME=xxx SPRING_MAIL_PASSWORD=xxx ./gradlew clean bootRun
What I don't understand is why the exact same configuration works, when I switch back to 1.5.7. It is also strange that when passign env variables via Eclipse run configuration, it works for profile. So some env variables are applied and some not...
I was able to recreate this issue. Created a Spring boot App with 1.5.X and injected Environment variables from Eclipse. Now, when I migrate to 2.X release, the environment variables are not getting injected.
On further analysis, found out this interesting thread
One of the Spring-boot developers made this comment
Hence my conclusion is when we are using 2.X release, there is one of the component within Spring-boot-parent which is making the spring boot maven plugin to fork it and run in a separate JVM. Thus, the environment variable is not getting passed.
That answers the question why profile value is picked-up from the environment section. Profile flag is always passed as an argument irrespective of whether the app runs in the maven JVM or a new one
To confirm this, you can add the config entries to the JVM argument tab like the one below
You will now be able to see the new values passed to spring boot
I don't know much about your configurations, but if the project structure is okay with correct dependencies and the application.properties exit under src/main/resources and your startup class annotated with #SpringBootApplication, it should work fine.
you can test if the application reads your properties file by injecting a variable String with annotation #Value inside any class and log or print it.
#Value("${spring.mail.host}")
private String host;
first Make sure your IDE is running on Java 8 or later version .
With Spring Boot 2.0, many configuration properties were renamed/removed and developers need to update their application.properties/application.yml accordingly. To help you with that, Spring Boot ships a new spring-boot-properties-migrator module. Once added as a dependency to your project, this will not only analyze your application’s environment and print diagnostics at startup, but also temporarily migrate properties at runtime for you. This is a must have during your application migration:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-properties-migrator</artifactId>
<scope>runtime</scope>
</dependency>
runtime("org.springframework.boot:spring-boot-properties-migrator")
Note Once you’re done with the migration, please make sure to remove this module from your project’s dependencies.
For more information follow this link
Spring Boot 2.X migration guide
Unit test is in Java, using Maven to build & run. I can use either local jars "outside" Maven or local Maven repository.
However, I'd like to figure out a way to do this automatically somehow, without changing the pom.xml->running ->changing the pom.xml
Is there any other way except the above or creating pom.xmls which only differ in the specific library version?
(I'm using IntelliJ if that's of any use)
You could also read the version of your lib from a property, like
<dependency>
<groupId>javax.faces</groupId>
<artifactId>jsf-api</artifactId>
<version>${jsp.api.version}</version>
</dependency>
You can set this property in several ways, e.g. loading build-specific properties file.
You could even specify it as a parameter when running the build. On the command line svn, this is something like
mvn -Djsp.api.version=1.8 install
Don't know how to specify such an property when running Maven from inside IntelliJ, but I'm sure it's possible....
This approach would give you full flexibility to freely speficy the lib version for each build. But if you only have a limited number of versions you want to choose from, using profiles is probably the better way to go. Just define a profile for each version number and the always tell Maven which profile to use when you run a build.
What you are looking for is profiles in Maven with properties.
You use a property to specify the version number and you can use profiles to specify what the property actually is and then specify which profile to use when you run Maven.