Eclipse does not pass mail properties to Spring Boot after migration - java

I am migrating my Spring Boot application from version 1.5.7 to 2.0.0 and I noticed that it no longer takes mail properties from ENV variables for some reason.
I am using java.mail.Sender and have the following propeties in my application.properties file:
spring.mail.host=smtp.example.com
spring.mail.username=username
spring.mail.password=password
spring.mail.port=587
spring.mail.properties.mail.smtp.auth=true
spring.mail.properties.mail.smtp.starttls.enable=true
spring.mail.defaultEncoding=UTF-8
This is there just to mock the mail properties in tests. I am injecting the real ones using the same keys as ENV variables: spring.mail.host=smtp.google.com, etc.
But when I try to send the email, I see that it is still using smtp.example.com. I thought that ENV variables had higher priority than values from application.properties. Did something change? Everything worked fine in Spring Boot 1.5.7.
EDIT:
The following command works so it is definitely some problem with Eclipse:
SPRING_PROFILES_ACTIVE=development SPRING_MAIL_HOST=smtp.gmail.com SPRING_MAIL_USERNAME=xxx SPRING_MAIL_PASSWORD=xxx ./gradlew clean bootRun
What I don't understand is why the exact same configuration works, when I switch back to 1.5.7. It is also strange that when passign env variables via Eclipse run configuration, it works for profile. So some env variables are applied and some not...

I was able to recreate this issue. Created a Spring boot App with 1.5.X and injected Environment variables from Eclipse. Now, when I migrate to 2.X release, the environment variables are not getting injected.
On further analysis, found out this interesting thread
One of the Spring-boot developers made this comment
Hence my conclusion is when we are using 2.X release, there is one of the component within Spring-boot-parent which is making the spring boot maven plugin to fork it and run in a separate JVM. Thus, the environment variable is not getting passed.
That answers the question why profile value is picked-up from the environment section. Profile flag is always passed as an argument irrespective of whether the app runs in the maven JVM or a new one
To confirm this, you can add the config entries to the JVM argument tab like the one below
You will now be able to see the new values passed to spring boot

I don't know much about your configurations, but if the project structure is okay with correct dependencies and the application.properties exit under src/main/resources and your startup class annotated with #SpringBootApplication, it should work fine.
you can test if the application reads your properties file by injecting a variable String with annotation #Value inside any class and log or print it.
#Value("${spring.mail.host}")
private String host;

first Make sure your IDE is running on Java 8 or later version .
With Spring Boot 2.0, many configuration properties were renamed/removed and developers need to update their application.properties/application.yml accordingly. To help you with that, Spring Boot ships a new spring-boot-properties-migrator module. Once added as a dependency to your project, this will not only analyze your application’s environment and print diagnostics at startup, but also temporarily migrate properties at runtime for you. This is a must have during your application migration:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-properties-migrator</artifactId>
<scope>runtime</scope>
</dependency>
runtime("org.springframework.boot:spring-boot-properties-migrator")
Note Once you’re done with the migration, please make sure to remove this module from your project’s dependencies.
For more information follow this link
Spring Boot 2.X migration guide

Related

What is the relationship between Spring Boot and the Maven pom.xml file?

I use maven to build spring boot applications. I know that the maven spring-boot dependencies have annotations (such as #AutoConfiguration or #Condition*). At what point in the SpringApplication.run method in the main class does it actually read from the pom.xml file? I'm stepping through the code line by line and I can't figure out at what point does Spring Boot interact with the pom.xml file.
The #EnableAutoConfiguration enables auto-cofiguration of the Application Context based on jar files in the classpath and user-defined beans, so presumably the pom.xml eventually gets added as a source and the annotations in the dependencies are read when the auto-configuration state is running, but I can't figure out where or when it does so.
SpringApplication.run doesn't read the pom.xml file, it works at a higher level of abstraction. The pom.xml file is used by your IDE or by the mvn command line application to download dependencies and configure your applications classpath.
By the time that SpringApplication.run is called, the classpath is fully configured but SpringApplication itself isn't aware about how this was done.
Auto-configuration occurs by searching the all jars on the classpath looking for a files named META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.imports (if you're using a recent version). Once these files have been found, the classes listed within them are loaded and used for configuration.
If you want to step though the code, you can set a breakpoint at AutoConfigurationImportSelector.getCandidateConfigurations(...) and see the auto-configuration candidate classes get found. The exact auto-configurations that get applied will depend on the #Condition... annotations on them.

Best Way to Remove Functionality from Java Spring Boot App at Build

I'm looking for a way to conditionally remove functionality from my Spring Boot app during the build. Whether that is a Maven plugin that strips out the actual Java classes or packages, or just is able to set a variable that cannot be later changed environmentally.
My spring boot app has a bunch of reports that can be generated but they shouldn't all be available in every environment they are deployed in. I would like an easy build config file where I can set which reports are enabled and disabled before build. I need them hard set at build, so someone with control of the environment can't just go in and update an associated property value and they become enabled again.
I was looking for some plugin that would remove classes or packages, or that could hardcode the variable value into the compiled class files but couldn't find anything. I figure there must be some way to make configuration stuck at build time. Have been unsuccessful finding a way to set properties at build that cannot be overridden.

What's the difference between -Drun.profiles and -Dspring.profiles.active on Spring?

I'm trying to understand the difference in Spring between -Drun.profiles and -Dspring.profiles.active.
Another answer in SO does not explain so much about the difference.
In my tests, both of them can be used to select a profile:
mvn spring-boot:run -Drun.profiles=prod
or
mvn spring-boot:run -Dspring.profiles.active=prod
So, what's the difference?
spring.profiles.active is one of the properties that Spring Boot applications support out of the box. Its used to specify at the level of Spring Boot application which profiles should be run.
Spring Boot supports many different properties, a full list can be found here.
Now, you won't find run.profiles among these properties, because its just a property that Spring Boot Maven plugin supports (and yes, it 'translates' it to the list of profiles to be used as well, so these properties might look similar), but the point is that -Drun.profiles will only work if you start the spring boot application with Maven plugin.
In production, however, the chances are that there won't be Maven at all, and the application will run as is (as a big jar) or even packed as Docker image or something. So for non maven-plugin usage you should use spring.profiles.active
The last point, that even in Maven --spring.profiles.active can be used, but it doesn't work out of the box. You should pass this parameter like this:
mvn spring-boot:run -Drun.jvmArguments="-Dspring.profiles.active=production"
See this item in Github.
Hope this clarifies the differences between the two.

Build Managment With Maven and HK2

So far, I have created a web application using JAX-RS (Jersey) and Maven as build and dependency managment, but for this question, I'm not sure it matters. I'm using h2k as DI framework. It works fine and I can package the application as a WAR which can be deployed to a tomcat server (both locally and remote).
The application is configured using jersey's ResourceConfig, where I also configure my AbstractBinder (for h2k) to bind my #Inject to concrete instances. So far so good. Now I want to use Jetty (or grizzly) as an embedded server for local development (by mvn jetty:run), and automate the build of the war for remote deployment. I want to use different classes (injected by hk2) depending on the environment (eg. fake email sender, on test server), and this is where I'm stuck. How do I specify which environment I'm running in and how do I specify which classes to use for each environment?
Maybe my problem is in my understanding of how all this works (Examples of actual build setups would be warmly welcomed). Normally I just use AbstractFactory, which I inject into my main method. My guess on how this should be done:
I should create a properties / xml file for each environment, where I specify which implementations and properties should be used.
When running or building, I should specify which environment I'm running in (for instance mvn build -ENVIRONMENT)

java.lang.IllegalArgumentException: Jetty ALPN/NPN has not been properly configured

Getting java.lang.IllegalArgumentException: Jetty ALPN/NPN has not been properly configured, while using gRPC(google pub/sub) to publish/consumes messages from Kafka.
Try adding a runtime dependency on netty-tcnative-boringssl-static. See gRPC's SECURITY.md. Note that the version of netty-tcnative necessary changes over time; you should look at the version of the document for a particular release (e.g., this is for 1.2.0).
Finally, went back to boot class class path approach. Prefixed the jetty-alpn.jar to boot class path and it starts working fine in cloud foundry now.
Adding the ALPN client JAR which matches my JDK version fixed this issue for me. In eclipse, you need to set up the jar as a bootstrap entry for the tomcat server.
You can find more info about it here : https://medium.com/#Parithi/jetty-alpn-npn-has-not-been-properly-configured-solution-418417ee6502
As suggested by google, use jetty container instead of tomcat, this solution works, but in our production, applications deployed on tomcat container, but of course I need it to work on tomcat in production.
On debugging the gRPC code, found that guava version causing the issue, updated the guava version 18.0, (where in some classes missed in previous versions), solved the problem , but failed while deploying in CF
Customized emebed-tomcat-core, and it works fine consistently, but again, team say no to custom tomcat container.
Java –jar apm-asset-xxxx.jar – works fine locally, but need to provide a custom command to CF start, didn’t have luxury to change the CF start process.
Finally, trick, the class loader to use tcnative-boring-ssl, library instead of tomcat-core library at runtime, by providing the following dependency in pom.xml. For the past 3 days, this solution is working CF.
org.springframework.boot
spring-boot-starter-web
org.hibernate
*
org.apache.tomcat.embed
tomcat-embed-core
org.apache.tomcat.embed
tomcat-embed-core
provided
Maven manifest plugin to promote the tc-native library to the top in the classloader.
In POM, try to place the gRPC dependency before the spring boot dependency (the order of dependencies matters). I did that and the issue was solved. For example:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-language</artifactId>
<version>0.13.0-beta</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>

Categories