I have an AWS lambda RequestHandler class which is invoked directly by AWS. Eventually I need to get it working with Spring Boot because I need it to be able to retrieve data from Spring Cloud configuration server.
The problem is that the code works if I run it locally from my own dev environment but fails to inject config values when deployed on AWS.
#Configuration
#EnableAutoConfiguration
#ComponentScan("my.package")
public class MyClass implements com.amazonaws.services.lambda.runtime.RequestHandler<I, O> {
public O handleRequest(I input, Context context) {
ApplicationContext applicationContext = new SpringApplicationBuilder()
.main(getClass())
.showBanner(false)
.web(false)
.sources(getClass())
.addCommandLineProperties(false)
.build()
.run();
log.info(applicationContext.getBean(SomeConfigClass.class).foo);
// prints cloud-injected value when running from local dev env
//
// prints "${path.to.value}" literal when running from AWS
// even though Spring Boot starts successfully without errors
}
}
#Configuration
public class SomeConfigClass {
#Value("${path.to.value}")
public String foo;
}
src/main/resources/bootstrap.yml:
spring:
application:
name: my_service
cloud:
config:
uri: http://my.server
failFast: true
profile: localdev
What have I tried:
using regular Spring MVC, but this doesn't have integration with #Value injection/Spring cloud.
using #PropertySource - but found out it doesn't support .yml files
verified to ensure the config server is serving requests to any IP address (there's no IP address filtering)
running curl to ensure the value is brought back
verified to ensure that .jar actually contains bootstrap.yml at jar root
verified to ensure that .jar actually contains Spring Boot classes. FWIW I'm using Maven shade plugin which packages the project into a fat .jar with all dependencies.
Note: AWS Lambda does not support environment variables and therefore I can not set anything like spring.application.name (neither as environment variable nor as -D parameter). Nor I can control the underlying classes which actually launch MyClass - this is completely transparent to the end user. I just package the jar and provide the entry point (class name), rest is taken care of.
Is there anything I could have missed? Any way I could debug this better?
After a bit of debugging I have determined that the issue is with using the Maven Shade plugin. Spring Boot looks in its autoconfigure jar for a META-INF/spring.factories jar see here for some information on this. In order to package a Spring Boot jar correctly you need to use the Spring Boot Maven Plugin and set it up to run during the maven repackage phase. The reason it works in your local IDE is because you are not running the Shade packaged jar. They do some special magic in their plugin to get things in the right spot that the Shade plugin is unaware of.
I was able to create some sample code that initially was not injecting values but works now that I used the correct plugin. See this GitHub repo to check out what I have done.
I did not connect it with Spring Cloud but now that the rest of the Spring Boot injection is working I think it should be straightforward.
As I mentioned in the comments you may want to consider just a simple REST call to get the cloud configuration and inject it yourself to save on the overhead of loading a Spring application with every request.
UPDATE: For Spring Boot 1.4.x you must provide this configuration in the Spring Boot plugin:
<configuration>
<layout>MODULE</layout>
</configuration>
If you do not then by default the new behavior of the plugin is to put all of your jars under BOOT-INF as the intent is for the jar to be executable and have the bootstrap process load it. I found this out while addressing adding a warning for the situation that was encountered here. See https://github.com/spring-projects/spring-boot/issues/5465 for reference.
Related
There is a Spring Boot 2 app with such a structure:
parent-module
module-1
src
main
java
resources
- application.yml
module-2
src
main
java
resources
- application.yml
Also, module-1 depends on module-2, specified in pom.xml dependencies section.
The problem is that when I specify some properties in module-2's application.yml - they are not visible in main module-1's components (via #Value annotation).
As was answered here seems like module-1's application.yml overrides module-2's application.yml. There is a workaround - if I use name application.yaml in module-2 everything works fine, but I'm going to add more modules and, finally, it's dirty hack.
What I'm doing wrong? Should such an hierarchy of property files specified somehow?
I will be happy to provide more details if it's needed.
Thank you!
Spring Boot is a runtime framework. I understand that your modules are not spring-boot applications by themselves (you can't make a dependency on a spring boot application packaged with spring boot maven plugin, because it produces an artifact that is not really a JAR from the Java's standpoint although it does have *.jar extension).
If so, they're probably regular jars. So you should have a "special" module that assembles the application. This special module lists both 'module1' and 'module2' in <dependency> section and should contain a definition of spring-boot-maven-plugin in its build section (assuming you're using maven). But if so you shouldn't really have more than one application.yml - it will be misleading. Instead, put the application.yml to the src/main/resources of that "special" module.
If you really have to for whatever reason work with multiple application.yaml files, make sure you've read this thread
I know, this is already a well-aged post.
I just came accross the same issue and the best solution I found was to import the module-specific configurations with the spring.config.import directive as described here.
In this case you still have your module specific configuration in property or yaml files within that specific module and do not have too much unwanted dependencies in your project setup.
application.yml is, as the name indicates, an application-level file, not a module-level file.
It is the build script that assembles the final application, e.g. the .war file, that needs to include a application.yml file, if any.
If modules need properties, and cannot rely on the defaults, e.g. using the : syntax in #Value("${prop.name:default}"), they need to provide a module-level property file using #PropertySource("classpath:/path/to/module-2.properties").
Note: By default, #PropertySource doesn't load YAML files (see official documentation), but Spring Boot can be enhanced to support it. See #PropertySource with YAML Files in Spring Boot | Bealdung.
Alternative: Have the application-level build script (the one building the .war file) merge multiple module-level build scripts into a unified application.yml file.
I'm working with the Spring Boot 2.2.9.RELEASE. I have the main app and some plain starter (which just uses spring-actuator functionality) with some properties in its some-starter/src/main/resources/application.properties file:
management.server.port=9000
management.server.ssl.enabled=false
management.endpoints.enabled-by-default=false
management.endpoint.health.enabled=true
management.endpoints.web.base-path=/
management.endpoints.web.path-mapping.health=health
I've imported the starter into my main app and I believe that the healthcheck endpoint should work with the port 9000 and with the path /health (smth like that localhost:9000/health).
But it doesn't. However, it works in case of the same properties in my main app main-app/src/main/resources/application.properties.
Is it problem with the property overriding in Spring Boot? Should i configure my resources via something like maven-resources-plugin?
When application.properties is loaded from the classpath, the first one on the classpath is loaded and any others on the classpath are ignored. In your case, the file in main-app/src/main/resources/application.properties will appear on the classpath before the application.properties in the jar of some-starter.
As its name suggests, application.properties is intended for configuring your application and it shouldn't be used in a starter. You should either configure all of the properties in your application, or you could update your starter to include an EnvironmentPostProcessor that is registered via spring.factories and adds some default properties to the Environment.
The situation is that I have two api projects, API A does HTTP requests to API B. Both API's are deployed to a development and production environment.
What I want to achieve is the following: Build the project based on a specific profile (dev or prod) so that the code can use a particular baseurl to talk with the correct api on the correct environment.
So if I build API A based on prod flag, I want it to use the specific url to make http requests to API B that is deployed on it's own prod environment.
It looks like you're referring to profiles of maven, however you should probably check out spring profiles. The concept should change :
You're not supposed to build different artifacts for different environments.
Instead create a spring profile in service A:
application-dev.properties:
url.addr=dev-service-host:1234
application-prod.properties:
url.addr=prod-service-b-host:4321
Then run the application with --spring.profiles.active=dev (or prod) flag.
Spring boot will load the correct definitions automatically because the dev/prod matches the suffix of properties file
You can define Spring-Boot profile as:
spring.profiles.active=prod
You also should have profiled .properties files in resources:
in application-dev.properties you should have api.b.url={api_b_url_on_dev_environment}
in application-prod.properties you should have api.b.url={api_b_url_on_prod_environment}
Or if you don't want to recompile your application after changing properties you may use outside .properties files.
In order for them to be included during app's deployment do the following:
in some config directory add application-dev.properties and application-prod.properties
deploy you app with the following properties: --spring.profiles.active=dev and --spring.config.additional-location=config/application.properties
This way the outside profiled properties will be included in deployment process. These .properties files have the highest priority in Spring.
I have a simple Maven module (not a Spring Boot application) in which I have placed my application.properties file.
I have 6-7 Spring Boot applications and I don't want to have an application.properties file in each and every application directory. I prefer it, if it is at one single place (external Maven module).
I am adding the maven module as a dependency in each of those Spring Boot application poms.
But, when I run those applications, it is not able to auto-detect the application.properties file because it is coming from a dependent jar not present physically in each of their application directories.
Is there any way to make this possible? I would like to avoid having properties files in 6-7 different locations, because that becomes tough to manage and handle.
Thank you in advance!
Consider using Spring Cloud Config that provides server and client-side support for externalized configuration in a distributed system. It requires some small effort, but it is very useful in long term. Config server manages configuration files (.properties or .yml), you can still use different config per profile (e.g. application-test.properties, application-prod.properties etc.). Your application has a higher priority, so you can always override properties coming from config server if needed. Another cool feature is that config server can utilize Git repository, so you can easily version all your configuration files. It also supports encryption - any fragile data can be encrypted so only your application knows how to decrypt it.
Config server
Config server is nothing else than a simple Spring Boot application that can be implemented as:
#SpringBootApplication
#EnableConfigServer
public class ConfigServer {
public static void main(String[] args) {
SpringApplication.run(ConfigServer.class, args);
}
}
with simple application.properties file included:
server.port: 8888
spring.cloud.config.server.git.uri: file://${user.home}/config-repo
with dependency in pom.xml
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-config-server</artifactId>
</dependency>
Config client
On client side you add a dependency to your pom.xml (or its equivalent in build.gradle if you use Gradle):
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-config-client</artifactId>
</dependency>
and all you have to do is add a URL to config server to your application.properties (or application.yml if you use YAML insted):
spring.cloud.config.uri: http://myconfigserver.com
Config files structure
Now let's say you have set up Git repository for your configuration files. Let's assume that your applications are named like horus, venus, mercury etc. and you have 3 different profiles: dev, test and prod. You also have some configuration that is common for all applications. In this case your configuration files structure would look like this (I will use properties files here but it applies to YAML as well):
application.properties - common config for all apps no matter what profile they use
application-dev.properties - common config for all apps running with dev profile
application-test.properties - common config for all apps running with test profile
application-prod.properties - common config for all apps running with prod profile
horus.properties - horus app config for, common for all profiles
horus-dev.properties - horus app config for dev profile only
horus-test.properties - horus app config for test profile only
horus-prod.properties - horus app config for prod profile only
etc.
There are some additional options that can be set (like encryption, connection strategy (fail fast or ignore) etc.), everything is well described and documented in official documentation https://cloud.spring.io/spring-cloud-config/ Hope it helps you making a good choice for managing your configuration in distributed application environment. Config server is a solution that was invented to solve this problem.
While Szymon Stepniak's answer certainly is a "by-the-book" of Spring Boot answer, I understand your situation, and even tried to do what you try to do by myself. Indeed, you can't define application.properties in other "external modules".
Here is how I've solved it:
Create a configuration in the "common" module
Create a property file in src/main/resources. It shouldn't be named application properties, It's better to provide a unique name for it (at least this is how I've done it, so let's assume that the file is called application-common.properties)
Use #PropertySources annotation to define a property file and load it with configuration.
Here is an example:
package com.myapp.common;
#Configuration
#PropertySources({
#PropertySource("classpath:application-common.properties")
})
public class MyConfiguration {
// you don't really have to define beans
}
Now if you want this configuration to load automatically only because the dependency is defined in your spring boot module of your build system, I've found the best to utilize spring factories:
Create the file src/main/resources/META-INF/spring.factories
Place the following into this file:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
com.myapp.common.MyConfiguration
This may be an impossible task, but here goes...
Is it possible to register a spring bean, by (ONLY) adding a jar to the classpath of a spring-boot application?
Scenario: I would like to create a non-intrusive plugin jar, which when imported into a spring-boot project's classpath, will automatically be picked up and provide a service (e.g. via a RestController).
Constraints
I don't want to change or reconfigure the existing spring-boot application (i.e. no additional scan paths or bean config).
I don't have any knowledge of the target spring-boot application's package structure/scan paths.
I guess I was hoping that by default Spring scan's its own package structure (i.e. org.springframework.** looking for the presence of database libs, etc) and I could piggy-back off that - I haven't had any luck (so far).
I've setup an example project in github, to further clarify/illustrate my example and attempts.
** Solution Addendum **
This bit that got it working, was to add the following file, which points to an #Configuration config file...
plugin-poc\src\main\resources\META-INF\spring.factories
org.springframework.boot.autoconfigure.EnableAutoConfiguration=org.thirdpartyplugin.PluginConfiguration
I think in such cases you would try to add a spring auto configuration that is annotated with #ConditionalOnClass to be only evaluated if the given class is on the classpath. This class can register the bean and would just be evaluated if the conditional evaluates to true
Here is the relevant part of the spring boot documentation : Creating your own auto-configuration