I'm trying to run my code on local appengine simulator through the command mvn appengine:run and there is no error, it is just that, it can't find any of the RestController (eg: No file found for: /setiaalam/amenities).
Also, there is no Spring Boot logo been display from the startup, so I suspect I need to specific the servlet init for it? It is running fine in my own Apache Tomcat Eclipse environment, but this is only working if I were to 'run' the main class.
To be more specific, there is no custom servlet i'm creating, I just want to migrate it to Google Cloud AppEngine Standard - although no error, there is no Spring Boot startup logo at all. Trying to access any of the GET API that works locally using Postmen always return 404. No issue when try to access it from previous Apache Tomcat localhost.
Yes, I'm following the github guideline here:
Link to Github for Spring boot Google Appengine Standard
It didn't mention anything on modifying the web.xml.
And I missing something here?
Code (The main app):
#SpringBootApplication
#EnableJpaAuditing
public class SetiaAlamApplication{
public static void main(String[] args) {
SpringApplication.run(SetiaAlamApplication.class, args);
}
}
Code(1 of the controller):
#RestController
#RequestMapping("/setiaalam")
public class AmenityController {
#Autowired
AmenityDAO amenityDAO;
//service to get all amenities
#GetMapping("/amenities")
public List<Amenity> getAllAmenities(){
return amenityDAO.findAll();
}
Code (The needed SpringBootServletInitializer):
public class ServletInitializer extends SpringBootServletInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(SetiaAlamApplication.class);
}
}
The application.properties:
# Spring DATASOURCE (DataSourceAutoConfiguration & DataSourceProperties)
spring.datasource.url = jdbc:mysql://ipaddress:3306/setiaalam?
useSSL=false&autoReconnect=true&failOverReadOnly=false&maxReconnects=10
spring.datasource.username = hidden
spring.datasource.password = hidden
# Hibernate Properties
# The SQL dialect makes Hibernate generate better SQL for the chosen
database
spring.jpa.properties.hibernate.dialect =
org.hibernate.dialect.MySQL5Dialect
# Hibernate ddl auto (create, create-drop, validate, update)
spring.jpa.hibernate.ddl-auto = update
Ok, issue fixed. How I fix it?
Step 1
Download the latest Eclipse.
Step 2
This follow by adding Spring Tool feature from the Eclipse Market. Please take note, if you can't find the Eclipse Market, you need to add that 1st. Simple google will guide you on how to do it.
Step 3
Then follow by adding Google Cloud Tool (from Eclipse Market).
Step 4
Once added, create a new Spring project (through Eclipse as previously I didn't but using spring.io).
Step 5
Based on the generated single main application, change the needed configuration on the pom.xml according to the Google guide for Google App Engine Standard.
Step 6
Once done, create a simple 1 controller that return just a single String text. Deploy locally - on Jetty of course, once working, move 1 by 1 all the classes to this new project.
Important Notes
Of course, if your project like mine that require more and extra dependency such as Google API, Hibernate, you need to manually add it into the pom.xml.
Also, for sure you will encounter the issue from Eclipse 'Unable to change Dynamic Web Module facet', i follow the simple guide of changing the project file org.eclipse.wst.common.project.facet.core.xml accordinly to 3.0 for the jst.web.
Thanks
Related
When using TestContainers to start a Vault container, the port that is exposed by the container is randomly selected during startup.
#Container
static VaultContainer vaultContainer = new VaultContainer<>("vault:1.7.2")
.withVaultToken(TOKEN)
.withInitCommand("secrets enable --path foo kv-v2")
.withInitCommand("kv put foo/app bar=foo");
Using a #DynamicPropertySource to override properties
#DynamicPropertySource
static void addProperties(DynamicPropertyRegistry registry) {
registry.add("spring.cloud.vault.host",()->vaultContainer.getHost());
registry.add("spring.cloud.vault.port",()->vaultContainer.getFirstMappedPort());
registry.add("spring.cloud.vault.uri",()->"http://"+vaultContainer.getHost()+":"+vaultContainer.getFirstMappedPort());
registry.add("spring.cloud.vault.token",()->TOKEN);
}
does not work since Spring Cloud Vault does not seem to "see" the added properties.
The issue is present in Spring-Boot 2.5.1 and Spring-Cloud-Vault-Config 3.0.3.
A small project showing the issue can be found on GitHub.
Am I doing something wrong or is there an alternative way to override the configuration?
When using Spring-Vault with a #VaultPropertySource instead of Spring-Cloud-Vault things work as expected.
According to:
https://github.com/spring-cloud/spring-cloud-vault/issues/602#event-4926845049
it's a spring-framework issue.
I'm trying to run a springboot app on a tomcat server that includes a datasource that's able to communicate with a vault and change db credentials during runtime. The only change I've made in this code is adding some properties that are necessary to communicate with the vault, and changing the datasource configuration to include these vault changes.
I get the following error during startup:
Description:
Cannot determine embedded database driver class for database type NONE
However, in my application.properties file I DO have the driver class specified...
spring.datasource.hikari.driverClassName=com.ibm.db2.jcc.DB2Driver
and in the pom file, I have the correct dependency so the driver is infact included in the classpath... I even see the jar in Intellij's 'External Libraries' drop down.
Again, I've not made many changes besides adding in addtional properties for our vault... and changing the code inside our datasource config to use the vault.
I've compared my changes against another module in which I did the exact same thing, and didn't have this issue at all there.
Does anyone have any ideas as to why this is happening, or suggestions on what I can try?
I've tried including a #Import annotation on my #Configuration class, which points to the vault configuration. I've tried adding a #ComponentScan on my application class to try and really get it to look at the config and properties properly.
If any further detail is required please just let me know. Thanks in advance for any and all help that can be offered.
I have faced same problem and got resolved by below annotation
#SpringDataApplication(exclude = {DataSourceAutoConfiguration.class, HibernateJpaAutoConfiguration.class})
please provide code snippet.So, i can debug it.
I am working on an application and deploying in cloud foundry. Internally it is having 3 custom dependencies developed by our team.
All 3 dependencies are boot project and have their #Configuration.
Dependency 1 is to interact with Couchbase. Source of this dependency is boot project.
Dependency 2 is to interact with FluentD for logging. Source of this dependency is boot project.
Dependency 3 is to interact with external rest service. Source of this dependency is boot project.
Dependency 4 is having all these above 3 dependencies and also having few utils classes and constants.
I am using this dependency 4 in multiple web applications which are having WebMVC implementation.
Everything is working fine in my local machine. But when I am pushing this web application on cloud, sometimes libraries getting executed before the web application which is crashing my app intermittently. Good thing app is getting recover in few seconds.
I did below changes in my libraries (jars/dependencies) and tried on cloud. After doing these changes ratio of app crash reduces, but unfortunately it is still crashing sometimes and I am able to see dependencies gets executed before application.
Added bootRepackage.enabled = false bootRepackage.withJarTask = jar in library's build.gradle
Took off from library and added in my web application
springBoot {
mainClass = "com.java.Application"
executable = true
}
Took off #SpringBootApplication from libraries(dependencies/jars). It's just in my web application now.
I do not know these are the only steps to make a boot dependency non-executable or I would have to do something else. Please let me know if I am missing something.
Here is the sample off application class of one of my dependency.
import org.springframework.context.annotation.ComponentScan;
#ComponentScan
public class LoggingApplication {
}
Sample of Web application main class.
#SpringBootApplication
#EnableWebMvc
#Import(LoggingApplication.class)
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
You didn't expose stacktrace nor provided any specifics of "crash". So looking into my crystal ball, this sounds like you are doing some work during wiring phase of Spring IoC container. This "work" should be moved into #PostConstruct handlers, so that you are sure it will be executed after Spring Context is fully created and initialized.
I have an AWS lambda RequestHandler class which is invoked directly by AWS. Eventually I need to get it working with Spring Boot because I need it to be able to retrieve data from Spring Cloud configuration server.
The problem is that the code works if I run it locally from my own dev environment but fails to inject config values when deployed on AWS.
#Configuration
#EnableAutoConfiguration
#ComponentScan("my.package")
public class MyClass implements com.amazonaws.services.lambda.runtime.RequestHandler<I, O> {
public O handleRequest(I input, Context context) {
ApplicationContext applicationContext = new SpringApplicationBuilder()
.main(getClass())
.showBanner(false)
.web(false)
.sources(getClass())
.addCommandLineProperties(false)
.build()
.run();
log.info(applicationContext.getBean(SomeConfigClass.class).foo);
// prints cloud-injected value when running from local dev env
//
// prints "${path.to.value}" literal when running from AWS
// even though Spring Boot starts successfully without errors
}
}
#Configuration
public class SomeConfigClass {
#Value("${path.to.value}")
public String foo;
}
src/main/resources/bootstrap.yml:
spring:
application:
name: my_service
cloud:
config:
uri: http://my.server
failFast: true
profile: localdev
What have I tried:
using regular Spring MVC, but this doesn't have integration with #Value injection/Spring cloud.
using #PropertySource - but found out it doesn't support .yml files
verified to ensure the config server is serving requests to any IP address (there's no IP address filtering)
running curl to ensure the value is brought back
verified to ensure that .jar actually contains bootstrap.yml at jar root
verified to ensure that .jar actually contains Spring Boot classes. FWIW I'm using Maven shade plugin which packages the project into a fat .jar with all dependencies.
Note: AWS Lambda does not support environment variables and therefore I can not set anything like spring.application.name (neither as environment variable nor as -D parameter). Nor I can control the underlying classes which actually launch MyClass - this is completely transparent to the end user. I just package the jar and provide the entry point (class name), rest is taken care of.
Is there anything I could have missed? Any way I could debug this better?
After a bit of debugging I have determined that the issue is with using the Maven Shade plugin. Spring Boot looks in its autoconfigure jar for a META-INF/spring.factories jar see here for some information on this. In order to package a Spring Boot jar correctly you need to use the Spring Boot Maven Plugin and set it up to run during the maven repackage phase. The reason it works in your local IDE is because you are not running the Shade packaged jar. They do some special magic in their plugin to get things in the right spot that the Shade plugin is unaware of.
I was able to create some sample code that initially was not injecting values but works now that I used the correct plugin. See this GitHub repo to check out what I have done.
I did not connect it with Spring Cloud but now that the rest of the Spring Boot injection is working I think it should be straightforward.
As I mentioned in the comments you may want to consider just a simple REST call to get the cloud configuration and inject it yourself to save on the overhead of loading a Spring application with every request.
UPDATE: For Spring Boot 1.4.x you must provide this configuration in the Spring Boot plugin:
<configuration>
<layout>MODULE</layout>
</configuration>
If you do not then by default the new behavior of the plugin is to put all of your jars under BOOT-INF as the intent is for the jar to be executable and have the bootstrap process load it. I found this out while addressing adding a warning for the situation that was encountered here. See https://github.com/spring-projects/spring-boot/issues/5465 for reference.
Is there any function to return the current working environment of a Play framework application ?
I tried the following but it doesn't seem to be working correctly;
String environment = play.api.Play.Mode
NOTE: I don't want to use isDev() isProd() stuff, I want to be able to create custom environments
PlayFramework 2.x supports only 3 modes: Prod, Dev and Test. First is used for production. Second provides more development additions like hotloading just editet classes. The last one is like the second one but with test libraries.
Play 1.x had also ID, which was able for using as different environment. For instance staging or instance of distributed server.
Play 2.x sadly doesn't support ID's anymore. But You can achieve the same effects manually.
Suppose you want to run your application in 'staging' mode.
First you need put configuration file along with basic configuration file, but named as application.staging.conf.
Second step is add to Global.scala code responsible for managing configuration files, something like this:
import java.io.File
import play.api._
import com.typesafe.config.ConfigFactory
object Global extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, classloader: ClassLoader, mode: Mode.Mode): Configuration = {
val env = System.getProperty("environment")
val envConfig = config ++ Configuration(ConfigFactory.load(s"application.${environment}.conf"))
super.onLoadConfig(environmentSpecificConfig, path, classloader, mode)
}
}
As you see it reads environment value and look at specific configuration file.
The last step is telling play framework which mode it should use. The best way is by starting command:
activator run -Denvironment=staging
Should works.
In Java it is play.Application.isDev() and isProd() or isTest()
https://playframework.com/documentation/2.2.x/api/java/index.html