Hello I am new to Maven in general so I'd like to apologise in advance if I get things wrong.
I have a Maven project with spring in which I also access a MySQL database.often the MySQL database is not accessible so I use a local database for testing, I swap between the two by changing the active profile.
I set up the profiles by putting this in my POM.xml:
<profiles>
<profile>
<id>Local</id>
<dependencies>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.3.3</version>
<classifier>jdk5</classifier>
</dependency>
</dependencies>
<properties>
<jdbc.url>jdbc:hsqldb:file:databaseName</jdbc.url>
<jdbc.username>a</jdbc.username>
<jdbc.password></jdbc.password>
<jdbc.driver>org.hsqldb.jdbcDriver</jdbc.driver>
</properties>
</profile>
<profile>
<id>MySQL</id>
<dependencies>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
</dependencies>
<properties>
<jdbc.url>jdbc:mysql://mysql.website.ac.uk:3306</jdbc.url>
<jdbc.username>user</jdbc.username>
<jdbc.password>1234</jdbc.password>
<jdbc.driver>com.mysql.jdbc.Driver</jdbc.driver>
</properties>
</profile>
</profiles>
When I try to access the properties variables through for example ${jdbc.url} they are not converted to the actual values as defined in the profile and I get errors. I suspect it might have something to do with Spring Boot.
Apart from the specific sprint-boot problem I think you are missing the maven resource filtering configuration.
Maven properties are not automatically propagated to Spring unless you put them into a separate properties files which must then be managed by spring (Usually via PropertyPlaceholderConfigurer).
But without maven filtering turned on properties are not automatically resolved, usually if you do not have many items under tour src/resources folder you can just enable filtering globally with
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
And your properties file should replace placeholders with maven values.
If, on the other hand in you have several resources in src/main/resources, you should check documentation more deeply to avoid binary files corruption
I think that spring boot is simply wrapping up all the process automatically in your application properties files.
You specified properties in profile, so there is a workaround. Annotations such as #Value("${name}") are referencing to externalized configuration, e.g. from application.yml. You should use this file to load properties you want. #property_name# loads property from your maven profile.
jdbc:
url: #jdbc.url#
username: #jdbc.username#
and so on...
Then these properties will be accessible in your code.
Maven properties are not put into environment variables or JVM variables that you can access by Spring.
However you can usually instruct maven plugin to do it for you.
For example configuration of maven-failsafe-plugin I use to set up test db:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<includes>**/*Integration.java</includes>
<environmentVariables>
<mysql.version>${mysql.version}</mysql.version>
<test.mysql.username>${test.mysql.username}</test.mysql.username>
<test.mysql.password>${test.mysql.password}</test.mysql.password>
<test.mysql.port>${test.mysql.port}</test.mysql.port>
<test.mysql.db>${test.mysql.db}</test.mysql.db>
</environmentVariables>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
Related
How does/should Maven plugins behave with regards to the order in which they process configuration options? I would expect that properties passed via CLI overwrite those defined in a <configuration> block in pom.xml.
Here's an example.
pom.xml
<plugin>
<groupId>group</groupId>
<artifactId>artifact</artifactId>
<version>1.2.3</version>
<executions>
...
</executions>
<configuration>
<url>foo.com</url>
</configuration>
</plugin>
CLI
mvn group:artifact:1.2.3:doit -Dmymojo.url=bar.com
I am currently debugging a plugin (not mine) that gives precedence to the url value defined in the POM rather than the one passed on the CLI. Is that how mojos are supposed to behave i.e. a Maven feature rather than a plugin bug? I didn't find anything mentioned in the ref guide.
As per https://issues.apache.org/jira/browse/MNG-4979 this works as designed. I find it counter-intuitive and don't find the reasons given in MNG-4979 convincing.
If your setup allows to modify the pom.xml you can work around this behavior as suggested by JF Meier (and the issue above).
<properties>
<mymojo.url>foo.bar</mymojo.url>
</properties>
<plugin>
<groupId>group</groupId>
<artifactId>artifact</artifactId>
<version>1.2.3</version>
<executions>
...
</executions>
<configuration>
<url>${mymojo.url}</url>
</configuration>
</plugin>
Through the command line, you set a property url. This would override an entry <url>foo.com</url> in the <properties> section of the POM.
Many plugins allow to set configuration entries through properties, but these properties do not automatically have the same name. In the documentation, this is usually called user property. To see examples, look e.g. at
https://maven.apache.org/plugins/maven-dependency-plugin/get-mojo.html
I am learning Liquibase and Spring Boot so I've created a simple project with Spring Initializr.
In the POM.xml file I've added:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.4.1</version>
<configuration>
<propertyFile>src/main/resources/application.properties</propertyFile>
</configuration>
</plugin>
I've specified as property file the application.properties so all the configuration of my application can happen in a single file.
When I run any liquibase-maven-plugin task from IntelliJ I get different errors, here's an example running the changeLogSync task:
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.4.1:changelogSync (default-cli) on project simpleTest: The changeLogFile must be specified
If I add the right keys in the application.properties I am able to make it work.
For example I've found that liquibase-maven-plugin will not read the spring.datasource.url property but it will only read the url property.
For this reason my application.properties will have to be something similar:
environment = JUnit
spring.datasource.url = jdbc:h2:file:./target/test
spring.datasource.driver-class-name = org.h2.Driver
spring.datasource.username = sa
spring.datasource.password = sa
spring.liquibase.change-log = classpath:/db/changelog/db.changelog-master.yaml
spring.h2.console.enabled = true
spring.h2.console.path = /h2-console
# Keys needed for liquibase maven plugin
url = jdbc:h2:file:./target/test
username = sa
password = sa
If I follow this pattern I'll end up having several keys with slightly different names but with the same values in my application.properties and this solution is clearly very ugly and inefficient.
What is an efficient and maintainable way to configure and use Liquibase Maven Plugin in Spring Boot?
Edit after the answer received from Amith Kumar:
environment=JUnit
spring.datasource.url=jdbc:h2:file:./target/glossary-test
spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=sa
spring.liquibase.change-log=classpath:/db/changelog/db.changelog-master.yaml
spring.h2.console.enabled=true
spring.h2.console.path=/h2-console
url=${spring.datasource.url}
changeLogFile=${spring.liquibase.change-log}
username=${spring.datasource.username}
password=${spring.datasource.password}
Error after the edit:
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.4.1:dropAll (default-cli) on project test: Error setting up or running Liquibase: liquibase.exception.DatabaseException: java.lang.RuntimeException: Cannot find database driver: Driver class was not specified and could not be determined from the url (${spring.datasource.url}) -> [Help 1]
Liquibase maven plugin supports configuration injection through pom.xml.
So you can use properties-maven-plugin to include your properties from application.properties (or use yaml-properties-maven-plugin if you are using application.yml), and then inject them into the liquibase configuration:
Example:
<plugin>
<groupId>it.ozimov</groupId>
<artifactId>yaml-properties-maven-plugin</artifactId>
<version>1.1.3</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>src/main/resources/application.yml</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
Now you can inject these properties in liquibase configuration:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<changeLogFile>src/main/resources/db/changelog/db.changelog-master.yaml</changeLogFile>
<driver>${spring.datasource.driverClassName}</driver>
<url>${spring.datasource.url}</url>
<username>${spring.datasource.username}</username>
<password>${spring.datasource.password}</password>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
<databaseChangeLogTableName>DATABASECHANGELOG</databaseChangeLogTableName>
<databaseChangeLogLockTableName>DATABASECHANGELOGLOCK</databaseChangeLogLockTableName>
</configuration>
<dependencies>
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.0</version>
</dependency>
</dependencies>
</plugin>
I also needed to set the logicalFilePath to ensure that the changelog path inferred by spring boot integration and the maven plugin where the same.
application.properties settings are very fast to have an up and running application but not the best solution in terms of flexibility
My advice is to configure a datasource using #Configuration, example here
And then configure liquibase passing datasource defined above as follows
#Configuration
public class LiquibaseConfigurer {
#Autowired
#Qualifier("primaryDataSource")
private DataSource oltpDataSource;
#Bean
#DependsOn
public SpringLiquibase liquibase() {
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setChangeLog("classpath:liquibase/liquibase-changelog.xml");
liquibase.setDataSource(oltpDataSource);
return liquibase;
}
}
In this case you just need liquibase-core dependency as follows
<dependency>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-core</artifactId>
</dependency>
A simpler alternative is to configure liquibase outside the application with no maven plugin.
Download library, or install it with some package manager, and launch a command line with all settings
liquibase --driver=org.h2.Driver \
--classpath=/path/to/h2/driver.jar \
--changeLogFile=/db/changelog/db.changelog-master.yaml \
--url="jdbc:h2:file:./target/glossary-test" \
--username=sa \
--password=sa \
--logLevel=debug \
migrate
Anyway the problem you have now is because you've written this:
url=${spring.datasource.url}
I don't know where did you find this syntax but try to replicate connections url and replace with the following
url=jdbc:h2:file:./target/test
do the same for other settings
It is a very common occurrence in many projects.
When you use multiple plug-ins/library, each expect certain properties from environment config where key names are defined in their native nomenclature.
There is no standardization for this issue.
In order to avoid providing same values to multiple properties, which is error prone, you are recommended to use references.
# Keys needed for liquibase maven plugin
url=${spring.datasource.url}
UPDATE
I noticed you are encountering exception when running liquibase maven plugin which is of course runs outside of spring context. The solution I provided earlier works within spring context, that is when you have application spun up.
For given scenario, use maven filter resource files feature. So your command will change to
mvn liquibase:generateChangeLog resources:resources
And your setup will look like below:
src/main/filters/filter.properties
db.url=jdbc:h2:file:./target/glossary-test
db.username=sa
db.password=sa
db.driver=org.h2.Driver
db.lb.changeLogFile=classpath:/db/changelog/db.changelog-master.yaml
application.properties
spring.datasource.url=#db.url#
spring.datasource.username=#db.username#
spring.datasource.password=#db.password#
spring.datasource.driver-class-name=#db.driver#
url=#db.url#
username=#db.username#
password=#db.password#
driver=#db.driver#
changeLogFile=#db.lb.changeLogFile#
pom.xml
<build>
......
<plugins
......
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.6.3</version>
<configuration>
<propertyFile>target/classes/application.properties</propertyFile>
</configuration>
</plugin>
</plugins>
<filters>
<filter>src/main/filters/filter.properties</filter>
</filters>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</build>
Please refer to my github project for complete working solution. Look at filter.properties file where you have common properties defined, and then same is referred in application.properties file.
NOTE: Since this is a spring project, you can't use ${propertyName} for maven filter file as its reserved property placeholder syntax for spring, but then use #propertyName#. For non-spring project ${propertyName} will work out of the box.
Is there a specific recommended approach to the inclusion of the spring-boot parent pom into projects that already have a required parent POM?
What do you recommend for projects that need to extend from an organizational parent (this is extremely common and even something many/most projects published to Maven central depending on the feeder repos they come from). Most of the build stuff is related to creating executable JARs (e.g. running embedded Tomcat/Jetty). There are ways to structure things so that you can get all the dependencies without extending from a parent (similar to composition vs. inheritance). You can't get a build stuff that way though.
So is it preferable to include all of the spring-boot parent pom inside of the required parent POM or to simply have a POM dependency within the project POM file.
Other options?
TIA,
Scott
You can use the spring-boot-starter-parent like a "bom" (c.f. Spring and Jersey other projects that support this feature now), and include it only in the dependency management section with scope=import.That way you get a lot of the benefits of using it (i.e. dependency management) without replacing the settings in your actual parent.
The 2 main other things it does are
define a load of properties for quickly setting versions of dependencies that you want to override
configure some plugins with default configuration (principally the Spring Boot maven plugin). So those are the things you will have to do manually if you use your own parent.
Example provided in Spring Boot documentation:
<dependencyManagement>
<dependencies>
<dependency>
<!-- Import dependency management from Spring Boot -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>2.1.3.RELEASE</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Update 2022-05-29 with 1.5.9.RELEASE.
I have full code and runable example here https://github.com/surasint/surasint-examples/tree/master/spring-boot-jdbi/9_spring-boot-no-parent (see README.txt to see that you can try)
You need this as a basic
<dependencyManagement>
<dependencies>
<dependency>
<!-- Import dependency management from Spring Boot -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${springframework.boot.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
But that is not enough, you also need explicitly define goal for spring-boot-maven-plugin (If you use Spring Boot as parent, you do not have to explicitly define this)
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${springframework.boot.version}</version>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
Otherwise you cannot build as executable jar or war.
Not yet, if you are using JSP, you need to have this:
<properties>
<failOnMissingWebXml>false</failOnMissingWebXml>
</properties>
Otherwise, you will get this error message:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-war-plugin:2.2:war (default-war) on project spring-boot-09: Error assembling WAR: webxml attribute is required (or pre-existing WEB-INF/web.xml if executi
ng in update mode) -> [Help 1]
NO NO , this is still not enough if you are using Maven Profile and Resource Filter with Spring Boot with "#" instead of "${}" (like this example https://www.surasint.com/spring-boot-maven-resource-filter/). Then you need to explicitly add this in
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
And this in
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.7</version>
<configuration>
<delimiters>
<delimiter>#</delimiter>
</delimiters>
<useDefaultDelimiters>false</useDefaultDelimiters>
</configuration>
</plugin>
See the example in the link https://www.surasint.com/spring-boot-with-no-parent-example/.
As per Surasin Tancharoen's answer, you may also want to define maven surefire plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
</plugin>
and possibly include fail-fast plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>${maven-failsafe-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
I am beginner on Spring and Maven.
In Spring Framework,
I want to manage a separate version control flow.
1. Main Project that already exist.
2. Module of the partial use that packaged by maven war.
Two projects should be treated separately when Push and Pull.
But Files on two projects may be present in the same folders.
How can I use this?
This is actually related with version control system you currently use. Git for example supports submodules. You can create a maven module directly in your root project folder and define it as git submodule. So they have different git tracks and may seperately maintained.
I believe what you need might be achievable by using 'war-overlays' as documented here
To summarize, you specify the 'child' project as a dependency in the 'Main' project:
...
<dependencies>
<dependency>
<groupId>com.example.projects</groupId>
<artifactId>documentedprojectdependency</artifactId>
<version>1.0-SNAPSHOT</version>
<type>war</type>
<scope>runtime</scope>
</dependency>
...
</dependencies>
...
And you define the overlay in the maven-war-plugin's configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<overlays>
<overlay>
<id>my-webapp-index.jsp</id>
<groupId>com.example.projects</groupId>
<artifactId>my-webapp</artifactId>
<includes>
<include>index.jsp</include>
</includes>
</overlay>
<overlay>
<!-- empty groupId/artifactId represents the current build -->
</overlay>
</overlays>
</configuration>
</plugin>
I have an application which I am working on which will eventually of course need a development and production environment. The key is the database connection settings which obviously I want to separate.
What is the correct process for separating these setting so that I don't deploy to production with the dev setting and visa versa so i don't deploy to dev with the production settings.
I was thinking potentially a properties file which is prefixed with prod and dev but I am not sure how robust that approach might be?
Thanks
To extend the questions a little further
Based on one of the comments i have looked in to the maven solution, as we are using open shift - there is already some pre-generated pom.xml code which looks as follows;
<profiles>
<profile>
<!-- When built in OpenShift the openshift profile will be used when invoking
mvn. -->
<!-- Use this profile for any OpenShift specific customization your app
will need. -->
<!-- By default that is to put the resulting archive into the deployments
folder. -->
<!-- http://maven.apache.org/guides/mini/guide-building-for-different-environments.html -->
<id>openshift</id>
<build>
<finalName>TomcatHotOrNot</finalName>
<plugins>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.2</version>
<configuration>
<outputDirectory>deployments</outputDirectory>
<warName>TomcatHotOrNot</warName>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
So based on the answer provided Using Maven for multiple deployment environment (production/development) - how would this need to be adapted to follow the solution there?
Thanks