How can I configure Maven Liquibase plugin in Spring Boot? - java

I am learning Liquibase and Spring Boot so I've created a simple project with Spring Initializr.
In the POM.xml file I've added:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.4.1</version>
<configuration>
<propertyFile>src/main/resources/application.properties</propertyFile>
</configuration>
</plugin>
I've specified as property file the application.properties so all the configuration of my application can happen in a single file.
When I run any liquibase-maven-plugin task from IntelliJ I get different errors, here's an example running the changeLogSync task:
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.4.1:changelogSync (default-cli) on project simpleTest: The changeLogFile must be specified
If I add the right keys in the application.properties I am able to make it work.
For example I've found that liquibase-maven-plugin will not read the spring.datasource.url property but it will only read the url property.
For this reason my application.properties will have to be something similar:
environment = JUnit
spring.datasource.url = jdbc:h2:file:./target/test
spring.datasource.driver-class-name = org.h2.Driver
spring.datasource.username = sa
spring.datasource.password = sa
spring.liquibase.change-log = classpath:/db/changelog/db.changelog-master.yaml
spring.h2.console.enabled = true
spring.h2.console.path = /h2-console
# Keys needed for liquibase maven plugin
url = jdbc:h2:file:./target/test
username = sa
password = sa
If I follow this pattern I'll end up having several keys with slightly different names but with the same values in my application.properties and this solution is clearly very ugly and inefficient.
What is an efficient and maintainable way to configure and use Liquibase Maven Plugin in Spring Boot?
Edit after the answer received from Amith Kumar:
environment=JUnit
spring.datasource.url=jdbc:h2:file:./target/glossary-test
spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=sa
spring.liquibase.change-log=classpath:/db/changelog/db.changelog-master.yaml
spring.h2.console.enabled=true
spring.h2.console.path=/h2-console
url=${spring.datasource.url}
changeLogFile=${spring.liquibase.change-log}
username=${spring.datasource.username}
password=${spring.datasource.password}
Error after the edit:
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.4.1:dropAll (default-cli) on project test: Error setting up or running Liquibase: liquibase.exception.DatabaseException: java.lang.RuntimeException: Cannot find database driver: Driver class was not specified and could not be determined from the url (${spring.datasource.url}) -> [Help 1]

Liquibase maven plugin supports configuration injection through pom.xml.
So you can use properties-maven-plugin to include your properties from application.properties (or use yaml-properties-maven-plugin if you are using application.yml), and then inject them into the liquibase configuration:
Example:
<plugin>
<groupId>it.ozimov</groupId>
<artifactId>yaml-properties-maven-plugin</artifactId>
<version>1.1.3</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>src/main/resources/application.yml</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
Now you can inject these properties in liquibase configuration:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<changeLogFile>src/main/resources/db/changelog/db.changelog-master.yaml</changeLogFile>
<driver>${spring.datasource.driverClassName}</driver>
<url>${spring.datasource.url}</url>
<username>${spring.datasource.username}</username>
<password>${spring.datasource.password}</password>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
<databaseChangeLogTableName>DATABASECHANGELOG</databaseChangeLogTableName>
<databaseChangeLogLockTableName>DATABASECHANGELOGLOCK</databaseChangeLogLockTableName>
</configuration>
<dependencies>
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.0</version>
</dependency>
</dependencies>
</plugin>
I also needed to set the logicalFilePath to ensure that the changelog path inferred by spring boot integration and the maven plugin where the same.

application.properties settings are very fast to have an up and running application but not the best solution in terms of flexibility
My advice is to configure a datasource using #Configuration, example here
And then configure liquibase passing datasource defined above as follows
#Configuration
public class LiquibaseConfigurer {
#Autowired
#Qualifier("primaryDataSource")
private DataSource oltpDataSource;
#Bean
#DependsOn
public SpringLiquibase liquibase() {
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setChangeLog("classpath:liquibase/liquibase-changelog.xml");
liquibase.setDataSource(oltpDataSource);
return liquibase;
}
}
In this case you just need liquibase-core dependency as follows
<dependency>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-core</artifactId>
</dependency>
A simpler alternative is to configure liquibase outside the application with no maven plugin.
Download library, or install it with some package manager, and launch a command line with all settings
liquibase --driver=org.h2.Driver \
--classpath=/path/to/h2/driver.jar \
--changeLogFile=/db/changelog/db.changelog-master.yaml \
--url="jdbc:h2:file:./target/glossary-test" \
--username=sa \
--password=sa \
--logLevel=debug \
migrate
Anyway the problem you have now is because you've written this:
url=${spring.datasource.url}
I don't know where did you find this syntax but try to replicate connections url and replace with the following
url=jdbc:h2:file:./target/test
do the same for other settings

It is a very common occurrence in many projects.
When you use multiple plug-ins/library, each expect certain properties from environment config where key names are defined in their native nomenclature.
There is no standardization for this issue.
In order to avoid providing same values to multiple properties, which is error prone, you are recommended to use references.
# Keys needed for liquibase maven plugin
url=${spring.datasource.url}
UPDATE
I noticed you are encountering exception when running liquibase maven plugin which is of course runs outside of spring context. The solution I provided earlier works within spring context, that is when you have application spun up.
For given scenario, use maven filter resource files feature. So your command will change to
mvn liquibase:generateChangeLog resources:resources
And your setup will look like below:
src/main/filters/filter.properties
db.url=jdbc:h2:file:./target/glossary-test
db.username=sa
db.password=sa
db.driver=org.h2.Driver
db.lb.changeLogFile=classpath:/db/changelog/db.changelog-master.yaml
application.properties
spring.datasource.url=#db.url#
spring.datasource.username=#db.username#
spring.datasource.password=#db.password#
spring.datasource.driver-class-name=#db.driver#
url=#db.url#
username=#db.username#
password=#db.password#
driver=#db.driver#
changeLogFile=#db.lb.changeLogFile#
pom.xml
<build>
......
<plugins
......
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.6.3</version>
<configuration>
<propertyFile>target/classes/application.properties</propertyFile>
</configuration>
</plugin>
</plugins>
<filters>
<filter>src/main/filters/filter.properties</filter>
</filters>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</build>
Please refer to my github project for complete working solution. Look at filter.properties file where you have common properties defined, and then same is referred in application.properties file.
NOTE: Since this is a spring project, you can't use ${propertyName} for maven filter file as its reserved property placeholder syntax for spring, but then use #propertyName#. For non-spring project ${propertyName} will work out of the box.

Related

maven-javadoc-plugin: How to update the module path dynamically

I have a Java (11.0.7) Maven (3.0.6) multi-module project that contains the following module declarations:
<modules>
<module>jdrum-commons</module>
<module>jdrum-datastore-base</module>
<module>jdrum-datastore-simple</module>
<module>jdrum</module>
</modules>
Each of these Maven modules contains a module-info that defines the necessary requirements and exports to restrict access and visibility.
As such, jdrum-datastore-simple has some test utility classes that I reuse in jdrum's tests. By configuring the surefire plugin in jdrum's config via the code snippet below I am able to package the whole project without any issues.
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>
<!-- Allow the unnamed module access to the tests at test-time -->
--add-opens jdrum/at.rovo.drum.impl=ALL-UNNAMED
--illegal-access=deny
</argLine>
</configuration>
</plugin>
</plugins>
</build>
Within the parents POM I've also configured the generation of a report via the site argument, which also generates the Javadoc of the respective projects. The configuration for the JAR containing the javadoc as well as the configuration for the Javadoc generation as part of the report are both the same and look like this:
<!-- Generate Javadoc while reporting -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.2.0</version>
<inherited>true</inherited>
<configuration>
<verbose>true</verbose>
<source>${maven.compiler.source}</source>
<show>protected</show>
<failOnWarnings>false</failOnWarnings>
<release>${maven.compiler.release}</release>
<stylesheet>java</stylesheet>
</configuration>
<reportSets>
<reportSet>
<id>html</id>
<reports>
<report>javadoc</report>
</reports>
</reportSet>
</reportSets>
</plugin>
The Javadoc generation as part of the package step, which generates the project-version-javadoc.jar as output, succeeds as both, the jdrum-datastore-simple dependencies as well as its tests, are only included at test time:
<!-- Test data store to use for testing -->
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
If I'd change the scope from test to compile or provided the Javadoc generation would also fail with an error such as
Exit code: 1 - javadoc: error - The code being documented uses packages in the unnamed module, but the packages defined in https://github.com/RovoMe/JDrum/jdrum-datastore-simple/apidocs/ are in named modules.
The issue here, as far as I understood the problem, is, that the jdrum-datastore-simple module is not added to the module path of Javadoc. The next logical step was therefore to add that module to the configuration as such:
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<additionalOptions>
<option>--add-modules</option>
<option>jdrum.datastore.simple</option>
</additionalOptions>
</configuration>
</plugin>
</plugins>
</reporting>
This adds the jdrum-datastore-simple module to the Javadoc configuration string, which can be seen in the jdrum/target/site/apidocs/options file that now contains an
...
--add-modules
jdrum.datastore.simple
...
entry. On further analyzing the generated options file it is apparent that the module path is missing out a reference to the actual JAR file and hence the Javadoc generation and thus the Maven process fails due to Javadoc not being able to locate the defined module. If I update that options file and add the path to the missing JAR file and then only perform a mvn package site the whole process succeeds and all is fine (as the pure invocation of the javadoc.bat located in the target/site/apidocs folder would as well).
Now, in order to make the whole process more dynamic I wanted to add or update the module path. However, the maven-javadoc-plugin does not directly allow this. Therefore I came up with adding a further maven-javadoc-plugin option of --module-path and a further option entry that contains the whole path. By the whole path I mean the path to every single dependency, so not only the path to jdrum-datastore-simple. This also works but due to hardcoding the path to the respective JAR files, the project is now not usable by other users unless they have the same system and path structure I used. To fix this I quickly replaced the respective path structure with ${settings.localRepository} and ${project.parent.basedir} properties on the respective modules in the module path. Unfortunately Javadoc is rather nitpicking on the path structure it accepts and it turns out that on my Windows machine Maven does return a path structure starting with C:\Users\... which Javadoc can't handle. If the path structure looks like C:/Users/... however Javadoc is fine with the values.
On further research I stumbled upon this thread which suggests to use Maven's build-helper-maven-plugin to define new properties for i.e. the M2 repository and use the built-in reg-ex capability to replace \ characters with /. However, adding a configuration such as
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>replace-local-repo-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.m2repo</name>
<value>${settings.localRepository}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
<execution>
<id>replace-local-path-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.basedir</name>
<value>${project.parent.basedir}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
</executions>
</plugin>
and using the introduced tags instead does not work at all as Maven is complaining about an invalid value provided. If I use $\{settings.localRepository} Maven is fine about the provided value, however in the final options file not the value of the actual settings.localRepository is updated but the provided string itself and I end up with something like $/{settings.localRepository}/org/slf4j/... which Javadoc can't resolve and therefore still misses out on the correct location to the jdrum-datastore-simple dependency.
So, how can I add the path to the missing dependency to maven-javadoc-plugin's module path defined in the generated options file so that the Maven is actually able to generate the whole report?
It seems that with java11 Update 9 (maybe also with update 8; not tested) maven-javadoc-plugin is able to correctly generate the Javadoc for multi-module projects without the need to alter the module-path.
For those interested how the actual Maven POM looks like:
Parent POM
POM for a shared module
POM for a sharing and consuming module
POM for the consuming module

Maven plugin CLI parameter overwriting <configuration> in pom.xml

How does/should Maven plugins behave with regards to the order in which they process configuration options? I would expect that properties passed via CLI overwrite those defined in a <configuration> block in pom.xml.
Here's an example.
pom.xml
<plugin>
<groupId>group</groupId>
<artifactId>artifact</artifactId>
<version>1.2.3</version>
<executions>
...
</executions>
<configuration>
<url>foo.com</url>
</configuration>
</plugin>
CLI
mvn group:artifact:1.2.3:doit -Dmymojo.url=bar.com
I am currently debugging a plugin (not mine) that gives precedence to the url value defined in the POM rather than the one passed on the CLI. Is that how mojos are supposed to behave i.e. a Maven feature rather than a plugin bug? I didn't find anything mentioned in the ref guide.
As per https://issues.apache.org/jira/browse/MNG-4979 this works as designed. I find it counter-intuitive and don't find the reasons given in MNG-4979 convincing.
If your setup allows to modify the pom.xml you can work around this behavior as suggested by JF Meier (and the issue above).
<properties>
<mymojo.url>foo.bar</mymojo.url>
</properties>
<plugin>
<groupId>group</groupId>
<artifactId>artifact</artifactId>
<version>1.2.3</version>
<executions>
...
</executions>
<configuration>
<url>${mymojo.url}</url>
</configuration>
</plugin>
Through the command line, you set a property url. This would override an entry <url>foo.com</url> in the <properties> section of the POM.
Many plugins allow to set configuration entries through properties, but these properties do not automatically have the same name. In the documentation, this is usually called user property. To see examples, look e.g. at
https://maven.apache.org/plugins/maven-dependency-plugin/get-mojo.html

Rename application.properties in spring boot, also for use with spring boot test

I have a project that uses Spring Boot. I need to rename the application.properties file to something else (i.e. othername.properties). The reason I need to rename it is because we have a bunch of other scripts that rely on that naming convention.
How can I tell Spring Boot to use othername.properties instead of the default application.properties? The documentation says that I can use the java command and pass an argument when running the application, but that doesn't help when I run a maven > install because there are spring integration/unit tests to be run as well (which rely on the application.properties file db settings).
Right now I have tried to set an environment property in the maven-surefire plugin like so:
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<spring.config.name>othername.properties</spring.config.name>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</pluginManagement>
When I add that in my pom.xml, it seems that Spring Boot isn't picking up that file, and also isn't reading from application.properties (because the db settings are in application.properties as well as othername.properties right now, and it fails, so I know it isn't being read).
The otherfile.properties is in my project's resources folder, should it go somewhere else? Also, how do I specify a relative path to that file if it were to live somewhere else?
Didn't try it, but I think it should be
<systemPropertyVariables> <spring.config.name>othername</spring.config.name> </systemPropertyVariables>
Without .properties

Try to use flyway for multiple DB instances

Running the Maven flyway-plugin
mvn flyway:migrate
with this configuration:
<plugin>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-maven-plugin</artifactId>
<version>4.0.3</version>
<configuration>
<driver>com.mysql.jdbc</driver>
<url>jdbc:mysql://localhost:3306/schema2?createDatabaseIfNotExist=true</url>
<user>root</user>
<password>root</password>
</configuration>
</plugin>
I try to create number of executions like in this solution:
How to use Flyway configuration to handle multiple databases
Start from one execution:
<plugin>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-maven-plugin</artifactId>
<version>4.0.3</version>
<executions>
<execution>
<id>migrate-database</id>
<phase>compile</phase>
<goals>
<goal>migrate</goal>
</goals>
<configuration>
<driver>com.mysql.jdbc</driver>
<url>jdbc:mysql://localhost:3306/schema2?createDatabaseIfNotExist=true</url>
<user>root</user>
<password>root</password>
</configuration>
</execution>
</executions>
</plugin>
See exception:
[ERROR] Failed to execute goal org.flywaydb:flyway-maven-plugin:4.0.3:migrate (default-cli) on project UrbanLife: org.flywaydb.core.api.FlywayException: Unable to connect to the database. Configure the url, user and password! -> [Help 1]
Looks like flyway can't see configuration inside
(Interesting that, in link, I mentioned before, it works)
Please help to create flyway multyDB integration via maven.
When you have multiple (or just one) <execution> in your maven plugin configuration and are attempting to run a specific execution from the command line you need to specify the execution by the execution id like so in your case
mvn flyway:migrate#migrate-database
see also: How to execute maven plugin execution directly from command line?
Lastly, if you want a specific execution to be the default, you can give it the execution id of default-cli as explained in these maven docs. Then you can simply run mvn flyway:migrate.
The FQN of the driver is incorrect, it should be com.mysql.jdbc.Driver or you could also simply remove it as it will be auto-detected from the URL.
You will also need to add the driver as dependency of your plugin by adding
<plugin>
...
<dependencies>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.39</version>
</dependency>
</dependencies>
</plugin>

Can't access Maven properties with Spring

Hello I am new to Maven in general so I'd like to apologise in advance if I get things wrong.
I have a Maven project with spring in which I also access a MySQL database.often the MySQL database is not accessible so I use a local database for testing, I swap between the two by changing the active profile.
I set up the profiles by putting this in my POM.xml:
<profiles>
<profile>
<id>Local</id>
<dependencies>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.3.3</version>
<classifier>jdk5</classifier>
</dependency>
</dependencies>
<properties>
<jdbc.url>jdbc:hsqldb:file:databaseName</jdbc.url>
<jdbc.username>a</jdbc.username>
<jdbc.password></jdbc.password>
<jdbc.driver>org.hsqldb.jdbcDriver</jdbc.driver>
</properties>
</profile>
<profile>
<id>MySQL</id>
<dependencies>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
</dependencies>
<properties>
<jdbc.url>jdbc:mysql://mysql.website.ac.uk:3306</jdbc.url>
<jdbc.username>user</jdbc.username>
<jdbc.password>1234</jdbc.password>
<jdbc.driver>com.mysql.jdbc.Driver</jdbc.driver>
</properties>
</profile>
</profiles>
When I try to access the properties variables through for example ${jdbc.url} they are not converted to the actual values as defined in the profile and I get errors. I suspect it might have something to do with Spring Boot.
Apart from the specific sprint-boot problem I think you are missing the maven resource filtering configuration.
Maven properties are not automatically propagated to Spring unless you put them into a separate properties files which must then be managed by spring (Usually via PropertyPlaceholderConfigurer).
But without maven filtering turned on properties are not automatically resolved, usually if you do not have many items under tour src/resources folder you can just enable filtering globally with
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
And your properties file should replace placeholders with maven values.
If, on the other hand in you have several resources in src/main/resources, you should check documentation more deeply to avoid binary files corruption
I think that spring boot is simply wrapping up all the process automatically in your application properties files.
You specified properties in profile, so there is a workaround. Annotations such as #Value("${name}") are referencing to externalized configuration, e.g. from application.yml. You should use this file to load properties you want. #property_name# loads property from your maven profile.
jdbc:
url: #jdbc.url#
username: #jdbc.username#
and so on...
Then these properties will be accessible in your code.
Maven properties are not put into environment variables or JVM variables that you can access by Spring.
However you can usually instruct maven plugin to do it for you.
For example configuration of maven-failsafe-plugin I use to set up test db:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<includes>**/*Integration.java</includes>
<environmentVariables>
<mysql.version>${mysql.version}</mysql.version>
<test.mysql.username>${test.mysql.username}</test.mysql.username>
<test.mysql.password>${test.mysql.password}</test.mysql.password>
<test.mysql.port>${test.mysql.port}</test.mysql.port>
<test.mysql.db>${test.mysql.db}</test.mysql.db>
</environmentVariables>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>

Categories