I want Quarkus to execute a flyway migration based on some SQL scripts that I have stored in a separate repository, then packaged into a jar file and published to a private Nexus instance.
I believe I can point to a specific location using this application property:
quarkus.flyway.locations=/some/path
But given this dependency:
<dependency>
<groupId>com.myorganisartion.db</groupId>
<artifactId>myschema</artifactId>
<version>18.0.0</version>
</dependency>
What would the value of the flyway.locations property be?
Assume that the folder contains just one folder, containing the .sql files, called myschema.
Thanks in advance!
I've tried googling around and look into quarkus example app, but no luck.
I see that you can reference the classpath in the property value, but I'm not sure what to put after that, and why?
It seems I forgot to reference the datasource:
quarkus.flyway.myschema.locations=classpath:myschema/baseline,classpath:myschema/migrations,
Related
I am trying to build a custom Liquibase docker image (based on the official liquibase/liquibase:4.3.5 image) for running database migrations in Kubernetes.
I am using some custom types for the database which are implemented using #DataTypeInfo annotation and extending existing LiquibaseDataTypes like liquibase.datatype.core.VarcharType (class discovery is implemented using the META-INF/services/liquibase.datatype.LiquibaseDatatype mechanism introduced in Liquibase 4+).
These extensions are implemented inside their own maven module called "schema-impl", which is generating a schema-impl.jar. Everything was working fine when using migrations integrated inside the app startup process, but now we want this to be done by the dedicated docker image.
The only information in the Liquibase documentation regarding this topic is the "Drivers and extensions" section from this document. According to this, I added the schema-impl.jar into the /liquibase/classpath directory during the image building process and also modified the liquibase.docker.properties in order to add this jar file explicitly inside the classpath property:
classpath: /liquibase/changelog:/liquibase/classpath:/liquibase/classpath/schema-impl.jar
liquibase.headless: true
However, when I try to run my changesets with the docker image, I am always getting an error because it cannot find the custom type definition:
liquibase.exception.DatabaseException: ERROR: type "my-string" does not exist
Any help would be really appreciated. Thanks in advance.
Ok I found it. Basically the problem was that I needed to include the classpath in the entrypoint command, not in the liquibase.docker.properties file (which seems to be useless for this usecase), like this:
--classpath=/liquibase/changelog:/liquibase/classpath/schema-impl.jar
There is a Spring Boot 2 app with such a structure:
parent-module
module-1
src
main
java
resources
- application.yml
module-2
src
main
java
resources
- application.yml
Also, module-1 depends on module-2, specified in pom.xml dependencies section.
The problem is that when I specify some properties in module-2's application.yml - they are not visible in main module-1's components (via #Value annotation).
As was answered here seems like module-1's application.yml overrides module-2's application.yml. There is a workaround - if I use name application.yaml in module-2 everything works fine, but I'm going to add more modules and, finally, it's dirty hack.
What I'm doing wrong? Should such an hierarchy of property files specified somehow?
I will be happy to provide more details if it's needed.
Thank you!
Spring Boot is a runtime framework. I understand that your modules are not spring-boot applications by themselves (you can't make a dependency on a spring boot application packaged with spring boot maven plugin, because it produces an artifact that is not really a JAR from the Java's standpoint although it does have *.jar extension).
If so, they're probably regular jars. So you should have a "special" module that assembles the application. This special module lists both 'module1' and 'module2' in <dependency> section and should contain a definition of spring-boot-maven-plugin in its build section (assuming you're using maven). But if so you shouldn't really have more than one application.yml - it will be misleading. Instead, put the application.yml to the src/main/resources of that "special" module.
If you really have to for whatever reason work with multiple application.yaml files, make sure you've read this thread
I know, this is already a well-aged post.
I just came accross the same issue and the best solution I found was to import the module-specific configurations with the spring.config.import directive as described here.
In this case you still have your module specific configuration in property or yaml files within that specific module and do not have too much unwanted dependencies in your project setup.
application.yml is, as the name indicates, an application-level file, not a module-level file.
It is the build script that assembles the final application, e.g. the .war file, that needs to include a application.yml file, if any.
If modules need properties, and cannot rely on the defaults, e.g. using the : syntax in #Value("${prop.name:default}"), they need to provide a module-level property file using #PropertySource("classpath:/path/to/module-2.properties").
Note: By default, #PropertySource doesn't load YAML files (see official documentation), but Spring Boot can be enhanced to support it. See #PropertySource with YAML Files in Spring Boot | Bealdung.
Alternative: Have the application-level build script (the one building the .war file) merge multiple module-level build scripts into a unified application.yml file.
I am curious what is the best practice for environment (or even server) specific properties for dependencies.
This is not for properties that are managed by the applicationContext, as I am aware of the {env}-application.properties convention that Spring supports.
I will give a small example to elaborate:
My project, is project A. In A, we depend on Project B
A's pom.xml
<dependency>
<groupId>com.B</groupId>
<artifactId>B</artifactId>
</dependency>
Artifact B has a dependency on B.properties, which it does not provide and it must be on the classpath. I am unable to refactor Artifact B.
Contents of B.properties are:
b.someproperty=${property.placeholder}/b/dir
Contents of application.properties:
property.placeholder=default
Contents of dev-application.properties:
property.placeholder=dev
So when I run my spring boot app with -Dspring.profiles.active=dev, b.someproperty must resolve to dev/b/dir
What I have tried so far (that worked):
1) Externalized properties that sit on the server, with the placeholders already resolve and simply adding them to the classpath at runtime (So each server would have it's own B.properties file with no placeholders, and this would not be part of the build/deploy process)
2) Class within the spring bootable jar which takes the properties in b.properties, resolves all the placeholders, and writes it out to a file on the server and then adds this file to the classpath. So running the app with -Dspring.profiles.active=dev would generate b.properties on the server with no placeholders, and everything is contained within the jar.
Neither of these solutions are very clean, or good (imo). Is it possible to resolve the placeholders as the properties are being used, even if they are not being managed by the application context?
Any insight or criticism of my current solutions appreciated.
I'm trying to write a custom Maven plugin that will parse the SCM changelog of the current Maven project, as well as any of its direct dependencies.
I know that MavenProject.getScm().getConnection() returns the connection URL of the current project.
However, I would also like to retrieve the connection URL of any direct dependencies. (They are already defined in each dependency's pom.xml)
I looked at MavenProject.getDependencies(), but it returns a List of Dependency objects which doesn't seem to contain the information I need.
Does anyone know how I can retrieve this information?
You will have to get instance of MavenProject for each of the dependencies, e.g. obtain instance of the MavenProjectBuilder and build MavenProject instance with it.
See the following question for a sample code snippet for resolving an individual dependency.
I'm trying to load test data into a test DB during a maven build for integration testing. persistence.xml is being copied to target/test-classes/META-INF/ correctly, but I get this exception when the test is run.
javax.persistence.PersistenceException:
No Persistence provider for
EntityManager named aimDatabase
It looks like it's not finding or loading persistence.xml.
Just solved the same problem with a Maven/Eclipse based JPA project.
I had my META-INF directory under src/main/java with the concequence that it was not copied to the target directory before the test phase.
Moving this directory to src/main/resources solved the problem and ensured that the META-INF/persistence.xml file was present in target/classes when the tests were run.
I think that the JPA facet put my META-INF/persistence.xml file in src/main/java, which turned out to be the root of my problem.
I'm using Maven2, and I had forgotten to add this dependency in my pom.xml file:
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>3.4.0.GA</version>
</dependency>
If this is on windows, you can use sysinternal's procmon to find out if it's checking the right path.
Just filter by path -> contains -> persistence.xml. Procmon will pick up any attempts to open a file named persistenc.xml, and you can check to see the path or paths that get tried.
See here for more detail on procmon: http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx
I had the same problem and it wasn't that it couldn't find the persistence.xml file, but that it couldn't find the provider specified in the XML.
Ensure that you have the correct JPA provider dependancies and the correct provider definition in your xml file.
ie. <provider>oracle.toplink.essentials.PersistenceProvider</provider>
With maven, I had to install the 2 toplink-essentials jars locally as there were no public repositories that held the dependancies.
Is your persistence.xml located in scr/test/resources? Cause I was facing similar problems.
Everything is working fine as long as my persistence.xml is located in src/main/resources.
If I move persistence.xml to src/test/resources nothing works anymore.
The only helpful but sad answer is here: http://jira.codehaus.org/browse/SUREFIRE-427
Seems like it is not possible right now for unclear reasons. :-(
we got the same problem, does some tweaking on the project and finaly find following
problem (more clear error description):
at oracle.toplink.essentials.ejb.cmp3.persistence.
PersistenceUnitProcessor.computePURootURL(PersistenceUnitProcessor.java:248)
With that information we recalled a primary rule:
NO WHITE SPACES IN PATH NAMES!!!
Try this. Works for us smile.
Maybe some day this will be fixed.
Hope this works for you. Good luck.