I'd like to start using a flyway to keep our different db environments synced.
The problem I have is that we can't align all the environments using a Prod dump on test and dev since our Prod env contains sensitive data that testers and developers can't access.
I understand that to start using flyway on an existing environment the steps are:
Create Prod dump
Execute Flyway Init
Align Test with prod dump
Execute Flyway Clean on Dev
Execute prod dump on Dev
Start using flyway migration normaly
I based on Axel Fontaine's video on minute 32:00 maybe there is another way to achieve this. So the question is how can I do to use flyway without using a production dump? Any help or idea?
As as I said in the talk, dump the structure and the reference data. These are the things you will be managing using Flyway. The application data doesn't need to be dumped.
What you want is the structure and the reference data to be in sync, not the application data.
Related
I'm testing out PostgreSQL and CockroachDB with my application. I've got it such that I can run my application with either PostgreSQL OR CockroachDB. Is it possible to set Flyway up such that I can run either with Flyway support without errors occurring from also having it configured for the other database I'm not using at the moment?
I've tried looking for documentation that answers this, but it seems that most documentation in this area pertains to running both databases concurrently, which isn't what I'm trying to do here.
Not a huge deal, but I am curious... Thank you!
The default behavior of Flyway uses the config file. Issuing a command like flyway migrate will go to the configured database with the designated locations (folders where the migrations are stored).
So, to be able to switch on the fly, you have two choices. You can create two config files and then set them on execute from the command line, or, take direct control of the configuration settings through the command line. So, two different command lines with the appropriate settings for where the migrations are stored and how to connect to them should let you do exactly this.
I have a java app which I deploy on various plateforms (using ansible).
This app uses a database, which sometimes needs to get schema updates, which I perform and log/version with flyway (as a software dependency).
I now face the need to update data on all plateforms, but with different values depending on the plateforms. This is not a schema update, but is nonetheless data (list of other apps to which it connects) that forms the main structure of my app, and as such I want it to be versioned, in a similar way to what flyway does.
At first I was thinking I should input the different data in my ansible configuration, which seemed to make sense as it's ansible that knows about the various plateforms. And then I thought that this information would get passed to flyway somehow so that it performs the required updates.
However if that is handled using 'versioned migrations', I could end up with version conflicts because one environment requires an update and another doesn't (common versioning vs environment versioning).
There is a slight mention of this issue in the flyway FAQ, and one can set the flyway.locations property, or maybe I could use flyway placeholders that are set by ansible ?
Am I on the right track ? Or should I not use flyway altogether (is it meant to be used with DML, or should it be reserved for DDL) ?
Flyway can be used for both schema and data updates. Although it's primary purpose is around versioning schema updates.
It sounds like you need a way to deploy some scripts only in certain environments. Flyway provides functionality that will support this workflow. However, you'll need to decide on the approach that works best for you.
Here are some ideas.
Use different locations
The simplest way I can think of is to have environment specific scripts in their own locations. You can also have a location for 'common' scripts.
When you deploy, you can specify the 'common' location, alongside the environment specific one. Something like:
flyway migration -locations=common/sql, test/sql
flyway migration -locations=common/sql, production/sql
And so on.
shouldExecute script config & placeholders
Another way is to use the Flyway Teams feature shouldExecute. This let's you define a boolean expression to determine if a script should be run. You can inject a value from a placeholder. There is a blog post that explains more about it.
Use the cherryPick configuration option
Another Teams Edition feature is cherryPick, which allows you to specify exactly which scripts to deploy. So you might have a configuration file per environment with a cherryPick config that specifies the exact scripts to run. This one might be unwieldy since you need to explicitly list every script, but it does give you complete control.
I have written a REST webservice using JAX-RS and I currently prepare my test database using DbUnit. However, if I would now deploy may application, this would not fit my needs anymore. Thus, I am looking for a maven plugin that lets me handle the preparation and update of the production database. So I need something that creates my tables and inserts default data when I deploy my service for the first time and updates the tables when I deploy new releases, when the service is running.
There are a couple of useful frameworks for this usecase.
Have a look at:
FlyWay
and Liquibase
Both can handle your use case quite neatly. The main difference is the way how migrations are defined, flyway uses SQL and Javacode based migrations, Liquibase uses XML.
My personal preference lies with FlyWay, as I find it more natural.
I am working on a Spring Webflow programming using MySQL has the database.
I have some jUnit test cases that Maven runs on the build that uses a test database base not the dev database. we have a diff database for running the projects in dev then the builds.
I have some test data that I need setup in dev before running my project. I was using jUnit for it as part of the package/test in maven but the issue is that test is using diff xml and database. how do you think I can go about making the project remove some delete in dev before running.. Does anyone know of any MySQL plugin for maven that will run a script before?
I'm not entirely sure what you're asking, but I think you want to have certain data available in a DB when you run tests and that data is different in different environments.
Given you're already using maven I would suggest you set up build profiles for each environment, which will allow you to specify different DB connections, indeed if you implement an ORM such as hibernate, you can even get the DB schema and data dropped and recreated on each run in your test / dev environments.
Further more, in environments where you want the data to be specific to your test run and the data does not need to persist beyond the scope of your tests, implement an in memory DB, such as HSQL which can be seeded with data as required and wiped at the end, this will make your development environment more suitable for larger numbers of developers and remove the need for physical DB resources.
I think you're asking about how to set up a database before running a Junit test. If so, I suggest you look at DBUnit and (as suggested in another answer) an in-memory database.
I am trying to create an integration test, which requires a running PostgreSQL server. Is it possible to start the server in maven build and stop it when tests are completed (in a separate process, I think)? Assuming the PostgreSQL server is not installed on the machine.
You are trying to push maven far beyond the intended envelope, so you'll be in for a fair amount of hurt before it will work.
Luckily postgresql can be downloaded as a zip archive.
As already mentioned above maven can use ant tasks to extend its reach. Ant has a large set of tasks to unzip files, and run commands. The sequence would be as follows :
unzip postgresql-xxx.zip in a well known directory --> INSTALL_DIR
create a data directory --> DATA_DIR
/bin/init-db -D
/bin/postgres -D
/bin/create_db -EUNICODE test
This should give you a running server with a test database.
Further issues : create a user, security (you likely want to connect via TCP/IP but this is disabled by default if I recall correct, this requires editing a config file before starting the database)
...
Good Luck.
I started writing a plugin for this purpose:
https://github.com/adrianboimvaser/postgresql-maven-plugin
It's in a very early stage and lacks documentation, but mostly works.
I already released version 0.1 to Maven Central.
I'm also releasing PostgreSQL binary distributions for all platforms as maven artifacts.
You can find the usage pattern in the plugin's integration tests.
Cheers!
Not to my knowledge. However, you could run a remote command that starts the server.
I think the usual scenario is to have a running integration test db, and not to shut it down/ restart it between builds.
But if you really want to you could set up your continuous integration server to start/ stop the db.
You sound like you are trying to build a full continuous integration environment. You should probably look into using a full CI tool such as Cruise Control or Bamboo.
How I've done it before is to set up a dedicated CI db that is accessible from the CI server, and then have a series of bash/python/whatever scripts run as a After Successful Build step which can then run whatever extra integration tasks you like. Pair that with something like liquibase and you could wipe out the CI db and make sure it is up to the latest schema every build.
Just to bring some fresh perspective into this matter:
You could also start the postgresql database as docker instance.
The plugin ecosystem for docker seems to be still in flux, so you might need to decide yourself which fits. Here are a few links to speed up your search:
https://github.com/fabric8io/docker-maven-plugin
http://heidloff.net/article/23.09.2015102508NHEBVR.htm
https://dzone.com/articles/build-images-and-run-docker-containers-in-maven