My current Spring 3.0 project is integrating with Flyway.
Thanks to the google site so there are document I can counting on. But unfortunately there is not much talking about integration with JPA.
So the questions is:
How to integrate Flyway with persistence.xml? And how does it work? Each time JPA provider will auto generate schema update so how we run a script before or after then?
I guess the query by flyway so far does not support HQL and such so is there any sample code then I can go through to know how to integrate the migration event? Design an interceptor or a new aspect?What to do on a domain level?
Any hint is appreciated. Thanks in advance.
Flyway has no support for JPA and Spring. It basically runs your SQL (not HQL) scripts in order and keeps track of them. And does it well. It remains agnostic to how you use your database and how you produce your upgrade scripts.
However, there's hope. Your persistence provider will most likely support updating existing schema (I know hibernate and eclipselink can), running ALTER and CREATE statements on startup. The migration SQL scripts aren't perfect and it won't always work, but it's a good start. Log these scripts, collect into SQL file, clean-up and use as V_*.sql file supplied to Flyway.
UPDATE: although there is no direct support for spring framework, you can easily integrate it with existing Spring application. This approach is proven to work on production and plays nicely:
<bean id="flyway" class="com.googlecode.flyway.core.Flyway" init-method="migrate">
<property name="dataSource" ref="..."/>
...
</bean>
Bonus: it works great with Java configuration (with Scala) as well:
#Bean(initMethod = "migrate")
def flyway() = {
val fly = new Flyway()
fly.setDataSource(dataSource)
fly
}
Related
Can anyone please help me on how to integrate Olingo (Odata) in a Springboot Java Appln.
I'm pretty new to Spring boot and have implemented one project and wanted it to convert to Oling (Odata).
I have gone through various resources but with a bunch of different approaches not sure how to do it the correct way.
Please let me know if some has worked on it and can guide me.
link to the project on which I applied spring-boot.
If you are trying to integrate OlingoJPA there are a couple of things you will have to do
Implement JPAServiceFactory and initializeODataJPAContext, basically this is about defining the persistence unit and entity manager
Then you can create a Spring Boot Configuration to mount your OData endpoint and initialize the EntityManagerFactory
Then you can point to your database here
An finally you can define the JPA entities you want your service to expose
The full Spring Boot + JPA project sample is located Github. Feel free to go though it, raise an issue or submit a Pull request for impalements
In my project I use h2 in memory database, and I want it to be created not by Hibernate, but with by a SQL script. Here is my hibernate.properties
I made
hibernate.hbm2ddl.auto=none
none to disable autocreation of database, and added
hibernate.hbm2ddl.import_files=schema.sql,insert-users.sql
schema.sql contains SQL code to create schema, and then to insert-users.sql and it contains the initial data.
The project builds successfully, but when I try to hit database, I get
a Table <tablename> not found exception.
Since Hibernate won't do this for you unless you use create or create-drop hbm2ddl, there are other ways to achieve what you want.
Specialized tools
There are tools that are created specifically for this: Flyway, LiquiBase. These are often configured to be run when the app is deployed and allow you to version DB scheme. They are applicable not only for testing (and mainly - not for testing), but for production as well. They can ensure that the scheme on all your envs is the same. If you use these tools, then it's better to set hbm2ddl to validate.
Spring's support
Less widespread way is to use Spring's support for embedded DBs:
<jdbc:embedded-database id="dataSource">
<jdbc:script location="classpath:schema.sql"/>
<jdbc:script location="classpath:test-data.sql"/>
</jdbc:embedded-database>
Data for testing
If the intention is to create data for testing (not scheme), then it's better to create entities and use your DAO/Repository layer to persist those in tests. This way you don't duplicate mechanisms of persisting data.
Two comments from the Hibernate documentation are relevant here:
This is useful for testing or demoing: by adding INSERT statements for example you can populate your database with a minimal set of data when it is deployed.
and
These statements are only executed if the schema is created ie if hibernate.hbm2ddl.auto is set to create or create-drop.
I'm not too sure that the import functionality will do what you want it to do.
I'm currently developing a web application using Spring MVC (without Maven).
What I need is to create a distributed transaction between two local databases, so that the code will update all of them in (theoretically) a 2phase commit.
Now, since I'm doing it for a school project, I'm in a simple environment which needs only to take a row from a table in one db and put it in a table on the other db, of course atomically (theoretically, such a transaction should be distributed because I'm using two different databases and not only one).
My question is, how can I deploy a Spring bean that firstly connects to both MySQL databases and then does that distributed transaction? Should I use some external library or could I achieve all with only using the Spring framework? In which case, could you please kindly link me an example or a guide to do this?
Thank you in advance for your help :)
Spring has an interface PlatformTransactionManager which is an
abstraction And it has many implementations like
DataSourceTransactionManager,HibernateTransactionManager etc.
Since you are using distributed transactions so you need to use
JTATransactionManager
These TransactionManagers provided by spring
are wrapper around the implementations provided by other frameworks
Now in case of JTA , you would be using either an application server
or a standalone JTA implementation like Atomikos
Following are the steps :-
Configure Transaction Manager in spring using application server
or standalone JTA implementation
Enable Transaction Management in spring
And then configure in your code the #Transactional annotation above
your method
Have a look at following links
http://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html
http://www.byteslounge.com/tutorials/spring-jta-multiple-resource-transactions-in-tomcat-with-atomikos-example
I am developing a Spring Boot application that uses Spring Data JPA and will need to connect to many different databases e.g. PostreSQL, MySQL, MS-SQL, MongoDB.
I need to create all datasources in runtime i.e. user choose these data by GUI in started application:
-driver(one of the list),
-source,
-port,
-username,
-password.
And after all he writes native sql to choosen database and get results.
I read a lot of things about it in stack and spring forums(e.g. AbstractRoutingDataSource) but all of these tutorials show how to create datasources from xml configuration or static definition in java bean. It is possible to create many datsources in runtime? How to manage transactions and how to create many sessionFactories? It is possible to use #Transactional annotation? What is the best method to do this? Can someone explain me how to do this 'step by step'?
Hope it's not too late for an answer ;)
I developed a module which can be easily integrated in any spring project. It uses a meta-datasource to hold the tenant-datasource connection details.
For the tenant-datasource an AbstractRoutingDataSource is used.
Here you find my core implementation using the AbstractRoutingDataSource.
https://github.com/Dactabird/multitenancy
Here is an example to show how to integrate it. https://github.com/Dactabird/multitenancy-sample
In this example I'm using H2 embedded db. But of course you can use whatever you want.
Feel free to modify it for your purposes or to ask if questions are left!
I'm familiar with how to make Spring handle multiple datasources dynamically via multiple persistence-units and multiple entityManagerFactoryBean implementations but what I'm struggling with is how to have a MySQL dialect and a DynamoDB dialect from within the same spring configuration, via spring-config xml files.
The work pattern is as follows:
[data POJO in, from some endpoint] -> Persist POJO into DynamoDB, retrieving the UUID of that object (business key as field on POJO) -> Persist UUID as a compound key (no referential integrity, it's just another column) into MySQL Database [with other related mapped entities].
I'm struggling with quite how on earth to go about adding the DynamoDB instance into the Spring configuration files to achieve this.
For what it's worth, the related repositories are going to be in separate packages.
Any starters for 10 would be gratefully received! I've done some searching but all DynamoDB mapper frameworks seem to be at a much higher level - have I missed something here? I've been looking at Spring-Data DynamoDB but still can't make the link between the configuration file and Dynamo.
Thanks in advance,
A.
========= UPDATE IN THINKING =========
I think I've gone about this the wrong way. From digging around the samples a lot more, doing a local integration test [pure dynamodb], I don't think it's possible to use DynamoDB as part of an EntityManager Factory implementation: to that end, I think I'm going to have to "create" my own repository implementations that call out to the mapper and AWS connection helper classes etc. for Dynamo, rather than using any of the JPA spring-provided code.
Unless anyone can recommend/suggest otherwise?
Question closed - after much investiation the only real way to do it is to introduce ones own interpretation of a repository and DAO-based implementation.
There is one interesting project, however, Spring Data Dynamodb. Looks interesting but not quite ready for Enterprise Production release.