I am working on a spring batch solution and planning to use MongoDB as a job repository. I am looking for a references on this implementation but could not get any references. Then I was checking the spring-batch-core-3.0.7.RELEASE.jar, there I could not able to see MongoDB schema. Does this mean Spring batch does not support MongoDB as job repository?
That is correct. Mongo is not a suitable data store for the job repository due to the transactionality requirements of the job repository. The data store must be ACID compliant in order to be used which is why we have focused our efforts on relational databases for the repository implementation to date.
There is a recent (v1.0.0 in 2021-11-02) project to handle that, it's not managed by Spring team :
https://github.com/europeana/spring-batch-mongo
This library provides MongoDB JobRepository support for Spring Batch.
On official Spring side, there is this opened issue :
https://github.com/spring-projects/spring-batch/issues/877
Related
Hi neo4j/spring boot masters
I am new to spring boot and I am trying to execute a cypher query without the #Query annotation.
I can see that my requirement is possible with entityManage.createQuery() in spring repository (javax)
But the adaptor for neo4j doesn't seem to have an entity Manager.
Any help is greatly appreciated.
I used the neo4j java driver in my spring boot as described at Using Neo4j from Java
This is maybe not what neo4j offers first, but i feel it is simple. You can also use it without transactions.
Depending on your needs, have a look at org.springframework.data.neo4j.core.Neo4jClient - specifically the query() method - if you have setup neo4j correctly in Spring then you can simply inject this class.
I'm developing a web application with Spring Boot and MongoDB. I want to make the services work with the #transactional spring annotation, but I don't know if that really works. (I didn't work with mongoDB before).
I added the annotation and it seem that everything run fine (The application runs and I can do all operations CRUD), but, I don't know if Spring is ignoring the annotation and it is working as usual, or is really considering the transactionality.
In other post, I have seen that I should add a new bean in the configuration class, in order to enable the transactionlity between Spring and MongoDB. Is it really necessary?, I only use transactions with single Mongo documents.
#Transactional works only from spring-data-mongodb version 2.1.0 and higher:
https://docs.spring.io/spring-data/mongodb/docs/2.1.0.RELEASE/api/
Indeed you have to add the bean:
#Bean
MongoTransactionManager transactionManager(MongoDatabaseFactory dbFactory) {
return new MongoTransactionManager(dbFactory);
}
I don't know if Spring is ignoring the annotation and it is working as usual, or is really considering the transactionality
For this, you can throw an exception between 2 DB updates and check if the first update has been rolled back.
But if you use transactions within a single Mongo document, you don't need the #Transactional annotation:
In MongoDB, a write operation is atomic on the level of a single
document, even if the operation modifies multiple embedded documents
within a single document.
MongoDb documentation - Transactions
For Reactive style mongoDB & Spring boot integration the answer I provided here can be useful to people
I'm currently developing a web application using Spring MVC (without Maven).
What I need is to create a distributed transaction between two local databases, so that the code will update all of them in (theoretically) a 2phase commit.
Now, since I'm doing it for a school project, I'm in a simple environment which needs only to take a row from a table in one db and put it in a table on the other db, of course atomically (theoretically, such a transaction should be distributed because I'm using two different databases and not only one).
My question is, how can I deploy a Spring bean that firstly connects to both MySQL databases and then does that distributed transaction? Should I use some external library or could I achieve all with only using the Spring framework? In which case, could you please kindly link me an example or a guide to do this?
Thank you in advance for your help :)
Spring has an interface PlatformTransactionManager which is an
abstraction And it has many implementations like
DataSourceTransactionManager,HibernateTransactionManager etc.
Since you are using distributed transactions so you need to use
JTATransactionManager
These TransactionManagers provided by spring
are wrapper around the implementations provided by other frameworks
Now in case of JTA , you would be using either an application server
or a standalone JTA implementation like Atomikos
Following are the steps :-
Configure Transaction Manager in spring using application server
or standalone JTA implementation
Enable Transaction Management in spring
And then configure in your code the #Transactional annotation above
your method
Have a look at following links
http://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html
http://www.byteslounge.com/tutorials/spring-jta-multiple-resource-transactions-in-tomcat-with-atomikos-example
I am developing a Spring Boot application that uses Spring Data JPA and will need to connect to many different databases e.g. PostreSQL, MySQL, MS-SQL, MongoDB.
I need to create all datasources in runtime i.e. user choose these data by GUI in started application:
-driver(one of the list),
-source,
-port,
-username,
-password.
And after all he writes native sql to choosen database and get results.
I read a lot of things about it in stack and spring forums(e.g. AbstractRoutingDataSource) but all of these tutorials show how to create datasources from xml configuration or static definition in java bean. It is possible to create many datsources in runtime? How to manage transactions and how to create many sessionFactories? It is possible to use #Transactional annotation? What is the best method to do this? Can someone explain me how to do this 'step by step'?
Hope it's not too late for an answer ;)
I developed a module which can be easily integrated in any spring project. It uses a meta-datasource to hold the tenant-datasource connection details.
For the tenant-datasource an AbstractRoutingDataSource is used.
Here you find my core implementation using the AbstractRoutingDataSource.
https://github.com/Dactabird/multitenancy
Here is an example to show how to integrate it. https://github.com/Dactabird/multitenancy-sample
In this example I'm using H2 embedded db. But of course you can use whatever you want.
Feel free to modify it for your purposes or to ask if questions are left!
I'm familiar with how to make Spring handle multiple datasources dynamically via multiple persistence-units and multiple entityManagerFactoryBean implementations but what I'm struggling with is how to have a MySQL dialect and a DynamoDB dialect from within the same spring configuration, via spring-config xml files.
The work pattern is as follows:
[data POJO in, from some endpoint] -> Persist POJO into DynamoDB, retrieving the UUID of that object (business key as field on POJO) -> Persist UUID as a compound key (no referential integrity, it's just another column) into MySQL Database [with other related mapped entities].
I'm struggling with quite how on earth to go about adding the DynamoDB instance into the Spring configuration files to achieve this.
For what it's worth, the related repositories are going to be in separate packages.
Any starters for 10 would be gratefully received! I've done some searching but all DynamoDB mapper frameworks seem to be at a much higher level - have I missed something here? I've been looking at Spring-Data DynamoDB but still can't make the link between the configuration file and Dynamo.
Thanks in advance,
A.
========= UPDATE IN THINKING =========
I think I've gone about this the wrong way. From digging around the samples a lot more, doing a local integration test [pure dynamodb], I don't think it's possible to use DynamoDB as part of an EntityManager Factory implementation: to that end, I think I'm going to have to "create" my own repository implementations that call out to the mapper and AWS connection helper classes etc. for Dynamo, rather than using any of the JPA spring-provided code.
Unless anyone can recommend/suggest otherwise?
Question closed - after much investiation the only real way to do it is to introduce ones own interpretation of a repository and DAO-based implementation.
There is one interesting project, however, Spring Data Dynamodb. Looks interesting but not quite ready for Enterprise Production release.