Could anyone show me how we could save documents with CAS aware policy in Spring Data Aerospike so that it uses the generation from the record to update the document with EXPECT_GEN_EQUAL generation policy?
I tried this:
Customer customer = customerRepositoryTest.findOne("335672888");
customer.setFieldX(some value)
customerRepository.save(customer);
But I found out the Aerospike Spring Data always use the NONE Generation Policy so that it always ignores the version property (generation) and overwrites the record when calling save.
Anyone got an idea? Thank you!
You should be using spring-data-aerospike dependency with groupId com.aerospike as it contains all contributed fixes to the project:
https://mvnrepository.com/artifact/com.aerospike/spring-data-aerospike
2.2.0.RELEASE is already supporting Spring Boot 2.2.
2.1.1.RELEASE is for Spring Boot 2.1.
Related
I want to get the details of entries in my custom cache after caching data or eviction of data.
I tried using actuator dependency to get 'actuator/metrics' path to get details but I'm getting empty tomcat server cache. There is no sign of my custom cache say myCache (the name which I passed into #Cacheable annotation value argument).
You tagged your question with Caffeine and Spring Boot, so I assume you use those two products.
If you use a recent Spring Boot and Caffeine, statistics will be automatically available at actuator/caches. If not, double check you have the needed libraries on your classpath and no configuration that enables another cache or disables caching at all, like spring.cache.type=none.
If you don't use Spring Boot, but just Spring, you need to add a CacheManager to your configuration, otherwise Spring defaults to the ConcurrentHashMap which does not have cache statistics.
I just want to use Aerospike as the backing cache for Spring CacheManager.
Should I use spring data aerospike when I don't intend to use Aerospike as a data-store but only as a cache?
Is there any implementation available similar to HazelcastCacheManager or do I need to write my own?
Any help is appreciated.
Found this implementation for Aerospike Cache Manager.
https://github.com/shaileshmishra008/spring-cache-aerospike
I was able to write a version using this project as a reference.
There were some big changes in the Cache area in Spring Data Aerospike - starting from versions 2.5.0/3.0.0.
A Medium article about Caching with Spring Boot and Aerospike:
https://medium.com/aerospike-developer-blog/caching-with-spring-boot-and-aerospike-17b91267d6c
I am working on a spring batch solution and planning to use MongoDB as a job repository. I am looking for a references on this implementation but could not get any references. Then I was checking the spring-batch-core-3.0.7.RELEASE.jar, there I could not able to see MongoDB schema. Does this mean Spring batch does not support MongoDB as job repository?
That is correct. Mongo is not a suitable data store for the job repository due to the transactionality requirements of the job repository. The data store must be ACID compliant in order to be used which is why we have focused our efforts on relational databases for the repository implementation to date.
There is a recent (v1.0.0 in 2021-11-02) project to handle that, it's not managed by Spring team :
https://github.com/europeana/spring-batch-mongo
This library provides MongoDB JobRepository support for Spring Batch.
On official Spring side, there is this opened issue :
https://github.com/spring-projects/spring-batch/issues/877
I have some data in the db
And users are given Explicit read permissions on them usin Spring ACL
Does Spring have any standard way of Filtering the data ?
I'm not sure if you already solved it, but I've just found a implementation here.
It seems that it could be easily installed on an existing Spring project.
I am trying to get a very minimal JPA + SDN (Spring Data Neo4j) cross store project running and am trying to demonstrate that saving a partial entity using a JPA repository call will create a corresponding node in Neo4j.
I have followed the instructions / advice that I have been able to find on SO, Google and Spring's site but am currently still having trouble standing things up. I currently have a minimal test project created at:
https://github.com/simon-lam/sdn-cross-store-poc
The project uses Spring Boot and has a simple domain containing a graph entity, GraphNodeEntity.java, and a partial entity, PartialEntity.java. I have written a very basic test, PartialEntityRepositoryTest.java, to do a save on the partial entity and am seeing:
The wrong transaction manager seems to be used because the CrossStoreNeo4jConfiguration class does not properly autowire entityManagerFactory, it is null
As a result of the above ^, no ID is assigned to my entity
I do not see any SDN activity in the logs at all
Am I doing something glaringly wrong?
More generally, I was hoping to confirm some assumptions and better understand cross store persistence support in general:
To enable it, do I need to enable advanced mapping?
As part of enabling advanced mapping, I need to set up AspectJ; does this include enabling load time weaving? If so is this accomplished through using the #EnableLoadTimeWeaving config?
Assuming that all my configuration is eventually fixed, should I expect to see partial nodes persist in Neo4j when I persist them using a JPA repository? This should be handled by the cross store support which is driven by aspects right?
Thank you for any help that can be offered!
I sent a message to the Neo4j Google Group and got some feedback from Michael Hunger so I'm going to share here:
Turns out the cross store lib has been dormant for a while
JPA repos are not supported, only the EntityManager operations are
The cross store setup was not meant for a remote server and was not tested
So in summary my core understanding / assumptions were off!
Source: https://groups.google.com/forum/#!topic/neo4j/FGI8692AVJQ