Add a Mapper to cache with java api - java

I want to add a Mapper to cache with java api in MyBatis. I try this way and add Mapper's namespace to caches, but the statements of the Mapper still has no cache. is there any way to add Mapper statements into the cache?
...
Cache cache = (new CacheBuilder("package.name.SampleEntity"))
.implementation(PerpetualCache.class)
.addDecorator(LruCache.class)
.clearInterval(null)
.size(null)
.readWrite(true)
.blocking(false)
.properties(new Properties())
.build();
sqlSessionFactory.getConfiguration().addCache(cache);
...

Related

How to skip Ignite (de)serialization for local cache?

I'm facing issue when try store HikariDataSource in Ignite cache, it can't be (de)serialized by Ignite. I like Ignite's features for caches, so want to reuse it also for local needs.
Caused by: org.apache.ignite.binary.BinaryInvalidTypeException: com.zaxxer.hikari.util.ConcurrentBag$$Lambda$2327/0x00000008010b9840
at org.apache.ignite.internal.binary.BinaryContext.descriptorForTypeId(BinaryContext.java:697)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize0(BinaryReaderExImpl.java:1765)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.deserialize(BinaryReaderExImpl.java:1724)
at org.apache.ignite.internal.binary.BinaryReaderExImpl.readField(BinaryReaderExImpl.java:1987)
at org.apache.ignite.internal.binary.BinaryFieldAccessor$DefaultFinalClassAccessor.read0(BinaryFieldAccessor.java:702)
at org.apache.ignite.internal.binary.BinaryFieldAccessor.read(BinaryFieldAccessor.java:187)
... 70 common frames omitted
How to skip (de)serialization for CacheMode.LOCAL caches in Ignite?
Use HashMap if you need keep a reference to the data source locally. The map doesn’t serialize objects. Ignite’s local cache always serializes records.

Invalidate entire namespace using Simple Spring Memcached

Has anyone tried invalidating an entire memcached namespace?
For example I have two reads methods having differents keys
#Override
#ReadThroughSingleCache(namespace = UrlClientExclusion.TABLE_NAME, expiration = 24 * HOUR)
public List<UrlClientExclusion> list(#ParameterValueKeyProvider String idClient) {
#Override
#ReadThroughSingleCache(namespace = UrlClientExclusion.TABLE_NAME, expiration = 24 * HOUR)
public UrlClientExclusion find(#ParameterValueKeyProvider int idUrlClientExclusion) {
and I want delete entire namespace UrlClientExclusion.TABLE_NAME on update/delete operation
I can't use a keylist method because there are many instances of the app
#InvalidateMultiCache(namespace = UrlClientExclusion.TABLE_NAME)
public int update(UrlClientExclusion urlClientExclusion, /*#ParameterValueKeyProvider List<Object> keylist*/ /* it is impossibile for me put all keys in this list*/) {
so the solution is to delete entire namespace.
What is the annototation to do this?
It is possibible to build custom annotation to delete entire namespace? How?
Many thanks
Memcached doesn't support namespaces, SSM provides namespaces as a logic abstraction. It is not possible to flush all keys from given namespaces as memcached doesn't group keys into namespaces. Memcached support only flushing/removing single key or all keys.
You can flush all your data from memcached instance or need to provide exact keys that should be removed.
I don't know how this can be handled with the simple-spring-memcached lib. But I would suggest you to use Spring's Cache Abstraction instead. In that way you can change a cache storage to one of your preference e.g. ConcurrentHashMap, Ehcache, Redis etc. It would be just a configuration change for your application. For the eviction of the namespace, you could do something like:
#CacheEvict(cacheNames="url_client_exclusion", allEntries=true)
public int update(...)
Unfortunately there is not an official support for Memcached offered by Pivotal, but if you really need to use Memcached, you could check out Memcached Spring Boot library which is compliant with the Spring Cache Abstraction.
There is a sample Java app where you could see how this lib is used. Over there you could also find an example of #CacheEvict usage (link).

Remove cache name from generated cache key

I would like to know if there is some way to remove the cache name from the generated cache key on Spring boot 2.
This is the code i'm currently using to cache data:
#Cacheable(value = "products", key = "#product.id")
public SimilarProducts findSimilarProducts(Product product){}
Spring boot is concatenating the string "products" to every single key i'm generating to save on the cache. I have already tried to make my own key generator but the spring boot keeps concatenating the string "products" to the generated keys. Thanks for your attention.
For example when i use:
Product p = new Product();
p.setId("12345");
findSimilarProducts(p);
The generated key will be:
products::12345
I would like it to be only 12345.
spring boot keeps concatenating the string "products" to the generated keys.
Spring Boot (or the cache abstraction for that matter) doesn't do such thing but a particular Cache implementation might. It would have been interesting to share a bit more details about your setup but I can only guess you are using Redis as the cache store and the default CacheKeyPrefix adds the name of the cache indeed.
Please review the documentation.
You can (maybe you need to) disable key prefix like this.
#Bean
public RedisCacheManager cacheManager(RedisConnectionFactory connectionFactory) {
RedisCacheManager cacheManager = RedisCacheManager.builder(connectionFactory)
.cacheDefaults(defaultCacheConfig().disableKeyPrefix())
.build();
return cacheManager;
}

Spring Data Cassandra Query DSL RxJava2

So I am using Spring Data Cassandra and RxJava, I am looking for a way to use RxJava Observable with custom query building (the find..by abstraction is to complicated to use in my case) and I was planning on using QueryDSL (the method findAll(Predicate), but it does not enable Async :/)
So far my best shot is to use AsyncCassandraTemplate to build a Query and return it as ListenableFuture so that it can be mapped to a Observable and be used with RxJava's Observable. Is there any other way?
There's no QueryDsl support Spring Data for Apache Cassandra. You can use Query objects to create queries and ReactiveCassandraTemplate for reactive API usage:
Mono<Person> person = cassandraTemplate.selectOneById(query(where("age").is(33)), Person.class);
Maybe<Person> maybe = Flowable.fromPublisher(person).firstElement();

How to create CaffeineCache object using Java?

I am trying to use Caffeine cache. How to create the object for the Caffeine cache using Java? I am not using any Spring for now in my project.
Based on Caffeine's official repository and wiki, Caffeine is a high performance Java 8 based caching library providing a near optimal hit rate. And it is inspired by Google Guava.
Because Caffeine is an in-memory cache it is quite simple to instantiate a cache object.
For example:
LoadingCache<Key, Graph> graphs = Caffeine.newBuilder()
.maximumSize(10_000)
.expireAfterWrite(5, TimeUnit.MINUTES)
.refreshAfterWrite(1, TimeUnit.MINUTES)
.build(key -> createExpensiveGraph(key));
Lookup an entry, or null if not found:
Graph graph = graphs.getIfPresent(key);
Lookup and compute an entry if absent, or null if not computable:
graph = graphs.get(key, k -> createExpensiveGraph(key));
Note: createExpensiveGraph(key) may be a DB getter or an actual computed graph.
Insert or update an entry:
graphs.put(key, graph);
Remove an entry:
graphs.invalidate(key);
EDIT:
Thanks to #BenManes's suggestion, I'm adding the dependency:
Edit your pom.xml and add:
<dependency>
<groupId>com.github.ben-manes.caffeine</groupId>
<artifactId>caffeine</artifactId>
<version>1.0.0</version>
</dependency>

Categories