EHCache 3 not working as a Map - java

I'm using EHCache for my Spring application and I don't think its working as expected
If you see below I'm adding the same data to a hashmap and also to Ehcache .. but when I tried to get it from the ehCache it prints null
Is there anything wrong in the way I have initialized the Cache or something that I have missed.
My class
#Named
public class Myclass extends DataLoader {
private static final Logger LOGGER = Logger.getLogger(Myclass.class.getName());
#Inject
private DictionaryInitializer dictionaryLoader;
#Inject
#Named("fileLoader")
private FileLoader fileLoader;
private Cache<String,Object> dictionary;
#PostConstruct
#Override
public void loadResource() {
List<String> spellTerms = null;
dictionary=dictionaryLoader.getCache();
String fileName = "C:\\spelling.txt";
spellTerms = fileLoader.loadResource(fileName);
HashMap<String,Object> dictData = new HashMap<>();
for(String line:spellTerms)
{
for (String key : parseWords(line))
{
HashMap<String,Object> tempMap = dictionaryUtil.indexWord(key);
if(tempMap!=null && !tempMap.isEmpty())
{
Set<Map.Entry<String, Object>> entries = tempMap.entrySet();
for(Map.Entry<String, Object> entry:entries)
{
dictionary.put(entry.getKey(), entry.getValue());
}
}
dictData.putAll(tempMap);
}
}
System.out.println(dictData.get("urcle")); //prints 45670
System.out.println(dictionary.get("urcle")); // prints null
}
private static Iterable<String> parseWords(String text)
{
List<String> allMatches = new ArrayList<String>();
Matcher m = Pattern.compile("[\\w-[\\d_]]+").matcher(text.toLowerCase());
while (m.find()) {
allMatches.add(m.group());
}
return allMatches;
}
}
My DictionaryInitializer is like below
public class DictionaryInitializer {
private static final Logger LOGGER = Logger.getLogger(DictionaryInitializer.class.getName());
private Cache<String,Object> dictionary;
#PostConstruct
public void initializeCache() {
CacheManager cacheManager = CacheManagerBuilder.newCacheManagerBuilder()
.build(true);
this.dictionary = cacheManager.createCache("myCache",
CacheConfigurationBuilder.newCacheConfigurationBuilder(String.class, Object.class,
ResourcePoolsBuilder.heap(100)).build());
LOGGER.info("Dictionary loaded");
}
public Cache<String,Object> getCache(){
return dictionary;
}
public void setCache(Cache<String,Object> dictionary) {
this.dictionary=dictionary;
}
}

I have found the issue.
I have initialized the Cache with a heap of just 100 entry and hence it takes only the fist 100 entries into the cache.
Fixed that and it works as expected

Related

Java: How to Mock a protected method inside a static child class

I am having a protected method inside a static child class. I am running a testcase , its getting successful but code coverage is not increasing.
public class A{
private static final String var1 = "key1";
protected static class B extends OsCmd{
private String abc1;
private String abc2;
protected B(String xyz, String xyz2) {
this.abc1 = xyz;
this.abc2 = xyz2;
}
#Override
protected void updateEnv(Map<String, String> env) {
env.put(VAR1, "FALSE");
env.put(VAR2, "TRUE");
env.put("key3", abc1);
env.put("key4", abc2);
}
}
}
Below is my test case
#ExtendWith(MockitoExtension.class)
public class ATest {
private A mockA;
#BeforeEach
public void setup() {
mockA = Mockito.spy(A.class);
}
#Test
public void test2() {
try (MockedConstruction mockedConstruction =
mockConstruction(A.B.class)) {
Map<String, String> map = new HashMap<String, String>();
map.put("key1", "value1");
A.B mockB =
new A.B("a", "b");
//doNothing().when(mockB).updateEnv(map);
mockB.updateEnv(map);
}
}
}
Can someone please help here, what mistake i am doing?
When you mock the constructor, then all internal method calls are also mocked and do not go through the actual code.
If you remove the following try-with-resources:
try (MockedConstruction mockedConstruction =
mockConstruction(A.B.class))
The real code will be executed and the coverage will increase.

Dynamodb attribute converter provider for enhanced type extending Hashmap

I have a type which is extending HashMap<String, String>. As per the documentation here, it is possible to add a custom converter for the type. But it seems not working. The contents of the hashMap doesn't get converted, output looks like below;
"summary": {
"en": null
},
Any idea how to convert Label and its fields along with it's hashmap's contents?
Parent
#DynamoDbBean(converterProviders = {
CustomAttributeConverterProvider.class,
DefaultAttributeConverterProvider.class})
public class Summary extends BaseEntry {
private #Valid Label summary = null;
}
Child
#DynamoDbBean(converterProviders = {
CustomAttributeConverterProvider.class,
DefaultAttributeConverterProvider.class})
public class Label extends HashMap<String, String> {
private #Valid String en = null;
}
HashMapAttributeConverter
public class HashMapAttributeConverter implements AttributeConverter<Map<String, String>> {
private static AttributeConverter<Map<String, String>> mapConverter;
/** Default constructor. */
public HashMapAttributeConverter() {
mapConverter =
MapAttributeConverter.builder(EnhancedType.mapOf(String.class, String.class))
.mapConstructor(HashMap::new)
.keyConverter(StringStringConverter.create())
.valueConverter(StringAttributeConverter.create())
.build();
}
#Override
public AttributeValue transformFrom(Map<String, String> input) {
return mapConverter.transformFrom(input);
}
#Override
public Map<String, String> transformTo(AttributeValue input) {
return mapConverter.transformTo(input);
}
#Override
public EnhancedType<Map<String, String>> type() {
return mapConverter.type();
}
#Override
public AttributeValueType attributeValueType() {
return mapConverter.attributeValueType();
}
}
CustomAttributeConverterProvider
public class CustomAttributeConverterProvider implements AttributeConverterProvider {
private final List<AttributeConverter<?>> customConverters =
Arrays.asList(new HashMapAttributeConverter());
private final Map<EnhancedType<?>, AttributeConverter<?>> customConvertersMap;
private final AttributeConverterProvider defaultProvider =
DefaultAttributeConverterProvider.create();
public CustomAttributeConverterProvider() {
customConvertersMap =
customConverters.stream().collect(Collectors.toMap(AttributeConverter::type, c -> c));
}
#Override
public <T> AttributeConverter<T> converterFor(EnhancedType<T> enhancedType) {
return (AttributeConverter<T>)
customConvertersMap.computeIfAbsent(enhancedType, defaultProvider::converterFor);
}
}
#Override
public <T> AttributeConverter<T> converterFor(EnhancedType<T> type) {
// in this method you have to return only your converter based on type
// otherwise null should be returned. It will fix your issue.
}

Spring, ehCache, multiple cache managers throwing errors where pre-existing caches of the same name found

Good evening all.. maybe someone can help me sort this out some.
I'm attempting to set up multiple cache managers, each with its own set of distinct caches. To that end, I have the following:
MyCachingFactory:
public interface MyCachingFactory {
public JCacheCacheManager cacheManager();
public JCacheCacheManager commonCacheManager();
public JCacheCacheManager secondaryCacheManager();
public JCacheCacheManager secondaryCacheManager();
}
MyCachingFactoryImpl:
#EnableCaching
public class MyCachingFactoryImpl implements MyCachingFactory {
private static final int DEFAULT_CACHE_EXPIRY = 64800;
private static final List<String> DEFAULT_CACHES = Arrays.asList(
"default");
private static final List<String> COMMON_CACHES = Arrays.asList(
"common_default",
"common_config",
"common_session",
"common_roles",
"common_capabilities");
private static final List<String> SECONDARY_CACHES = Arrays.asList(
"secondary_default",
"secondary_foo");
private static final List<String> TERTIARY_CACHES = Arrays.asList(
"tertiary_default",
"tertiary_foo");
#Bean
public JCacheCacheManager cacheManager() {
return getNewJCacheManager(DEFAULT_CACHES);
}
#Bean
public JCacheCacheManager commonCacheManager() {
return getNewJCacheManager(COMMON_CACHES);
}
#Bean
public JCacheCacheManager secondaryCacheManager() {
return getNewJCacheManager(SECONDARY_CACHES);
}
#Bean
public JCacheCacheManager tertiaryCacheManager() {
return getNewJCacheManager(TERTIARY_CACHES);
}
private JCacheCacheManager getNewJCacheManager(List<String> cacheNames) {
JCacheCacheManager springJCacheManager = new JCacheCacheManager();
CachingProvider provider = Caching.getCachingProvider();
EhcacheCachingProvider ehcacheProvider = (EhcacheCachingProvider) provder;
javax.cache.CacheManager jCacheManager = ehcacheProvider.getCacheManager();
AtomicInteger count = new AtomicInteger(1);
cacheNames.forEach((listName) -> {
logger.debug("[" + count + "] Creating cache name [" + listName + "].");
jCacheManager.createCache(listName, new MutableConfiguration<>()
.setExpiryPolicyFactory(TouchedPolicyExpiry.factoryOf(new Duration(TimeUnit.SECONDS, DEFAULT_CACHE_EXPIRY)))
.setStoryByValue(false)
.setStatisticsEnabled(true));
count.incrementAndGet();
});
springJCacheManager.setCacheManager(jCacheManager);
return springJCacheManager;
}
}
Example class:
public class Example {
#Resource
private JCacheCacheManager commonCacheManager;
#CacheResult(cacheName = "common_config")
public String getCachedConfigByName(String name) { ... }
#CacheResult(cacheName = "common_config")
public String getCachedConfigByType(String type) { ... }
}
However, when starting up my application, I see my debug statements for cache creation... iterating over the same list (in this case, COMMON_CACHE) multiple times. This, in turn, causes a nested exception in that Spring complains that a cache named X already exists.
What am I doing wrong here?
Thanks in advance (go easy please... I am new to spring caching).

AnnotationConfigApplicationContext#4c0bc4 has not been refreshed yet

I'm getting stack trace for following code,
public interface SequenceDAO {
public Sequence getSequence(String sequenceId);
public int getNextValue(String sequenceId);
}
``````````````````````````````````````````
public class Sequence {
private int initial;
private String prefix;
private String suffix;
public Sequence(int initial, String prefix, String suffix) {
this.initial = initial;
this.prefix = prefix;
this.suffix = suffix;
}
````````````````````````````````````````````````
#Component("SequenceDAO")
public class SequenceDAOImpl implements SequenceDAO {
private Map<String, Sequence> sequences;
private Map<String, Integer> values;
public SequenceDAOImpl() {
sequences = new HashMap<>();
sequences.put("IT", new Sequence(30, "IT", "A"));
values = new HashMap<>();
values.put("IT", 10000);
}
#Override
public Sequence getSequence(String sequenceId) {
return sequences.get(sequenceId);
}
#Override
public int getNextValue(String sequenceId) {
int value = values.get(sequenceId);
values.put(sequenceId, value + 1);
return value;
}
}
``````````````````````````````````````````````
public static void main(String[] args) {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext();
context.scan("com.example");
SequenceDAO obj = context.getBean("SequenceDAO", SequenceDAO.class);
System.out.println(obj.getNextValue("IT"));
System.out.println(obj.getSequence("IT"));
}
`````````````````````````````````````````````````````````
Exception in thread "main" java.lang.IllegalStateException: org.springframework.context.annotation.AnnotationConfigApplicationContext#4c0bc4 has not been refreshed yet
at org.springframework.context.support.AbstractApplicationContext.assertBeanFactoryActive(AbstractApplicationContext.java:1041)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1059)
at com.example.SpringAnnotationsSequenceGeneratorWithDaoIntroductionApplication.main(SpringAnnotationsSequenceGeneratorWithDaoIntroductionApplication.java:14)
Iam new to spring and I am learning spring without annotations , so if anyone can tell me what happend wrong here
any help is appericiated.
Beat Regards
Your context init should like this:
ApplicationContext aContext = new AnnotationConfigApplicationContext(ConcertConfig.class);
#Configuration
#EnableAspectJAutoProxy
#ComponentScan
public class ConcertConfig {
}

Custom CacheResolver not working

I have a Spring Boot project with a custom CacheResolver as I need to decide on runtime which cache I want to use, I don't have any compilation errors but, when I do some tests and place a break point at my custom CacheResolver it never steps into it.
This is my Configuration class for the Cache:
#Configuration
#EnableCaching(proxyTargetClass = true)
#PropertySource(CacheConfig.CLASSPATH_DEPLOY_CACHE_PROPERTIES_PROPERTIES)
public class CacheConfig extends CachingConfigurerSupport{
public static final String CLASSPATH_DEPLOY_CACHE_PROPERTIES_PROPERTIES = "classpath:/deploy/cache-properties.properties";
public static final String CACHEABLE_DOCUMENTS_PROPERTY = "cacheable.documents";
public static final String TTL_CACHEABLE_DOCUMENTS_PROPERTY = "ttl.cacheable.documents";
public static final String SIZED_CACHEABLE_DOCUMENTS_PROPERTY = "sized.cacheable.documents";
public static final String CACHE_NAME = "permanentCache";
public static final String TTL_CACHE = "ttlCache";
public static final String SIZED_CACHE = "sizedCache";
public static final String CACHEABLE_DOCUMENTS = "cacheableDocuments";
public static final String SIZED_CACHEABLE_DOCUMENTS = "sizedCacheableDocuments";
public static final int WEIGHT = 1000000;
public static final int TO_KBYTES = 1000;
#Inject
protected Environment environment;
//#Bean
#Override
public CacheManager cacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
GuavaCache sizedCache = new GuavaCache(SIZED_CACHE, CacheBuilder.newBuilder().maximumWeight(WEIGHT).weigher(
(key, storable) -> {
String json = ((Storable) storable).toJson();
return json.getBytes().length / TO_KBYTES;
}
).build());
GuavaCache permanentCache = new GuavaCache(CACHE_NAME,CacheBuilder.newBuilder().build());
//GuavaCache ttlCache = new GuavaCache(TTL_CACHE, CacheBuilder.newBuilder().expireAfterWrite(30, TimeUnit.MINUTES).build());
cacheManager.setCaches(Arrays.asList(permanentCache,sizedCache));
return cacheManager;
}
#Bean(name = "wgstCacheResolver")
#Override
public CacheResolver cacheResolver(){
CacheResolver cacheResolver = new WgstCacheResolver(cacheManager(),cacheableDocuments(),sizedCacheableDocuments());
return cacheResolver;
}
#Bean(name = CACHEABLE_DOCUMENTS)
public List<String> cacheableDocuments(){
String[] cacheableDocuments = StringUtils.commaDelimitedListToStringArray(environment.getProperty(CACHEABLE_DOCUMENTS_PROPERTY));
return Arrays.asList(cacheableDocuments);
}
#Bean(name = SIZED_CACHEABLE_DOCUMENTS)
public List<String> sizedCacheableDocuments(){
String[] sizedCacheableDocuments = StringUtils.commaDelimitedListToStringArray(environment.getProperty(SIZED_CACHEABLE_DOCUMENTS_PROPERTY));
return Arrays.asList(sizedCacheableDocuments);
}
}
Here is my CacheResolver
public class WgstCacheResolver extends AbstractCacheResolver {
private final List<String> cacheableDocuments;
private final List<String> sizedCacheableDocuments;
public WgstCacheResolver(final CacheManager cacheManager,final List<String> cacheableDocuments, final List<String> sizedCacheableDocuments) {
super(cacheManager);
this.cacheableDocuments = cacheableDocuments;
this.sizedCacheableDocuments = sizedCacheableDocuments;
}
/**
* Resolves the cache(s) to be updated on runtime
* #param context
* #return*/
#Override
protected Collection<String> getCacheNames(final CacheOperationInvocationContext<?> context) {
final Collection<String> cacheNames = new ArrayList<>();
final AbstractDao dao = (AbstractDao)context.getTarget();
final String documentType = dao.getDocumentType().toString();
if (cacheableDocuments.contains(documentType)){
cacheNames.add("permanentCache");
}
if (sizedCacheableDocuments.contains(documentType)){
cacheNames.add("sizedCache");
}
return cacheNames;
}
}
And here my DAO where I use the cache:
#Component
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE, proxyMode = ScopedProxyMode.DEFAULT)
#CacheConfig(cacheResolver = "wgstCacheResolver")
public class CacheableDao<T extends Storable> extends AbstractDao<T> {
private final Logger logger = LoggerFactory.getLogger(CacheableDao.class);
public CacheableDao(final Bucket bucket, final Class<T> typeParameterClass, final DocumentType documentType) {
super(bucket, typeParameterClass, documentType);
}
#Cacheable(key = "{#root.methodName, #root.target.generateFullKey(#key)}")
public T get(final String key) throws DatastoreAccessException, ObjectMappingException {
//do something
}
.
.
.
}
I have tried implementing CacheResolver instead of extending AbstractCacheResolver but it didn't make any difference.
Thank you.
Cache names need to be included at some point, just specifying the CacheResolver to use is not enough, the #Cacheable class needs to be aware of the available cache names, so I included them with the #CacheConfig annotation:
#CacheConfig(cacheNames = {WgstCacheConfig.PERMANENT_CACHE, WgstCacheConfig.SIZED_CACHE},
cacheResolver = WgstCacheConfig.WGST_CACHE_RESOLVER)
public class CacheableDao<T extends Storable> extends AbstractDao<T> {
One thing that I don't like is that I need to provide a null CacheManager, even if I'm not using it, otherwise I get the following error:
Caused by: java.lang.IllegalStateException: No CacheResolver specified, and no bean of type CacheManager found. Register a CacheManager bean or remove the #EnableCaching annotation from your configuration.
So I left it like this, and it works:
#Bean
public CacheManager cacheManager() {
return null;
}
#Bean(name = WGST_CACHE_RESOLVER)
public CacheResolver cacheResolver(){
CacheResolver cacheResolver = new WgstCacheResolver(cacheableDocuments(),sizedCacheableDocuments(),getPermanentCache(),
getSizedCache());
return cacheResolver;
}
Reran my tests, stepping through my custom CacheResolver and it is behaving as expected resolving to the correct cache(s)
My configuration class is not extending CachingConfigurerSupport anymore.
After a bit of back and forth (Sorry about that!) it turns out this is indeed a bug in Spring Framework.
I've created SPR-13081. Expect a fix for the next maintenance release (4.1.7.RELEASE). Thanks for the sample project!

Categories