Fetch HashMap instead of entity from Spring Repository using ehcache - java

I am using ehcache for caching data in my spring project.
For example if you are fetching data mst_store table then currently I am using below code
public interface MstStateRepository extends JpaRepository<MstState, Integer> {
#Override
#Cacheable("getAllState")
List<MstState> findAll();
You can see that findAll method return List<MstState>
But instead of List what I need return type as Map. Means key as stateId and object in Value.
I can do this thing in service label but I need to write seperate logic for that as below
#Service
class CacheService {
#Autowired
private MstStateRepository mstStateRepository;
Map<Integer, MstState> cacheData = new HashMap<>();
public List<MstState> findAllState() {
List<MstState> mstStates = mstStateRepository.findAll();
for (MstState mstState : mstStates) {
cacheData.put(mstState.getStateId);
cacheData.value(mstState);
}
}
}
So instead of writing separate logic can we get directly Map from repository. Please suggest

You could use Java 8 default methods for that which allows you to write a default implementation that could be overriden by jpa but won't be. You may also use the streams introduced in java 8:
public interface MstStateRepository extends JpaRepository<MstState, Integer> {
#Cacheable("getAllState")
default Map<Integer, MstState> getAllState(){
return findAll().stream()
.collect(Collectors.toMap(
MstState::getStateId,
UnaryOperator.identity()
));
}
}

Related

Accessing Reactive CRUD repository from MapStruct Mapper stuck in block()

I am somewhat new to Spring and have recently generated a JHipster monolith application with the WebFlux option. My current aim is to make it compatible with Firestore and implement some missing features like inserting document references. To do so, I am currently having the following structure:
A domain object class "Device" which holds a field String firmwareType;
A domain object class "FirmwareType"
A DTO object DeviceDTO which holds a field FirmwareType firmwareType;
Correspondingly, I also have the corresponding Repository (extending FirestoreReactiveRepository which extends ReactiveCrudRepository) and Controller implementations, which all work fine. To perform the conversion from a "full object" of FirmwareType in the DTO-object to a String firmwareTypeId; in the Device-object, I implemented a MapStruct Mapper:
#Mapper(unmappedTargetPolicy = org.mapstruct.ReportingPolicy.IGNORE, componentModel = "spring")
public abstract class DeviceMapper {
private final Logger logger = LoggerFactory.getLogger(DeviceMapper.class);
#Autowired
protected FirmwareTypeRepository fwTypeRepo;
public abstract Device dtoToDevice(DeviceDTO deviceDTO);
public abstract DeviceDTO deviceToDto(Device device);
public abstract List<DeviceDTO> devicesToDTOs(List<Device> devices);
public abstract List<Device> dtosToDevices(List<DeviceDTO> dtos);
public String map(FirmwareType value) {
if (value == null || value.getId() == null || value.getId().isEmpty()) {
return null;
}
return value.getId();
}
public FirmwareType map(String value) {
if (value == null || value.isEmpty()) {
return null;
}
return fwTypeRepo.findById(value).block(); // <<-- this gets stuck!
}
}
The FirmwareTypeRepository which is autowired as fwTypeRepo field:
#Repository
public interface FirmwareTypeRepository extends FirestoreReactiveRepository<FirmwareType> {
Mono<FirmwareType> findById(String id);
}
The corresponding map functions get called perfectly fine, but the fwTypeRepo.findById(..) call in the marked line (third-last line) seems to get stuck somewhere and never returns or throws an error. When the "fwTypeRepo" via its Controller-endpoint is called, it works without any issues.
I suppose it must be some kind of calling context issue or something? Is there another way to force a result by Mono synchronously than block?
Thanks for your help in advance, everyone!
Edit: At this point, I am sure it has something to do with Autowiring the Repository. It seems to not correctly do so / use the correct instance. While a customized Interface+Impl is called correctly, the underlying logic (from FirestoreReactive/ReactiveCrudRepository) doesn't seem to supply data correctly (also when #Autowire is used in other components!). I found some hints pointing at the package-structure but (i.e. Application class needs to be in a root package) but that isn't an issue.
Mapstruct is not reactive as i know so this approach won't work, you'll need mapstruct to return a mono that builds the object itself but that wouldn't make sense as it's a mapping lib which is only for doing blocking things.
Could try use 2 Mono/Mappers, 1 for each DB call and then just Mono.zip(dbCall1, dbCall2) and set the the mapped db call output into the other objects field.
var call1 = Mono.fromFuture(() -> db.loadObject1()).map(o -> mapper1.map(o));
var call2 = Mono.fromFuture(() -> db.loadObject2()).map(o -> mapper2.map(o));
Mono.zip(call1, call2)
.map(t -> {
var o1 = t.getT1();
var o2 = t.getT2();
o1.setField(o2);
});

Replicated MongoDB + spring-data-mongodb

There is a replicated mongodb (mongodb-1 - primary, mongodb-2 - secondary, mongodb-3 - secondary).
The app runs through spring-boot-starter-data-mongodb.
Service:
public class FooBarService {
private FooBarRepository repository;
public FooBar method1() {
return repository.someQuery();
}
public FooBar method2() {
return repository.someQuery();
}
}
Repository:
public interface FooBarRepository extends MongoRepository<FooBar, String> {
FooBar someQuery();
}
My question is, how nice to make it so that method1 reads from the primary participant in the mongo replica set, and method2 reads from the secondary participant in the mongo replica set?
Would like to find some way to manage this at the service level (Something like #Transactional, but to select a mongo replica set member).
Can you advise me on any solutions in this regard?
Solution 1: #Meta Annotation
If you want to continue using Repository Interfaces, you can annotate the query method definition with the #Meta annotation which lets you pass flags to indicate to read from a secondary mongodb member.
public interface FooBarRepository extends MongoRepository<FooBar, String> {
#Query("{}")
#Meta(flags = Meta.CursorOption.SECONDARY_READS)
FooBar someQuery();
}
But you cannot control this flag from the service level. You would have to create 2 query methods: One with the flag and one without it. Like someQueryFromSecondary() and someQueryFromPrimary().
Solution 2: Using MongoTemplate
Another option would be to use MongoTemplate directly and set the flag on the Query.
public void someQuery(boolean readFromSecondary) {
var query = Query.query(Criteria.where("someKey").is("1"));
if (readFromSecondary) {
query.allowSecondaryReads();
}
return mongoTemplate.findOne(query, FooBar.class);
}
Regardless which solution you choose: be aware of that reading from secondary members can lead to retrieving stale data. Consider taking a look at the mongodb docs.

How do I pass the type as a variable?

I'm working with DynamoDB. I want to create a db adapter class (rather than viewing a bunch of copy/pasted code in various sections of the program) that can do the following:
DynamoDBMapper mapper = Dynamo.getMapper(SomeClassHere.TABLE_NAME);
DynamoDBQueryExpression<SomeClassHere> queryExpression = new DynamoDBQueryExpression<SomeClassHere>()
.withHashKeyValues(passedInObjectHere);
List<SomeClassHere> list = mapper.query(SomeClassHere, queryExpression);
so SomeClassHere would be the name of a class as its Type and used for mapping fields in the DynamoDBMapper as well.
I'd like to pass in an object parameter similar to this:
public void getList(IDynamoDBModel model) {
DynamoDBMapper mapper = Dynamo.getMapper(GetTheClassTypeFromTheModel);
DynamoDBQueryExpression<GetTheClassTypeFromTheModel> queryExpression = new DynamoDBQueryExpression<GetTheClassTypeFromTheModel>()
.withHashKeyValues(model);
List<GetTheClassTypeFromTheModel> list = mapper.query(GetTheClassTypeFromTheModel, queryExpression);
}
This way, instead of copy/pasting 'code for save', 'code for delete', 'code for get item' everywhere, I can simply:
call a connection manager
get a connection to the DynamoDb from it
pass in the object with its preloaded values into the appropriate
method (save, delete, getItem, etc...)
I realize that generics are used at compile time and it may be a fruitless effort but I reeeeally want to clean up this inherited code base.
I am not familiar with DynomoDB.
But according to my understanding you want to implement generic data access layer from DynamoDB. it can be achieve via generic and I have implement below code you to understand how you can achieve this.
Please ignore DynamoDB specification and this code won't work by just copy and paste. so makes sure to edit accordingly.
Implement DynamoTypeProvider interface.
#FunctionalInterface
public interface DynamoTypeProvider<DynamoType> {
Class<DynamoType> getDynamoType();
}
Implement Generic Data access service for DyanamoDB as below.
public abstract class DynamoQueryService<DynamoType, DynamoDBModel extends DynamoTypeProvider<DynamoType>> {
public List<DynamoType> getList(DynamoDBModel model) {
DynamoDBMapper mapper = Dynamo.getMapper(model.getDynamoType());
DynamoDBQueryExpression<DynamoType> queryExpression = new DynamoDBQueryExpression<DynamoType>()
.withHashKeyValues(model);
return mapper.query(model.getDynamoType(), queryExpression);
}
}
Suppose Dynamo type is UserType and DynamoDBModel is User.
from User we should be able to get Dynamo DB type UserType.
public class UserType {
//your code.
}
public class User implements DynamoTypeProvider<UserType>{
//your code
#Override
public Class<UserType> getDynamoType() {
return UserType.class;
}
Then implement specify model type data access service. here it is UserDynamoQueryService.
public class UserDynamoQueryService extends DynamoQueryService<UserType, User> {
}
then you can get UserType as
UserDynamoQueryService service = new UserDynamoQueryService();
List<UserType> types = service.getList(new User());
Using Generic you can achieve this.
please makes sure edit this accordingly.
Or you can try below example.
public class DynamoQueryService {
public <T, X> List<T> getList(X x, Class<T> tClass) {
DynamoDBMapper mapper = Dynamo.getMapper(tClass);
DynamoDBQueryExpression<T> queryExpression = new DynamoDBQueryExpression<T>()
.withHashKeyValues(x);
return mapper.query(tClass, queryExpression);
}
}
And use as below.
DynamoQueryService queryService = new DynamoQueryService();
List<UserType> userTypes = service.getList(new User(), UserType.class);
Then User dose not need to implement DynamoTypeProvider.

How to populate map of string and another map in a thread safe way?

I am working on measuing my application metrics using below class in which I increment and decrement metrics.
public class AppMetrics {
private final AtomicLongMap<String> metricCounter = AtomicLongMap.create();
private static class Holder {
private static final AppMetrics INSTANCE = new AppMetrics();
}
public static AppMetrics getInstance() {
return Holder.INSTANCE;
}
private AppMetrics() {}
public void increment(String name) {
metricCounter.getAndIncrement(name);
}
public AtomicLongMap<String> getMetricCounter() {
return metricCounter;
}
}
I am calling increment method of AppMetrics class from multithreaded code to increment the metrics by passing the metric name.
Problem Statement:
Now I want to have metricCounter for each clientId which is a String. That means we can also get same clientId multiple times and sometimes it will be a new clientId, so somehow then I need to extract the metricCounter map for that clientId and increment metrics on that particular map (which is what I am not sure how to do that).
What is the right way to do that keeping in mind it has to be thread safe and have to perform atomic operations. I was thinking to make a map like that instead:
private final Map<String, AtomicLongMap<String>> clientIdMetricCounterHolder = Maps.newConcurrentMap();
Is this the right way? If yes then how can I populate this map by passing clientId as it's key and it's value will be the counter map for each metric.
I am on Java 7.
If you use a map then you'll need to synchronize on the creation of new AtomicLongMap instances. I would recommend using a LoadingCache instead. You might not end up using any of the actual "caching" features but the "loading" feature is extremely helpful as it will synchronizing creation of AtomicLongMap instances for you. e.g.:
LoadingCache<String, AtomicLongMap<String>> clientIdMetricCounterCache =
CacheBuilder.newBuilder().build(new CacheLoader<String, AtomicLongMap<String>>() {
#Override
public AtomicLongMap<String> load(String key) throws Exception {
return AtomicLongMap.create();
}
});
Now you can safely start update metric counts for any client without worrying about whether the client is new or not. e.g.
clientIdMetricCounterCache.get(clientId).incrementAndGet(metricName);
A Map<String, Map<String, T>> is just a Map<Pair<String, String>, T> in disguise. Create a MultiKey class:
class MultiKey {
public String clientId;
public String name;
// be sure to add hashCode and equals
}
Then just use an AtomicLongMap<MultiKey>.
Edited:
Provided the set of metrics is well defined, it wouldn't be too hard to use this data structure to view metrics for one client:
Set<String> possibleMetrics = // all the possible values for "name"
Map<String, Long> getMetricsForClient(String client) {
return Maps.asMap(possibleMetrics, m -> metrics.get(new MultiKey(client, m));
}
The returned map will be a live unmodifiable view. It might be a bit more verbose if you're using an older Java version, but it's still possible.

How to access a HashMap or ArrayList from any part of the application?

Every time I load certain values from database, a HashMap is loaded with certain keys and values from the database, how do I make this HashMap available to all the other classes without having to load the values repeatedly into the HashMap each time it is called:
This is the class which contains method where HashMap is loaded:
public class Codes {
List<CODES> List = null;
private CodesDAO codesDAO = new CodesDAO(); //DAO Class
public HashMap <MultiKey,String> fetchCodes(){
MultiKey multiKey;
HashMap <MultiKey,String> map = new HashMap<MultiKey,String>();
List = codesDAO.fetchGuiCodes();//fetches codes from DB
for(CODES gui:List){
multiKey = new MultiKey(gui.getCode(), gui.getKEY());
map.put(multiKey,gui.getDESC());
}
return map;
}
}
You can save your map in a static field, and initialize it in a static block. This way it is done only once:
public class Codes {
private static Map<MultiKey, String> codes;
static {
CodesDAO codesDAO = new CodesDAO(); // DAO Class
HashMap<MultiKey, String> map = new HashMap<MultiKey, String>();
List<CODES> list = codesDAO.fetchGuiCodes();// fetches codes from DB
for (CODES gui : list) {
MultiKey multiKey = new MultiKey(gui.getCode(), gui.getKEY());
map.put(multiKey, gui.getDESC());
}
codes = Collections.unmodifiableMap(map);
}
public static Map<MultiKey, String> fetchCodes() {
return codes;
}
}
Then you can retrieve the codes with:
Codes.fetchCodes();
If static fields are not an option, you could lazily initialise as follows:
private HashMap<MultiKey, String> map = null;
public HashMap<MultiKey, String> fetchCodes() {
if (map == null) {
map = new HashMap<MultiKey, String>();
list = codesDAO.fetchGuiCodes();// fetches codes from DB
for (CODES gui : list) {
MultiKey multiKey = new MultiKey(gui.getCode(), gui.getKEY());
map.put(multiKey, gui.getDESC());
}
}
return map;
}
Note: this is not thread-safe, but could be with some additional synchronization.
May be load the data only once? Use memoization(I would) from guava:
Suppliers.memoize(//Implementation of Supplier<T>)
If you use Spring, you could simply declare a bean (singleton) and implement the InitializingBean interface.
You would be forced to implement a method called afterPropertiesSet() and load your Map there.
If you don't use Spring, you could initialize your map at the start like you did and put it in
the servletConext. this scope is availbale from all session.
This is all good for read-only data. if you need to update it, be carefull because this will not be thread-safe. you will have to make it thread-safe.
hope it help
regards
I'm not sure how the OP designed his Java EE application and if any 3rd party frameworks are been used, but in a properly designed standard Java EE application using EJB, CDI, JPA, transactions and all on em, the DB is normally not available in static context. The answers which suggest to initialize it statically are in such case severely misleading and broken.
The canonical approach is to just create one instance holding the preinitialized data and reuse it throughout application's lifetime. With the current Java EE standards, this can be achieved by creating and initializing the bean once during application's startup and storing it in the application scope. For example, an application scoped CDI bean:
#Named
#ApplicationScoped
public class Data {
private List<Code> codes;
#EJB
private DataService service;
#PostConstruct
public void init() {
codes = Collections.unmodifiableList(service.getAllCodes());
}
public List<Code> getCodes() {
return codes;
}
}
This is then available by #{data.codes} anywhere else in the application.

Categories