#CacheRemoveAll not work - java

#CacheRemoveAll not clear cache. Every time when I call getTechnologyColorObjcts(), my data will be cached. but when I call deleteTechnologyColorObjects(), cache is still there.
#Singleton(name = "ThematicTechnologyColorCache")
#Startup
#Lock(LockType.READ)
#CacheDefaults(cacheName = "ThematicTechnologyColorCache")
#Keep
#KeepName
#KeepClassMemberNames
public class ThematicTechnologyColorCache {
#Inject
private WebPortalManager webPortalManager;
#CacheResult(cacheName = "ThematicTechnologyColorCache")
public HashMap<String, List<ThemeText>> getTechnologyColorObjects() {
HashMap<String, List<ThemeText>> technologyColorsList = null;
HashMap<String, Integer> profileIDList = getProfileIDList();
if (profileIDList.size() != 0 && profileIDList != null) {
technologyColorsList = getTechnologyColorList(profileIDList);
}
return technologyColorsList;
}
//This function is not working
#CacheRemoveAll(cacheName = "ThematicTechnologyColorCache")
public void deleteObjects() {
logger.warning("In deleteTechnologyColorObjects");
}

Related

Paging 3, Guava/LiveData: liveData onChanged gets called before data is fetched from server

I'm developing an android app for showing a list of products stored in a server. I'm using Paging 3 lib with Guava/LiveData. When debugging, i found that the method onChanged from LiveData is always called before the PagingSource fetch data from server and never more.
My PagingSource class:
public class ProductsPagingSource extends ListenableFuturePagingSource<Integer, Product> {
ProductsRepository productsRepository;
Integer initialLoadSize = 0;
public ProductsPagingSource(ProductsRepository productsRepository) {
this.productsRepository = productsRepository;
}
#Nullable
#Override
public Integer getRefreshKey(#NonNull PagingState<Integer, Product> pagingState)
{
return null;
}
#NonNull
#Override
public ListenableFuture<LoadResult<Integer, Product>> loadFuture(#NonNull LoadParams<Integer> loadParams) {
Integer nextPageNumber = loadParams.getKey();
if (nextPageNumber == null) {
nextPageNumber = 1;
initialLoadSize = loadParams.getLoadSize();
}
Integer offSet = 0;
if(nextPageNumber == 2) {
offSet = initialLoadSize;
}
else {
offSet = ((nextPageNumber - 1) * loadParams.getLoadSize()) + (initialLoadSize - loadParams.getLoadSize());
}
ListeningExecutorService service = MoreExecutors.listeningDecorator(Executors.newSingleThreadExecutor());
Integer finalOffSet = offSet;
Integer finalNextPageNumber = nextPageNumber;
ListenableFuture<LoadResult<Integer, Product>> lf = service.submit(new Callable<LoadResult<Integer, Product>>() {
#Override
public LoadResult<Integer, Product> call() {
List<Product> productsList = null;
productsList = productsRepository.loadProducts(loadParams.getLoadSize(), finalOffSet);
Integer nextKey = null;
if(productsList.size() >= loadParams.getLoadSize()) {
nextKey = finalNextPageNumber + 1;
}
return new LoadResult.Page<Integer, Product>(productsList,
null,
nextKey);
}
});
return lf;
}
}
The ViewModel class:
public class HomeViewModel extends AndroidViewModel {
LiveData<PagingData<Product>> productsLd;
public HomeViewModel(#NonNull Application application) {
super(application);
ProductsRepository productsRepository = new ProductsRepository(getApplication());
CoroutineScope viewModelScope = ViewModelKt.getViewModelScope(this);
Pager<Integer, Product> pager = new Pager(new PagingConfig(10), () -> new ProductsPagingSource(productsRepository));
productsLd = PagingLiveData.cachedIn(PagingLiveData.getLiveData(pager), viewModelScope);
}
public LiveData<PagingData<Product>> getProductsLd() {
return productsLd;
}
}
Then, on method onCreate from HomeActivity, LiveData is observed:
myAdapter = new MyAdapter(this, new ProductComparator());
HomeViewModel homeViewModel = new ViewModelProvider(this).get(HomeViewModel.class);
LiveData<PagingData<Product>> productsLd = homeViewModel.getProductsLd();
productsLd.observe(this, new Observer<PagingData<Product>>() {
#Override
public void onChanged(PagingData<Product> productPagingData) {
myAdapter.submitData(getLifecycle(),productPagingData);
}
});
onChanged is immediately called after the LiveData is obtained from ViewModel and before loadFuture from ProductsPagingSource is called, so the PagingData delivered is empty. Method loadFuture from ProductsPagingSource is called and the data is fetched correctly from server, but LiveData never triggers onChanged.
Can someone help me? Thanks in advance!
I already solved the problem. It was a silly error: i forgot to set the Adapter in the RecycleView.

How to configure correct parallelism in persistor bolt?

I'm using apache storm to create a topology that initially read a "stream" of tuple in a file, and next it split and store the tuples in mongodb.
I've a cluster on Atlas with a shared replica set. I've already developed the topology, and the solution works properly if I use a single thread.
public static StormTopology build() {
return buildWithSpout();
}
public static StormTopology buildWithSpout() {
Config config = new Config();
TopologyBuilder builder = new TopologyBuilder();
CsvSpout datasetSpout = new CsvSpout("file.txt");
SplitterBolt splitterBolt = new SplitterBolt(",");
PartitionMongoInsertBolt insertPartitionBolt = new PartitionMongoInsertBolt();
builder.setSpout(DATA_SPOUT_ID, datasetSpout, 1);
builder.setBolt(DEPENDENCY_SPLITTER_ID, splitterBolt, 1).shuffleGrouping(DATA_SPOUT_ID);
builder.setBolt(UPDATER_COUNTER_ID, insertPartitionBolt, 1).shuffleGrouping(DEPENDENCY_SPLITTER_ID);
}
However, when I use parallel processes, my persistor bolt don't save all tuples in mongodb, despite the tuples are correctly emitted by the previous bolt.
builder.setSpout(DATA_SPOUT_ID, datasetSpout, 1);
builder.setBolt(DEPENDENCY_SPLITTER_ID, splitterBolt, 3).shuffleGrouping(DATA_SPOUT_ID);
builder.setBolt(UPDATER_COUNTER_ID, insertPartitionBolt, 3).shuffleGrouping(DEPENDENCY_SPLITTER_ID);
This is my first bolt:
public class SplitterBolt extends BaseBasicBolt {
private String del;
private MongoConnector db = null;
public SplitterBolt(String del) {
this.del = del;
}
public void prepare(Map stormConf, TopologyContext context) {
db = MongoConnector.getInstance();
}
public void execute(Tuple input, BasicOutputCollector collector) {
String tuple = input.getStringByField("tuple");
int idTuple = Integer.parseInt(input.getStringByField("id"));
String opString = "";
String[] data = tuple.split(this.del);
for(int i=0; i < data.length; i++) {
OpenBitSet attrs = new OpenBitSet();
attrs.fastSet(i);
opString = Utility.toStringOpenBitSet(attrs, 5);
collector.emit(new Values(idTuple, opString, data[i]));
}
db.incrementCount();
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("idtuple","binaryattr","value"));
}
}
And this is my persistor bolt that store in mongo all tuples:
public class PartitionMongoInsertBolt extends BaseBasicBolt {
private MongoConnector mongodb = null;
public void prepare(Map stormConf, TopologyContext context) {
//Singleton Instance
mongodb = MongoConnector.getInstance();
}
public void execute(Tuple input, BasicOutputCollector collector) {
mongodb.insertUpdateTuple(input);
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {}
}
My only doubt is that I used a singleton pattern for the connection class to mongo. Can this be a problem?
UPDATE
This is my MongoConnector class:
public class MongoConnector {
private MongoClient mongoClient = null;
private MongoDatabase database = null;
private MongoCollection<Document> partitionCollection = null;
private static MongoConnector mongoInstance = null;
public MongoConnector() {
MongoClientURI uri = new MongoClientURI("connection string");
this.mongoClient = new MongoClient(uri);
this.database = mongoClient.getDatabase("db.database");
this.partitionCollection = database.getCollection("db.collection");
}
public static MongoConnector getInstance() {
if (mongoInstance == null)
mongoInstance = new MongoConnector();
return mongoInstance;
}
public void insertUpdateTuple2(Tuple tuple) {
int idTuple = (Integer) tuple.getValue(0);
String attrs = (String) tuple.getValue(1);
String value = (String) tuple.getValue(2);
value = value.replace('.', ',');
Bson query = Filters.eq("_id", attrs);
Document docIterator = this.partitionCollection.find(query).first();
if (docIterator != null) {
Bson newValue = new Document(value, idTuple);
Bson updateDocument = new Document("$push", newValue);
this.partitionCollection.updateOne(docIterator, updateDocument);
} else {
Document document = new Document();
document.put("_id", attrs);
ArrayList<Integer> partition = new ArrayList<Integer>();
partition.add(idTuple);
document.put(value, partition);
this.partitionCollection.insertOne(document);
}
}
}
SOLUTION UPDATE
I've solved the problem chainging this line:
this.partitionCollection.updateOne(docIterator, updateDocument);
in
this.partitionCollection.findOneAndUpdate(query, updateDocument);

Performance Issue with Java Code

I am using java with mongodb.. I am getting performance issue in this below code can you please optimize this code?
I need to increase the performance of this function.
//first function which receives the parameter as associationId. can you guys please help me to increase the performance for this function..
public ArrayList<ResidentAssociationMapEntity> listUsersByAssociation(String
associationId) {
HashMap<String, Object> filterExpressions = new HashMap<String, Object>();
filterExpressions.put("associationId", associationId);
filterExpressions.put("residentStatus", 1);
List<ResidentAssociationMapEntity> residents =
DBManager.getInstance().getByMultipleFields(
ResidentAssociationMapEntity.class, filterExpressions);
if (residents != null && residents.size() > 0) {
for (ResidentAssociationMapEntity map : residents ) {
map.setUser(getUser(map.getUserId()));
}
}
return new ArrayList<ResidentAssociationMapEntity>(residents);
}
// get user function
public UserEntity getUser(String userId) {
UserEntity entity = DBManager.getInstance().getById(UserEntity.class,
userId);
ResidentAssociationMapEntity map =
getResidentAssocaitionMap(entity.getUserId());
if (map != null) {
entity.setUnitNo(map.getUnitNumber() + "");
entity.setAssociationId(map.getAssociationId());
entity.setAssociationName(CompanyDAO.getInstance().getCompany(map.getAssocia
tionId()).getName());
}
return entity;
}
//getResidentAssocaitionMap function
private ResidentAssociationMapEntity getResidentAssocaitionMap(String
residentId) {
HashMap<String, Object> filterExpressions = new HashMap<String, Object>();
filterExpressions.put("userId", residentId);
filterExpressions.put("residentStatus", 1);
List<ResidentAssociationMapEntity> residents =
DBManager.getInstance().getByMultipleFields(
ResidentAssociationMapEntity.class, filterExpressions);
if (residents != null && residents.size() > 0) {
return residents.get(0);
}
return null;
}

Concatination of strings in treemap from the following code

I'm using this way to call the class and object still can't figure out what is wrong in getting the concept right
public class MMTUtil{
private static Map<String, String> domainDocumentationMap = null;
static{
domainDocumentationMap = new TreeMap<String, String>();
}
public static Map<String, String> getDomainDocumentationMap() {
return domainDocumentationMap;
}
public static void setDomainDocumentationMap(Map<String, String> domainDocumentationMap) {
MMTUtil.domainDocumentationMap = domainDocumentationMap;
}
How shall I use the setDomainDocumentation and getDomainDocumentation effectively with the code so that my Treemap has the following set of values
(objectType + objectname,DocumentationLink)
public UMRResultObject insertDocumentation(UMRDocumentationDTO documentationDTO)
{
Session session = UMRHibernateUtil.getUmrSession();
Transaction tx = null;
documentationLink = null;
objectName = null;
objectType = null;
try{
tx = session.beginTransaction();
dao.insertDocumentation(documentationDTO, session);
MMTUtil.getDomainDocumentationMap().put(objectName.getDomainName()+objectType.getDomainType(),documentationLink.getDocumentationLink());
tx.commit();
ro.setSuccess(true);
ro.getMessages().add("Record Inserted Successfully");
}
}

How to Mock repository Items in ATG

I am trying to create a Mock class for droplet. I am able to mock the repository calls and req.getParameter but need help on how to mock the repository item list from the repository. Below is the sample code.
for (final RepositoryItem item : skuList) {
final String skuId = (String) item.getPropertyValue("id");
final String skuType = (String) item.getPropertyValue("skuType");
if (this.isLoggingDebug()) {
this.logDebug("skuType [ " + skuType + " ]");
}
final String skuActive = (String) item.getPropertyValue("isActive");
if EJSD.equalsIgnoreCase(skuType) && (skuActive.equals("1"))) {
eSkuList.add(item);
skuCode = (String) item.getPropertyValue(ESTConstants.SKU_MISC1);
} else (PJPROMIS.equalsIgnoreCase(skuType) && skuId.contains("PP") && (skuActive.equals("1"))) {
personalSkuList.add(item);
String tmp = "";
if (skuId.lastIndexOf("-") > -1) {
tmp = skuId.substring(skuId.lastIndexOf("-") + 1);
tmp = tmp.toUpperCase();
if (this.getDefaultDisplayNameMap() != null) {
String val = this.getDefaultDisplayNameMap().get(tmp);
if (StringUtils.isNotEmpty(val)) {
displayNameMap.put(skuId, val);
} else {
val = (String) item.getPropertyValue("displayName");
displayNameMap.put(skuId, val);
}
} else {
final String val = (String) item.getPropertyValue("displayName");
displayNameMap.put(skuId, val);
}
}
}
}
There are a multitude of ways to 'mock' the list. I've been doing it this was as I feel it is more readable.
#Mock private RepositoryItem skuMockA;
#Mock private RepositoryItem skuMockB;
List<RepositoryItem> skuList = new ArrayList<RepositoryItem>();
#BeforeMethod(groups = { "unit" })
public void setup() throws Exception {
testObj = new YourDropletName();
MockitoAnnotations.initMocks(this);
skuList = new ArrayList<RepositoryItem>();
skuList.add(skuMockA);
skuList.add(skuMockB);
Mockito.when(skuMockA.getPropertyValue("id")).thenReturn("skuA");
Mockito.when(skuMockA.getPropertyValue("skuType")).thenReturn(ActiveSkuDroplet.EJSD);
Mockito.when(skuMockA.getPropertyValue(ESTConstants.SKU_MISC1)).thenReturn("skuCodeA");
Mockito.when(skuMockA.getPropertyValue("displayName")).thenReturn("skuADisplayName");
Mockito.when(skuMockB.getPropertyValue("id")).thenReturn("skuB-PP");
Mockito.when(skuMockB.getPropertyValue("skuType")).thenReturn(ActiveSkuDroplet.PJPROMIS);
Mockito.when(skuMockB.getPropertyValue(ESTConstants.SKU_MISC1)).thenReturn("skuCodeB");
Mockito.when(skuMockB.getPropertyValue("displayName")).thenReturn("skuBDisplayName");
}
So when you then call this within a test it will be something like this:
Mockito.when(someMethodThatReturnsAList).thenReturn(skuList);
So the key really is that you are not mocking the List but instead the contents of the List.
Creating a mock using mockito is a good option.
But I am here explaining a different way of mocking the repository item.
Create a common implementation for RepositoryItem, say MockRepositoryItemImpl like this in your test package.
Public MockRepositoryItemImpl implements RepositoryItem {
private Map<String, Object> properties;
MockRepositoryItemImpl(){
properties = new HashMap<>();
}
#override
public Object getPropertyValue(String propertyName){
return properties.get(propertyName);
}
#override
public void setPropertyValue(String propertyName, Object propertyValue){
properties.put(propertyName, propertyValue);
}
}
Use this implementation to create the mock object in your test case.
RepositoryItem mockSKU = new MockRepositoryItemImpl();
mockSKU.setPropertyValue("id", "sku0001");
mockSKU.setPropertyValue("displayName", "Mock SKU");
mockSKU.setPropertyValue("skuType", "Type1");
mockSKU.setPropertyValue("isActive", "1");

Categories