I'm using this way to call the class and object still can't figure out what is wrong in getting the concept right
public class MMTUtil{
private static Map<String, String> domainDocumentationMap = null;
static{
domainDocumentationMap = new TreeMap<String, String>();
}
public static Map<String, String> getDomainDocumentationMap() {
return domainDocumentationMap;
}
public static void setDomainDocumentationMap(Map<String, String> domainDocumentationMap) {
MMTUtil.domainDocumentationMap = domainDocumentationMap;
}
How shall I use the setDomainDocumentation and getDomainDocumentation effectively with the code so that my Treemap has the following set of values
(objectType + objectname,DocumentationLink)
public UMRResultObject insertDocumentation(UMRDocumentationDTO documentationDTO)
{
Session session = UMRHibernateUtil.getUmrSession();
Transaction tx = null;
documentationLink = null;
objectName = null;
objectType = null;
try{
tx = session.beginTransaction();
dao.insertDocumentation(documentationDTO, session);
MMTUtil.getDomainDocumentationMap().put(objectName.getDomainName()+objectType.getDomainType(),documentationLink.getDocumentationLink());
tx.commit();
ro.setSuccess(true);
ro.getMessages().add("Record Inserted Successfully");
}
}
Related
javax.persistence.PersistenceException: org.hibernate.TransactionException: Already have an associated managed connection
Caused by: org.hibernate.TransactionException: Already have an associated managed connection
I am using parallelStream().foreach() Works on multithreading concept :
private void loadHcpDataIntoCassandra(ResultSet srcHcpData) throws SQLException {
List<Map<String, Object>> hcpDataList = resultSetToList(srcHcpData);
Cluster cluster = HcpDao.getCassandraConnection();
List<String> tblSpecs = HcpDao.getTableSpecs();
Session session = cluster.connect(tblSpecs.get(0));
//Call prepareHcpAndLoad() method :-
hcpDataList.parallelStream().forEach(hcpMap -> prepareHcpAndLoad(hcpMap, session));
cluster.close();
}
I got above mention Exception, and i replace parallelStream().forEach to
for (Map<String,Object> hcpMap : hcpDataList) {
prepareHcpAndLoad(hcpMap, session);
}
The enhance for loop is perfectly work for me. But I need multi-thread concept. How to solve this problem even i use parallelStream().foreach()
private static void prepareHcpAndLoad(Map<String, Object> hcpMap, Session session) {
String hcpHceId = "";
for (Map.Entry<String, Object> entry : hcpMap.entrySet()) {
String colName = entry.getKey();
Object hcpTableRow = entry.getValue();
hcpMasterData.setHcpHceId(hcpTableRow.toString());
hcpHceId = hcpTableRow.toString();
}
/** Get MDM_id */
MdmId mdm = new MdmId();
mdm.setHcpHceId(hcpHceId);
String mdmId = getMdmId(mdm);
/** update mdmId */
hcpMasterData.setMdmId(mdmId);
mapper.save(hcpMasterData);
}
//#PersistenceContext
#PersistenceContext(type = PersistenceContextType.TRANSACTION)
private static EntityManager em = getEntityManager();
public static String getMdmId(MdmId mdm) {
if(em == null) {
em = getEntityManager();
}
String mdmId = "";
EntityTransaction tr = em.getTransaction();
try {
tr.begin(); //Error Line
em.persist(mdm);
em.flush();
mdmId = Long.toString(mdm.getId());
tr.commit();
} catch (Exception error) {
logger.error(error.getMessage());
error.printStackTrace();
}
return mdmId;
}
private static EntityManager getEntityManager() {
return Persistence.createEntityManagerFactory("sql-connection").createEntityManager();
}
#CacheRemoveAll not clear cache. Every time when I call getTechnologyColorObjcts(), my data will be cached. but when I call deleteTechnologyColorObjects(), cache is still there.
#Singleton(name = "ThematicTechnologyColorCache")
#Startup
#Lock(LockType.READ)
#CacheDefaults(cacheName = "ThematicTechnologyColorCache")
#Keep
#KeepName
#KeepClassMemberNames
public class ThematicTechnologyColorCache {
#Inject
private WebPortalManager webPortalManager;
#CacheResult(cacheName = "ThematicTechnologyColorCache")
public HashMap<String, List<ThemeText>> getTechnologyColorObjects() {
HashMap<String, List<ThemeText>> technologyColorsList = null;
HashMap<String, Integer> profileIDList = getProfileIDList();
if (profileIDList.size() != 0 && profileIDList != null) {
technologyColorsList = getTechnologyColorList(profileIDList);
}
return technologyColorsList;
}
//This function is not working
#CacheRemoveAll(cacheName = "ThematicTechnologyColorCache")
public void deleteObjects() {
logger.warning("In deleteTechnologyColorObjects");
}
public class DownloadBoxHelper extends WCMUsePojo {
private static final Logger log = LoggerFactory.getLogger(DownloadBoxHelper.class);
private ArrayList<Map<String, String>> downloadList;
private ArrayList<Map<String, String>> downloadListFinal;
DownloadBoxModel downloadBoxModel;
#Override
public void activate() throws Exception {
log.info("Download Box activate Method started");
JcrUtilService jcrUtil = getSlingScriptHelper().getService(JcrUtilService.class);
downloadBoxModel = getResource().adaptTo(DownloadBoxModel.class);
downloadList = downloadBoxModel.getDownloadList();
downloadListFinal =DHLUtil.getSizeTypeOfAsset(downloadList, getResource(), jcrUtil);
downloadBoxModel.setDownloadListFinal(downloadListFinal);
log.info("Download Box activate Method Ended");
}
public DownloadBoxModel getDownloadBoxModel() {
return downloadBoxModel;
}
}
I want to mock this helper class. But this helper class have some static method like downloadListFinal =DHLUtil.getSizeTypeOfAsset(downloadList, getResource(), jcrUtil);
This static method refer to DHLUtil.class file. Here is declaration
**public static ArrayList<Map<String, String>> getSizeTypeOfAsset(ArrayList<Map<String, String>> downloadList,
Resource rs, JcrUtilService jcrUtil) {
log.info("DHLUtil getSizeTypeOfAsset() initiated ");
ArrayList<Map<String, String>> localDownloadList = new ArrayList<Map<String, String>>();
Session session = null;
Node assetMetaNode;
try {
session = jcrUtil.getSession(DHLSubService.readservice);
Iterator<Map<String, String>> it = downloadList.iterator();
while (it.hasNext()) {
Map<String, String> mp = it.next();
if (mp.get(DHLConstants.ASSET_DOWNLOAD_ITEM).toString().contains(".")) {
assetMetaNode = session.getNode((mp.get(DHLConstants.ASSET_DOWNLOAD_ITEM).toString())
+ DHLConstants.SLASH + JcrConstants.JCR_CONTENT +DHLConstants.SLASH + DamConstants.ACTIVITY_TYPE_METADATA);
String assetType = assetMetaNode.getProperty(DamConstants.DC_FORMAT).getString();
if(assetType!=null){
if(assetType.contains("vnd.openxmlformats-officedocument.spreadsheetml.sheet") || assetType.contains("vnd.ms-excel")){
assetType="ms-excel";
}
if(assetType.contains("vnd.openxmlformats-officedocument.wordprocessingml.document") || assetType.contains("msword")){
assetType="ms-word";
}
if(assetType.contains("vnd.openxmlformats-officedocument.presentationml.presentation") || assetType.contains("vnd.ms-powerpoint")){
assetType="ms-powerpoint";
}
}
Property assetSize = assetMetaNode.getProperty(DamConstants.DAM_SIZE);
double assetSizeUpdated = 0d;
DecimalFormat df = new DecimalFormat("0.0");
String assetSizeType = DHLConstants.BYTE;
;
if (assetSize.getLong() < (1024)) {
assetSizeUpdated = (double) assetSize.getLong();
}
if (assetSize.getLong() > 1024 && assetSize.getLong() < (1024 * 1024)) {
assetSizeType = DHLConstants.KILOBYTE;
assetSizeUpdated = (double) assetSize.getLong() / 1024L;
}
if (assetSize.getLong() > (1024 * 1024)) {
assetSizeType = DHLConstants.MEGABYTE;
assetSizeUpdated = ((double) assetSize.getLong() / (1024 * 1024));
}
if (assetType.contains("/")) {
String strSplit[] = assetType.split("/");
assetType = strSplit[1];
}
String strMetaData = assetType.toUpperCase() + DHLConstants.SPACE + DHLConstants.LEFT_BRACKET
+ DHLConstants.SPACE + df.format(assetSizeUpdated) + DHLConstants.SPACE + assetSizeType + DHLConstants.SPACE + DHLConstants.RIGHT_BRACKET;
mp.put(DamConstants.ACTIVITY_TYPE_METADATA, strMetaData);
localDownloadList.add(mp);
}
}
}catch (DHLException dhe) {
log.error("DHLException {}", dhe);
}catch (Exception e) {
log.error("Exception {}", e);
}finally {
if(null!=session && session.isLive()) {
session.logout();
}
}
return localDownloadList;
}
So how I mock this?
My JUnit file is:
**#RunWith(PowerMockRunner.class)
#PrepareForTes({DownloadBoxHelper.class,DHLUtil.class,DownloadBoxModel.class})
public class DownloadBoxHelperTest extends PowerMockTestCase {
private DownloadBoxHelper aFinalClass_mock = null;
#Test
public void mockFinalClassTest() {
ArrayList<Map<String, String>> downloadList = new ArrayList<Map<String, String>>();;
ArrayList<Map<String, String>> downloadListFinal;
Map<String, String> n = new HashMap<String, String>();
n.put("a", "a");
n.put("b", "b");
downloadList.add(n);
DownloadBoxModel downloadBoxModel;
aFinalClass_mock = PowerMockito.mock(DownloadBoxHelper.class);
Mockito.when(aFinalClass_mock.getSlingScriptHelper()).thenReturn(null);
// Assert the mocked result is returned from method call
//Assert.assertEquals(aFinalClass_mock.getSlingScriptHelper()).thenReturn(null);
}
#Test
public void mockFinalClassTest_1() {
JcrUtilService jcrUtil;s
ArrayList<Map<String, String>> downloadListFinal;
Map<String, String> n1 = new HashMap<String, String>();
n1.put("a", "a");
n1.put("b", "b");
downloadListFinal.add(n1);
Mockito.when(aFinalClass_mock.getDownloadListFinal()).thenReturn(downloadListFinal);
// Assert the mocked result is returned from method call
//Assert.assertEquals(aFinalClass_mock.getSizeTypeOfAsset(downloadListFinal, getResource(), jcrUtil);, mockedResult);
}
Please Provide me solution or one reference JUnit file where we are using ["
public static ArrayList<Map<String, String>> getSizeTypeOfAsset(ArrayList<Map<String, String>> downloadList,
Resource rs, JcrUtilService jcrUtil) {
log.info("DHLUtil getSizeTypeOfAsset() initiated "); "
] this type of class.
Thanks
you should add:
PowerMockito.mockStatic(DHLUtil.class);
and the you can use this method like any other mock:
when(DHLUtil.getSizeTypeOfAsset()).thenReturn(whatever);
I am trying to cache some data in storm bolt, but not sure if this is right way to do it or not. In below class employee id and employe name are cached to a hash map. For this a database call has been made to Employee table to select all employees and populate a hash map in prepare method (is this right place to initialize map?).
After some logging it turns out (while running storm topology), topology is making multiple database connections and initializing map multiple times. Ofcourse I want to avoid this, that is why I want to cache the result so that it does not go to database everytime. Please help?
public class TestBolt extends BaseRichBolt {
private static final long serialVersionUID = 2946379346389650348L;
private OutputCollector collector;
private Map<String, String> employeeIdToNameMap;
private static final Logger LOG = Logger.getLogger(TestBolt.class);
#Override
public void execute(Tuple tuple) {
String employeeId = tuple.getStringByField("employeeId");
String employeeName = employeeIdToNameMap.get(employeeId);
collector.emit(tuple, new Values(employeeId, employeeName));
collector.ack(tuple);
}
#Override
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
// TODO Auto-generated method stub
this.collector = collector;
try {
employeeIdToNameMap = createEmployeIdToNameMap();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields(/*some fields*/));
}
private Map<String, String> createEmployeIdToNameMap() throws SQLException {
final Map<String, String> employeeIdToNameMap = new HashMap<>();
final DatabaseManager dbm = new PostgresManager();
final String query = "select id, name from employee;";
final Connection conn = dbm.createDefaultConnection();
final ResultSet result = dbm.executeSelectQuery(conn, query);
while(result.next()) {
String employeId = result.getString("id");
String name = result.getString("name");
employeeIdToNameMap.put(employeId, name);
}
conn.close();
return employeeIdToNameMap;
}
}
SOLUTION
I created synchronized map and its working fine for me
private static Map<String, String> employeeIdToNameMap = Collections
.synchronizedMap(new HashMap<String, String>());
Since you have multiple bolt tasks, you can mark employeeIdToNameMap static and volatile. Initialize the map in prepare like this -
try {
synchronized(TestBolt.class) {
if (null == employeeIdToNameMap) {
employeeIdToNameMap = createEmployeIdToNameMap();
}
}
} catch (SQLException e) {
...
}
I retrieve values from a database, create a new Transaction Object and add it to an ArrayList<Transaction>, which I then return.
The problem is that everytime returnList.add(t); is called, instead of just adding the Transaction, it also replaces all old Transactions with the new one.
Where is the error that causes this behaviour?
public ArrayList<Transaction> getTransactions(long intervall, Map<String, String> transactionFields) {
connect();
ArrayList<Transaction> returnList = new ArrayList<Transaction>();
Statement sta;
ZonedDateTime now = ZonedDateTime.now(ZoneOffset.UTC);
now = now.minusSeconds(intervall);
try {
sta = conn.createStatement();
String Sql = "...";
ResultSet rs = sta.executeQuery(Sql);
while (rs.next()) {
Transaction t = new Transaction(rs.getString("No"), transactionFields);
t.set("AgentName", rs.getString("cname"));
returnList.add(t);
}
} catch (SQLException e) {
...
}
disconnect();
return returnList;
}
Here is the Transaction class:
public class Transaction {
private Map<String, String> fields;
public Transaction(String number, Map<String, String> transactionFields) {
fields = transactionFields;
fields.put("Number", number);
}
public void set(String field, String value) {
fields.put(field, value);
}
public String get(String field) {
return fields.get(field);
}
public Map<String, String> getFieldMap() {
return fields;
}
#Override
public String toString() {
return fields.toString();
You are using the same Map in all your Transaction instances.
Instead, pass in a new one each time:
Transaction t = new Transaction(rs.getString("No"), new HashMap<String, String>());
or just create the Map inside your Transaction class.