Update: I've created an example on GitHub to demonstrate my problem; HibernateMapTest currently fails due to the fact the HashMap key is a proxy object. I'm hoping someone can suggest a way I can query for the entity and fetch the map so that the test passes...
I'm simply trying to fetch the contents of a HashMap persisted in Hibernate, but I'm having some trouble finding the correct way to do it...
The HBM mapping is as follows, I did not create this but from my research it appears to be a ternary association mapping with a many-to-many relation. (Update: to simplify my question I've forced the map to lazy="false" to avoid my join):
<hibernate-mapping>
<class name="database.Document" table="document">
...
<map name="documentbundles" table="document_bundles" lazy="false">
<key column="id"/>
<index-many-to-many column="pkgitemid" class="database.PkgItem"/>
<many-to-many column="child" class="database.Document" />
</map>
</class>
</hibernate-mapping>
For simplicity I'm just currently just attempting to fetch all the records with this map data populated:
DetachedCriteria criteria = DetachedCriteria.forClass(Document.class);
criteria.add(Restrictions.eq("id", 1));
List<Document> result = hibernateTemplate.findByCriteria(criteria);
After last to false, I now get the contents of the Map without throwing a LazyInitializationException; but none of the key objects have been initialised properly. I've dumped a screenshot to clarify what I mean:
I know that the fields are populated in the database, and I suspect my fetching strategy is still to blame. How do you fetch a <map> in Hibernate correctly?
The error is due to the HibernateTemplate opening a Hibernate session to execute this query:
List results = hibernateTemplate.find("from database.Document d where d.name = 'doc1'");
and then immediately closing the session after the query is run. Then when looping through the keys, the session to which the map was linked is closed so the data cannot be loaded anymore, causing the proxy to throw the LazyInitializationException.
This exception means that the proxy can no longer load the data transparently because the session to which is linked too is now closed.
One of the main goals of the HibernateTemplate is to know when to open and close sessions. The template will keep the session open if there is an ongoing transaction.
So the key here is to wrap the unit test in a TransactionTemplate (the template equivalent to #Transactional), which causes the session to be kept open by the HibernateTemplate. Because the session is kept open, no more lazy initialization exceptions occur.
Modifying the test like this will solve the problem (notice the use of TransactionTemplate):
import database.Document;
import database.PkgItem;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.springframework.core.io.ClassPathResource;
import org.springframework.core.io.Resource;
import org.springframework.orm.hibernate3.HibernateTemplate;
import org.springframework.orm.hibernate3.HibernateTransactionManager;
import org.springframework.orm.hibernate3.LocalSessionFactoryBean;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.support.TransactionCallbackWithoutResult;
import org.springframework.transaction.support.TransactionTemplate;
import java.util.HashMap;
import java.util.List;
import java.util.Set;
public class HibernateMapTest {
private static final String TEST_DIALECT = "org.hibernate.dialect.HSQLDialect";
private static final String TEST_DRIVER = "org.hsqldb.jdbcDriver";
private static final String TEST_URL = "jdbc:hsqldb:mem:adportal";
private static final String TEST_USER = "sa";
private static final String TEST_PASSWORD = "";
private HibernateTemplate hibernateTemplate;
private TransactionTemplate transactionTemplate;
#Before
public void setUp() throws Exception {
hibernateTemplate = new HibernateTemplate();
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.getHibernateProperties().put("hibernate.dialect", TEST_DIALECT);
sessionFactory.getHibernateProperties().put("hibernate.connection.driver_class", TEST_DRIVER);
sessionFactory.getHibernateProperties().put("hibernate.connection.password", TEST_PASSWORD);
sessionFactory.getHibernateProperties().put("hibernate.connection.url", TEST_URL);
sessionFactory.getHibernateProperties().put("hibernate.connection.username", TEST_USER);
sessionFactory.getHibernateProperties().put("hibernate.hbm2ddl.auto", "create");
sessionFactory.getHibernateProperties().put("hibernate.show_sql", "true");
sessionFactory.getHibernateProperties().put("hibernate.jdbc.batch_size", "0");
sessionFactory.getHibernateProperties().put("hibernate.cache.use_second_level_cache", "false");
sessionFactory.setMappingDirectoryLocations(new Resource[]{new ClassPathResource("database")});
sessionFactory.afterPropertiesSet();
hibernateTemplate.setSessionFactory(sessionFactory.getObject());
transactionTemplate = new TransactionTemplate(new HibernateTransactionManager(sessionFactory.getObject()));
}
#After
public void tearDown() throws Exception {
hibernateTemplate.getSessionFactory().close();
}
#Test
public void testFetchEntityWithMap() throws Exception {
transactionTemplate.execute(new TransactionCallbackWithoutResult() {
protected void doInTransactionWithoutResult(TransactionStatus status) {
// Store the entities and mapping
PkgItem key = new PkgItem();
key.setName("pkgitem1");
hibernateTemplate.persist(key);
Document doc2 = new Document();
doc2.setName("doc2");
hibernateTemplate.persist(doc2);
Document doc1 = new Document();
doc1.setName("doc1");
HashMap<PkgItem, Document> documentbundles = new HashMap<PkgItem, Document>();
documentbundles.put(key, doc2);
doc1.setDocumentbundles(documentbundles);
hibernateTemplate.persist(doc1);
// Now attempt a query
List results = hibernateTemplate.find("from database.Document d where d.name = 'doc1'");
Document result = (Document)results.get(0);
// Check the doc was returned
Assert.assertEquals("doc1", result.getName());
key = (PkgItem)hibernateTemplate.find("from database.PkgItem").get(0);
Set<PkgItem> bundleKeys = result.getDocumentbundles().keySet();
// Check the key is still present in the map. At this point the test fails because
// the map contains a proxy object of the key...
Assert.assertEquals(key, bundleKeys.iterator().next());
}
});
}
}
and these are the test results and the log after the change:
This is a supplement to jhadesdev's answer, since I needed to do a little more work to get exactly what I was looking for.
In summary, you can't fetch a PersistedMap with a Hibernate query and immediately start using it like a typical Java hash map. The keys are always proxies; eager fetching / joining only fetches the map values, not the keys.
This means any code that deals with the hash map needs to be wrapped in a Hibernate transaction, which caused me some architectural problems as my data and service layers are separate.
I worked around this by iterating the hash map within a single transaction and replacing the keys with the ones originally passed in. I kept performance up by batching up the keys I want to fetch and retrieving them in one go:
// Build a list of keys we want to fetch in one go
final List<PkgItem> pkgItems = Arrays.asList(pkgItem1, pkgItem2, ...);
Map<PkgItem, Document> bundles = transactionTemplate.execute(new TransactionCallback< Map<PkgItem, Document> >() {
#Override
public Map<PkgItem, Document> doInTransaction(TransactionStatus transactionStatus) {
if (doc1.getId() == null) return null;
// Merge the parent document into this transaction
Document container = hibernateTemplate.merge(doc1);
// Copy the original package items into the key set
Map<PkgItem, Document> out = new HashMap<PkgItem, Document>();
for (PkgItem dbKey : container.getDocumentbundles().keySet()) {
int keyIndex = pkgItems.indexOf(dbKey);
if (keyIndex > -1) out.put(pkgItems.get(keyIndex), container.getDocumentbundles().get(dbKey));
}
return out;
}
});
// Now we can perform a standard lookup
assertEquals("doc2", result.get(pkgItem1).getName());
I can now use the map without Hibernate in the resulting code, with only a minimal performance hit. I've also updated the test in my example GitHub project to demonstrate how this can work.
Related
While I have Java Batch jobs that read data, process it and store it in other places in the database, now I need a step to actually remove data from the database. All I need to run is a delete query via JPA.
The chunk based Reader/Processor/Writer pattern does not make sense here. But the Batchlet alternative is giving me a headache either. What did I do?
I created a Batchlet that gets invoked via CDI. At that moment it is easy to inject my JPA EntityManager. What is not easy is to run the update query. Code looks like this:
package ...;
import javax.batch.api.BatchProperty;
import javax.inject.Inject;
import javax.inject.Named;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
#Named("CleanerBatchlet")
public class CleanerBatchlet extends AbstractBatchlet {
public static final Logger log = LogManager.getLogger(CleanerBatchlet.class);
#PersistenceContext(unitName = "...")
private EntityManager entityManager;
#Inject
#BatchProperty(name = "technologyIds")
private String technologyIds;
private void clearQueue(long technologyId) {
//EntityManager entityManager = ...getEntityManager();
//entityManager.getTransaction().begin();
Query q = entityManager.createQuery("delete from Record r where r.technologyId=:technologyId");
q.setParameter("technologyId", technologyId);
int count = q.executeUpdate();
//entityManager.getTransaction().commit();
log.debug("Deleted {} entries from queue {}", count, technologyId);
//entityManager.close();
}
#Override
public String doProcess() throws Exception {
log.debug("doProcess()");
out.println("technologyIds=" + technologyIds);
log.info("technologyIds=" + technologyIds);
try {
String[] parts = technologyIds.split(",");
for (String part: parts) {
long technologyId = Long.parseLong(part);
clearQueue(technologyId);
}
} catch (NullPointerException | NumberFormatException e) {
throw new IllegalStateException("technologyIds must be set to a string of comma-separated numbers.", e);
}
return "COMPLETED";
}
}
As you can see some lines are commented out - these are the ones I am experimenting with.
So if I run the code as-is, I get an exception telling me that the update query requires a transaction. This is regardless of which of the two persistence units in my project I use (one is configured for JTA, the other is not).
javax.persistence.TransactionRequiredException: Executing an update/delete query
It also does not matter whether I uncomment the transaction handling code begin/commit. I still get the same error that a transaction is required to run the update query.
Even when I try to circumvent CDI and JTA completely by creating my own EntityManager via the Persistence API (and close it afterwards, respectively) I do get the very same exception.
So how can I run this delete query or other update queryies from within the batch job?
I'd suggest using plain jdbc to run this delete query, with either auto commit or manual transaction commit.
During the batchlet processing, the incoming transaction is suspended. So the entity manager does not have a transaction context.
Ultimately I made it work by following this tutorial: https://dzone.com/articles/resource-local-vs-jta-transaction-types-and-payara
and going for the Classic RESOURCE_LOCAL Application pattern.
It involves injecting the nonJTA EntityManagerFactory, using that to create the entitymanager and closing it after use. Of course the transaction has to be managed manually but after all now it works.
The essential excerpt of my code looke like this:
#PersistenceUnit(unitName = "...")
private EntityManagerFactory emf;
#Inject
#BatchProperty(name = "technologyIds")
private String technologyIds;
private void clearQueue(long technologyId) {
EntityManager entityManager = emf.createEntityManager();
entityManager.getTransaction().begin();
Query q = entityManager.createQuery("delete from Record r where r.technologyId=:technologyId");
q.setParameter("technologyId", technologyId);
q.executeUpdate();
entityManager.getTransaction().commit();
entityManager.close();
}
We have a requirement where around 25 CSV files would come each day & stored the Database in equivalent table structure.
Any of CSV file column structure could change in future by add new /remove columns & underlying DB table would align to the new format, without code change or redeployment.
Here are the choice of tech.
SpringBoot as Run time
Hibernate as JPA/DB Inetraction
Oracle DB as database
If using Hibernate, how to achieving this dynamic column management of the table as per the incoming CSV?
As far as I know, Hibernate would have Java Entity classes equivalent to the Table , which will be used to persists data. Any table change need Entity class change too.
Possible solution could be
just define basic JPA Entity & table structure (like ids & FKs linking to other tables etc) for CSV equivalent table,
then on arrival of CSV files, add the columns to the table by running the ALTER table command from application
In future 1st CSVs, if column added/removed , use similar alter commands
Is this achievable by Hibernate?
Or any other product better suited for this kind of tasks.
Task definition
We will have to implement a mechanism allowing for creating/deleting custom fields in real time avoiding the application restart, add a value into it and make sure the value is present in the application database. Besides we will have to make sure that the custom field can be used in queries.
Solution
Domain Model
We will first need a business entity class which we will experiment with. Let is be Contact class. There will be two persistent fields: id and name.
However besides these permanent and unchangeable fields the class should be some sort of construction to store values of custom fields. Map would be an ideal construction for this.
Let's create a base class for all business entities supporting custom fields - CustomizableEntity, that contains Map CustomProperties to work with custom fields:
package com.enterra.customfieldsdemo.domain;
import java.util.Map;
import java.util.HashMap;
public abstract class CustomizableEntity {
private Map customProperties;
public Map getCustomProperties() {
if (customProperties == null)
customProperties = new HashMap();
return customProperties;
}
public void setCustomProperties(Map customProperties) {
this.customProperties = customProperties;
}
public Object getValueOfCustomField(String name) {
return getCustomProperties().get(name);
}
public void setValueOfCustomField(String name, Object value) {
getCustomProperties().put(name, value);
}
}
Step 1 - base class CustomizableEntity
Inherit our class Contact from this base class:
package com.enterra.customfieldsdemo.domain;
import com.enterra.customfieldsdemo.domain.CustomizableEntity;
public class Contact extends CustomizableEntity {
private int id;
private String name;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
Step 2 - Class Contact inherited from CustomizableEntity.
We should not forget about the mapping file for this class:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Configuration DTD//EN"
"http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd">
<hibernate-mapping auto-import="true" default-access="property" default-cascade="none" default-lazy="true">
<class abstract="false" name="com.enterra.customfieldsdemo.domain.Contact" table="tbl_contact">
<id column="fld_id" name="id">
<generator class="native"/>
</id>
<property name="name" column="fld_name" type="string"/>
<dynamic-component insert="true" name="customProperties" optimistic-lock="true" unique="false" update="true">
</dynamic-component>
</class>
</hibernate-mapping>
Step 3 - Mapping Class Contact.
Please note that properties id and name are done as all ordinary properties, however for customProperties we use a tag . Documentation on Hibernate 3.2.0GA says that the point of a dynamic-component is:
"The semantics of a mapping are identical to . The advantage of this kind of mapping is the ability to determine the actual properties of the bean at deployment time, just by editing the mapping document. Runtime manipulation of the mapping document is also possible, using a DOM parser. Even better, you can access (and change) Hibernate's configuration-time metamodel via the Configuration object."
Based on this regulation from Hibernate documentation we will be building this function mechanism.
HibernateUtil and hibernate.cfg.xml
After we are defined with the domain model of our application we have to create necessary conditions for Hibernate framework functioning. For this we have to create a configuration file hibernate.cfg.xml and class to work with the core Hibernate functions.
<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE hibernate-configuration
PUBLIC "-//Hibernate/Hibernate Configuration DTD//EN"
"http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory>
<property name="show_sql">true</property>
<property name="dialect">
org.hibernate.dialect.MySQLDialect</property>
<property name="cglib.use_reflection_optimizer">true</property>
<property name="hibernate.connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="hibernate.connection.url">jdbc:mysql://localhost:3306/custom_fields_test</property>
<property name="hibernate.connection.username">root</property>
<property name="hibernate.connection.password"></property>
<property name="hibernate.c3p0.max_size">50</property>
<property name="hibernate.c3p0.min_size">0</property>
<property name="hibernate.c3p0.timeout">120</property>
<property name="hibernate.c3p0.max_statements">100</property>
<property name="hibernate.c3p0.idle_test_period">0</property>
<property name="hibernate.c3p0.acquire_increment">2</property>
<property name="hibernate.jdbc.batch_size">20</property>
<property name="hibernate.hbm2ddl.auto">update</property>
</session-factory>
</hibernate-configuration>
Step 4 - Hibernate configuration file.
The file hibernate.cfg.xml does not contain anything noticeable except for this string:
<property name="hibernate.hbm2ddl.auto">update</property>
Step 5 - using auto-update.
Later we will explain in details on its purpose and tell more how we can go without it. There are several ways to implement class HibernateUtil. Our implementation will differ a bit from well known due to changes into Hibernate configuration.
package com.enterra.customfieldsdemo;
import org.hibernate.*;
import org.hibernate.mapping.PersistentClass;
import org.hibernate.tool.hbm2ddl.SchemaUpdate;
import org.hibernate.cfg.Configuration;
import com.enterra.customfieldsdemo.domain.Contact;
public class HibernateUtil {
private static HibernateUtil instance;
private Configuration configuration;
private SessionFactory sessionFactory;
private Session session;
public synchronized static HibernateUtil getInstance() {
if (instance == null) {
instance = new HibernateUtil();
}
return instance;
}
private synchronized SessionFactory getSessionFactory() {
if (sessionFactory == null) {
sessionFactory = getConfiguration().buildSessionFactory();
}
return sessionFactory;
}
public synchronized Session getCurrentSession() {
if (session == null) {
session = getSessionFactory().openSession();
session.setFlushMode(FlushMode.COMMIT);
System.out.println("session opened.");
}
return session;
}
private synchronized Configuration getConfiguration() {
if (configuration == null) {
System.out.print("configuring Hibernate ... ");
try {
configuration = new Configuration().configure();
configuration.addClass(Contact.class);
System.out.println("ok");
} catch (HibernateException e) {
System.out.println("failure");
e.printStackTrace();
}
}
return configuration;
}
public void reset() {
Session session = getCurrentSession();
if (session != null) {
session.flush();
if (session.isOpen()) {
System.out.print("closing session ... ");
session.close();
System.out.println("ok");
}
}
SessionFactory sf = getSessionFactory();
if (sf != null) {
System.out.print("closing session factory ... ");
sf.close();
System.out.println("ok");
}
this.configuration = null;
this.sessionFactory = null;
this.session = null;
}
public PersistentClass getClassMapping(Class entityClass){
return getConfiguration().getClassMapping(entityClass.getName());
}
}
Step 6 - HibernateUtils class.
Alongside with usual methods like getCurrentSession(), getConfiguration(), which is necessary for regular work of the application based on Hibernate, we also have implemented such methods as: reset() and getClassMapping(Class entityClass). In the method getConfiguration(), we configure Hibernate and add class Contact into the configuration.
Method reset() has been used to close all used by Hibernate resources and clearing all of its settings:
public void reset() {
Session session = getCurrentSession();
if (session != null) {
session.flush();
if (session.isOpen()) {
System.out.print("closing session ... ");
session.close();
System.out.println("ok");
}
}
SessionFactory sf = getSessionFactory();
if (sf != null) {
System.out.print("closing session factory ... "); sf.close();
System.out.println("ok");
}
this.configuration = null;
this.sessionFactory = null;
this.session = null;
}
Step 7 - method reset()
Method getClassMapping(Class entityClass) returns object PersistentClass, that contains full information on mapping the related entity. In particular the manipulations with the object PersistentClass allow modifying the set of attributes of the entity class in the run-time.
public PersistentClass getClassMapping(Class entityClass){
return
getConfiguration().getClassMapping(entityClass.getName());
}
Step 8 - method getClassMapping(Class entityClass).
Manipulations with mapping
Once we have the business entity class (Contact) available and the main class to interact with Hibernate we can start working. We can create and save samples of the Contact class. We can even place some data into our Map customProperties, however we should be aware that this data (stored in Map customProperties) are not saved to the DB.
To have the data saved we should provide for the mechanism of creating custom fields in our classs and make it the way Hibernate knows how to work with them.
To provide for class mapping manipulation we should create some interface. Let's call it CustomizableEntityManager. Its name should reflect the purpose of the interface managing a business entity, its contents and attributes:
package com.enterra.customfieldsdemo;
import org.hibernate.mapping.Component;
public interface CustomizableEntityManager {
public static String CUSTOM_COMPONENT_NAME = "customProperties";
void addCustomField(String name);
void removeCustomField(String name);
Component getCustomProperties();
Class getEntityClass();
}
Step 9 - Interface CustomizableEntityManager
The main methods for the interface are: void addCustomField(String name) and void removeCustomField(String name). These should created and remove our custom field in the mapping of the corresponding class.
Below is the way to implement the interface:
package com.enterra.customfieldsdemo;
import org.hibernate.cfg.Configuration;
import org.hibernate.mapping.*;
import java.util.Iterator;
public class CustomizableEntityManagerImpl implements CustomizableEntityManager {
private Component customProperties;
private Class entityClass;
public CustomizableEntityManagerImpl(Class entityClass) {
this.entityClass = entityClass;
}
public Class getEntityClass() {
return entityClass;
}
public Component getCustomProperties() {
if (customProperties == null) {
Property property = getPersistentClass().getProperty(CUSTOM_COMPONENT_NAME);
customProperties = (Component) property.getValue();
}
return customProperties;
}
public void addCustomField(String name) {
SimpleValue simpleValue = new SimpleValue();
simpleValue.addColumn(new Column("fld_" + name));
simpleValue.setTypeName(String.class.getName());
PersistentClass persistentClass = getPersistentClass();
simpleValue.setTable(persistentClass.getTable());
Property property = new Property();
property.setName(name);
property.setValue(simpleValue);
getCustomProperties().addProperty(property);
updateMapping();
}
public void removeCustomField(String name) {
Iterator propertyIterator = customProperties.getPropertyIterator();
while (propertyIterator.hasNext()) {
Property property = (Property) propertyIterator.next();
if (property.getName().equals(name)) {
propertyIterator.remove();
updateMapping();
return;
}
}
}
private synchronized void updateMapping() {
MappingManager.updateClassMapping(this);
HibernateUtil.getInstance().reset();
// updateDBSchema();
}
private PersistentClass getPersistentClass() {
return HibernateUtil.getInstance().getClassMapping(this.entityClass);
}
}
Step 10 - implementing interface CustomizableEntityManager
First of all we should point out that when creating class CustomizableEntityManager we specify the business entity class the manager will operate. This class is passed as a parameter to designer CustomizableEntityManager:
private Class entityClass;
public CustomizableEntityManagerImpl(Class entityClass) {
this.entityClass = entityClass;
}
public Class getEntityClass() {
return entityClass;
}
Step 11 - class designer CustomizableEntityManagerImpl
Now we should get more interested in how to implement method void addCustomField(String name):
public void addCustomField(String name) {
SimpleValue simpleValue = new SimpleValue();
simpleValue.addColumn(new Column("fld_" + name));
simpleValue.setTypeName(String.class.getName());
PersistentClass persistentClass = getPersistentClass();
simpleValue.setTable(persistentClass.getTable());
Property property = new Property();
property.setName(name);
property.setValue(simpleValue);
getCustomProperties().addProperty(property);
updateMapping();
}
Step 12 - creating custom field.
As we can see from the implementation, Hibernate offers more options in working with properties of persistent objects and their representation in the DB. As per the essence of the method:
1) We create class SimpleValue that allow us to denote how the value of this custom field will be stored in the DB in which field and table of the DB:
SimpleValue simpleValue = new SimpleValue();
simpleValue.addColumn(new Column("fld_" + name));
simpleValue.setTypeName(String.class.getName());
PersistentClass persistentClass = getPersistentClass();
simpleValue.setTable(persistentClass.getTable());
Step 13 - creating new column of the table.
2) We create a property of the persistent object and add a dynamic component into it (!), that we have planned to be used for this purpose:
Property property = new Property()
property.setName(name)
property.setValue(simpleValue)
getCustomProperties().addProperty(property)
Step 14 - creating object property.
3) And finally we should make our application perform certain changes in the xml files and update the Hibernate configuration. This can be done via method updateMapping();
It is necessary to clarify the purpose of another two get-methods which have been used in the code above. The first method is getCustomProperties():
public Component getCustomProperties() {
if (customProperties == null) {
Property property = getPersistentClass().getProperty(CUSTOM_COMPONENT_NAME);
customProperties = (Component) property.getValue();
}
return customProperties;
}
Step 15 - getting CustomProperties as Component.
This method finds and returns object Component corresponding to the tag in the mapping of our business entity.
The second method is updateMapping():
private synchronized void updateMapping() {
MappingManager.updateClassMapping(this);
HibernateUtil.getInstance().reset();
// updateDBSchema();
}
Step 16 - method updateMapping().
The method is in charge for storing the updated mapping of the persistent class and updates the configuration status of Hibernate to make further changes that we make valid when the changes take effect.
By the way we should get back to the string:
<property name="hibernate.hbm2ddl.auto">update</property>
of the Hibernate configuration. If this string was missing we would have to launch executing updates of the DB schema using hibernate utilities. However using the setting allows us to avoid this.
Saving mapping
Modifications to mapping made in run-time do not save by themselves into the corresponding xml mapping file and to make the changes to get activated at next launch of the application we need to manually save changes to the corresponding mapping file.
To do this we will be using class MappingManager the main purpose of which is to save mapping of the designated business entity to its xml mapping file:
package com.enterra.customfieldsdemo;
import com.enterra.customfieldsdemo.domain.CustomizableEntity;
import org.hibernate.Session;
import org.hibernate.mapping.Column;
import org.hibernate.mapping.Property;
import org.hibernate.type.Type;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import java.util.Iterator;
public class MappingManager {
public static void updateClassMapping(CustomizableEntityManager entityManager) {
try {
Session session = HibernateUtil.getInstance().getCurrentSession();
Class<? extends CustomizableEntity> entityClass = entityManager.getEntityClass();
String file = entityClass.getResource(entityClass.getSimpleName() + ".hbm.xml").getPath();
Document document = XMLUtil.loadDocument(file);
NodeList componentTags = document.getElementsByTagName("dynamic-component");
Node node = componentTags.item(0);
XMLUtil.removeChildren(node);
Iterator propertyIterator = entityManager.getCustomProperties().getPropertyIterator();
while (propertyIterator.hasNext()) {
Property property = (Property) propertyIterator.next();
Element element = createPropertyElement(document, property);
node.appendChild(element);
}
XMLUtil.saveDocument(document, file);
} catch (Exception e) {
e.printStackTrace();
}
}
private static Element createPropertyElement(Document document, Property property) {
Element element = document.createElement("property");
Type type = property.getType();
element.setAttribute("name", property.getName());
element.setAttribute("column", ((Column)
property.getColumnIterator().next()).getName());
element.setAttribute("type",
type.getReturnedClass().getName());
element.setAttribute("not-null", String.valueOf(false));
return element;
}
}
Step 17 - the utility to update mapping of the persistent class.
The class literally performs the following:
Defines a location and loads xml mapping for the designated business entity into the DOM Document object for further manipulations with it;
Finds the element of this document . In particular here we store the custom fields and its contents we change;
Delete (!) all embedded elements from this element;
For any persistent property contained in our component that is in charge for the custom fields storage, we create a specific document element and define attributes for the element from the corresponding property;
Save this newly created mapping file.
When manipulating XML we use (as we can see from the code) class XMLUtil, that in general can be implemented in any way though it should correctly load and save the xml file.
Our implementation is given at the Step below:
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import org.w3c.dom.Document;
import org.xml.sax.SAXException;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.transform.TransformerException;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.OutputKeys;
import javax.xml.transform.stream.StreamResult;
import javax.xml.transform.dom.DOMSource;
import java.io.IOException;
import java.io.FileOutputStream;
public class XMLUtil {
public static void removeChildren(Node node) {
NodeList childNodes = node.getChildNodes();
int length = childNodes.getLength();
for (int i = length - 1; i > -1; i--)
node.removeChild(childNodes.item(i));
}
public static Document loadDocument(String file)
throws ParserConfigurationException, SAXException, IOException {
DocumentBuilderFactory factory =DocumentBuilderFactory.newInstance();
DocumentBuilder builder = factory.newDocumentBuilder();
return builder.parse(file);
}
public static void saveDocument(Document dom, String file)
throws TransformerException, IOException {
TransformerFactory tf = TransformerFactory.newInstance();
Transformer transformer = tf.newTransformer();
transformer.setOutputProperty(OutputKeys.DOCTYPE_PUBLIC, dom.getDoctype().getPublicId());
transformer.setOutputProperty(OutputKeys.DOCTYPE_SYSTEM, dom.getDoctype().getSystemId());
DOMSource source = new DOMSource(dom);
StreamResult result = new StreamResult();
FileOutputStream outputStream = new FileOutputStream(file);
result.setOutputStream(outputStream);
transformer.transform(source, result);
outputStream.flush();
outputStream.close();
}
}
Source: Please refer this article for more detail
I am using spring 4.0 and hibernate 4.3,
The problem is that if I get the hibernate session object through session.openSession() then i can initialized the lazy object anywhere including JSP but then i have to manage the session object of hibernate by my own which is wrong practice.
But if i create the session by session.getCurrentSession() then i get the benefit of spring managing the session for me but then this problem occurs that i cant lazy initialised the object out side the transactional boundaries like JSP.
I don't want to explicitly set the query fetch mode to eager or join every time for every lazy object also every time i don't want to change the domain object annotation changed to Eager.
Overall i am just asking that what is the correct way to deal with lazy initialization object when used with spring and hibernate. Please Help
Below is my logic to get lazy loaded data when needed.
import java.io.Serializable;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.orm.hibernate4.HibernateTemplate;
import org.springframework.stereotype.Service;
public class UserDAO implements GenericDao<User,String> {
#Autowired
private HibernateTemplate template;
#Override
public User load(final String id) {
return template.load(User.class,id);
}
#Override
public User get(final String id) {
return template.get(User.class,id);
}
public User getUserVideos(final String id) {
User user = template.get(User.class,id);
template.initialize(user.getVideo());
return user;
}
#Override
public Long count() {
return new Long(template.loadAll(User.class).size());
}
#Override
public void flush() {
template.flush();
}
}
And you can use the returned objects in your JSP or whereever you want
I use following tecnologies:
TestNG(6.9.10)
Spring(4.3.2.RELEASE)
Hibernate(5.1.0.Final)
Java 8
I test some code with functionality by integration tests and i need to check the entity for correct save/update/delete or any other changes. There are sessionFactory configuration in my .xml :
<bean id="sessionFactory" class="org.springframework.orm.hibernate5.LocalSessionFactoryBean"
p:dataSource-ref="dataSource" p:hibernateProperties="jdbcProperties">
<property name="packagesToScan" value="my.package"/>
</bean>
and test class example:
#ContextConfiguration(locations = {"classpath:/applicationContext-test.xml",
"classpath:/applicationContext-dao.xml",
"classpath:/applicationContext-orm.xml"})
public class AccountServiceTest extends AbstractTransactionalTestNGSpringContextTests {
#Autowired
private SomeService someService;
#Autowired
private SessionFactory sessionFactory;
#Test
public void updateEntity() {
//given
Long entityId = 1L;
SomeClass expected = someService.get(entityId);
String newPropertyValue = "new value";
//when
someService.changeEntity(entity, newPropertyValue);
// Manual flush is required to avoid false positive in test
sessionFactory.getCurrentSession().flush();
//then
expected = someService.get(entityId);
Assert.assertEquals(expected.getChangedProperty() , newPropertyValue);
}
service method:
#Transactional
#Override
public int changeEntity(entity, newPropertyValue) {
return dao().executeNamedQuery(REFRESH_ACCESS_TIME_QUERY,
CollectionUtils.arrayToMap("id", entity.getId(), "myColumn", newPropertyValue));
}
dao:
#Override
public int executeNamedQuery(final String query, final Map<String, Object> parameters) {
Query queryObject = sessionFactory.getCurrentSession().getNamedQuery(query);
if (parameters != null) {
for (Map.Entry<String, Object> entry : parameters.entrySet()) {
NamedQueryUtils.applyNamedParameterToQuery(queryObject, entry.getKey(), entry.getValue());
}
}
return queryObject.executeUpdate();
}
But my entity property didn't change after flush()
as described here, change #Autowire SessionFactory with #PersistenceContext EntityManager , i should use EntityManager to flush() - but i can't do this - i can't transform sessionFactory to EntityManager, and i don't need in creation of EntityManager for my application - because i need to change my .xml config file and others.
Is there are any another solutions of this problem?
Your code is actually working as expected.
Your test method is transactional and thus your Session is alive during the whole execution of the test method. The Session is also the 1st level cache for hibernate and when loading an entity from the database it is put into the session.
So the line SomeClass expected = someService.get(entityId); will load the entity from the database and with it also put it in the Session.
Now this line expected = someService.get(entityId); first checks (well actually the dao method underneath) checks if the entity of the requested type with the id is already present in the Session if so it simply returns it. It will not query the database!.
The main problem is that you are using hibernate in a wrong way, you are basically bypassing hibernate with the way you are updating your database. You should update your entity and persist it. You should not write queries to update the database!
Annotated test method
#Test
public void updateEntity() {
//given
Long entityId = 1L;
SomeClass expected = someService.get(entityId); // load from db and put in Sesion
String newPropertyValue = "new value";
//when
someService.changeEntity(entity, newPropertyValue); // update directly in database bypass Session and entity
// Manual flush is required to avoid false positive in test
sessionFactory.getCurrentSession().flush();
//then
expected = someService.get(entityId); // return entity from Session
Assert.assertEquals(expected.getChangedProperty() , newPropertyValue);
}
To only fix the test add a call to clear() after the flush().
sessionFactory.getCurrentSession().clear();
However what you actually should do is stop writing code like that and use Hibernate and persistent entities in the correct way.
#Test
public void updateEntity() {
//given
Long entityId = 1L;
String newPropertyValue = "new value";
SomeClass expected = someService.get(entityId);
expected.setMyColumn(newPropertyValue);
//when
someService.changeEntity(entity);
sessionFactory.getCurrentSession().flush();
// now you should use a SQL query to verify the state in the DB.
Map<String, Object> dbValues = getJdbcTemplate().queryForMap("select * from someClass where id=?", entityId);
//then
Assert.assertEquals(dbValues.get("myColumn"), newPropertyValue);
}
Your dao method should look something like this.
public void changeEntity(SomeClass entity) {
sessionFactory.getCurrentSession().saveOrUpdate(entity);
}
I am using the latest spring-data-mongodb (1.1.0.M2) and the latest Mongo Driver (2.9.0-RC1). I have a situation where I have multiple clients connecting to my application and I want to give each one their own "schema/database" in the same Mongo server. This is not a very difficult task to achieve if I was using the driver directly:
Mongo mongo = new Mongo( new DBAddress( "localhost", 127017 ) );
DB client1DB = mongo.getDB( "client1" );
DBCollection client1TTestCollection = client1DB.getCollection( "test" );
long client1TestCollectionCount = client1TTestCollection.count();
DB client2DB = mongo.getDB( "client2" );
DBCollection client2TTestCollection = client2DB.getCollection( "test" );
long client2TestCollectionCount = client2TTestCollection.count();
See, easy. But spring-data-mongodb does not allow an easy way to use multiple databases. The preferred way of setting up a connection to Mongo is to extend the AbstractMongoConfiguration class:
You will see that you override the following method:
getDatabaseName()
So it forces you to use one database name. The repository interfaces that you then build use that database name inside the MongoTemplate that is passed into the SimpleMongoRepository class.
Where on earth would I stick multiple database names? I have to make multiple database names, multiple MongoTempates (one per database name), and multiple other config classes. And that still doesn't get my repository interfaces to use the correct template. If anyone has tried such a thing let me know. If I figure it out I will post the answer here.
Thanks.
Here is a link to an article I think is what you are looking for http://michaelbarnesjr.wordpress.com/2012/01/19/spring-data-mongo/
The key is to provide multiple templates
configure a template for each database.
<bean id="vehicleTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongoConnection"/>
<constructor-arg name="databaseName" value="vehicledatabase"/>
</bean>
configure a template for each database.
<bean id="imageTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongoConnection"/>
<constructor-arg name="databaseName" value="imagedatabase"/>
</bean>
<bean id="vehicleTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongoConnection"/>
<constructor-arg name="databaseName" value="vehicledatabase"/>
</bean>
Now, you need to tell Spring where your repositories are so it can inject them. They must all be in the same directory. I tried to have them in different sub-directories, and it did not work correctly. So they are all in the repository directory.
<mongo:repositories base-package="my.package.repository">
<mongo:repository id="imageRepository" mongo-template-ref="imageTemplate"/>
<mongo:repository id="carRepository" mongo-template-ref="vehicleTemplate"/>
<mongo:repository id="truckRepository" mongo-template-ref="vehicleTemplate"/>
</mongo:repositories>
Each repository is an Interface and is written as follows (yes, you can leave them blank):
#Repository
public interface ImageRepository extends MongoRepository<Image, String> {
}
#Repository
public interface TruckRepository extends MongoRepository<Truck, String> {
}
The name of the private variable imageRepository is the collection! Image.java will be saved to the image collection within the imagedb database.
Here is how you can find, insert, and delete records:
#Service
public class ImageService {
#Autowired
private ImageRepository imageRepository;
}
By Autowiring you match the variable name to the name (id) in your configuration.
You may want to sub-class SimpleMongoDbFactory and strategize how the default DB as returned by getDb is returned. One option is to use thread-local variables to decide on the Db to use, instead of using multiple MongoTemplates.
Something like this:
public class ThreadLocalDbNameMongoDbFactory extends SimpleMongoDbFactory {
private static final ThreadLocal<String> dbName = new ThreadLocal<String>();
private final String defaultName; // init in c'tor before calling super
// omitted constructor for clarity
public static void setDefaultNameForCurrentThread(String tlName) {
dbName.set(tlName);
}
public static void clearDefaultNameForCurrentThread() {
dbName.remove();
}
public DB getDb() {
String tlName = dbName.get();
return super.getDb(tlName != null ? tlName : defaultName);
}
}
Then, override mongoDBFactory() in your #Configuration class that extends from AbstractMongoConfiguration like so:
#Bean
#Override
public MongoDbFactory mongoDbFactory() throws Exception {
if (getUserCredentials() == null) {
return new ThreadLocalDbNameMongoDbFactory(mongo(), getDatabaseName());
} else {
return new ThreadLocalDbNameMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials());
}
}
In your client code (maybe a ServletFilter or some such) you will need to call:
ThreadLocalDBNameMongoRepository.setDefaultNameForCurrentThread()
before doing any Mongo work and subsequently reset it with:
ThreadLocalDBNameMongoRepository.clearDefaultNameForCurrentThread()
after you are done.
So after much research and experimentation, I have concluded that this is not yet possibly with the current spring-data-mongodb project. I tried baja's method above and ran into a specific hurdle. The MongoTemplate runs its ensureIndexes() method from within its constructor. This method calls out the the database to make sure annotated indexes exist in the database. The constructor for MongoTemplate gets called when Spring starts up so I never even have a chance to set a ThreadLocal variable. I have to have a default already set when Spring starts, then change it when a request comes in. This is not allowable because I don't want nor do I have a default database.
All was not lost though. Our original plan was to have each client running on its own application server, pointed at its own MongoDB database on the MongoDB server. Then we can provide a -Dprovider= system variable and each server runs pointing only to one database.
We were instructed to have a multi-tenant application, hence the attempt at the ThreadLocal variable. But since it did not work, we were able to run the application the way we had originally designed.
I believe there is a way though to make this all work, it just takes more than is described in the other posts. You have to make your own RepositoryFactoryBean. Here is the example from the Spring Data MongoDB Reference Docs. You would still have to implement your own MongoTemplate and delay or remove the ensureIndexes() call. But you would have to rewrite a few classes to make sure your MongoTemplate is called instead of Spring's. In other words, a lot of work. Work that I would like to see happen or even do, I just did not have the time.
Thanks for the responses.
The spot to look at is the MongoDbFactory interface. The basic implementation of that takes a Mongo instance and works with that throughout all the application lifetime. To achieve a per-thread (and thus per-request) database usage you'll probably have to implement something along the lines of AbstractRoutingDataSource. The idea is pretty much that you have a template method that will have to lookup the tenant per invocation (ThreadLocal bound I guess) and then select a Mongo instance from a set of predefined ones or some custom logic to come up with a fresh one for a new tenant etc.
Keep in mind that MongoDbFactory usually get's used through the getDb() method. However, there are features in MongoDB that need us to provide a getDb(String name). DBRefs (sth. like a foreign key in the relational world) can point to documents an entirely different database. So if you're doing the delegation either avoid using that feature (I think the DBRefs pointing to another DB are the only places calling getDb(name)) or explicitly handle it.
From a configuration point of view you could either simply override mongoDbFactory() entirely or simply not extend the base class at all and come up with your own Java based configuration.
I used different DB using java Config, this is how i did it:
#Bean
public MongoDbFactory mongoRestDbFactory() throws Exception {
MongoClientURI uri=new MongoClientURI(environment.getProperty("mongo.uri"));
return new SimpleMongoDbFactory(uri);
}
#Override
public String getDatabaseName() {
return "rest";
}
#Override
public #Bean(name = "secondaryMongoTemplate") MongoTemplate mongoTemplate() throws Exception{ //hay que cambiar el nombre de los templates para que el contendor de beans sepa la diferencia
return new MongoTemplate(mongoRestDbFactory());
}
And the other was like this:
#Bean
public MongoDbFactory restDbFactory() throws Exception {
MongoClientURI uri = new MongoClientURI(environment.getProperty("mongo.urirestaurants"));
return new SimpleMongoDbFactory(uri);
}
#Override
public String getDatabaseName() {
return "rest";
}
#Override
public #Bean(name = "primaryMongoTemplate") MongoTemplate mongoTemplate() throws Exception{
return new MongoTemplate(restDbFactory());
}
So when i need to change my database i only select which Config to use
An example with Spring boot V2.6.2 :
Content of your "application.yml" file :
spring:
application:
name: myApp
autoconfigure:
data:
mongodb:
host: localhost
port: 27017
database: FirstDatabase
mongodbreference:
host: localhost
port: 27017
database: SecondDatabase
In a Classe named "MultipleMongoProperties.java" :
package your.packagename;
import lombok.Data;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.mongo.MongoProperties;
import org.springframework.boot.context.properties.ConfigurationProperties;
#Data
#ConfigurationProperties(prefix = "spring.data")
public class MultipleMongoProperties {
private MongoProperties mongodb = new MongoProperties();
private MongoProperties mongodbreference = new MongoProperties();
}
And finaly the class "MultipleMongoConfig.java" :
package your.package;
import com.mongodb.client.MongoClients;
import lombok.RequiredArgsConstructor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.autoconfigure.mongo.MongoProperties;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoClientDatabaseFactory;
#Configuration
#RequiredArgsConstructor
#EnableConfigurationProperties(MultipleMongoProperties.class)
public class MultipleMongoConfig {
private static final Logger LOG = LoggerFactory.getLogger(Multip
leMongoConfig.class);
private final MultipleMongoProperties mongoProperties;
private MongoProperties mongoDestination;
#Bean("referenceMongoTemplate")
#Primary
public MongoTemplate referenceMongoTemplate() {
return new MongoTemplate(referenceFactory(this.mongoProperties.getMongodbreference()));
}
#Bean("destinationMongoTemplate")
public MongoTemplate destinationMongoTemplate() {
return new MongoTemplate(destinationFactory(this.mongoProperties.getMongodb()));
}
public MongoDatabaseFactory referenceFactory(final MongoProperties mongo) {
this.setUriToMongoProperties(mongo);
return new SimpleMongoClientDatabaseFactory(MongoClients.create(mongo.getUri()), mongo.getDatabase());
}
public MongoDatabaseFactory destinationFactory(final MongoProperties mongo) {
this.setUriToMongoProperties(mongo);
return new SimpleMongoClientDatabaseFactory(MongoClients.create(mongo.getUri()), mongo.getDatabase());
}
private void setUriToMongoProperties(MongoProperties mongo) {
mongo.setUri("mongodb://" + mongo.getUsername() + ":" + String.valueOf(mongo.getPassword()) + "#" + mongo.getHost() + ":" + mongo.getPort() + "/" + mongo.getAuthenticationDatabase());
}
}
In another class you just have to implement :
package your.package;
import com.mongodb.bulk.BulkWriteResult;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.stereotype.Component;
#Component
public class CollectionRepositoryImpl implements CollectionsRepository {
#Autowired
#Qualifier("referenceMongoTemplate")
private MongoTemplate referenceMongoTemplate;
#Autowired
#Qualifier("destinationMongoTemplate")
private MongoTemplate destinationMongoTemplate;
...
As far as I understand, you want more flexibility in changing the current db on the fly.
I've linked a project that implements multi tenancy in a simple way.
It could be used as a starting point for the application.
It implements SimpleMongoDbFactory and provide a custom getDB method to resolve the correct db to use in certain moment. It can be improved in many ways, for example, by retrieving the db details from a HttpSession from SpringSession object, which for instance could be cached by Redis .
To have different mongoTemplates using different dbs at the same time, maybe change the scope of your mongoDbFactory to session.
References:
multi-tenant-spring-mongodb