i want to know when does hibernate fulshes the context session when i call session= session.getCurrentSession()
The thing is i have 2 methods in my dao calling getCurrentSession(), when i process the update making the call to getCurrentSession() the entitys are empty:
SessionImpl(PersistenceContext[entityKeys=[],collectionKeys=[]];...)
How can i make this entitys persist from the select method to the update method?
Here are my methods:
public List<SystemConfiguration> getListConfigurations() {
List<SystemConfiguration> lista = new ArrayList<SystemConfiguration>();
Session session = null;
Query query = null;
String sql = "from SystemConfiguration where description = :desc";
try {
/* BEFORE
session = SessionFactoryUtil.getInstance().getCurrentSession();
#SuppressWarnings("unused")
Transaction ta = session.beginTransaction(); */
//FOLLOWING LINE SOLVED THE PROBLEM
session = SessionFactoryUtil.getInstance().openSession();
query = session.createQuery(sql);
query.setString("desc", "configuracion");
lista = query.list();
return lista;
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
public void updateConfigurations(List<SystemConfiguration> configs) throws Exception{
Session sess = null;
Transaction tx = null;
try {
//BEFORE
//sess = SessionFactoryUtil.getInstance().getCurrentSession();
//FOLLOWING LINE SOLVED THE PROBLEM
sess = SessionFactoryUtil.getInstance().openSession(new SystemConfigurationInterceptor());
tx = sess.beginTransaction();
for (SystemConfiguration sys : configs) {
sess.update(sys);
}
tx.commit();
} // try
catch (Exception e) {
e.printStackTrace();
if (tx != null && tx.isActive()) {
tx.rollback();
} // if
throw e;
}
}
And this is my interceptor:
public class SystemConfigurationInterceptor extends EmptyInterceptor {
private int updates;
private int creates;
private int loads;
public void onDelete(Object entity,
Serializable id,
Object[] state,
String[] propertyNames,
Type[] types) {
// do nothing
}
// This method is called when Entity object gets updated.
public boolean onFlushDirty(Object entity,
Serializable id,
Object[] currentState,
Object[] previousState,
String[] propertyNames,
Type[] types) {
if ( entity instanceof SystemConfiguration ) {
updates++;
for ( int i=0; i < propertyNames.length; i++ ) {
if ( "updated_at".equals( propertyNames[i] ) ) {
currentState[i] = new Timestamp(Calendar.getInstance().getTime().getTime());
return true;
}
}
}
return false;
}
public boolean onLoad(Object entity,
Serializable id,
Object[] state,
String[] propertyNames,
Type[] types) {
if ( entity instanceof SystemConfiguration ) {
loads++;
}
return false;
}
// This method is called when Entity object gets created.
public boolean onSave(Object entity,
Serializable id,
Object[] state,
String[] propertyNames,
Type[] types) {
if ( entity instanceof SystemConfiguration ) {
creates++;
for ( int i=0; i<propertyNames.length; i++ ) {
if ( "updated_at".equals( propertyNames[i] ) ) {
state[i] = new Timestamp(Calendar.getInstance().getTime().getTime());
return true;
}
}
}
return false;
}
public void afterTransactionCompletion(Transaction tx) {
if ( tx.wasCommitted() ) {
System.out.println("Creations: " + creates + ", Updates: " + updates +", Loads: " + loads);
}
updates=0;
creates=0;
loads=0;
}
Hibernate will flush when you tell it to and when the current transaction is "closed" (usually when the DB connection is returned to the pool somehow).
So the answer to your question depends on which framework you use. With Spring, the session is flushed when the outermost #Transactional method returns.
Your "solution" above will not work for long since it never closes the session. While it returns a result, it will leak a database connection so after a few calls, you will run out of connections.
Also your question doesn't really make sense. SELECT doesn't change objects, so they don't need to be "persisted" before you change them.
After changing them in updateConfigurations(), Hibernate can chose not to write them into the database immediately and just update the cache.
Eventually, if you configured everything correctly, Spring will commit the transaction and that will flush the cache. But when you use Spring, you should never create open and close sessions because it will mess with what Spring is doing.
Related
I m using ignite cache, I want to cache a view where the id is not relevant so implementing loadCache seems to be somehow tricky to me when there is no id!!!
How should I update the example below
public class CacheJdbcPersonStore extends CacheStoreAdapter<Long, Person> {
...
// This method is called whenever "IgniteCache.loadCache()" or
// "IgniteCache.localLoadCache()" methods are called.
#Override public void loadCache(IgniteBiInClosure<Long, Person> clo, Object... args) {
if (args == null || args.length == 0 || args[0] == null)
throw new CacheLoaderException("Expected entry count parameter is not provided.");
final int entryCnt = (Integer)args[0];
Connection conn = null;
try (Connection conn = connection()) {
try (PreparedStatement st = conn.prepareStatement("select * from PERSONS")) {
try (ResultSet rs = st.executeQuery()) {
int cnt = 0;
while (cnt < entryCnt && rs.next()) {
Person person = new Person(rs.getLong(1), rs.getString(2), rs.getString(3));
clo.apply(person.getId(), person);
cnt++;
}
}
}
}
catch (SQLException e) {
throw new CacheLoaderException("Failed to load values from cache store.", e);
}
}
...
}
clo.apply(person.getId(), person); this part is the issue in my logic my view doesn't have an ID
You need some unique ID to store data in Ignite. If there is nothing suitable in the actual data, your options are:
UUID.randomUUID()
Simple counter (id++) or LongAdder/AtomicLong - works only if you are loading from a single node
IgniteAtomicSequence - works across entire cluster
I want to do Audit for my entity updates. So I have implemented EmptyInterceptor.
My onFlushDirty() method is not executing, but afterTransactionCompletion() does executes
I'm using Spring 4.1 and Hibernate 5.0
+
I have not done any configuration rather than #Component in configuration file till afterTransactionCompletion() get executed.
What I'm missing here ?
Also how to intercept Event query.executeUpdate() ?
My Interceptor class is as follows:
#Component
public class AuditLogInterceptor extends EmptyInterceptor {
private static final long serialVersionUID = 1L;
#Override
public boolean onFlushDirty(Object entity, Serializable id,
Object[] currentState, Object[] previousState,
String[] propertyNames, Type[] types) {
System.out.println("AuditLogInterceptor.onFlushDirty()");
System.out.println();
System.out.println("Property Names :- ");
for (String propertyName : propertyNames) {
System.out.print(", "+propertyName);
}
System.out.println();
System.out.println("Current State :- ");
for (Object current : currentState) {
System.out.print(", "+ String.valueOf( current ) );
}
System.out.println();
System.out.println("Previous State :- ");
for (Object previous : previousState) {
System.out.print(", "+ String.valueOf( previous ) );
}
return true;
//return super.onFlushDirty(entity, id, currentState, previousState,
//propertyNames, types);
}
#Override
public void afterTransactionCompletion(Transaction tx) {
// TODO Auto-generated method stub
System.out.println("AuditLogInterceptor.afterTransactionCompletion()");
super.afterTransactionCompletion(tx);
}
#Override
public boolean onSave(Object entity, Serializable id, Object[] state,
String[] propertyNames, Type[] types) {
System.out.println("AuditLogInterceptor.onSave()");
System.out.println("Property Names :- "+Arrays.toString(propertyNames));
System.out.println("States :- "+Arrays.toString(state));
return super.onSave(entity, id, state, propertyNames, types);
}
#Override
public void postFlush(#SuppressWarnings("rawtypes") Iterator entities) {
System.out.println();
System.out.println("AuditLogInterceptor.postFlush()");
for ( ; entities.hasNext() ;) {
System.out.println("-----"+ entities.next().getClass().getSimpleName());
}
System.out.println();
super.postFlush(entities);
}
}
Code In my DAO
#Override
public boolean updateAssignment( Integer workTaskDetailId, short workTaskStatus ) {
Session session = null;
Transaction transaction = null;
boolean isUpdated = false;
try {
session = this.sessionFactory.withOptions().interceptor( new AuditLogInterceptor() ).openSession();
transaction = session.beginTransaction();
String COMPLETION_DATE = "";
if( workTaskStatus == 263 )
COMPLETION_DATE = ", wtd.completionDate = :completionDate ";
final String UPDATE_WORKTASKSTATUS = "update WorkTaskDetail wtd set wtd.workTaskStatus = :workTaskStatus "
+COMPLETION_DATE+ "where wtd.workTaskDetailId = :workTaskDetailId ";
Query query = session.createQuery(UPDATE_WORKTASKSTATUS).setShort("workTaskStatus", workTaskStatus)
.setInteger("workTaskDetailId", workTaskDetailId);
if( workTaskStatus == 263 )
query.setDate("completionDate", new Date() );
int updateCount = query.executeUpdate();
if( updateCount > 0 )
isUpdated = true;
if( session != null )
session.flush();
if( transaction != null && transaction.getStatus().equals(TransactionStatus.ACTIVE) )
transaction.commit();
} catch ( Exception exception ) {
if( transaction != null && transaction.getStatus().equals( TransactionStatus.ACTIVE) )
transaction.rollback();
LOGGER.error("Message :- "+exception.getMessage());
LOGGER.error("Root Cause :- "+exception.getCause());
LOGGER.error(" ************************************************************");
} finally {
if( session != null )
session.close();
}
return isUpdated;
}
The onFlushDirty method was not called because there was no entity being modified by the currently running Persistence Context.
Only managed entities can generate automatic UPDATE statements. In your case, you are executing a manual SQL UPDATE, which is beyond the scope of Hibernate entity state transitions.
Therefore, you have two choices:
Use Hibernate Envers as there is no need to write a homemade Audit Log on top of Hibernate.
Use Debezium since the DB already features an Audit Log anyway (a.k.a. WAL, redo log, transaction log).
I have a CDI bean with Java object which I use to display profile data from Database:
Parent Bean
#Named("DCProfileTabGeneralController")
#ViewScoped
public class DCProfileTabGeneral implements Serializable
{
public DCObj dc;
public class DCObj
{
private int componentStatsId; // COMPONENTSTATSID NUMBER(38,0)
........
// Default Constructor
public DCObj(){};
public DCObj(int componentStatsId....)
{
this.componentStatsId = componentStatsId;
.......
}
public int getComponentStatsId()
{
return componentStatsId;
}
public void setComponentStatsId(int componentStatsId)
{
this.componentStatsId = componentStatsId;
}
....
}
// Getters ------------------------------------------------------------------------------------
public DCObj getdcData()
{
return dc;
}
#PostConstruct
public void initData() throws SQLException
{
initDBData();
}
// Generate data Object from Oracle
public void initDBData() throws SQLException
{
dc = new DCObj(result.getInt("COMPONENTSTATSID"),
.........
}
}
Validator
#Named("ValidatorDatacenterController")
#ViewScoped
public class ValidatorDatacenter implements Validator, Serializable
{
public ValidatorDatacenter()
{
}
#Inject
private DCProfileTabGeneral profileTabGeneral;
// Validate Datacenter Name
public void validateDatacenterName(FacesContext context, UIComponent component, Object value) throws ValidatorException, SQLException
{
int componentStatsId = -1;
if (profileTabGeneral != null)
{
DCObj dcData = profileTabGeneral.dc;
if (dcData != null)
{
componentStatsId = dcData.getComponentStatsId();
}
}
if (componentStatsId == -1)
{
return;
}
String l;
String s;
if (value != null && !(s = value.toString().trim()).isEmpty())
{
if (s.length() > 18)
{
throw new ValidatorException(new FacesMessage(FacesMessage.SEVERITY_ERROR,
" Value is too long! (18 digits max)", null));
}
if (ds == null)
{
throw new SQLException("Can't get data source");
}
Connection conn = null;
PreparedStatement ps = null;
ResultSet rs;
int resComponentStatsId = -1;
try
{
conn = ds.getConnection();
// if componentsstatsid <> edited componentstatsid in jsf -> throw validator exception
ps = conn.prepareStatement("SELECT componentstatsid from COMPONENTSTATS where NAME = ?");
ps.setString(1, s);
rs = ps.executeQuery();
while (rs.next())
{
resComponentStatsId = rs.getInt(1);
}
if (resComponentStatsId != -1 && resComponentStatsId != componentStatsId)
{
throw new ValidatorException(new FacesMessage(FacesMessage.SEVERITY_ERROR,
" '" + s + "' is already in use!", null));
}
}
catch (SQLException x)
{
throw new ValidatorException(new FacesMessage(FacesMessage.SEVERITY_ERROR,
" SQL error!", null));
}
finally
{
if (ps != null)
{
ps.close();
}
if (conn != null)
{
conn.close();
}
}
}
else
{
throw new ValidatorException(new FacesMessage(FacesMessage.SEVERITY_ERROR,
" This field cannot be empty!", null));
}
}
}
I have a custom validator which checks the input values from the profile page into the Database. I tested to get the Java object from the parent page using #Inject and to pass the Ojject to the validator. It turns out that I get empty Java object every time when I use #Inject.
I also tested to get Int using CDI. It works but when I again tested to get the Java Object again I get empty Object.
Can you tell me what is the proper way to call a Java Object from CDI bean? Why I cannot get Java object from CDI bean?
If I recall correctly CDI injection will not work with a validator. Use advanced from Myfaces CODI as the JSF-module from deltaspike is not ready yet. https://cwiki.apache.org/EXTCDI/jsf-usage.html
Or go for deltaspike and use the BeanProvider to get your instance.
BeanProvider.getContextualReference(DCProfileTabGeneral .class, false);
I have a problem regarding how to save the objects in the Database in Java when having many-to-one relationship.
Basically I have 2 classes - UserVO and GroupVO, that look like this:
public class UserVO extends ValueObject implements Serializable {
/**
* The default serial version ID
*/
private static final long serialVersionUID = 1L;
private String login;
private String password;
private Long groupId;
public UserVO() {
super();
this.setLogin("");
this.setPassword("");
this.setGroupId(0L);
}
// all the getters and setters
// ...
}
and
public final class GroupVO extends ValueObject implements Serializable {
/**
* The default serial version ID
*/
private static final long serialVersionUID = 1L;
private String description;
private Set<UserVO> users = new HashSet<UserVO>();
public GroupVO() {
super();
this.setDescription("");
}
// all the getters and setters
// ...
}
Their super-class is a very simple abstract class:
public abstract class ValueObject {
private Long id;
private String name;
public ValueObject() {
super();
// the ID is auto-generated
// this.setId(0L);
this.setName("");
}
// all the getters and setters
// ...
}
Now I have to create the DAO classes for them. In the UserDAO I something like this for creating and inserting a user in the DB:
#Override
public Long create(UserVO user) throws IllegalArgumentException, DAOException {
if (user.getId() != null) {
throw new IllegalArgumentException("User may already be created, the user ID is not null.");
}
Object[] values = { user.getName(), user.getLogin(), user.getPassword(), user.getGroupId() };
Connection connection = null;
PreparedStatement preparedStatement = null;
ResultSet generatedKeys = null;
try {
connection = daoFactory.getConnection();
preparedStatement = DAOUtil.prepareStatement(connection, SQL_CREATE_USER, true, values);
int affectedRows = preparedStatement.executeUpdate();
if (affectedRows == 0) {
throw new DAOException("Creating user failed, no rows affected.");
}
generatedKeys = preparedStatement.getGeneratedKeys();
if (generatedKeys.next()) {
user.setId(generatedKeys.getLong(1));
} else {
throw new DAOException("Creating user failed, no generated key obtained.");
}
} catch (SQLException e) {
throw new DAOException(e);
} finally {
DAOUtil.close(connection, preparedStatement, generatedKeys);
}
return user.getId();
}
There are also some helper Classes, but I guess you understand my code :).
And this is the GroupDAO create method:
#Override
public Long create(GroupVO group) throws IllegalArgumentException, DAOException {
if (group.getId() != null) {
throw new IllegalArgumentException("Group may already be created, the group ID is not null.");
}
Object[] values = { group.getName(), group.getDescription() };
Connection connection = null;
PreparedStatement preparedStatement = null;
ResultSet generatedKeys = null;
try {
connection = daoFactory.getConnection();
preparedStatement = DAOUtil.prepareStatement(connection, SQL_CREATE_GROUP, true, values);
int affectedRows = preparedStatement.executeUpdate();
if (affectedRows == 0) {
throw new DAOException("Creating group failed, no rows affected.");
}
generatedKeys = preparedStatement.getGeneratedKeys();
if (generatedKeys.next()) {
group.setId(generatedKeys.getLong(1));
} else {
throw new DAOException("Creating group failed, no generated key obtained.");
}
} catch (SQLException e) {
throw new DAOException(e);
} finally {
DAOUtil.close(connection, preparedStatement, generatedKeys);
}
return group.getId();
}
Now, if I make a small test in the main function, for creating a group and saving in the DB, everything goes well:
DAOFactory usersGroupsRolesFactory = DAOFactory.getInstance("UsersGroupsRolesDB.jdbc");
System.out.println("DAOFactory successfully obtained: " + usersGroupsRolesFactory);
// Create an instance of the GroupDAO class
GroupDAO dao = usersGroupsRolesFactory.getGroupDAO();
// Create some GroupVO objects
GroupVO group1 = new GroupVO();
group1.setName("Administrators");
group1.setDescription("These users have all the right in the application");
dao.create(group1);
As GroupVO class has a set of UserVO objects, and in the main function if I also type:
UserVO user1 = new UserVO();
user1.setName("Palancica Pavel");
user1.setLogin("login_pavel");
user1.setPassword("password_pavel");
group1.getUsers().add(user1); // I may also add some more users
and say I am first time calling: dao.create(group1);
Normally, this shouldn't only save the Group info, but also all the associated UserVO objects.
For me that means that in the "create" function of the GroupDAO, after the group ID is successfully generated, I need to do a lot of other code.
I wonder is this is the correct way of saving those users in the DB, as I think I have to make the GroupDAO class to communicate with the UserDAO class, and also with the DAOFactory, which in my case, can give us an UserDAO, or GroupDAO object. Or I can do all the DB interection for saving those users without using the UserDAO class.
That code that I'm thinking seem very long, messy/spaghetti, and I am not quite sure if this is the correct approach :(.
Note that I am note using any ORM Framework.
Please let me know guys, what do you think about that?!
If you need more details, I can send you my Project, it's not commercial :D
Thanks in advance
I would use a GroupDAO to only create the group, a UserDAO to only create a user, and a functional service delegating to these two DAOs to create a group with all its users. Its code would look like the following:
Long groupId = groupDao.createGroup(groupVO);
for (UserVO userVO : groupVO.getUsers()) {
userDao.createUser(userVO, groupId);
}
I wrote a Hibernate interceptor :
public class MyInterceptor extends EmptyInterceptor {
private boolean isCanal=false;
public boolean onSave(Object entity, Serializable arg1, Object[] arg2, String[] arg3, Type[] arg4) throws CallbackException {
for(int i=0;i<100;i++){
System.out.println("Inside MyInterceptor(onSave) : "+entity.toString());
}
if(entity instanceof Canal){
isCanal=true;
}
return false;
}
public void afterTransactionCompletion(Transaction tx){
if(tx.wasCommitted()&&(isCanal)){
for(int i=0;i<100;i++){
System.out.println("Inside MyInterceptor(afterTransactionCompletion) : Canal was saved to DB.");
}
}
}
I can see the method onSave executing fine, but afterTransactionCompletion method never gets executed even though the transaction is successfully commited to the database.
I need a way to track every time a Canal object is successfully saved to the DB and react by printing some messages. is that feasible ? and how ?
Here is the method I use to save objects in the DB :
public static Object enregObjet(Object obj) throws UpdateException,
EnregException, ErreurException {
Transaction tx = null;
try {
Session s = InterfaceBD.currentSession();
tx = s.beginTransaction();
try {
// Positionner les champs dteUti et dteUtiModif
Method dteUtiSetter = null;
;
// Objet en insertion
dteUtiSetter = obj.getClass().getMethod("setDteUti",
new Class[] { java.util.Date.class });
dteUtiSetter.invoke(obj, new Object[] { new java.util.Date() });
} catch (NoSuchMethodException ex) {
;// Le champ dteUtiModif n'existe pas
}
// Enregistrer
IardNum.numeroterCode(obj);
IardNum.numeroterId(obj);
s.save(obj);
s.flush();
tx.commit();
try {
String id = "";
// Positionner les champs dteUti et dteUtiModif
Method idGetter = null;
// Objet en insertion
idGetter = obj.getClass().getMethod("getId");
id = (String) idGetter.invoke(obj);
Connection conn = InterfaceBD.getConn();
IardGenerator3.cleanSeq(id, conn);
conn.close();
} catch (NoSuchMethodException ex) {
;// Le champ dteUtiModif n'existe pas
}
catch(ClassCastException ex){
;//just ignore it because we are dealing with a PK class (e.g : CausesAnnexesSinistrePK).
}
s.clear();
return obj;
}
I would suggest using DAO design template. Basicaly you just create an class, which will save Canal object and use it everywhere where you need to save it.
In this class you can add all logic you need.