I have the following tricky situation in my Spring Boot application that uses Hibernate. I load objects from the data store and I modified them in several functions of my application that are not related one with another. The idea is that I need to load the existing copy of the object from the database before saving its update instance in order to create a backup, but if I use the repository's findById method, Hibernate finds a copy of the object (the modified one) in its cache and returns that one and it is not ok for me, because I need a copy of the original object, before it was modified (the object that is currently in the database). I tried using a separate Session, but in case of multiple objects the DB is locked and I'm not able to access the database anymore (MS Sql Express). Has anyone an idea on how to obtain the original unmodified object before persisting tghe changes in the database ? Thanks
To keep backup of entities you should use #Audited (it keeps versions / snaphshots of each entity).
You can have a look over there https://www.baeldung.com/database-auditing-jpa
A more advanced approach is https://javers.org/.
Javers is the state-of-the-art way to do what you want to do. I think it will suit your needs.
Related
I need to keep in sync Client with postgreSQL database (only data that are loaded from database, not entire database, 50+ db tables and a lot of collections inside entities). As recently I have added server based on Spring-REST API to my application I could manage those changes maybe differently/more efficient that would require less work. So untill now my approach was to add psql notification that triggers json
CREATE TRIGGER extChangesOccured
AFTER INSERT OR UPDATE OR DELETE ON xxx_table
FOR EACH ROW EXECUTE PROCEDURE notifyUsers();
the client then receive the json built as:
json_build_object(
'table',TG_TABLE_NAME,
'action', TG_OP,
'id', data,
'session', session_app_name);
compare if this change is made by this client or any other and fetch the new data from database.
Then on client side new object is manually "rewritten", something like method copyFromObject(new_entity) and variables are being overriden (including collections, avoid transient etc...).
This approach requires to keep copyFromObject method for each entity (hmm still can be optimized with reflections)
Problems with my approach is:
requires some work when modifying variables (can be optimized using reflections)
entire new entity is loaded when changed by some client
I am curious of Your solutions to keep clients in sync with db, generally I have desktop client here and the client loads a lot of data from database which must be sync, loading database takes even 1min on the app start depends on chosen data-period which should be fetched
The perfect solution would be to have some engine that would fetch/override only those variables in entities that was really changed and make it automatically.
A simple solution is to implement Optimistic Lock? It will prevent user from persisting data if the entity was changed after the user fetched it.
Or
You can use 3rd party apps for DB synchronization. I've played some time ago with Pusher and you can find an excessive tutorial about Client synchronization here: React client synchronization
Of course pusher is not the only one solution, and I'm not related to the dev team of that app by at all.
For my purpose I have implemented AVL Tree based loaded entities and database synchronization engine that creates repositiories based on the loaded entities from hibernate and asynchronously search throught all the fields in entities and rewrites/merge all the same fields (so that if some field (pk) is the same entity like the one in repository, it replaces it)
In this way synchronization with database is easy as it comes to find the externally changed entity in the repository (so basically in the AVL Tree which is O(log n)) and rewrite its fields.
I have a spring mvc project that needs to create a bunch of tables. The source of the table data is from a non-SQL, remote data source that isn't available until after the user logs in. I can create dummy rows in DataConfiguration.initDatabase and then truncate each table. If I don't do that, then when I try to insert data in other places in the code, the reference to the repository is null.
Since I have quite a number of tables and they use referential integrity, is there a way to declare them in such a way that they are automatically created without actually inserting any data?
I am using Java configuration.
The tables were apparently created after all. Even when I looked in H2, I couldn't find them. Turns out I was looking in the default H2 connection "jdbc:h2:~/test" rather than "jdbc:h2:mem:testdb".
I am trying to create a desktop application using eclipse-rcp. In that application, I use a ORM framework to load objects from database and using JFace-databinding to bind these objects to user-interface, so the users can modify data that these objects contains.
since the objects loaded, other users or other client may also work with the same data. so when user want to save the objects back into database, the data these objects contains may differs from data in database, the difference may be caused by my application or caused by others.
should I check against real data in database when I need to save a object that may be not fresh any more?
maybe this is a common problem in ORM, but this is first time I need to deal with ORM.
yes - it's not a bad idea to check against "real" data before saving. you may have a special field - last update timestamp, or increment count.
such approach is called optimistic locking and, since it is very typical it may be supported by ORM's.
I've got a hibernate interfaced mysql database with a load of different types of objects, some of which are periodically retrieved and altered by other pieces of code, which are operating in JADE agents. Because of the way the objects are retrieved (in queries, returning collections of objects) they don't seem to be managed by the entity manager, and definitely aren't managed when they're passed to agents without an entity manager factory or manager.
The objects from the database are passed about between agents, before arriving back at the database, at this point, I want to update the version of the object in the database - but each time I merge the object, it creates a new object in the database.
I'm fairly sure that I'm not using the merge method properly. Can anyone suggest a good way that I can combine the updated object with the existing database object without knowing in advance which properties of the object have changed? Possibly something along the lines of searching for the existing object and deleting it, then adding the new one, but I'm not sure how to do this without messing up PKeys etc
Hibernate has saveOrUpdate-method which either saves the object or updates it depending if an object with a same ID already exists:
http://docs.jboss.org/hibernate/core/3.3/reference/en/html/objectstate.html#objectstate-saveorupdate
I have an application which can read/write changes to a database table. Another instance of the same application should be able to see the updated values in the database. i am using hibernate for this purpose. If I have 2 instances of the application running, and if i make changes to the db from one instance for the first time, the updated values can be seen from the second. But any further changes from the first instance is not reflected in the second. Please throw some light.
This seems to be a bug in your cache settings. By default, Hibernate assumes that it's the only one changing the database. This allows it to efficiently cache objects. If several parties can change tables in the DB, then you must switch off caching for those tables/instances.
You can use hibernate.connection.autocommit=true
This will make hibernate commit each SQL update to the database immediately and u should be able to see the changes from the other application.
HOWEVER, I would strongly discourage you from doing so. Like Aaron pointed out, you should only use one Hibernate SessionFactory with a database.
If you do need multiple applications to be in sync, think about using a shared cache,e.g. Gemstone.