Hibernate multi level of transaction - java

I have some hibernate code and I want my code run in 1 transaction
let me explain in code
public void changeBranch(Branch branch) throws DatabaseException {
//some code
humanDao.update(he);
superBranchUsername = branch.getFatherUsername();
int superBranchId = branchDao.getBranchIdByUserName(superBranchUsername);
BranchEntity superBranch = branchDao.load(superBranchId);
BranchEntity be = new BranchEntity();
setBranchEntity(be, he, pkId, bname, confirmed, level, studentCount, uname, superBranch);
branchDao.update(be); // update kardane jadvale Branch va Set kardane Human motenazer be on
//some code
}
Both humanDao.update(he); and branchDao.update(be); run in transaction handle by My GenericDAO that humanDao and branchDao are inherited from it.
but I want this block of code (wrote above) to also run in a transaction!! How can I get to Hibernate to do this?

DAOs should not handle transactions for exactly the reason you've discovered: they can't know when they're part of a larger transaction.
If you were using Spring declarative transactions, you'd have a service layer that would create the transaction context for both DAOs and deal with everything. I would recommend doing something like that.
UPDATE: I added a link to Spring.

Please see: Chapter 11. Transactions and Concurrency

I find how should I fix it if I new session in changeBranch(Branch branch) and pass this session as a parameter to my DAO my problem solved

Related

Mandatory Flush for a repository

Situation: JPA, SpringBoot, Hibernate
public interface ViewRepository
extends JpaRepository<SomeView, Long>, JpaSpecificationExecutor<SomeView> {
Optional<SomeView> findByIdAndLanguageId(Long id,Long lid);
}
//service
SomeView getSomeView(){
SomeView someView1 = repo.findByIdAndLanguageId(id,lid1);
....
//now get in it in different lang id
SomeView someView2 = repo.findByIdAndLanguageId(id,lid2);
//the problem here is that the value for someView2is the same as someView1 since hibernate cash this.
}
Is there an annotation / way to prevent this caching for any call to this repository only?(not application wide turning of the caching) at service level or repository level ....
This is main feature of Hibernate.
If you look something up and change new found entity, changes will be saved to database without any additional code.
If you don't want that you need to use thing called "stateless session". But please warn everyone around about it, because otherwise you end up with many surprised people. This "stateless session" isn't very popular thing and no one will expect you use it.
If you don't want this caching to happen, you can immediately detach the entity after reading it i.e.:
SomeView someView1 = repo.findByIdAndLanguageId(id,lid1);
entityManager.detach(someView1);

Commit hook in JOOQ

I've been using JOOQ in backend web services for a while now. In many of these services, after persisting data to the database (or better said, after successfully committing data), we usually want to write some messages to Kafka about the persisted records so that other services know of these events.
What I'm essentially looking for is: Is there a way for me to register a post-commit hook or callback with JOOQ's DSLContext object, so I can run some code when a transaction successfully commits?
I'm aware of the ExecuteListener and ExecuteListenerProvider interfaces, but as far as I can tell the void end(ExecuteContext ctx) method (which is supposedly for end of lifecycle uses) is not called when committing the transaction. It is called after every query though.
Here's an example:
public static void main(String[] args) throws Throwable {
Class.forName("org.postgresql.Driver");
Connection connection = DriverManager.getConnection("<url>", "<user>", "<pass>");
connection.setAutoCommit(false);
DSLContext context = DSL.using(connection, SQLDialect.POSTGRES_9_5);
context.transaction(conf -> {
conf.set(new DefaultExecuteListenerProvider(new DefaultExecuteListener() {
#Override
public void end(ExecuteContext ctx) {
System.out.println("End method triggered.");
}
}));
DSLContext innerContext = DSL.using(conf);
System.out.println("Pre insert.");
innerContext.insertInto(...).execute();
System.out.println("Post insert.");
});
connection.close();
}
Which always seems to print:
Pre insert.
End method triggered.
Post insert.
Making me believe this is not intended for commit hooks.
Is there perhaps a JOOQ guru that can tell me if there is support for commit hooks in JOOQ? And if so, point me in the right direction?
The ExecuteListener SPI is listening to the lifecycle of a single query execution, i.e. of this:
innerContext.insertInto(...).execute();
This isn't what you're looking for. Instead, you should implement your own TransactionProvider (possibly delegating to jOOQ's DefaultTransactionProvider). You can then implement any logic you want prior to the actual commit logic.
Note that jOOQ 3.9 will also provide a new TransactionListener SPI (see #5378) to facilitate this.

NewRelic Java API and Transaction boundaries

I want to monitor a critical part of our Java Application (Glassfish v3.1.2 JSF2 application).
I want to track a specific function call as a new transaction. This method can be called within the "/Faces Servlet" or any other JAX-RS transactions.
The #Trace annotation seems to be perfect for my case but reading the doc it is not clear if it supports nested transactions (like the REQUIRES_NEW J2EE transaction semantic).
Here is the method I want to track
#Trace(dispatcher=true, matricName="Internal/Query")
public void query(Query q) {
long st = -System.currentTimeMillis();
// do my stuff
st += System.currentTimeMillis();
NewRelic.addCustomParameter("Client", q.getClient());
// Add useful parameters
NewRelic.recordResponseTimeMetric("Internal/Query/queryTime", st); // Is this needed?
}
And for example a JAX-RS WS like this :
#GET
public Response wsquery(...) { // <- Start NewRelic Transaction T1
myBean.query(q1); // <- Start nested Transaction T1.1
myBean.query(q2); // <- Start nested Transaction T1.2
}
Will I Have 3 transactions tracked?
One for the JAX-RS call to wsquery and two for Internal/Query.
Thanks.
Based off of the information provided it's not certain exactly what you're going to get. I recommend giving it a test. You can also bump up the logging level to the "finest" level and see exactly what is being instrumented. If you run into issues beyond that contact us at http://support.newrelic.com

spring jdbc + PlatformTransactionManager + spring data + theory [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 9 years ago.
I did start learn tool kit spring jdbc. Read some documentation about transactions
in spring end their templates. Nevertheless some common things not clear to me.
1) If we have Spring Data why always heard only about Spring JDBC
Spring framework have some project like Spring MVC, Spring Security and etc. First i try to find Spring JDBC on home site of Spring, but do not found it. Instead of i found Spring Data project. After some research i found what Spring Data used Spring JDBC in JDBC Extensions sub-project and last have some oracle specific operation what interesting to me. And i realize what did not see or hear any use or see references in tutorials to Spring Data. It really something bad?
2) Should I create new instance of JdbcTemplate every time
Next was JdbcTemplate, useful template method. All docs replete with examples of code like
public class JdbcCorporateEventDao implements CorporateEventDao {
private JdbcTemplate jdbcTemplate;
public void setDataSource(DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
// JDBC-backed implementations of the methods on the CorporateEventDao follow...
}
And they also write in docs
 Instances of the JdbcTemplate class are threadsafe once configured
This is misleading. Why in setDataSource method they create new instance if they can put dataSource in already created one or i misunderstood it?
3) How can we use TransactionTemplate for writing sophisticated client logic?
TransactionTemplate another one template what works with TransactionStatus. As I understand it can help me manage my transaction but how much help?
At the start of execute method of TransactionTemplate we have transactionManager.getTransaction(this). At the end of execute method we have line this.transactionManager.commit(status). Therefore, how i understand it, all what i place in doInTransaction method will execute in single transaction. But how about execute other DAOs with same transaction in another methods? It constrains to be writing client with sophisticated logic. I mean all logic must be in one method? I thought it can not be true.
What i mean when say sophisticated logic. For example i have my own template method.
/*abstract class definition*/
public final void execute(){
onPreExec();
exec();
onPostExec();
}
abstract void exec(); //client execute few DAOs methods
public void onPreExec(){}
public void onPostExec(){} //commit or rollback transaction in another method
/*other class members*/
4) Is thread-safe to use "PTM", "TD", "TS"?
Further i begin investigate what stand behind this.transactionManager.commit(status). This represent to me PlatformTransactionManager and TransactionDefinition. How i understand, at the moment when i start write this line of text, this classes can help me achieve my goal in question #3. For example i can do like this:
/*abstract class definition*/
protected PlatformTransactionManager ptm;
protected TransactionDefinition td;
protected TransactionStatus ts;//TS with PROPAGATION_REQUIRED, ISOLATION_READ_COMMITTED
public final void execute(){
onPreExec();
exec();
onPostExec();
}
abstract void exec(); //client execute few DAOs methods
public void onPreExec(){//start transaction
ts=ptm.getTransaction(td);
}
public void onPostExec(){//end transaction
if (exec.wasCompletedSuccessfully()){
dao.markJobCompleted(); //do some for fix execution completeness
ptm.commit(ts);
} else {ptm.rollback(ts);}
}
/*other class members*/
At least this looks more convenient than transactionTemplate.execute() method for some cases. Although merely divided into several parts transactionTemplate.execute() method.
But still not clear it is thread-safe? i.e. can i use it and be sure that all inner call of JDBCs callableStatment.execute() methods from jdbcTemplate will refers to this and only to this transaction. And not see another transaction in other threads.
Thanks for reading.
Regarding JdbcTemplate:
jdbc template is provided by spring to interact with database. you can use simple JDBC code to connect database and perform operations but in this case you have to handle issues related connection closing etc. Spring jdbc template handles all these issues and the end user only need to use these api and execute operations.
regarding new JDBCTemplate(), i think this is an example only i.e. you need not to create jdbc template object in each class. you can create bean of it like data source or create BaseDAO class for this.
just one more thing, please go through Spring Data Support videos. these are very good videos to learn basic of JDBC template.
You need to realize that Spring has been around for 11 years now. It's evolved. There have been lots of projects that started as ancillary to Spring that have been folded into it. Spring JDBC has been around since Rod Johnson first wrote it. Spring Data is a recent development.

Database cleanup after Junit tests

I have to test some Thrift services using Junit. When I run my tests as a Thrift client, the services modify the server database. I am unable to find a good solution which can clean up the database after each test is run.
Cleanup is important especially because the IDs need to be unique which are currently read form an XML file. Now, I have to manually change the IDs after running tests, so that the next set of tests can run without throwing primary key violation in the database. If I can cleanup the database after each test run, then the problem is completely resolved, else I will have to think about other solutions like generating random IDs and using them wherever IDs are required.
Edit: I would like to emphasize that I am testing a service, which is writing to database, I don't have direct access to the database. But since, the service is ours, I can modify the service to provide any cleanup method if required.
If you are using Spring, everything you need is the #DirtiesContext annotation on your test class.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("/test-context.xml")
#DirtiesContext(classMode = ClassMode.AFTER_EACH_TEST_METHOD)
public class MyServiceTest {
....
}
Unless you as testing specific database actions (verifying you can query or update the database for example) your JUnits shouldn't be writing to a real database. Instead you should mock the database classes. This way you don't actually have to connect and modify the database and therefor no cleanup is needed.
You can mock your classes a couple of different ways. You can use a library such as JMock which will do all the execution and validation work for you. My personal favorite way to do this is with Dependency Injection. This way I can create mock classes that implement my repository interfaces (you are using interfaces for your data access layer right? ;-)) and I implement only the needed methods with known actions/return values.
//Example repository interface.
public interface StudentRepository
{
public List<Student> getAllStudents();
}
//Example mock database class.
public class MockStudentRepository implements StudentRepository
{
//This method creates fake but known data.
public List<Student> getAllStudents()
{
List<Student> studentList = new ArrayList<Student>();
studentList.add(new Student(...));
studentList.add(new Student(...));
studentList.add(new Student(...));
return studentList;
}
}
//Example method to test.
public int computeAverageAge(StudentRepository aRepository)
{
List<Student> students = aRepository.GetAllStudents();
int totalAge = 0;
for(Student student : students)
{
totalAge += student.getAge();
}
return totalAge/students.size();
}
//Example test method.
public void testComputeAverageAge()
{
int expectedAverage = 25; //What the expected answer of your result set is
int actualAverage = computeAverageAge(new MockStudentRepository());
AssertEquals(expectedAverage, actualAverage);
}
How about using something like DBUnit?
Spring's unit testing framework has extensive capabilities for dealing with JDBC. The general approach is that the unit tests runs in a transaction, and (outside of your test) the transaction is rolled back once the test is complete.
This has the advantage of being able to use your database and its schema, but without making any direct changes to the data. Of course, if you actually perform a commit inside your test, then all bets are off!
For more reading, look at Spring's documentation on integration testing with JDBC.
When writing JUnit tests, you can override two specific methods: setUp() and tearDown(). In setUp(), you can set everything thats necessary in order to test your code so you dont have to set things up in each specific test case. tearDown() is called after all the test cases run.
If possible, you could set it up so you can open your database in the setUp() method and then have it clear everything from the tests and close it in the tearDown() method. This is how we have done all testing when we have a database.
Heres an example:
#Override
protected void setUp() throws Exception {
super.setUp();
db = new WolfToursDbAdapter(mContext);
db.open();
//Set up other required state and data
}
#Override
protected void tearDown() throws Exception {
super.tearDown();
db.dropTables();
db.close();
db = null;
}
//Methods to run all the tests
Assuming you have access to the database: Another option is to create a backup of the database just before the tests and restore from that backup after the tests. This can be automated.
If you are using Spring + Junit 4.x then you don't need to insert anything in DB.
Look at
AbstractTransactionalJUnit4SpringContextTests class.
Also check out the Spring documentation for JUnit support.
It's a bit draconian, but I usually aim to wipe out the database (or just the tables I'm interested in) before every test method execution. This doesn't tend to work as I move into more integration-type tests of course.
In cases where I have no control over the database, say I want to verify the correct number of rows were created after a given call, then the test will count the number of rows before and after the tested call, and make sure the difference is correct. In other words, take into account the existing data, then see how the tested code changed things, without assuming anything about the existing data. It can be a bit of work to set up, but let's me test against a more "live" system.
In your case, are the specific IDs important? Could you generate the IDs on the fly, perhaps randomly, verify they're not already in use, then proceed?
I agree with Brainimus if you're trying to test against data you have pulled from a database. If you're looking to test modifications made to the database, another solution would be to mock the database itself. There are multiple implementations of in-memory databases that you can use to create a temporary database (for instance during JUnit's setUp()) and then remove the entire database from memory (during tearDown()). As long as you're not using an vendor-specific SQL, then this is a good way to test modifying a database without touching your real production one.
Some good Java databases that offer in memory support are Apache Derby, Java DB (but it is really Oracle's flavor of Apache Derby again), HyperSQL (better known as HSQLDB) and H2 Database Engine. I have personally used HSQLDB to create in-memory mock databases for testing and it worked great, but I'm sure the others would offer similar results.

Categories