Update row in Play Framework when running a job - java

I'm running a Job in Play! that takes a while (more than 1 hour). During this job, I want to save some id's to my MySQL database. However, after writing the data, if I query the database in my terminal (using the MySQL monitor), the rows don't seem to be updated. So.. I need to update this data while the job is running.
This is the code:
this.ga.lastUID = msgnums[i-1];
this.ga.count = count[i-1];
this.ga.merge();
where this.ga is the instance of my JPA model.
If I use the save() method, I get a detached exception. If I use the findById(this.ga.id) method to get a new object, I can use save() but it has the same result as merge() (i.e. nothing is updated). Anyone has an idea how to fix this?
Thanks!

A job execution is transactional. Everything you do in your job will be commited at the end of the job.
To achieve what you want you can create a subjob class an call it in the main one.
Something like
public class SubJob extends Job {
private Ga ga;
public SubJob(Ga ga) {
this.ga = ga;
}
#Override
public doJob() throws Exception {
ga = ga.merge();
ga.lastUID = msgnums[i-1];
ga.count = count[i-1];
ga.save();
}
}
And in your main job you call
new SubJob(ga).now().get();
Your subjob will execute the save in another transaction that will be commited at the end of the subjob.
Be careful, we this way of work you can't rollback all the main job

Related

Annotating a class so that its every method can wait for X ms after execution

I am writing some JUnit unit tests to test my DynamoDB data accessor object. Here is one of the test.
....
private static DynamoDBMapper ddbMapper;
#BeforeClass
public static void setup() {
ddbMapper = DynamoDBClients.getTestingDynamoDBMapper();
}
#Before
public void setupTestItem() {
Item item = new Item();
item.setItemID(TEST_COUTSE_ITEM_ID);
// Create an item with every fields populated
ddbMapper.save(item);
}
#Test
public void test_update_simple_attribute() {
Item item = ddbMapper.load(Item.class, TEST_ITEM_ID, TEST_ITEM_VARIANT);
item.setLanguageTag(TEST_LANGUAGE_TAG + "CHANGED");
ddbMapper.save(item);
Item updatedItem = ddbMapper.load(Item.class, TEST_ITEM_ID, TEST_ITEM_VARIANT);
assertEquals(updatedItem.getLanguageTag(), TEST_LANGUAGE_TAG + "CHANGED"); // This field has been changed
}
I have more tests that will update the item, and assert whether the update got pushed through to DynamoDB, and I could read it back.
However I noticed that if I run these tests more often, I sometime ran into issue that DynamoDB have not yet fully write the updated data in, and when I load it, it is still showing the old data. Rerunning the tests usually solve the issue.
I believe that DynamoDB uses eventual consistency model for write, so it makes sense that sometime update might take a bit longer than the Java execution speed. One way I could mitigate this is to have the JUnit test to suspend for 100ms or so.
But I would have to include some code to suspend execution everywhere I do a ddbMapper.save() or ddbMapper.delete(), that doesn't seem feasible for all the tests I have and the tests I will write.
It seems like I could tackle this with an annotation driven approach. Perhaps implementing a class level annotation #SuspendAtEndOfMethod and all its method could get affected, I am wondering if this is possible?

Call method post transaction commit

I have a very peculiar requirement where I have to insert records in 2 tables (audit tables) if insertion in one particular table succeeds. Here I am not talking about #PreInsert in Listener because Listeners are always called in the same transaction. I know that can be done manually by simply calling "save" method after the first save succeeds. BUT I wanted to know is there any other way which I can try using Listener be it JPA/EclipseLink/String-data so that future developers of the application are not forced to insert data in audit table manually.
Basically I am looking for #PostCommit type of functionality. Please help me.
I believe you ultimately do want the callback to run within the boundary of your current transaction, you just want it to run after Hibernate has done its things, just like Hibernate Envers works.
To do this, you basically need to register an event action queue callback like the following:
session.getActionQueue().registerProcess(
new BeforeTransactionCompletionProcess() {
#Override
public void doBeforeTransactionCompletion(SessionImplementor session) {
// do whatever you want with the session here.
}
}
);
If you ultimately must run your code outside the transaction, you could do something similar:
session.getActionQueue().registerProcess(
new AfterTransactionCompletionProcess() {
#Override
public void doAfterTransactionCompletion(boolean success, SharedSessionContractImplementor session) {
// do whatever you want with the session here.
}
}
);
That should get you going either way.

Manually managed transaction

I have a code where transaction control is done manually,
But the code calls methods of other EJB's of the same application where the transaction is not controlled manually, for example, there are methods using the Hibernate Session that is managed by the Container.
How to prevent my manually managed transaction from downloading the commit when there is a query method using eg a Session.createCriteria.
When this occurs, my transaction unloads the commit before my process is actually completed.
Example
private void exe() throws Exception {
#EJB Businessbusiness;
this.beginTransaction();
business.processar(); // Exemplo
this.commit();
}
#Stateless
public class Business() {
#EJB
private DAO dao;
private void processar() throws Exception {
// executando processo 1
this.save();
// executando processo 2
this.update();
// Saving and updating has not yet been committed. So far it is correct.
Teste = dao.buscarTeste(1L);
// Here, after performing the search, my transaction downloads the commit to the bank without completing the whole process.
}
}
#Stateless
public class DAO() {
public Teste buscarTeste(Long codigo) {
Criteria cri = getSession().createCriteria(Teste.class);
cri.add(Restrictions.eq("codigo", codigo));
return (Teste) cri.uniqueResult();
}
}
I am not sure I actually got your point.
But assuming you are not getting any error, when you invoke the Business.processar() method it inherits the transaction. Which remains 'pending' until the exe client commit.
So, I would investigate what your getSession() does in the middle, I am quite sure it starts a brand new transaction, which is going to retrieve uncommitted data.
By the way, is there a reason for using hibernate instead of JPA with hibernate as a concrete implementation ?

Clear the in memory database after every testcase

I am using hsqldb for testing some of the data access layer in Java. I have certain test cases like 100 around. I create a in memory database and then insert some values in the table so that with my test case i can load it, but the problem is for every test case i need to clear the in memory database, only the values not the tables.
Is it possible, one thing is i need to manually delete the rows from the table and is there some thing else I can use.
Thanks
If you use DbUnit in unit-tests, you can specify that DbUnit should perform a clean-and-insert operation before every test to ensure that the contents of the database are in a valid state before every test. This can be done in a manner similar to the one below:
#Before
public void setUp() throws Exception
{
logger.info("Performing the setup of test {}", testName.getMethodName());
IDatabaseConnection connection = null;
try
{
connection = getConnection();
IDataSet dataSet = getDataSet();
//The following line cleans up all DbUnit recognized tables and inserts and test data before every test.
DatabaseOperation.CLEAN_INSERT.execute(connection, dataSet);
}
finally
{
// Closes the connection as the persistence layer gets it's connection from elsewhere
connection.close();
}
}
Note that it is always recommended to perform any setup activities in a #Before setup method, rather than in a #After teardown method. The latter indicates that you are creating new database objects in a method being tested, which IMHO does not exactly lend easily to testable behavior. Besides, if you are cleaning up after a test, to ensure that a second test runs correctly, then any such cleanup is actually a part of the setup of the second test, and not a teardown of the first.
The alternative to using DbUnit is to start a new transaction in your #Before setup method, and to roll it back in the #After teardown method. This would depend on how your data access layer is written.
If your data access layer accepts Connection objects, then your setup routine should create them, and turn off auto-commit. Also, there is an assumption that your data access layer will not invoke Connection.commit. Assuming the previous, you can rollback the transaction using Connection.rollback() in your teardown method.
With respect to transaction control, the below snippet demonstrates how one would do it using JPA for instance:
#Before
public void setUp() throws Exception
{
logger.info("Performing the setup of test {}", testName.getMethodName());
em = emf.createEntityManager();
// Starts the transaction before every test
em.getTransaction.begin();
}
#After
public void tearDown() throws Exception
{
logger.info("Performing the teardown of test {}", testName.getMethodName());
if (em != null)
{
// Rolls back the transaction after every test
em.getTransaction().rollback();
em.close();
}
}
Similar approaches would have to be undertaken for other ORM frameworks or even your custom persistence layer, if you have written one.
Could you use HSQLDB transactions?
Before every test, start a new transaction:
START TRANSACTION;
After every test, roll it back:
ROLLBACK;
This would also allow you to have some permanent data.
Depending on your test framework, it is possible to execute a delete call after each test. In Junit the annotation is #After and a method with this annotation will be run after each [#Test] method.
You have to use Truncate Query for the Destroy Database Memory or this link can be helpful to you.
http://wiki.apache.org/db-derby/InMemoryBackEndPrimer

How do I insert a lot of entities in a Play! Job?

In my application I have to simulate various situations for analysis. Thus insert a (very) large amount of lines into a database. (We're talking about a very large amount of data...several billion)
Model
#Entity
public class Case extends Model {
public String url;
}
Job
public class Simulator extends Job {
public void doJob() {
for (int i = 0; i !=) {
// Somestuff
new Case(someString).save();
}
}
}
After half an hour, there is still nothing in the database. But debug traces show Play inserts some stuff. I suspect it is some kind of cache.
I've tried about everything :
Model.em().flush();
Changes nothing.
Model.em().getTransaction().commit();
throws TransactionRequiredException occured : no transaction is in progress
Model.em().setFlushMode(FlushModeType.COMMIT);
Model.em().setFlushMode(FlushModeType.AUTO);
Changes nothing.
I've also tried #NoTransaction annotations everywhere :
Class & functions in Controller
Class Case
Overriding save method in Model
Class & functions of my Job
Getting quite desperate. Every kind of advice is welcome.
EDIT : After a little research, the first row appears in database. The associated ID is about 550.000. That means about half a million rows are somewhere in between my application and database.
Try
em.getTransaction().begin();
em.persist(model);
em.getTransaction().commit();
You can't commit a transaction before you begin it.
as per documentation, the job should have its own transaction enabled as Play request do, so that's not the issue. Try doing this:
for (int i = 0; i !=) {
// Somestuff
Case tmp = new Case(someString);
tmp = JPA.em().merge(tmp);
tmp.save();
}
The idea is that you add the newly created object to the EntityManager explicitly before saving, making sure the object is part of the "dirty objects" that will be persisted.
You need to instruct Play! when it should run your job by annotating your class with one of these annotations #OnApplicationStart, #Every or #On.
Please check Play! documentation on jobs

Categories