Batchupdate is freezed only for large number of rows - java

From my spring java code, i am trying to insert 13,500 rows into a table in oracle database through below step.
NamedParameterJdbsTemplate.batchupdate(insert quert, listname. Toarray(new map(list.size());
But the process gets freeze at this step, it does not throw any error.
Observed this only when inserting more number of records into table.
Able to insert successfully with above step when I try to insert 12000 or less rows.
But when I try to insert 13500 rows, the process gets freeze and do not throw any error either.
Able to insert even 40k rows at once into the same table from database.
Db table does not causing any issue, but facing this issue only from batchupdate form java that too only for rows above 12000.
Could you please help me how to figure what causing this issue?

Related

Hibernate select return only 1 row from sybase db

I have observed a issue which occurs rarely, approx. once in 700K iterations.
I am saving 5 records in Sybase table using hibernate save method.
And trying to get those records with Hibernate getWithSomeId(Serializable someId), SELECT query formed here should return above 5 rows, but rarely its returning only 1 row.
Time difference between write to db and read is ~200ms.
Any one has any idea why such issue can occur? TIA

insert into select vs bulk insert

I have a table A, about 400k rows one day, I want to write java program to back up old data to another table, there are two ways:
use jdbc to fetch data from table A, 500 rows one time for example, then concatenate sql like insert into table B values(value1, value2...),(value1, value2...),... then execute.
use insert into table B select * from A where, about 2~3 million rows.
As somebody said the second way is slower, but I am not sure, so which way is better? Must not crash database.
The 1st one is the realistic option but you dont have to do it by concatenating SQL. Java JDBC already has functions for sending inserts in batches. In a for loop you can keep performing inserts for one row at a time then when you reach your desired batch size call executeBatch() to post it in one big bulk insert.
The value that you'll have to play around with is how many to insert at at time or the batch size. That answer will depend on your hardware so just see what works. The larger the batch size the more units held in memory at anyone time. For really large databases doing this I have still crashed the program BUT NOT the database. It was only when I found the right batch size for my system did everything work out. For me 5000 was fine. But running it many many times for over a million rows the database was fine. But again this depends on the database as well. But you are on the correct path.
https://www.tutorialspoint.com/jdbc/jdbc-batch-processing.htm

How to update data from one table to another using jpa efficiently

I have a linux centos system, and am using postgres PostgreSQL 9.4.11 database.
I have a messaging system in which messages gets stored in db and there receipts are inserted in to the same table with the updated status (so for 1 message 3 rows are added in the table), everyday million rows are inserted into it due to which generating reports from that table takes a lot time. I don't have control over that table to make it store the updated status in the same row.
I tried data table partitioning to improve it but it was not that much useful.
So to overcome this, I started storing the sent messages in two tables and when I receive the receipt I update the second table same row and generate reports from that. To do this I tried using trigger but trigger were enormously slow and to update 100k rows it took 12 hours approx.
Third option that I tried is that I created a scheduler in java which queries 100 records from the large table and updates the records in the new table by its correct id.
Doing this one by one is not efficient at all as it increases the server load as well as takes time to update data.
I tried the following batch update method to update all receipts in one go:
I added all id's in array and set the required status and time and did execute update, but this didn't work, the arrids array shows 1000 records and the update count shows only 20 or 50 records updated (shows some random count).
queryStm = "Update reciept set smsc_sub=:dlr,time_sub=:time,serverTime=:servertime where id IN :arrids ";
Query query = em.createNativeQuery(queryStm);
query.setParameter("arrids ",arrids );
query.setParameter("dlr", dlr);
query.setParameter("time", time);
query.setParameter("servertime", new Timestamp(new Date().getTime()));
query.executeUpdate();
Is there any efficient way to achieve updating data into one table from another table either in java/jpa or in postgres.

create table column count row limit

I am creating a Java program that is connected with a MySQL database. I want to create a table in MySQL that has a limitation for 10 inputs or for 10 insert to statement. Can someone help me with this problem?
Try seeing this question: How can I set a maximum number of rows in MySQL table?
You should create a stored procedure to control that limit!

JDBC PreparedStatement.executeBatch does not update values in database and no error

Hi I am trying to do update sybase db using JDBC addBatch executeBatch. I am creating table at run time and inserting values in jdbc batch. I am reading values from comma separated file in the following format
1,ABC,DEF
2,GHI,KJL
create query is CREATE TABLE School(schoolid int,schoolname varchar ,schooltype varchar )
insert query INSERT INTO School(schoolid,schoolname,schooltype) VALUES (?,?,?)
What is happening is I am getting no error code executes successfully but there is not values in the end database. Table is empty. I am also using dbConn.commit() but still empty table. Please guide. Thanks in advance.
Most likely your batches are smaller than batchSize which means you'll never reach a line with pstmt.executeBatch();, since ++count % batchSize == 0 never evaluates to true.
You can easily fix that by adding a line with pstmt.executeBatch(); right after closing the while loop. That way, any remaining rows that didn't cause the batch to be executed and a new batch to be created will be executed at the end.

Categories