Spring Integration JDBC Batch Insert - java

I have a requirement to insert a list of POJO to be inserted into the database. I have a stored procedure which performs one insert at a time. In the current implementation, I have a splitter that splits on the POJO and passes that payload to the stored proc outbound gateway to call my stored procedure.
In a real time scenario my list size could be as big as 500K. So Is there a way for a better implementation? Is there a way to perform a batch insert in SI flow?
Thanks

I wrote mentioned custom outbound channel adapter, which uses spring-jdbc batch ability:
public class ArrayListSqlBatchOBCA {
private final static Logger log = LoggerFactory.getLogger(ArrayListSqlBatchOBCA.class);
private NamedParameterJdbcTemplate template;
private String sql;
public void process(Message<? extends ArrayList<?>> message) throws Exception {
try {
ArrayList<?> list = (ArrayList<?>) message.getPayload();
SqlParameterSource[] batchArgs = new SqlParameterSource[list.size()];
for (int i = 0; i < list.size(); i++) {
batchArgs[i] = new BeanPropertySqlParameterSource(list.get(i));
}
template.batchUpdate(sql, batchArgs);
}
catch (Exception e) {
log.error("Exception while processing message", e);
throw e;
}
}
}
This is how I use it (XML config):
<channel id="cc"/>
<outbound-channel-adapter channel="cc" method="process">
<beans:bean class="mypackage.ArrayListSqlBatchOBCA">
<beans:property name="template" ref="jdbcTemplate"/>
<beans:property name="sql"
value="INSERT INTO test (field1,field2,field3)
VALUES (:f1,:f2,:f3)"/>
</beans:bean>
</outbound-channel-adapter>
Further, I added ?reWriteBatchedInserts=true to jdbc url of my datasource (I use postgres JDBC driver).
Documentation about the option:
reWriteBatchedInserts - Enable optimization to rewrite and collapse compatible INSERT statements that are batched. If enabled, pgjdbc rewrites batch of insert into ... values(?, ?) into insert into ... values(?, ?), (?, ?), ...
This way, using a small custom code, I insert multiple rows at once from input collection.

Related

WildFly/Jboss/persistance-unit/Entity Manager - How to create new connection every time when user calls GET API Endpoint

I'm trying to modify existing Java app (WildFly, Jboss, oracle) which currently working fine as using persistence-unit and EntityManager connect to Oracle database(using standalone.xml and persistence.xml). However, I need to create every time new connection to database for the user which calls new GET API Endpoint using credentials from the HttpHeaders. Currently, I'm creating new entitymanager object which session is commit, rollback nad close. Unfortunately time response for every call become higher and higher. There is warning about "PersistenceUnitUser" being already registered and memory usage constantly growing. So that is bad solution.
Is there any proper way to do it, which works witout any harms ?
P.S.
Currently app using standalone.xml and persistence.xml. And that is working fine. I'm calling java api endpoint using entity manager being connected as Admin user/pass but I need to create new connection using user/pass from the httpHeaders and call one sql statement to see proper results as ORACLE uses reserved word such us: 'user'. For instance : select * from table where create_usr = user. When done 'Main EntityManager will use data from it to continue some process.
Please see code example below :
#GET
#Path("/todo-list-enriched")
#Produces(MediaType.APPLICATION_JSON)
public Response getToDoListEnriched(#Context HttpHeaders httpHeaders, #QueryParam("skip") int elementNumber, #QueryParam("take") int pageSize, #QueryParam("orderby") String orderBy)
{
String userName = httpHeaders.getHeaderString(X_USER_NAME);
String userName = httpHeaders.getHeaderString(X_PASSWORD);
EntityManager entityManager = null;
try {
Map<String, String> persistenceMap = new HashMap<String, String>();
persistenceMap.put("hibernate.dialect","org.hibernate.dialect.Oracle8iDialect");
persistenceMap.put("hibernate.connection.username", asUserName);
persistenceMap.put("hibernate.connection.password", asPassword);
EntityManagerFactory emf = Persistence.createEntityManagerFactory("PersistenceUnitUser", persistenceMap);
entityManager = emf.createEntityManager();
if (!entityManager.getTransaction().isActive()) {
entityManager.getTransaction().begin();
}
-- Do some works as select, update, select
-- and after that
if (entityManager.getTransaction().isActive()) {
entityManager.getTransaction().commit();
}
}
catch (Exception ex)
{
if (entityManager != null && entityManager.getTransaction().isActive()) {
entityManager.getTransaction().rollback();
}
}
finally {
if (entityManager != null && entityManager.isOpen()) {
entityManager.close();
}
}
}
}
``
Best Regards
Marcin
You should define a connection pool and a datasource in the standalone.xml (cf. https://docs.wildfly.org/26.1/Admin_Guide.html#DataSource) and then use it in your persistence.xml and inject the EntitytManager in your rest service class (cf. https://docs.wildfly.org/26.1/Developer_Guide.html#entity-manager).
You may look at this example application: https://github.com/wildfly/quickstart/tree/main/todo-backend

Not able to insert data in MSSQL using r2dbc (non spring)

I am new to r2dbc. I am trying to connect to MSSQL DB using R2DBC (non spring project) with reactor. It is not establishing the connection and also the data is not getting inserted into the table. I have tried by giving wrong table name as well, but there is no exception for it.
public Flux<MssqlResult> writetoDB() {
return createDBConnection().create()
.flatMapMany(c -> c.createStatement("INSERT INTO person (id, first_name, last_name) VALUES(#id, #firstname, #lastname)")
.bind("id", 1)
.bind("firstname", "Walter")
.bind("lastname", "White")
.execute()
.doFinally((st) -> c.close()))
.log();
}
private MssqlConnectionFactory createDBConnection() {
MssqlConnectionConfiguration configuration = MssqlConnectionConfiguration.builder()
.host("sample-host").username("testuser")
.password("testuser1").database("testDB").preferCursoredExecution(true).build();
MssqlConnectionFactory factory = new MssqlConnectionFactory(configuration);
return factory;
}
Kindly suggest what I am missing here.

In memory SQLITE in a Spring Boot application not working

I am trying to setup SQLITE as an in-memory database in my spring-boot application. But when i try to query the database then it gives me an error "No such table"
Can someone please recommend what am i doing wrong? I need to have SQLITE as a in memory only and we only use jdbc in our project.
Here is my code:
application.properties
spring.datasource.url=jdbc:sqlite:memory
spring.datasource.username=
spring.datasource.password=
spring.datasource.platform=sqlite
spring.datasource.driver-class-name=org.sqlite.JDBC
MyRepo.java
#Repository
public class MyRepo{
#Autowired
private NamedParameterJdbcTemplate namedJdbc;
public String getUserName() throws Exception{
String userName = null;
String sql = "SELECT username FROM emp WHERE username=:name";
MapSqlParameterSource paramSource = new MapSqlParameterSource();
paramSource.addValue("name", "tuser");
userName = this.namedJdbc.query(sql, paramSource, (rs) -> {
String name = null;
while (rs.next()) {
name = rs.getString("username").trim();
return name;
}
return null;
});
return userName;
}
}
UserDaoTest.java
#SpringBootTest
public class UserDaoTest {
#Autowired
private MyRepo rep;
#Test
public void testFindByname() throws Exception{
rep.getUserName();
}
}
I also have schema.sql and data.sql files under src/main/resources
schema.sql
DROP TABLE IF EXISTS emp;CREATE TABLE IF NOT EXISTS emp(username VARCHAR(20), empId BIGINT, PRIMARY KEY(empId) )
data.sql
INSERT INTO emp(username,empId) VALUES ('tuser',1001);
Exception that i am getting:
PreparedStatementCallback; uncategorized SQLException for SQL [SELECT username FROM Chats WHERE username=?]; SQL state [null]; error code [1]; [SQLITE_ERROR] SQL error or missing database (no such table: Chats)
well, I am shooting in the dark but looks like you need to add the schema for 'Chats' table as well to your schema.sql
https://sqlite.org/inmemorydb.html
The database ceases to exist as soon as the database connection is closed. Every :memory: database is distinct from every other. So, opening two database connections each with the filename ":memory:" will create two independent in-memory databases.
Your issue might be with spring boot opening multiple connections due to its connection pool configuration. If you're using hikari connection pool (default in newer spring boot versions), try adding these properties
spring.datasource.hikari.maximum-pool-size=1
spring.datasource.hikari.max-lifetime=0

Mysql Stored Procedure per connection or per transaction for thread safe

First let me describe the problem I am facing.
We are developing a web service using Jersey and MySQL at back-end. It's simple that user can view or edit data via Restful API, and the service will load/save data to or from MySQL database.
I created several stored procedure at MySQL server, like SelectAnswer and UpdateAnswer. In the program, we use standard JDBC and MySQL connector to connect to the database.
Firstly I created stored procedure per connection. Like each time the program start a new connection to the database, it creates a bunch of Stored Procedure.
public enum JDBCDataSource {
INSTANCE;
private BasicDataSource dataSource = new BasicDataSource();
private Connection conn;
private CallableStatement answerUpsert;
private JDBCDataSource(){
initConnection();
try{
conn = dataSource.getConnection();
answerUpsert = prepareSP(answerUpsert,"{call upsert_answer(?, ?, ?)}");
} catch (SQLException e) {
e.printStackTrace();
}
}
And the program reuse the stored procedure for each time calling:
private void executeUpsert(String app, String id, String content)
throws SQLException{
try {
CallableStatement callableStatement = JDBCDataSource.INSTANCE.getUpsert();
callableStatement.setString(1,app);
callableStatement.setInt(2,Integer.valueOf(id));
callableStatement.setString(3,content);
callableStatement.execute();
} catch (NumberFormatException e) {
e.printStackTrace();
} finally {
callableStatement.clearParameters();
}
}
So each time calling the procedure, the function set its own parameters, and execute the SP, and finally clear the parameters.
But it is not thread safe, if the user post two request and one request is trying to set parameters, and another to clear parameters, it will cause an exception.
So for stored procedure in MySQL, should I create it per transaction or per connection in order to keep it thread safe? Maybe my understanding of SP and MySQL is not correct, or maybe it is design problem. Please share your though for my question.

concurrent access SQLite Da

I´m developing an app with 3 parts:
- JavaFX Desktop app.
- Java Server WebApp
- AndroidApp
I´m using Hibernate for mapping a SQLite Database.
But when the desktop app is open and try to insert a new ibject from the AndroidApp throug the Server it gives me an error: java.sql.SQLException: database is locked
My hibernate.cfg.xml file:
<property name="show_sql">true</property>
<property name="format_sql">true</property>
<property name="dialect">dialect.SQLiteDialect</property>
<property name="connection.driver_class">org.sqlite.JDBC</property>
<property name="connection.url">jdbc:sqlite:grainsa_provisional.sqlite</property>
<property name="connection.username"></property>
<property name="connection.password"></property>
And my "Objects Manager",the same way in the Server and in the Desktop by example:
private Session mSession;
private Transaction mTransaction;
private void initQuery() throws HibernateException {
mSession = HibernateUtil.getSessionFactory().openSession();
mTransaction = mSession.beginTransaction();
}
private void manejaExcepcion(HibernateException hibernateException) {
mTransaction.rollback();
throw new HibernateException("ha ocurrido un error con la Base de Datos!!!", hibernateException);
}
public Conductor selectConductorByID(Integer id) {
Conductor conductor = new Conductor();
try{
initQuery();
conductor = (Conductor) mSession.get(Conductor.class, id);
} catch (HibernateException e){
manejaExcepcion(e);
throw e;
} finally {
mSession.close();
}
return conductor;
}
If you need more information please ask!
What i´m doing wrong?
Thanks everyone and sorry about my english!
Edit: ím thinking to change the acces mode of mi desktop JavaFX app to make the query through the server, but it will take me alot of time, and i do not think that is the best way to do it..
Edit2:
This is the right way to open, make query and close the conexion to the databasa to lock/query/unlock?
private void initQuery() throws HibernateException {
mSession = HibernateUtil.getSessionFactory().openSession();
mTransaction = mSession.beginTransaction();
}
private void manejaExcepcion(HibernateException hibernateException) {
mTransaction.rollback();
throw new HibernateException("ha ocurrido un error con la Base de Datos!!!", hibernateException);
}
public Conductor selectConductorByID(Integer id) {
Conductor conductor = new Conductor();
try{
initQuery();
conductor = (Conductor) mSession.get(Conductor.class, id);
} catch (HibernateException e){
manejaExcepcion(e);
throw e;
} finally {
mSession.close();
}
return conductor;
}
Please help! and thanks again!
I´m a little bit deseperated...
From FAQ (5) in SQLite FAQ:
But use caution: this locking mechanism might not work correctly
if the database file is kept on an NFS filesystem. This is because
fcntl() file locking is broken on many NFS implementations.
You should avoid putting SQLite database files on NFS if multiple
processes might try to access the file at the same time.
On Windows, Microsoft's documentation says that locking
may not work under FAT filesystems if you are not running the Share.exe daemon.
People who have a lot of experience with Windows tell me that file
locking of network files is very buggy and is not dependable.
If what they say is true, sharing an SQLite database between
two or more Windows machines might cause unexpected problems.
Maybe this is the cause for you problem? Are you working on windows?
Multiple processes can have the same database open at the same time.
Multiple processes can be doing a SELECT at the same time.
But only one process can be making changes to the database
at any moment in time, however.
SQLLite is problematic in multi-user scenarios, but can still work fine if updates are short and fast.
In your code it looks like you are not closing the transaction correctly.
Maybe this case the db lock
You should change the code to see hibernate doc here
private Transaction initQuery() throws HibernateException {
mSession = HibernateUtil.getSessionFactory().openSession();
mTransaction = mSession.beginTransaction();
return mTransaction;
}
public Conductor selectConductorByID(Integer id) {
Conductor conductor = new Conductor();
Transaction tx = null;
try{
tx = initQuery();
conductor = (Conductor) mSession.get(Conductor.class, id);
//flush and commit before close
mSession.flush();
tx.commit();
} catch (HibernateException e){
manejaExcepcion(e);
throw e;
} finally {
mSession.close();
}
return conductor;
}

Categories