SQLite 'Database Locked' weird error for specific queries - java

I'm working on a project in Java, where, I have to make modifications in my SQLite Database.
I connected to it and is working pretty fine, except for this weird error.
s.executeUpdate("INSERT INTO STUDENTS VALUES('S0',11)");
...
//many statements... including queries
...
String c2="INSERT INTO STUDENTS VALUES ('S1', 2)";
s.executeUpdate(c2);
s.executeUpdate("DROP TABLE STUDENTS");
The statements s.executeUpdate("INSERT INTO STUDENTS VALUES('S0',11)"); and s.executeUpdate(c2); run perfectly and insert rows into the database. But when it comes to the statement below, I'm getting the weird Database Locked error.
When I changed the query to another, it also worked pretty fine. The error comes when it reaches the ending statement. More precisely, all the queries written above, i.e., the first statement of the code here work pretty fine.
Please help me to find the bug.

I guess that the "s" variable is a Statement. Try closing the resources after you execute:
PreparedStatement updateStatement = connection.prepareStatement("INSERT INTO STUDENTS VALUES ('S1', 2)");
try {
updateStatement.executeUpdate();
} catch (SQLException e) {
e.printStackTrace();
} finally {
updateStatement.close();
}
Do this after every call to the database.

Related

Unable to run several SQL update statements in Java

I am still quite new to the world of java. I am working on my second application which is a program that mass updates a time field in my company's SQL database. I am able to run queries through java, and store each query line in a resultset just fine. The thing is that each line of the result set is an update statement. I want to then run those resultset lines. However over and over I keep getting the "SQL command not properly ended" error message when I know full well these statements are formatted correctly and run just fine in TOAD for oracle. Can anyone help me understand whats going on here? I have also tried batching and continue to get the same error.
This is an example of one of the output lines of my query with table and field names changed.
Update sometable.somefield set COMPLETED_TS ='31-OCT-17 06.00.00.000000000 AM'Where eqact_id ='2559340';
Below you can see the end of my SQL string and my runScript2() method.
"\r\n" +
"\r\n" +
"where \"Center\" = S.CODE and S.TIMEZONE_ID = T.ID"; //This String is named SQL1
public void runScript2(){
try {
PreparedStatement statement0 = Connection1.conn.prepareStatement(SQL1);
ResultSet result0 = statement0.executeQuery();
Connection1.conn.setAutoCommit(false);
while(result0.next()) {
PreparedStatement statementq1=Connection1.conn.prepareStatement(result0.getString(1));
statementq1.executeUpdate();
}
Connection1.conn.commit();
}catch (SQLException e1) {
e1.printStackTrace();
}
}
Well I am angry and happy at the same time as I figured out that the issue was that my result0.getString(1) lines had a semicolon at the end of each and for some reason Java didn't like this. They run just fine without this.
Live and you learn I guess.

How to change values of Microsoft Access database from java?

I am trying to make an app that changes certain values of an MS-Access database. I am not trying to add new lines or anything. My problem is that I get a net.ucanaccess.jdbc.UcanaccessSQLException: UCAExc:::5.0.0-SNAPSHOT attempt to assign to non-updatable column error. The current code I'm using is
try {
sql = "SELECT * FROM MtnRoads";
Connection connection = DriverManager.getConnection("jdbc:ucanaccess://C://Users//anyGenericProgrammer//Documents//Database1.accdb");
Statement statement = connection.createStatement();
ResultSet result = statement.executeQuery(sql);
result.updateString(aNumber, aString);
} catch (Exception e) {
errCode.setText(e.toString());
System.out.println(e);
}`
I have looked at this StackOverflow question to figure out how to even update the lines in the first place, however the example that is used in extremely confusing. Is there any way to make this work without throwing errors? (I am using javax.swing.JFrame library, errCode is a JLable.)

SQL delete query in java app takes too long

I'm starting with SQL and trying to mix it with Java app. I have table ZAMESTNANEC containing 6 rows.
When I issue the command delete from ZAMESTNANEC where ID = 7; in SQL it will delete in no time. A few milliseconds. But when I use this in my Java app, the app will freeze in processing. I waited for 4 minutes and nothing happened (and due to its working state I can't do anything else). Oh and the row wasn't deleted.
I read this topic about deleting but it didn't help me much. In fact it didn't help me at all.
oracle delete query taking too much time
I tried to debug it but it's frozen on this command. I don't understand why in SQL it works fine and in Java app it doesn't. Other commands like SELECT works fine.
JDBC here - http://pastebin.com/BRh06yc8
Code from button here
private void jButtonOdeberZamActionPerformed(java.awt.event.ActionEvent evt) {
try{
OracleConnector.setUpConnection("xxxxxxxx", 1521, "ee11",
"NAME", "PASSWORD");
conn = OracleConnector.getConnection();
stmt = conn.createStatement();
stmt.executeQuery("delete from ZAMESTNANEC where ID = 7");
} catch(SQLException ex){
System.out.println(ex);
}
executeQuery should be used for queries that are expected to return results. Try executeUpdate instead and see if that helps. It could be that your app is waiting to receive results which never come back. By Tom H
Thank you Tom.

infinite loop "hangs" after some iterations in java code during mysql query

I have a long piece of code in java which uses selenium webdriver and firefox to test my website. Pardon me if I can't reproduce it here. It has an infinite while loop to keep doing its function repeatedly. Thats what its supposed to do. Also, I don't use multi threading.
Sometimes, it gets stuck. I use a windows system and the code runs on command prompt. When it gets stuck, no errors or exceptions are thrown. Its something like "it hangs" (only the window in which the code runs hangs). Then I have to use CTRL + C . Sometimes it resumes working after that, other times it gets terminated and I restart it. It works fine but after some loops it "hangs" again. Also, I've noticed that its usually during the execution of one of the methods querying mysql database.
The code runs an infinite loop. Each time, it queries the mysql database, fetches a value(whose 'status' field is not 'done') from a particular table (one value in each loop) and proceeds with testing with this value.At the end of the loop, the table is updated (the column 'status' is set to 'done' for that value). When there are no new values having 'status' not equal to 'done' in that particular table, it should ideally display "NO NEW VALUE". However, after all the values have been used, it simply takes up the last used value (even though its status is updated to 'done' at the end of previous loop) and goes ahead. I then have to terminate the execution and run the code again. This time when the infinite loop begins, it queries the database and correctly displays "NO NEW VALUE", queries again, displays the message again and so on(which is what it should do)
I close the sql connection using con.close().
It appears that after running the loop for a few times, some resource is getting exhausted somewhere. But this is only a wild guess.
Can anyone suggest what the problem is and how do I fix it ?
Below is a relevant piece of code :
try{
String sql = "select something from somewhere where id = ? and is_deleted = '0';";
System.out.println("\n"+sql + "\n? = " + pID);
PreparedStatement selQuery1 = conn.prepareStatement(sql);
selQuery1.setString(1, pID);
ResultSet rs1 = selQuery1.executeQuery();
//Extract data from result set
while(rs1.next() && i1<6){
//do something
}//end while loop
String sql2 = "select something2 from somewhere2 where id = ? and is_deleted = '0';";
System.out.println("\n"+sql2 + "\n? = " + pjID);
PreparedStatement selQuery2 = conn.prepareStatement(sql2);
selQuery2.setString(1, pjID);
ResultSet rs2 = selQuery2.executeQuery();
//Extract data from result set
while(rs2.next() && i1<6){
//do something
}//end while loop
System.out.println("\nDone.");
conn.close();
}catch (SQLException e) {
flag=false;
}
Please note that no exceptions are thrown anywhere. The window in which the code is running just freezes (that too once in while) after displaying both the query statements.
I forgot to close the query and the resultset. Just closing the connection should implicitly close the query and resultset but it doesn't work always.
I also faced the same problem recently. But in my case the issue was with indexes. I am just pointing out here so that it can be helpful to other folks.
In my case I am fetching the menu items from MenuMaster table from database. So after successfully log in, I am hitting a database to fetch the menu items using MySQL connector driver. Here I need to fetch parent menu with their child menus. In my query, in where clause I have not used any primary key or Unique key. So, it was taking a long time. So just make an index of that key, and it worked as charm...

Sybase JConnect: ENABLE_BULK_LOAD usage

Can anyone out there provide an example of bulk inserts via JConnect (with ENABLE_BULK_LOAD) to Sybase ASE?
I've scoured the internet and found nothing.
I got in touch with one of the engineers at Sybase and they provided me a code sample. So, I get to answer my own question.
Basically here is a rundown, as the code sample is pretty large... This assumes a lot of pre initialized variables, but otherwise it would be a few hundred lines. Anyone interested should get the idea. This can yield up to 22K insertions a second in a perfect world (as per Sybase anyway).
SybDriver sybDriver = (SybDriver) Class.forName("com.sybase.jdbc3.jdbc.SybDriver").newInstance();
sybDriver.setVersion(com.sybase.jdbcx.SybDriver.VERSION_6);
DriverManager.registerDriver(sybDriver);
//DBProps (after including normal login/password etc.
props.put("ENABLE_BULK_LOAD","true");
//open connection here for sybDriver
dbConn.setAutoCommit(false);
String SQLString = "insert into batch_inserts (row_id, colname1, colname2)\n values (?,?,?) \n";
PreparedStatement pstmt;
try
{
pstmt = dbConn.prepareStatement(SQLString);
}
catch (SQLException sqle)
{
displaySQLEx("Couldn't prepare statement",sqle);
return;
}
for (String[] val : valuesToInsert)
{
pstmt.setString(1, val[0]); //row_id varchar(30)
pstmt.setString(2, val[1]);//logical_server varchar(30)
pstmt.setString(3, val[2]); //client_host varchar(30)
try
{
pstmt.addBatch();
}
catch (SQLException sqle)
{
displaySQLEx("Failed to build batch",sqle);
break;
}
}
try {
pstmt.executeBatch();
dbConn.commit();
pstmt.close();
} catch (SQLException sqle) {
//handle
}
try {
if (dbConn != null)
dbConn.close();
} catch (Exception e) {
//handle
}
After following most of your advice we didn't see any improvement over simply creating a massive string and sending that across in batches of ~100-1000rows with a surrounding transaction. we got around:
*Big String Method [5000rows in 500batches]: 1716ms = ~2914rows per second.
(this is shit!).
Our db is sitting on a virtual host with one CPU (i7 underneath) and the table schema is:
CREATE TABLE
archive_account_transactions
(
account_transaction_id INT,
entered_by INT,
account_id INT,
transaction_type_id INT,
DATE DATETIME,
product_id INT,
amount float,
contract_id INT NULL,
note CHAR(255) NULL
)
with four indexes on account_transaction_id (pk), account_id, DATE, contract_id.
Just thought I would post a few comments first we're connecting using:
jdbc:sybase:Tds:40.1.1.2:5000/ikp?EnableBatchWorkaround=true;ENABLE_BULK_LOAD=true
we did also try the .addBatch syntax described above but it was marginally slower than just using java StringBuilder to build the batch in sql manually and then just push it across in one execute statement. Removing the column names in the insert statement gave us a surprisingly large performance boost it seemed to be the only thing that actually effected the performance. As the Enable_bulk_load param didn't seem to effect it at all nor did the EnableBatchWorkaround we also tried DYNAMIC_PREPARE=false which sounded promising but also didn't seem to do anything.
Any help getting these parameters actually functioning would be great! In other words are there any tests we could run to verify that they are in effect? I'm still convinced that this performance isn't close to pushing the boundaries of sybase as mysql out of the box does more like 16,000rows per second using the same "big string method" with the same schema.
Cheers
Rod
In order to get the sample provided by Chris Kannon working, do not forget to disable auto commit mode first:
dbConn.setAutoCommit(false);
And place the following line before dbConn.commit():
pstmt.executeBatch();
Otherwise this technique will only slowdown the insertion.
Don't know how to do this in Java, but you can bulk-load text files with LOAD TABLE SQL statement. We did it with Sybase ASA over JConnect.
Support for Batch Updates
Batch updates allow a Statement object to submit multiple update commands
as one unit (batch) to an underlying database for processing together.
Note: To use batch updates, you must refresh the SQL scripts in the sp directory
under your jConnect installation directory.
CHAPTER
See BatchUpdates.java in the sample (jConnect 4.x) and sample2 (jConnect
5.x) subdirectories for an example of using batch updates with Statement,
PreparedStatement, and CallableStatement.
jConnect also supports dynamic PreparedStatements in batch.
Reference:
http://download.sybase.com/pdfdocs/jcg0420e/prjdbc.pdf
http://manuals.sybase.com/onlinebooks/group-jcarc/jcg0520e/prjdbc/#ebt-link;hf=0;pt=7694?target=%25N%14_4440_START_RESTART_N%25#X
.
Other Batch Update Resources
http://java.sun.com/j2se/1.3/docs/guide/jdbc/spec2/jdbc2.1.frame6.html
http://www.jguru.com/faq/view.jsp?EID=5079

Categories