I tried to save an object into sqlite, and the class of the object has implemented Serializable. But there is always an error:
android.database.sqlite.SQLiteException: unrecognized token:
"[Ljava.lang.Object;#277c81d9" (code 1): , while compiling: insert
into mClass(classData) values(?)[Ljava.lang.Object;#277c81d9
Here is my code:
public boolean add(ReturnInfo ri) {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
db = dh.getWritableDatabase();
ObjectOutputStream oos = new ObjectOutputStream(bos);
oos.writeObject(ri);
oos.flush();
byte[] data = bos.toByteArray();
bos.close();
oos.close();
db.execSQL("insert into mClass(classData) values(?)" + new Object[]{data});
db.close();
Log.e("db", "insert succeeded");
return true;
} catch (Exception e) {
e.printStackTrace();
Log.e("db", "insert failed");
return false;
}
The database has been created successfully, I have no idea where went wrong.
The issue is that you use prepared statements in a wrong way.
db.execSQL("insert into mClass(classData) values(?)" + new Object[]{data});
Here you generate inappropriate SQL statement because you just add an object to the end of the string and end up with something like this:
"insert into mClass(classData) values(?)[Ljava.lang.Object;#277c81d9"
which is not an SQL statement.
To use prepared statements you need write the following:
SQLiteStatement stmt =
db.compileStatement("insert into mClass(classData) values(?)");
stmt.bindString(1, data);
stmt.execute()
Also, look at this question to get a better understanding of prepared statements in android.
Related
I have code to save binary to PostgreSQL, I am using JDK 1.5. but I got error..
And after I print the insert statement, then I try in my postgresql console, something error like this image:
File file = new File("E:\\myimage.gif");
FileInputStream fis;
try {
fis = new FileInputStream(file);
PreparedStatement ps = conn.prepareStatement("INSERT INTO golf_fnb.coba VALUES (?)");
ps.setBinaryStream(1, fis, (int)file.length());
System.out.println("SQl: "+ps);
ps.executeUpdate();
ps.close();
fis.close();
} catch (FileNotFoundException e2) {
// TODO Auto-generated catch block
e2.printStackTrace();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
and this is the error in my eclipse console:
org.postgresql.util.PSQLException: ERROR: syntax error at or near "\"
at org.postgresql.util.PSQLException.parseServerError(PSQLException.java:139)
at org.postgresql.core.QueryExecutor.executeV3(QueryExecutor.java:152)
at org.postgresql.core.QueryExecutor.execute(QueryExecutor.java:100)
at org.postgresql.core.QueryExecutor.execute(QueryExecutor.java:43)
at org.postgresql.jdbc1.AbstractJdbc1Statement.execute(AbstractJdbc1Statement.java:517)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:50)
at org.postgresql.jdbc1.AbstractJdbc1Statement.executeUpdate(AbstractJdbc1Statement.java:273)
at finger.ConsoleUserInterfaceFactory$ConsoleUserInterface.verify4(ConsoleUserInterfaceFactory.java:605)
at finger.ConsoleUserInterfaceFactory$ConsoleUserInterface.run(ConsoleUserInterfaceFactory.java:117)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:651)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:676)
at java.lang.Thread.run(Thread.java:595)
I think the main error is using the column. Syntax: INSERT INTO TABLE(col1, ...) VALUES(val1, ...).
It just might also be you(or someone having a similar problem) intendedUPDATE golf_fnb SET coba = ? WHERE id = ?`. For the INSERT:
try (FileInputStream fis = new FileInputStream(file);
PreparedStatement ps = conn.prepareStatement("INSERT INTO golf_fnb(coba) VALUES (?)",
Statement.RETURN_GENERATED_KEYS)) {
ps.setBinaryStream(1, fis, (int)file.length());
System.out.println("SQl: "+ps);
int updateCount = ps.executeUpdate();
if (updateCount == 1) {
try (ResultSet rs = ps.getGeneratedKeys()) {
if (rs.next()) {
long id = rs.getLong(1);
System.out.println("ID " + id);
return;
}
}
}
} catch (SQLException | IOException e) {
e.printStackTrace();
}
Used try-with-resources to close all automatically.
The inserted record one might like to find. Assuming a long primary key, added getGeneratedKeys.
\', the apostrophe might give some problem. Manualy one should have \047 instead maybe. I hope this to-octal-conversion by the driver disappears with the new syntax above.
I have got a Microsoft Access database in the resource folder of my Java application.
When the user clicks a button, this database is copied to the temp directory of the PC. Then I make a temporary VBS file in the same directory and execute it.
(This VBS file calls a VBA macro within the database, that deletes some records.)
However, as the macro attempts to delete the records an error is thrown stating that the database is read only.
Why does this happen?
Here is my code:
When the user clicks the button, some variables are set and then the following code is executed:
private void moveAccess() throws IOException {
String dbName = "sys_cl_imp.accdb";
String tempDbPath = System.getenv("TEMP").replace('\\', '/') + "/" + dbName;
InputStream in = ConscriptioLegere.class.getResourceAsStream("res/" + dbName);
File f = new File(tempDbPath);
Files.copy(in, f.toPath(), StandardCopyOption.REPLACE_EXISTING);
this.dbFilePath = tempDbPath;
System.out.println("access in temp");
f = null;
}
Then a connection is made to the database to update some data;
with
Connection con = DriverManager.getConnection("jdbc:ucanaccess://" + dbFilePath);
Statement sql = con.createStatement();
...
sql.close();
con.close();
Afterwards this is executed:
public boolean startImport() {
File vbsFile = new File(vbsFilePath);
PrintWriter pw;
try {
updateAccess();
} catch (IOException e) {
e.printStackTrace();
return false;
}
try{
pw = new PrintWriter(vbsFile);
pw.println("Set accessApp = CreateObject(\"Access.Application\")");
pw.println("accessApp.OpenCurrentDatabase (\"" + dbFilePath + "\")");
pw.println("accessApp.Run \"sys_cl_imp.importData\", \"" + saveLoc + "\"");
pw.println("accessApp.CloseCurrentDatabase");
pw.close();
Process p = Runtime.getRuntime().exec("cscript /nologo \"" + vbsFilePath + "\"");
While the process is running, the error occurres.
I don't understand why the database is open as ReadOnly.
I tried setting f to null after the copying of the db, but it proved not to work that way.
Based on this dicussion.
The solution is adding ;singleconnection=true to JDBC url. UCanAccess will close the file after JDBC connection closed.
Connection con = DriverManager.getConnection("jdbc:ucanaccess://" + dbFilePath +";singleconnection=true");
Thank you for your solution beckyang.
I managed to get it working with it, but there was a second mistake:
I deleted the contents of a table with java then closed the connection and run the vba procedure.
In the VBA I was attempting to delete the data again; but as there were none, this didn't work out.
After deleting the SQL from the VBA, the project worked :)
This question already has answers here:
Android SQLite database: slow insertion
(5 answers)
Closed 6 years ago.
I'm trying to parse values from a CSV file to a SQLite DB, however the file is quite large (~2,500,000 lines). I ran my program for a a few hours, printing where it was up to, but by my calculation, the file would have taken about 100 hours to parse completely, so I stopped it.
I'm going to have to run this program as a background process at least once a week, on a new CSV file that is around 90% similar to the previous one. I have come up with a few solutions to improve my program. However I don't know much about databases, so I have questions about each of my solutions.
Is there a more efficient way to read a CSV file than what I have already?
Is instantiating an ObjectOutputStream, and storing it as a BLOB significantly computationally expensive? I could directly add the values instead, but I use the BLOB later, so storing it now saves me from instantiating a new one multiple times.
Would connection pooling, or changing the way I use the Connection in some other way be more efficient?
I'm setting the URL column as UNIQUE so I can use INSERT OR IGNORE, but testing this on smaller datasets(~10000 lines) indicates that there is no performance gain compared to dropping the table and repopulating. Is there a faster way to add only unique values?
Are there any obvious mistakes I'm making? (Again, I know very little about databases)
public class Database{
public void createResultsTable(){
Statement stmt;
String sql = "CREATE TABLE results("
+ "ID INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, "
+ "TITLE TEXT NOT NULL, "
+ "URL TEXT NOT NULL UNIQUE, "
...
...
+ "SELLER TEXT NOT NULL, "
+ "BEAN BLOB);";
try {
stmt = c.createStatement();
stmt.executeUpdate(sql);
} catch (SQLException e) { e.printStackTrace();}
}
public void addCSVToDatabase(Connection conn, String src){
BufferedReader reader = null;
DBEntryBean b;
String[] vals;
try{
reader = new BufferedReader(new InputStreamReader(new FileInputStream(src), "UTF-8"));
for(String line; (line = reader.readLine()) != null;){
//Each line takes the form: "title|URL|...|...|SELLER"
vals = line.split("|");
b = new DBEntryBean();
b.setTitle(vals[0]);
b.setURL(vals[1]);
...
...
b.setSeller(vals[n]);
insert(conn, b);
}
} catch(){
}
}
public void insert(Connection conn, DBEntryBean b){
PreparedStatement pstmt = null;
String sql = "INSERT OR IGNORE INTO results("
+ "TITLE, "
+ "URL, "
...
...
+ "SELLER, "
+ "BEAN"
+ ");";
try {
pstmt = c.prepareStatement(sql);
pstmt.setString(Constants.DB_COL_TITLE, b.getTitle());
pstmt.setString(Constants.DB_COL_URL, b.getURL());
...
...
pstmt.setString(Constants.DB_COL_SELLER, b.getSeller());
// ByteArrayOutputStream baos = new ByteArrayOutputStream();
// oos = new ObjectOutputStream(baos);
// oos.writeObject(b);
// byte[] bytes = baos.toByteArray();
// pstmt.setBytes(Constants.DB_COL_BEAN, bytes);
pstmt.executeUpdate();
} catch (SQLException e) { e.printStackTrace();
} finally{
if(pstmt != null){
try{ pstmt.close(); }
catch (SQLException e) { e.printStackTrace(); }
}
}
}
}
The biggest bottleck in your code is that you are not batching the insert operations. You should really call pstmt.addBatch(); instead of pstmt.executeUpdate(); and execute the batch once you have something like a batch of 10K rows to insert.
On the CSV parsing side should really consider using a csv library to do the parsing for you. Univocity-parsers has the fastest CSV parser around and it should process these 2.5 million lines in less than a second. I'm the author of this library by the way.
String.split() is convenient but not fast. For anything more than a few dozen rows it doesn't make sense to use this.
Hope this helps.
I have a byte[] which is actually an image.
i want to store it in Oracle 11g. I created a BLOB Column in my Table. and by following i tried to insert it.
String imageStr = "xyz...."
byte[] data = imageStr.getBytes();
String sQuery = "insert into Table (LOCATION , BLOB_DATA) Values ('Lahore', data) ";
It throws exception "java.sql.SQLException: ORA-01465: invalid hex number"
I searched it and found that this type of query should be done via PreparedStaement.
so i did something following
PreparedStatement prepStmt = dbConnection.prepareStatement("insert into Table (LOCATION, BLOB_DATA) values(?,?);
prepStmt.setString(1, 'Lahore');
prepStmt.setBytes(2, bytes);
I start getting error on dbConnection.prepareStatement(String) because the DBConnection class is not Java Native class.
It's a Custom class made by Earlier Developers for Database Connection and it do not has prepareStatement(String) function in it.
So what to do now??
1. Should i create a method prepareStatement(String) in DBConnection class?
2. Should i go for first approach?
You can look at my example to store image in db
Statement s;
Connection c;
FileInputStream fis;
PreparedStatement ps;
File file;
try
{
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");//your driver
c=DriverManager.getConnection("Jdbc:Odbc:image","scott","tiger");//password and name changes according to your db
s=c.createStatement();
st.execute("Create table ImageStoring(Image_No number(5),Photo blob)");
}
catch(Exception e1)
{
e1.printStackTrace();
}
try
{
file=new File"D:/ARU/Aruphotos/4.jpg");
fis=new FileInputStream(file);
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
c=DriverManager.getConnection("Jdbc:Odbc:image","scott","tiger");
s=c.createStatement();
ps=c.prepareStatement("insert into ImageStoring values(?,?)");
ps.setInt(1,2);
ps.setBinaryStream(2,fis,(int)file.length());
System.out.println("success");
ps.execute();
ps.close();
c.close();
}
catch(Exception e)
{
e.printStackTrace();
}
}
I am writing a full database extract program in java. Database is Oracle, and it is huge. Some tables have ~260 million records. The program should create one file per table in a specific format, so using Oracle datapump etc is not an option. Also, some company security policies do not allow to write a PL/SQL procedure to create files on DB server for this requirement. I have to go with Java and JDBC.
The issue I am facing is that Since files for some of the table is huge (~30GB) I am running out of memory almost every time even with a 20GB Java Heap. During the creation of file when the file size exceeds the heap size, even with one of the most aggressive GC policy, the process seems to hang-up. For example if the file size is > 20GB and heap size is 20GB, once heap utilization hits max heap size, its slows down writing 2MB per minute or so and at this speed, it will take months to get full extract.
I am looking for some way to overcome this issue. Any help would be greatly appreciated.
Here are some details of the system configuration I have:
Java - JDK1.6.0_14
System config - RH Enterprise Linux (2.6.18) running on 4 X Intel Xeon E7450 (6 cores) #2.39GH
RAM - 32GB
Database Oracle 11g
file wirting part of the code goes below:
private void runQuery(Connection conn, String query, String filePath,
String fileName) throws SQLException, Exception {
PreparedStatement stmt = null;
ResultSet rs = null;
try {
stmt = conn.prepareStatement(query,
ResultSet.TYPE_SCROLL_INSENSITIVE,
ResultSet.CONCUR_READ_ONLY);
stmt.setFetchSize(maxRecBeforWrite);
rs = stmt.executeQuery();
// Write query result to file
writeDataToFile(rs, filePath + "/" + fileName, getRecordCount(
query, conn));
} catch (SQLException sqle) {
sqle.printStackTrace();
} finally {
try {
rs.close();
stmt.close();
} catch (SQLException ex) {
throw ex;
}
}
}
private void writeDataToFile(ResultSet rs, String tempFile, String cnt)
throws SQLException, Exception {
FileOutputStream fileOut = null;
int maxLength = 0;
try {
fileOut = new FileOutputStream(tempFile, true);
FileChannel fcOut = fileOut.getChannel();
List<TableMetaData> metaList = getMetaData(rs);
maxLength = getMaxRecordLength(metaList);
// Write Header
writeHeaderRec(fileOut, maxLength);
while (rs.next()) {
// Now iterate on metaList and fetch all the column values.
writeData(rs, metaList, fcOut);
}
// Write trailer
writeTrailerRec(fileOut, cnt, maxLength);
} catch (FileNotFoundException fnfe) {
fnfe.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
try {
fileOut.close();
} catch (IOException ioe) {
fileOut = null;
throw new Exception(ioe.getMessage());
}
}
}
private void writeData(ResultSet rs, List<TableMetaData> metaList,
FileChannel fcOut) throws SQLException, IOException {
StringBuilder rec = new StringBuilder();
String lf = "\n";
for (TableMetaData tabMeta : metaList) {
rec.append(getFormattedString(rs, tabMeta));
}
rec.append(lf);
ByteBuffer byteBuf = ByteBuffer.wrap(rec.toString()
.getBytes("US-ASCII"));
fcOut.write(byteBuf);
}
private String getFormattedString(ResultSet rs, TableMetaData tabMeta)
throws SQLException, IOException {
String colValue = null;
// check if it is a CLOB column
if (tabMeta.isCLOB()) {
// Column is a CLOB, so fetch it and retrieve first clobLimit chars.
colValue = String.format("%-" + tabMeta.getColumnSize() + "s",
getCLOBString(rs, tabMeta));
} else {
colValue = String.format("%-" + tabMeta.getColumnSize() + "s", rs
.getString(tabMeta.getColumnName()));
}
return colValue;
}
Its probably due to the way you call prepareStatement, see this question for a similar problem. You don't need scrollability and a ResultSet will be read-only be default so just call
stmt = conn.prepareStatement(query);
Edit:
Map your database tables to Class usig JPA.
Now load collection of Objects from DB using Hibernate in the Batch of some tolerable size and serialize it to FILE .
Is your algorithm like the following? This is assuming a direct mapping between DB rows and lines in the file:
// open file for writing with buffered writer.
// execute JDBC statement
// iterate through result set
// convert rs to file format
// write to file
// close file
// close statement/rs/connection etc
Try using Spring JDBC Template to simplify the JDBC portion.
I believe this must be possible on default 32 MB java heap. Just fetch each row, save the data to file stream, flash and close once done.
What value are you using for maxRecBeforWrite?
Perhaps the query of the max record length is defeating your setFetchSize by forcing JDBC to scan the entire result for record length? Maybe you could delay writing your header and note the max record size on the fly.