I am trying to retrieve a zip from my database, but each time the generated zip is corrupted. The zip is supposed to contain 3 pdf files, but when I generate it, it only contains the first one with size 0 and when I try to open the zip "Unexpected end of archive" error popup is displayed. I cannot figure out what's wrong, as the file in the database is not corrupted and the code is working in my PC, and many other remote servers, but not on a specific remote server (all run on wildfly 10, same mysql database configuration, with the same zip stored in database). My code is the following (JDBC):
...
Statement stmt = conn.createStatement();
ResultSet rsstmt.executeQuery("SELECT document_data from table "
+ "WHERE condition");
if (rs.next() && rs.getBytes("document_data") != null) {
ByteArrayInputStream(rs.getBytes("document_data"));
File zipped= new File("exported.zip");
FileOutputStream fos = new FileOutputStream(zipped);
byte[] buffer = new byte[1];
InputStream is = rs.getBinaryStream(1);
while (is.read(buffer) > 0) {
fos.write(buffer);
}
fos.close();
}
I tried the following code too, but didn't work either:
InputStream in = null;//zip bytes
OutputStream out;//zip archive to be generated
.
.
.
if (rs.next() && rs.getBytes("document_data") != null) {
in = new ByteArrayInputStream(rs.getBytes("document_data"));
}
out = new FileOutputStream("exported.zip");
byte[] buf = new byte[1024];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
in.close();
out.close();
When executing the following SQL query, the zip generated is NOT corrupted:
SELECT document_data INTO DUMPFILE '/tmp/exported.zip' FROM table WHERE condition;
NOTE: The zip size in database(LONGBLOB field) is 830K.
NOTE: I tried with JDBC and Hibernate, but result is the same.
Any ideas on this strange behavior?
You are both calling getBytes(), which reads out the blob, and then reading from its input stream, which by this stage will be empty.
Get rid of the getBytes() call.
Related
I have a custom method to
copy the database from assets folder to the database directory
AssetManager am = myContext.getAssets();
OutputStream os = new FileOutputStream(DBFile);
DBFile.createNewFile();
byte []b = new byte[1024];
int i, r;
String []Files = am.list("");
Arrays.sort(Files);
for(i=1;i<5;i++) {
String fn = String.format("0%d.database", i);
if(Arrays.binarySearch(Files, fn) < 0)
break;
InputStream is = am.open(fn);
while((r = is.read(b)) != -1)
os.write(b, 0, r);
is.close();
}
os.close();
All works as expected and the database is copied to the apps database directory
except for when I change the android system language (settings>language) and set
the system language as arabic, and re install the app, it crashes and the database
file isn't copied to the databases directory. It's just a blank file.
Im thinking maybe i should read the database file with an encoding specified
because something must change in java dhe moment I change the language.
In Java JSP, I'm using input stream and BLOB datatype to write any type of file into the database. I want to retrieve the BLOB file. How should I go about retrieving it? I tried using select statement and got this (material column).
Using a ResultSet, you can retrieve a java.sql.Blob instance:
Blob blob = resultSet.getBlob("MATERIAL");
Then you can open an input stream:
InputStream input = blob.getBinaryStream();
And write it to a file as described in Is it possible to create a File object from InputStream
Getting the Blob from the table:
try (ResultSet rs = statement.executeQuery(query)) {
if(rs.next()) {
fileBlob = rs.getBlob(1);
}
}
Saving the contents of the Blob into a new file:
try (InputStream ipStream = fileBlob.getBinaryStream(1, fileBlob.length());
OutputStream opStream = new FileOutputStream(new File("path/to/file"))) {
int c;
while ((c = ipStream.read()) > -1) {
opStream.write(c);
}
}
I am writing a web app that I would like to be able to import a csv file into a database from a jsp. Previously I have used the following code to insert the csv into the database.
LOAD DATA LOCAL INFILE "/myFileLocation.csv"
INTO TABLE myTable
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Which works great when I have the file locally.
My question, when I upload the csv file in my jsp as a multipart. Is it possible for me to pass that PartItem file as a variable and replace the "/myFileLocation.csv" with the PartItem files temp location?
I can see the temp location when I debug and view the PartItem file which resides in repository/path under the variables table. Is this at all possible to access or do i need to parse the csv and insert it into the database that way?
I ended up finding a way to make this work. Not sure if it's the best solution but its working as I envisioned. Basically I create a string pointing to an assets folder I created in the web/resources like this.
final String mainPath = this.getServletContext().getRealPath("resources/assets");
Then I read the uploaded file, check to see if the file already exists in my assets folder, if it does I delete it.
Part filePart = request.getPart("csvFile");
String path = mainPath + "/" + filePart.getSubmittedFileName();
File fileTemp = new File(path);
if(fileTemp.exists()){
fileTemp.delete();
}
Lastly I read the uploaded file and write a new file in the location I directed it to which in this case is the assets folder I created like this.
final String fileName = filePart.getSubmittedFileName();
File convFile = new File(filePart.getSubmittedFileName());
FileOutputStream fos = new FileOutputStream(convFile);
OutputStream out = null;
InputStream filecontent= null;
try{
out = new FileOutputStream(new File(mainPath + File.separator + fileName));
filecontent = filePart.getInputStream();
int read = 0;
final byte[] bytes = new byte[1024];
while((read = filecontent.read(bytes)) != -1){
out.write(bytes, 0, read);
}
} catch (FileNotFoundException fne) {
} finally {
if (out != null) {
out.close();
}
if (filecontent != null) {
filecontent.close();
}
}
After that I just passed a string containing the path to the file with the file name to the DAO I created where I was able to utilize the sql statement I had posted above.
Like I stated before, not sure if this is the best way to do this but it seems to be working fine for me and none of my java code is contained within my jsp. If anyone has a better way of doing this or sees something wrong with what I did here let me know, I'd be very interested to hear about it.
I am trying to transfer a SQLite database into an app by downloading it and then unzipping it to the correct location. I was successful in transferring the DB when it was unzipped. The error I get is that it cannot find any of the tables I query. I have also been successful in unzipping and reading normal text files.
The DB has Hebrew and English, but that has not caused problems before. The bilingual DB was copied successfully when it was not zipped and bilingual texts have been successfully unzipped and read. Still, it is a possibility that there is an encoding problem going on. That seems weird to me, because as you can see below in the code, I'm just copying the bytes directly.
-EDIT-
Let's say the prezipped db is called test1.db. I zipped it, put it in the app, unzipped it and called that test2.db. when I ran a diff command on these two, there were no differences. So there must be a technical issue with the way android is reading the file / or maybe encoding issue on android that doesn't exist on pc?
I hate to do a code dump, but i will post both my copyDatabase() function (which works). That is what I used previously running it on an unzipped DB file. I put it here as comparison. Now I'm trying to use unzipDatabase() function (which doesn't work), and use it on a zipped DB file. The latter function was copied from How to unzip files programmatically in Android?
private void copyDatabase() throws IOException{
String DB_NAME = "test.db";
String DB_PATH = "/data/data/org.myapp.myappname/databases/";
//Open your local db as the input stream
InputStream myInput = myContext.getAssets().open(DB_NAME);
// Path to the just created empty db
String outFileName = DB_PATH + DB_NAME;
//Open the empty db as the output stream
OutputStream myOutput = new FileOutputStream(outFileName);
//transfer bytes from the inputfile to the outputfile
byte[] buffer = new byte[1024];
int length;
while ((length = myInput.read(buffer))>0){
myOutput.write(buffer, 0, length);
}
//Close the streams
myOutput.flush();
myOutput.close();
myInput.close();
}
private boolean unzipDatabase(String path)
{
String DB_NAME = "test.zip";
InputStream is;
ZipInputStream zis;
try
{
String filename;
is = myContext.getAssets().open(DB_NAME);
zis = new ZipInputStream(is);
ZipEntry ze;
byte[] buffer = new byte[1024];
int count;
while ((ze = zis.getNextEntry()) != null)
{
// write to a file
filename = ze.getName();
// Need to create directories if not exists, or
// it will generate an Exception...
if (ze.isDirectory()) {
Log.d("yo",path + filename);
File fmd = new File(path + filename);
fmd.mkdirs();
continue;
}
OutputStream fout = new FileOutputStream(path + filename);
// reading and writing zip
while ((count = zis.read(buffer)) != -1)
{
fout.write(buffer, 0, count);
}
fout.flush();
fout.close();
zis.closeEntry();
}
zis.close();
}
catch(IOException e)
{
e.printStackTrace();
return false;
}
return true;
}
So still don't know why, but the problem is solved if I first delete the old copy of the database (located at DB_PATH + DB_NAME) and then unzip the new one there. I didn't need to do this when copying it directly.
so yay, it was a file overwriting issue...If someone knows why, feel free to comment
I tried to uplad a pdf file using java.sql.PreparedStatement to mysql Blob field using the following code.
File inFile = new File("Path+BLOCK.pdf");
byte[] b = new byte[(int)inFile.length()];
PreparedStatement psmnt = (PreparedStatement)
con.prepareStatement("INSERT INTO
2012DOC (SRNO,DOCUMENT)
VALUES (?,?)"
); //con is java.sql.Connection object
psmnt.setString(1, "1200021");
psmnt.setBytes(2, b);
psmnt.executeUpdate();
This code executes without error and database shows blob content, but when I try to retrieve the file using the below code it gives a corrupt file which doesn't open.
ResultSet rs=con.Execute("SELECT DOCUMENT FROM 2012DOC");
rs.next();
response.setContentType("application/pdf");
response.setHeader("Content-Disposition", "attachment; filename=kjsahkjd.pdf");
java.sql.Blob blob = rs.getBlob("DOCUMENT");
ServletOutputStream servletOutputStream = response.getOutputStream();
InputStream in = blob.getBinaryStream();
int length = (int) blob.length();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
while ((length = in.read(buffer)) != -1) {
servletOutputStream.write(buffer, 0, length);
}
in.close();
servletOutputStream.flush();
servletOutputStream.close();
It outputs the file with same size as the original,but the file doesn't open.
The pdf reader is fired but cannot open the file and gives an error 'the file was either damaged or not supported file type'
Ahhh...After a little debugging I found the code that uploads is troublesome, and finally got the right way to do it.
Here is what I did...I'm posting it so that others with same problem can solve it
After Converting the java.io.File to java.io.FileInputStream
FileInputStream io = new FileInputStream(inFile);
Set the BLOB field using psmnt.setBinaryStream()
psmnt.setBinaryStream(3, (InputStream)io,(int)inFile.length());
remove " java.sql.Blob blob = rs.getBlob("DOCUMENT"); "
and don't initialize length i.e instead of
int length = (int) blob.length();
just write
int length;
then it downloads file successfully .. enjoy :)