Image not served inside jsp from Servlets - java

I have a jsp wherein for each row of the table i need to display the image present in the database. I retrieve all the table row data from database including image as Blob and store it in a bean. The image is stored in the bean as byte array like this:
photo = rs.getBlob("PHOTO");
photoByteArray = photo.getBytes(1, (int)photo.length());
While looping over the list of beans in jsp, the src attribute points to a servlet like this:
<img class="img" width="55" height="50" src="displayThumbnail?photoData=${part.photoData}">
which serves the image like shown below but they don't show up however upon debugging the byte array do seem to have data.
protected void processRequest(HttpServletRequest request, HttpServletResponse response) throws IOException {
response.setContentType("image/jpeg");
OutputStream o = response.getOutputStream();
String photoDataStr = request.getParameter("photoData");
byte[] photoData = null;
if(photoDataStr != null) {
photoData = photoDataStr.getBytes();
}
o.write(photoData);
o.close();
}
However the image doesn't show up. Now, if i query the database for each individual image as shown below, the images do show up fine in that case.
protected void processRequest(HttpServletRequest request, HttpServletResponse response) {
PreparedStatement pstmt = null;
ResultSet rs = null;
Connection conn = null;
try {
if(conn == null) {
conn = open();
}
pstmt = conn.prepareStatement("select photo from PART_PHOTOS where id = ?");
String id = request.getParameter("id");
pstmt.setString(1, id);
rs = pstmt.executeQuery();
if (rs.next()) {
Blob b = rs.getBlob("photo");
response.setContentType("image/jpeg");
response.setContentLength((int) b.length());
InputStream is = b.getBinaryStream();
OutputStream os = response.getOutputStream();
byte buf[] = new byte[(int) b.length()];
is.read(buf);
os.write(buf);
os.close();
is.close();
}
} catch (Exception ex) {
System.out.println(ex.getMessage());
ex.printStackTrace();
} finally {
if (rs != null) {
try {
rs.close();
} catch (SQLException e) {
e.printStackTrace();
}
rs = null;
}
if (pstmt != null) {
try {
pstmt.close();
} catch (SQLException e) {
e.printStackTrace();
}
pstmt = null;
}
//check if it's the end of the loop
if (conn != null) {
try {
conn.close();
} catch (SQLException e) {
e.printStackTrace();
}
conn = null;
}
}
}
I would highly appreciate if anyone can provide any recommendations around the same.

You're assuming that you can put random binary data into an HTML file, and it will be parsed correctly, and sent back to your server intact. This is a bad assumption! If nothing else, the byte that corresponds to the ASCII for the quote character is going to cause problems, right? Not to mention encoding issues, and the fact that the parameters to a URL must be urlencoded. This is just doomed to fail.
To make this work, you'd have to have some kind of explicit text encoding of the binary data when you serve the page (base64, maybe), and then decode the servlet parameter back to binary image data after the URL is posted back.

Your first processRequest() snippet is only sending back the byte representation of the photoData request parameter and not the photo data identified by the parameter. Looks like a bug in your code.
It seems you are trying to solve your problem in the wrong manner. When you first create the HTML table, storing the image in your "bean" from your first query gives you nothing unless you cache the data, and the subseqent displayThumbnail request retrieves the image from the cache, avoiding the database query.
If you do not want to mess with caching, then there is no need to store the image in your initial bean since it gives you nothing, and just do something like your second processRequest() snippet to fetch the image directly when the browser asks for it.

Your ${part.photoData} expression must return some ID. In the processRequest() method you must get that ID value (by using request.getParameter("photoData")) and by that value retrieve image from database (or better from cache or from file system) and send the binary data to web client.

Related

Object.toString() OutOfMemory - How to split if it contains Base64?

I have a response from a web server which contains the whole Base64 content of a pdf (I CANNOT CHANGE SERVER RESPONSE). Dimension of this file is over 50MB.
So, when i try to process this response with an AsynkTask the application runs into OOM exception.
#Override
protected String doInBackground(Object... params) {
String response = null;
try {
CommandCode = params[0].toString();
App application = (App) params[1];
String data = params[2].toString(); <--- Exception here
if (application != null) {
switch (CommandCode) {
//STORE ATTACHMENT
case Constants.ParserCommand.STORE_ATTACHMENT: {
try {
if (JSONParser.ParseGetAttachment(new JSONObject(data), application)) {
response = "attachment stored";
} else {
response = null;
}
} catch (Exception ex) {
response = null;
}
}
}
}
} catch (Exception ex) {
// some code
}
}
"Data" is a JSONObject which contains various informations including the base64 content. Above code is called by this few lines:
// GET_ATTACHMENT_BY_ATID
JSONObject jo = new JSONObject(_response).getJSONObject("DATA").getJSONObject("Attachment");
task.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR, Constants.ParserCommand.STORE_ATTACHMENT, appCDA, jo);
All data are needed to create a new record into internal DB to store the document, but the content is stored into a File in filesystem.
My first thought is to isolate the content and work with it. Is there a way to do that? For example:
// All data withouth base64Content
JSONObject mainInfo = new JSONObject(_response).getJSONObject("DATA").getJSONObject("Attachment").without("Base64Content");
// Content isolated
String contentInfo = new JSONObject(_response).getJSONObject("DATA").getJSONObject("Attachment").getString("Base64Content");
Now the OOM exception still raise. Is there a way to split this base64 content in multiple parts and save them into the file above mentioned by appending?
Using s As FileStream = File.Open("C:\FilePath\File", FileMode.Open)
Using sr As New StreamReader(s)
Using reader As New JsonTextReader(sr)
While reader.Read()
//Perform your action here
End While
End Using
End Using
End Using
Here you can use FileStream to read the JSON file, it will only load apart of file in memory at a time and not the whole file. The code is in VB, you can implement it in Java in some way.

byte[] to be stored in Oracle

I have a byte[] which is actually an image.
i want to store it in Oracle 11g. I created a BLOB Column in my Table. and by following i tried to insert it.
String imageStr = "xyz...."
byte[] data = imageStr.getBytes();
String sQuery = "insert into Table (LOCATION , BLOB_DATA) Values ('Lahore', data) ";
It throws exception "java.sql.SQLException: ORA-01465: invalid hex number"
I searched it and found that this type of query should be done via PreparedStaement.
so i did something following
PreparedStatement prepStmt = dbConnection.prepareStatement("insert into Table (LOCATION, BLOB_DATA) values(?,?);
prepStmt.setString(1, 'Lahore');
prepStmt.setBytes(2, bytes);
I start getting error on dbConnection.prepareStatement(String) because the DBConnection class is not Java Native class.
It's a Custom class made by Earlier Developers for Database Connection and it do not has prepareStatement(String) function in it.
So what to do now??
1. Should i create a method prepareStatement(String) in DBConnection class?
2. Should i go for first approach?
You can look at my example to store image in db
Statement s;
Connection c;
FileInputStream fis;
PreparedStatement ps;
File file;
try
{
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");//your driver
c=DriverManager.getConnection("Jdbc:Odbc:image","scott","tiger");//password and name changes according to your db
s=c.createStatement();
st.execute("Create table ImageStoring(Image_No number(5),Photo blob)");
}
catch(Exception e1)
{
e1.printStackTrace();
}
try
{
file=new File"D:/ARU/Aruphotos/4.jpg");
fis=new FileInputStream(file);
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
c=DriverManager.getConnection("Jdbc:Odbc:image","scott","tiger");
s=c.createStatement();
ps=c.prepareStatement("insert into ImageStoring values(?,?)");
ps.setInt(1,2);
ps.setBinaryStream(2,fis,(int)file.length());
System.out.println("success");
ps.execute();
ps.close();
c.close();
}
catch(Exception e)
{
e.printStackTrace();
}
}

Java vs PHP when making simple sql queries

I was wondering if any of you could help me identify why my simple test applications are giving me so different results. I made two test programs that makes a simple database query and loop through the results. The first one was made by using PHP the second was made with Java.
I knew that Java would perform better, but I have hard time believing that java would perform almost 20 times better (see results below).
PHP:
$sql = "select * from message where user_id=20";
$db = get_PDO();
$stm = $db->prepare($sql);
for($i=0;$i<10000;$i++)
{
echo $i."\n";
$res = $stm->execute();
$rows = $stm->fetchAll();
}
echo "done";
The get_PDO is just a function that connects to the database and returns an pdo object.
Java:
public class Connect
{
public static void main (String[] args)
{
Connection conn = null;
Statement st= null;
try
{
String userName = ""; // db username
String password = ""; // db password
String url = "jdbc:mysql://localhost/test"; //test = db name
Class.forName ("com.mysql.jdbc.Driver").newInstance ();
conn = DriverManager.getConnection (url, userName, password);
System.out.println ("Database connection established");
}
catch (Exception e)
{
System.err.println ("Cannot connect to database server: "+e.toString());
}
finally
{
if (conn != null)
{
String query="select * from message where user_id=20";
try{
st = conn.createStatement();
}
catch(Exception e)
{
e.printStackTrace();
try{
conn.close();
}
catch(Exception er) {}
return;
}
for(int i=0;i<10000;i++)
{
System.out.println(i);
try
{
ResultSet rs = st.executeQuery(query);
while(rs.next())
{
}
}
catch(Exception e)
{
e.printStackTrace();
}
}
try
{
conn.close ();
System.out.println ("Database connection terminated");
}
catch (Exception e) { /* ignore close errors */ }
}
}
}
}
The results:
I measured the performance using time. (ie. time php test.php and time java Connect)
PHP:
real 1m34.337s
user 1m32.564s
sys 0m0.716s
Java:
real 0m5.388s
user 0m4.428s
sys 0m0.972s
Is java really that much faster or have I done something stupid? :)
Depending on the number of messages, PHP might need much more memory because the complete result set is held after calling fetchAll().
AFAIK PHP creates an associative Array for the result, which might also be time consuming.
It could be that in PHP, you retrieve all the data from the database by calling fetchAll() whereas in Java, you just move through the result set using rs.next() without actually reading. Depending on the JDBC driver implementation, this might give an opportunity for optimization that's not possible in the way the PHP version is implemented.

How to overcome OutOfMemoryError during huge file write

I am writing a full database extract program in java. Database is Oracle, and it is huge. Some tables have ~260 million records. The program should create one file per table in a specific format, so using Oracle datapump etc is not an option. Also, some company security policies do not allow to write a PL/SQL procedure to create files on DB server for this requirement. I have to go with Java and JDBC.
The issue I am facing is that Since files for some of the table is huge (~30GB) I am running out of memory almost every time even with a 20GB Java Heap. During the creation of file when the file size exceeds the heap size, even with one of the most aggressive GC policy, the process seems to hang-up. For example if the file size is > 20GB and heap size is 20GB, once heap utilization hits max heap size, its slows down writing 2MB per minute or so and at this speed, it will take months to get full extract.
I am looking for some way to overcome this issue. Any help would be greatly appreciated.
Here are some details of the system configuration I have:
Java - JDK1.6.0_14
System config - RH Enterprise Linux (2.6.18) running on 4 X Intel Xeon E7450 (6 cores) #2.39GH
RAM - 32GB
Database Oracle 11g
file wirting part of the code goes below:
private void runQuery(Connection conn, String query, String filePath,
String fileName) throws SQLException, Exception {
PreparedStatement stmt = null;
ResultSet rs = null;
try {
stmt = conn.prepareStatement(query,
ResultSet.TYPE_SCROLL_INSENSITIVE,
ResultSet.CONCUR_READ_ONLY);
stmt.setFetchSize(maxRecBeforWrite);
rs = stmt.executeQuery();
// Write query result to file
writeDataToFile(rs, filePath + "/" + fileName, getRecordCount(
query, conn));
} catch (SQLException sqle) {
sqle.printStackTrace();
} finally {
try {
rs.close();
stmt.close();
} catch (SQLException ex) {
throw ex;
}
}
}
private void writeDataToFile(ResultSet rs, String tempFile, String cnt)
throws SQLException, Exception {
FileOutputStream fileOut = null;
int maxLength = 0;
try {
fileOut = new FileOutputStream(tempFile, true);
FileChannel fcOut = fileOut.getChannel();
List<TableMetaData> metaList = getMetaData(rs);
maxLength = getMaxRecordLength(metaList);
// Write Header
writeHeaderRec(fileOut, maxLength);
while (rs.next()) {
// Now iterate on metaList and fetch all the column values.
writeData(rs, metaList, fcOut);
}
// Write trailer
writeTrailerRec(fileOut, cnt, maxLength);
} catch (FileNotFoundException fnfe) {
fnfe.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
try {
fileOut.close();
} catch (IOException ioe) {
fileOut = null;
throw new Exception(ioe.getMessage());
}
}
}
private void writeData(ResultSet rs, List<TableMetaData> metaList,
FileChannel fcOut) throws SQLException, IOException {
StringBuilder rec = new StringBuilder();
String lf = "\n";
for (TableMetaData tabMeta : metaList) {
rec.append(getFormattedString(rs, tabMeta));
}
rec.append(lf);
ByteBuffer byteBuf = ByteBuffer.wrap(rec.toString()
.getBytes("US-ASCII"));
fcOut.write(byteBuf);
}
private String getFormattedString(ResultSet rs, TableMetaData tabMeta)
throws SQLException, IOException {
String colValue = null;
// check if it is a CLOB column
if (tabMeta.isCLOB()) {
// Column is a CLOB, so fetch it and retrieve first clobLimit chars.
colValue = String.format("%-" + tabMeta.getColumnSize() + "s",
getCLOBString(rs, tabMeta));
} else {
colValue = String.format("%-" + tabMeta.getColumnSize() + "s", rs
.getString(tabMeta.getColumnName()));
}
return colValue;
}
Its probably due to the way you call prepareStatement, see this question for a similar problem. You don't need scrollability and a ResultSet will be read-only be default so just call
stmt = conn.prepareStatement(query);
Edit:
Map your database tables to Class usig JPA.
Now load collection of Objects from DB using Hibernate in the Batch of some tolerable size and serialize it to FILE .
Is your algorithm like the following? This is assuming a direct mapping between DB rows and lines in the file:
// open file for writing with buffered writer.
// execute JDBC statement
// iterate through result set
// convert rs to file format
// write to file
// close file
// close statement/rs/connection etc
Try using Spring JDBC Template to simplify the JDBC portion.
I believe this must be possible on default 32 MB java heap. Just fetch each row, save the data to file stream, flash and close once done.
What value are you using for maxRecBeforWrite?
Perhaps the query of the max record length is defeating your setFetchSize by forcing JDBC to scan the entire result for record length? Maybe you could delay writing your header and note the max record size on the fly.

Handling large records in a Java EE application

There is a table phonenumbers with two columns: id, and number. There are about half a million entries in the table. Database is MySQL.
The requirement is to develop a simple Java EE application, connected to that database, that allows a user to download all numbervalues in comma separated style by following a specific URL.
If we get all the values in a huge String array and then concatenate them (with comma in between all the values) in a String and then send it down to the user, does it sound a proper solution?
The application is not public and will be used by a limited no. of people.
Your best bet is to not store the data in Java's memory in any way, but just write the obtained data to the response immediately as the data comes in. You also need to configure the MySQL JDBC driver to serve the resultset row-by-row by Statement#setFetchSize() as per the MySQL JDBC driver documentation, otherwise it will cache the whole thing in memory.
Assuming you're familiar with Servlets, here's a kickoff example which takes that all into account:
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
response.setContentType("text/plain");
response.setHeader("Content-Disposition", "attachment;filename=numbers.txt"); // Force download popup.
Connection connection = null;
Statement statement = null;
ResultSet resultSet = null;
Writer writer = response.getWriter();
try {
connection = database.getConnection();
statement = connection.createStatement(ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY);
statement.setFetchSize(Integer.MIN_VALUE);
resultSet = statement.executeQuery("SELECT number FROM phonenumbers");
while (resultSet.next()) {
writer.write(resultSet.getString("number"));
if (!resultSet.isLast()) {
writer.write(",");
}
}
} catch (SQLException e) {
throw new ServletException("Query failed!", e);
} finally {
if (resultSet != null) try { resultSet.close; } catch (SQLException logOrIgnore) {}
if (statement != null) try { statement.close; } catch (SQLException logOrIgnore) {}
if (connection != null) try { connection.close; } catch (SQLException logOrIgnore) {}
}
}
There's a bit more to properly formatting CSV output. It would be easiest to use an existing library such as this one to generate the output file.
You can generate output to a file on disk (on the web server) and then redirect the browser to that file (with a cron job or whatever to clean up old data) or just stream the result directly back to the user.
If you are streaming directly be sure and set the MIME type to something that will trigger a download in the user's browser (e.g. text/csv or text/comma-separated-values)
If using Mysql 5.1+, I would simply use the proprietary syntax to dump the file somewhere and stream it in a Servlet response.
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
http://dev.mysql.com/doc/refman/5.1/en/select.html
For so many records, if you still want to use JDBC, you may try the following:
fetch the number of records fetch few
records( using a query
limit ) and write them
if you reach the number of
records in a chunk, you fetch another
one until you reach the maximum
number of records

Categories