Is there a faster way to output a PDF file? - java

This is a piece of code to output a PDF file to browser, could it be faster?
This is implemented in a Java servlet.
private ByteArrayOutputStream getByteArrayOutputStreamFromFile(File file) throws Exception {
BufferedInputStream bis = null;
ByteArrayOutputStream bos = null;
try {
bis = new BufferedInputStream(new FileInputStream(file));
bos = new ByteArrayOutputStream();
byte[] byteContent = new byte[1024 * 1024];
int len = 0;
while ((len = bis.read(byteContent)) != -1) {
bos.write(byteContent, 0, len);
}
return bos;
} catch (Exception ex) {
throw ex;
} finally {
if (bis != null) {
bis.close();
}
if (bos != null) {
bos.close();
}
}
}

Using Google Guava you can summarize this in one line:
import com.google.common.io.Files;
private OutputStream getOutputStream(File file) throws IOException {
return Files.newOutputStreamSupplier(file).getOutput();
}

response.setContentType("application/pdf");
ServletContext ctx = getServletContext();
InputStream is = ctx.getResourceAsStream("/erp.pdf");
int read =0;
byte[] bytes = new byte[1024];
OutputStream os = response.getOutputStream();
while((read = is.read(bytes)) != -1)
{
os.write(bytes, 0, read);
}
os.flush();
os.close();

A suggestion:
always look to libraries such as Apache Commons FileUtils. They provide simple and easy to use methods.
You can also leave out the BufferedOutputStream as you're already using a buffer. But that's not going to make a big difference. Try using the nio instead of the streams. This might make some difference.
Also look at this: How to download and save a file from Internet using Java? might help you some way.

Related

Upload large files to the Google Drive

In my app, I'm uploading files to the google drive using GD API. It works fine for small file sizes, but when file size is large (ex: 200MB), it throws java.lang.OutOfMemoryError: exception. I know why it crashes it loads the whole data into the memory, can anyone suggest how can I fix this problem?
This is my code:
OutputStream outputStream = result.getDriveContents().getOutputStream();
FileInputStream fis;
try {
fis = new FileInputStream(file.getPath());
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buf = new byte[8192];
int n;
while (-1 != (n = fis.read(buf)))
baos.write(buf, 0, n);
byte[] photoBytes = baos.toByteArray();
outputStream.write(photoBytes);
outputStream.close();
outputStream = null;
fis.close();
fis = null;
} catch (FileNotFoundException e) {
}
This line would allocate 200 MB of RAM and can definitely cause OutOfMemoryError exception:
byte[] photoBytes = baos.toByteArray();
Why are you not writing directly to your outputStream:
while (-1 != (n = fis.read(buf)))
outputStream.write(buf, 0, n);

java client server chat transferred files not received properly

I have my client server chat
Client sends files and server receives them. But, the problem is that, i don't think that files are received properly because when i check the size of the files i see the difference is halfed for some reasons!
I am using GUI to browse for files in the client side, and then i'm sending a command to the server to know that the client is sending a file. But it is not working
Here is the client and server
public void sendFiles(String file) {
try {
BufferedOutputStream outToClient = null;
outToClient = new BufferedOutputStream(sock.getOutputStream());
System.out.println("Sending file...");
if (outToClient != null) {
File myFile = new File( file );
byte[] mybytearray = new byte[(int) myFile.length()];
FileInputStream fis = null;
fis = new FileInputStream(myFile);
BufferedInputStream bis = new BufferedInputStream(fis);
this.out.println("SF");
bis.read(mybytearray, 0, mybytearray.length);
outToClient.write(mybytearray, 0, mybytearray.length);
this.out.flush();
outToClient.flush();
outToClient.close();
System.out.println("File sent!");
return;
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Server
public void recvFile() {
try {
byte[] aByte = new byte[1];
int bytesRead;
InputStream is = null;
is = sock.getInputStream();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
if (is != null) {
FileOutputStream fos = null;
BufferedOutputStream bos = null;
try {
fos = new FileOutputStream("/Users/Documents/Received.png");
bos = new BufferedOutputStream(fos);
bytesRead = is.read(aByte, 0, aByte.length);
do {
baos.write(aByte);
bytesRead = is.read(aByte);
} while (bytesRead != -1);
bos.write(baos.toByteArray());
bos.flush();
bos.close();
// clientSocket.close();
} catch (IOException ex) {
// Do exception handling
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
Can someone help me with this issue? As i don't know how to properly send and receive files
Thank you
You are using two copy techniques, and they are both wrong.
First:
byte[] mybytearray = new byte[(int) myFile.length()];
bis.read(mybytearray, 0, mybytearray.length);
outToClient.write(mybytearray, 0, mybytearray.length);
Here you are assuming:
That the file fits into memory.
That the file length fits into an int.
That read() fills the buffer.
None of these assumptions is valid.
Second:
byte[] aByte = new byte[1];
bytesRead = is.read(aByte, 0, aByte.length);
do {
baos.write(aByte);
bytesRead = is.read(aByte);
} while (bytesRead != -1);
Here you are:
Using a ridiculously small buffer of one byte.
Writing an extra byte if the file length is zero.
Using a do/while where the situation naturally calls for a while (as 99.99% of situations do), and therefore:
Using two read() calls, and only correctly checking the result of one of them.
Pointlessly using a ByteArrayOutputStream, which, as above, assumes the file fits into memory and that its size fits into an int. It also pointlessly adds latency.
Throw them both away and use this, at both ends:
byte[] buffer = new byte[8192];
int count;
while ((count = in.read(buffer)) > 0)
{
out.write(buffer, 0, count);
}
where:
in is a FileInputStream in the case of sending the file, or the socket input stream in the case of receiving the file.
out is a FileOutputStream in the case of receiving the file, or the socket output stream in the case of sending the file

How do you read from an InputStream in Java and convert to byte array?

I am currently trying to read in data from a server response. I am using a Socket to connect to a server, creating a http GET request, then am using a Buffered Reader to read in data. Here is what the code looks like compacted:
Socket conn = new Socket(server, 80);
//Request made here
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String response;
while((response = inFromServer.readLine()) != null){
System.out.println(response);
}
I would like to read in the data, instead of as a String, as a byte array, and write it to a file. How is this possible? Any help is greatly appreciated, thank you.
You need to use a ByteArrayOutputStream, do something like the below code:
Socket conn = new Socket(server, 80);
//Request made here
InputStream is = conn.getInputStream();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
int readBytes = -1;
while((readBytes = is.read(buffer)) > 1){
baos.write(buffer,0,readBytes);
}
byte[] responseArray = baos.toByteArray();
One way is to use Apache commons-io IOUtils
byte[] bytes = IOUtils.toByteArray(inputstream);
With plain java:
ByteArrayOutputStream output = new ByteArrayOutputStream();
try(InputStream stream = new FileInputStream("myFile")) {
byte[] buffer = new byte[2048];
int numRead;
while((numRead = stream.read(buffer)) != -1) {
output.write(buffer, 0, numRead);
}
} catch(IOException e) {
e.printStackTrace();
}
// and here your bytes
byte[] myDesiredBytes = output.toByteArray();
If you are not using Apache commons-io library in your project,I have pretty simple method to do the same without using it..
/*
* Read bytes from inputStream and writes to OutputStream,
* later converts OutputStream to byte array in Java.
*/
public static byte[] toByteArrayUsingJava(InputStream is)
throws IOException{
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int reads = is.read();
while(reads != -1){
baos.write(reads);
reads = is.read();
}
return baos.toByteArray();
}

EOFException: Unexpected end of ZLIB input stream

I am facing this issue while unzipping a file and writing it in to another file. Here is the code. Can any one please let me know what changes are required.
I get this exception on the line with while ((len = zis.read(buffer)) > 0)
private FileItem readZippedFileRequest(HttpServletRequest request,Part part, String fileName) {
FileItem fileItem = null;
byte[] buffer = new byte[1024];
InputStream inputStream = part.getInputStream();
ZipInputStream zis = new ZipInputStream(inputStream);
ZipEntry entry;
while ((entry = zis.getNextEntry()) != null) {
ByteArrayOutputStream fos = new ByteArrayOutputStream();
int len;
while ((len = zis.read(buffer)) > 0) {
fos.write(buffer, 0, len);
}
InputStream myByteArray = new ByteArrayInputStream(fos.toByteArray());
fileItem = createCSVFile(myByteArray, fileName,ImportExportConstant.FILE_TYPE_EXCEL);
}
return fileItem;
}
There's nothing wrong with your code. There is something wrong with the file, as the message says. Are you sure it is zipped, and not GZipped for example? It would be more usual for a part to be GZipped. Try a GZIPInputStream.
NB there's no need for the ByteArrayInputStream. It's a complete waste of time and space. Just pass the zip/gzip input stream directly to your createCSVFile() method.
I have this error too and I searched a little bit... I've read that there must be zis.closeEntry(); before len = zis.read(buffer) but I tried it and then the error appears at zis.closeEntry();
I asked google and here is the answer:
!Answer!
I tried and wrote a little bit, then I switched the throws IOException in a try/catch-block and now there is all right.
The Exception is a well known bug. You have to put all in a try/catch-block and do nothing in the catch.
private FileItem readZippedFileRequest(HttpServletRequest request,Part part, String fileName) {
FileItem fileItem = null;
byte[] buffer = new byte[1024];
try{
InputStream inputStream = part.getInputStream();
ZipInputStream zis = new ZipInputStream(inputStream);
ZipEntry entry;
while ((entry = zis.getNextEntry()) != null) {
ByteArrayOutputStream fos = new ByteArrayOutputStream();
int len;
while ((len = zis.read(buffer)) > 0) {
fos.write(buffer, 0, len);
}
InputStream myByteArray = new ByteArrayInputStream(fos.toByteArray());
fileItem = createCSVFile(myByteArray, fileName,ImportExportConstant.FILE_TYPE_EXCEL);
}
}catch(IOException ex){
//Do nothing here
}
return fileItem;
}

Fastest way to copy text from a File to a HttpServletResponse

I need a very fast way to copy text from a file to the body of a HttpServletResponse.
Actually I'm copying byte by byte in a loop, from a bufferedReader to the response.getWriter() but I believe there must be a faster and more straightforward way of doing it.
Thanks!
I like using the read() method that accepts a byte array since you can tweak the size and change the performance.
public static void copy(InputStream is, OutputStream os) throws IOException {
byte buffer[] = new byte[8192];
int bytesRead;
BufferedInputStream bis = new BufferedInputStream(is);
while ((bytesRead = bis.read(buffer)) != -1) {
os.write(buffer, 0, bytesRead);
}
is.close();
os.flush();
os.close();
}
There's no need to do this stuff yourself. It is such a common requirement that open source, battle-tested, optimised solutions exist.
Apache Commons IO has an IOUtils class with a range of static copy methods. Perhaps you could use
IOUtils.copy(reader, writer);
http://commons.apache.org/io/api-1.4/org/apache/commons/io/IOUtils.html#copy(java.io.Reader, java.io.Writer)
This is how I do it in my Servlet with a 4K buffer,
// Send the file.
OutputStream out = response.getOutputStream();
BufferedInputStream is = new BufferedInputStream(new FileInputStream(file));
byte[] buf = new byte[4 * 1024]; // 4K buffer
int bytesRead;
while ((bytesRead = is.read(buf)) != -1) {
out.write(buf, 0, bytesRead);
}
is.close();
out.flush();
out.close();

Categories