Java copy part of InputStream to OutputStream - java

I have a file with 3236000 bytes and I want to read 2936000 from start and write to an OutputStream
InputStream is = new FileInputStream(file1);
OutputStream os = new FileOutputStream(file2);
AFunctionToCopy(is,os,0,2936000); /* a function or sourcecode to write input stream 0to2936000 bytes */
I can read and write byte by byte, but it's to slow (i think) from buffered reading
How can do I copy it?

public static void copyStream(InputStream input, OutputStream output, long start, long end)
throws IOException
{
for(int i = 0; i<start;i++) input.read(); // dispose of the unwanted bytes
byte[] buffer = new byte[1024]; // Adjust if you want
int bytesRead;
while ((bytesRead = input.read(buffer)) != -1 && bytesRead<=end) // test for EOF or end reached
{
output.write(buffer, 0, bytesRead);
}
}
should work for you.

If you have access to the Apache Commons library, you can use:
IOUtils.copyLarge(InputStream input, OutputStream output, long inputOffset, long length)

Related

Reading InputStream bytes and writing to ByteArrayOutputStream

I have code block to read mentioned number of bytes from an InputStream and return a byte[] using ByteArrayOutputStream. When I'm writing that byte[] array to a file, resultant file on the filesystem seems broken. Can anyone help me find out problem in the below code block.
public byte[] readWrite(long bytes, InputStream in) throws IOException {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int maxReadBufferSize = 8 * 1024; //8KB
long numReads = bytes/maxReadBufferSize;
long numRemainingRead = bytes % maxReadBufferSize;
for(int i=0; i<numReads; i++) {
byte bufr[] = new byte[maxReadBufferSize];
int val = in.read(bufr, 0, bufr.length);
if(val != -1) {
bos.write(bufr);
}
}
if(numRemainingRead > 0) {
byte bufr[] = new byte[(int)numRemainingRead];
int val = in.read(bufr, 0, bufr.length);
if(val != -1) {
bos.write(bufr);
}
}
return bos.toByteArray();
}
My understanding of the problem statement
Read bytes number of bytes from the given InputStream in a ByteArrayOutputStream.
Finally, return a byte array.
Key observations
A lot of work is done to make sure bytes are read in chunks of 8KB.
Also, the last remaining chunk of odd size is read separately.
A lot of work is also done to make sure we are reading from the correct offset.
My views
Unless we are reading a very large file (>10MB) I don't see a valid reason for reading in chunks of 8KB.
Let Java libraries do all the hard work of maintaining offset and making sure we don't read outside limits.
Eg: We don't have to give offset, simply do inputStream.read(b) over and over, the next byte array of size b.length will be read. Similarly, we can simply write to outputStream.
Code
public byte[] readWrite(long bytes, InputStream in) throws IOException {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] buffer = new byte[(int)bytes];
is.read(buffer);
bos.write(buffer);
return bos.toByteArray();
}
References
About InputStreams
Byte Array to Human Readable Format

Java: File to String, problems with using a buffer, byte array not clean

Take the following static method:
public static String fileToString(String filename) throws Exception {
FileInputStream fis = new FileInputStream(filename);
byte[] buffer = new byte[8192];
StringBuffer sb = new StringBuffer();
int bytesRead; // unused? weird compiler messages...
while((bytesRead = fis.read(buffer)) != -1) { // InputStream.read() returns -1 at EOF
sb.append(new String(buffer));
}
return new String(sb);
}
As you can see everything looks okay, and it is perfect for small text files. But once you get to big files with thousands of lines, you encounter problems with repeating text. Based on my intuition, I thoughtbyte[] buffer was "unclean", so to speak. So I added the following line to the method:
buffer = new byte[8192];
So that it is now:
public static String fileToString(String filename) throws Exception {
FileInputStream fis = new FileInputStream(filename);
byte[] buffer = new byte[8192];
StringBuffer sb = new StringBuffer();
int bytesRead; // unused? weird compiler messages...
while((bytesRead = fis.read(buffer)) != -1) { // InputStream.read() returns -1 at EOF
sb.append(new String(buffer));
buffer = new byte[8192]; // added new line here
}
return new String(sb);
}
And it's perfect, except for the fact that at the end of the String that the static method returns, I get a lot of null characters (depends on the buffer size). What's going on here?
actually: // unused? weird compiler messages...
is not weird. You never read this.
how could sb.append(new String(buffer)); know how many bytes are written to the buffer.
Exactly, this is where bytesRead comes into play.
So you need new String(bytes, offset, length)
public static String fileToString(String filename) throws Exception {
FileInputStream fis = new FileInputStream(filename);
byte[] buffer = new byte[8192];
StringBuffer sb = new StringBuffer();
int bytesRead; // unused? weird compiler messages...
while((bytesRead = fis.read(buffer)) != -1) { // InputStream.read() returns -1 at EOF
sb.append(new String(buffer,0,bytesRead));
buffer = new byte[8192];
bytesRead=0;
}
return new String(sb);
}
might work
You really shouldnt be reading bytes and creating a String from the raw bytes. THis is wrong because it completely ignores the encoding of the text. You might be lucky and be reading ASCII in which case things will just work out. In all other cases this is asking for trouble.
You really should use a BufferedReader which wraps an InputStreamReader which wraps your source InputStream.
Don't reinvent wheel. If you are not doing a school homework, use existing library like Apache commons IO.
http://commons.apache.org/io/apidocs/org/apache/commons/io/IOUtils.html#toString%28java.io.InputStream,%20java.nio.charset.Charset%29
For example you can read the File into a String in just a few lines like following:
public static String fileToString(String filepath) throws Exception {
return IOUtils.toString(new FileInputStream(filepath), "utf-8");
}
This will save you from lot of hand -written custom code and possibly have much lesser bugs.

Write an InputStream to an HttpServletResponse

I have an InputStream that I want written to a HttpServletResponse.
There's this approach, which takes too long due to the use of byte[]
InputStream is = getInputStream();
int contentLength = getContentLength();
byte[] data = new byte[contentLength];
is.read(data);
//response here is the HttpServletResponse object
response.setContentLength(contentLength);
response.write(data);
I was wondering what could possibly be the best way to do it, in terms of speed and efficiency.
Just write in blocks instead of copying it entirely into Java's memory first. The below basic example writes it in blocks of 10KB. This way you end up with a consistent memory usage of only 10KB instead of the complete content length. Also the enduser will start getting parts of the content much sooner.
response.setContentLength(getContentLength());
byte[] buffer = new byte[10240];
try (
InputStream input = getInputStream();
OutputStream output = response.getOutputStream();
) {
for (int length = 0; (length = input.read(buffer)) > 0;) {
output.write(buffer, 0, length);
}
}
As creme de la creme with regard to performance, you could use NIO Channels and a directly allocated ByteBuffer. Create the following utility/helper method in some custom utility class, e.g. Utils:
public static long stream(InputStream input, OutputStream output) throws IOException {
try (
ReadableByteChannel inputChannel = Channels.newChannel(input);
WritableByteChannel outputChannel = Channels.newChannel(output);
) {
ByteBuffer buffer = ByteBuffer.allocateDirect(10240);
long size = 0;
while (inputChannel.read(buffer) != -1) {
buffer.flip();
size += outputChannel.write(buffer);
buffer.clear();
}
return size;
}
}
Which you then use as below:
response.setContentLength(getContentLength());
Utils.stream(getInputStream(), response.getOutputStream());
BufferedInputStream in = null;
BufferedOutputStream out = null;
OutputStream os;
os = new BufferedOutputStream(response.getOutputStream());
in = new BufferedInputStream(new FileInputStream(file));
out = new BufferedOutputStream(os);
byte[] buffer = new byte[1024 * 8];
int j = -1;
while ((j = in.read(buffer)) != -1) {
out.write(buffer, 0, j);
}
I think that is very close to the best way, but I would suggest the following change. Use a fixed size buffer(Say 20K) and then do the read/write in a loop.
For the loop do something like
byte[] buffer=new byte[20*1024];
outputStream=response.getOutputStream();
while(true) {
int readSize=is.read(buffer);
if(readSize==-1)
break;
outputStream.write(buffer,0,readSize);
}
ps: Your program will not always work as is, because read don't always fill up the entire array you give it.

Java InputStream reading problem

I have a Java class, where I'm reading data in via an InputStream
byte[] b = null;
try {
b = new byte[in.available()];
in.read(b);
} catch (IOException e) {
e.printStackTrace();
}
It works perfectly when I run my app from the IDE (Eclipse).
But when I export my project and it's packed in a JAR, the read command doesn't read all the data. How could I fix it?
This problem mostly occurs when the InputStream is a File (~10kb).
Thanks!
Usually I prefer using a fixed size buffer when reading from input stream. As evilone pointed out, using available() as buffer size might not be a good idea because, say, if you are reading a remote resource, then you might not know the available bytes in advance. You can read the javadoc of InputStream to get more insight.
Here is the code snippet I usually use for reading input stream:
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = 0;
while ((bytesRead = in.read(buffer)) >= 0){
for (int i = 0; i < bytesRead; i++){
//Do whatever you need with the bytes here
}
}
The version of read() I'm using here will fill the given buffer as much as possible and
return number of bytes actually read. This means there is chance that your buffer may contain trailing garbage data, so it is very important to use bytes only up to bytesRead.
Note the line (bytesRead = in.read(buffer)) >= 0, there is nothing in the InputStream spec saying that read() cannot read 0 bytes. You may need to handle the case when read() reads 0 bytes as special case depending on your case. For local file I never experienced such case; however, when reading remote resources, I actually seen read() reads 0 bytes constantly resulting the above code into an infinite loop. I solved the infinite loop problem by counting the number of times I read 0 bytes, when the counter exceed a threshold I will throw exception. You may not encounter this problem, but just keep this in mind :)
I probably will stay away from creating new byte array for each read for performance reasons.
read() will return -1 when the InputStream is depleted. There is also a version of read which takes an array, this allows you to do chunked reads. It returns the number of bytes actually read or -1 when at the end of the InputStream. Combine this with a dynamic buffer such as ByteArrayOutputStream to get the following:
InputStream in = ...
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int read;
byte[] input = new byte[4096];
while ( -1 != ( read = in.read( input ) ) ) {
buffer.write( input, 0, read );
}
input = buffer.toByteArray()
This cuts down a lot on the number of methods you have to invoke and allows the ByteArrayOutputStream to grow its internal buffer faster.
File file = new File("/path/to/file");
try {
InputStream is = new FileInputStream(file);
byte[] bytes = IOUtils.toByteArray(is);
System.out.println("Byte array size: " + bytes.length);
} catch (IOException e) {
e.printStackTrace();
}
Below is a snippet of code that downloads a file (*. Png, *. Jpeg, *. Gif, ...) and write it in BufferedOutputStream that represents the HttpServletResponse.
BufferedInputStream inputStream = bo.getBufferedInputStream(imageFile);
try {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int bytesRead = 0;
byte[] input = new byte[DefaultBufferSizeIndicator.getDefaultBufferSize()];
while (-1 != (bytesRead = inputStream.read(input))) {
buffer.write(input, 0, bytesRead);
}
input = buffer.toByteArray();
response.reset();
response.setBufferSize(DefaultBufferSizeIndicator.getDefaultBufferSize());
response.setContentType(mimeType);
// Here's the secret. Content-Length should equal the number of bytes read.
response.setHeader("Content-Length", String.valueOf(buffer.size()));
response.setHeader("Content-Disposition", "inline; filename=\"" + imageFile.getName() + "\"");
BufferedOutputStream outputStream = new BufferedOutputStream(response.getOutputStream(), DefaultBufferSizeIndicator.getDefaultBufferSize());
try {
outputStream.write(input, 0, buffer.size());
} finally {
ImageBO.close(outputStream);
}
} finally {
ImageBO.close(inputStream);
}
Hope this helps.

Java: BufferedReader reads more than a line?

I'm making a program in Java with Sockets. I can send commands to the client and from the client to the server. To read the commands I use a BufferedReader. To write them, a PrintWriter But now I want to transfer a file through that socket (Not simply create a second connection).First I write to the outputstream how many bytes the file contains. For example 40000 bytes. So I write the number 40000 through the socket, but the other side of the connection reads 78.
So I was thinking: The BufferedReader reads more than just the line (by calling readLine()) and on that way I lose some bytes from the file-data. Because they are in the buffer from the BufferedReader.
So the number 78 is a byte of the file I want to transmit.
Is this way of thinking right, or not. If so, how to sovle this problem.
I hope I've explained well.
Here is my code, but my default language is Dutch. So some variable-name can sound stange.
public void flushStreamToStream(InputStream is, OutputStream os, boolean closeIn, boolean closeOut) throws IOException {
byte[] buffer = new byte[BUFFERSIZE];
int bytesRead;
if ((!closeOut) && closeIn) { // To Socket from File
action = "Upload";
os.write(is.available()); // Here I write 400000
max = is.available();
System.out.println("Bytes to send: " + max);
while ((bytesRead = is.read(buffer)) != -1) {
startTiming(); // Two lines to compute the speed
os.write(buffer, 0, bytesRead);
stopTiming(); // Speed compution
process += bytesRead;
}
os.flush();
is.close();
return;
}
if ((!closeIn) && closeOut) { // To File from Socket
action = "Download";
int bytesToRead = -1;
bytesToRead = is.read(); // Here he reads 78.
System.out.println("Bytes to read: " + bytesToRead);
max = bytesToRead;
int nextBufferSize;
while ((nextBufferSize = Math.min(BUFFERSIZE, bytesToRead)) > 0) {
startTiming();
bytesRead = is.read(buffer, 0, nextBufferSize);
bytesToRead -= bytesRead;
process += nextBufferSize;
os.write(buffer, 0, bytesRead);
stopTiming();
}
os.flush();
os.close();
return;
}
throw new IllegalArgumentException("The only two boolean combinations are: closeOut == false && closeIn == true AND closeOut == true && closeIn == false");
}
Here is the solution:
Thanks to James suggestion
I think laginimaineb anwser was a piece of the solution.
Read the commands.
DataInputStream in = new DataInputStream(is); // Originally a BufferedReader
// Read the request line
String str;
while ((str = in.readLine()) != null) {
if (str.trim().equals("")) {
continue;
}
handleSocketInput(str);
}
Now the flushStreamToStream:
public void flushStreamToStream(InputStream is, OutputStream os, boolean closeIn, boolean closeOut) throws IOException {
byte[] buffer = new byte[BUFFERSIZE];
int bytesRead;
if ((!closeOut) && closeIn) { // To Socket from File
action = "Upload";
DataOutputStream dos = new DataOutputStream(os);
dos.writeInt(is.available());
max = is.available();
System.out.println("Bytes to send: " + max);
while ((bytesRead = is.read(buffer)) != -1) {
startTiming();
dos.write(buffer, 0, bytesRead);
stopTiming();
process += bytesRead;
}
os.flush();
is.close();
return;
}
if ((!closeIn) && closeOut) { // To File from Socket
action = "Download";
DataInputStream dis = new DataInputStream(is);
int bytesToRead = dis.readInt();
System.out.println("Bytes to read: " + bytesToRead);
max = bytesToRead;
int nextBufferSize;
while ((nextBufferSize = Math.min(BUFFERSIZE, bytesToRead)) > 0) {
startTiming();
bytesRead = is.read(buffer, 0, nextBufferSize);
bytesToRead -= bytesRead;
process += nextBufferSize;
os.write(buffer, 0, bytesRead);
stopTiming();
}
os.flush();
os.close();
return;
}
throw new IllegalArgumentException("The only two boolean combinations are: closeOut == false && closeIn == true AND closeOut == true && closeIn == false");
}
Martijn.
I'm not sure I've followed your explanation.
However, yes - you have no real control over how much a BufferedReader will actually read. The point of such a reader is that it optimistically reads chunks of the underlying resource as needed to replenish its buffer. So when you first call readLine, it will see that its internal buffer doesn't have enough to serve youthe request, and will go off and read however many bytes it feels like into its buffer from the underlying source, which will generally be much more than you asked for just then. Once the buffer's been populated, it returns your line from the buffered content.
Thus once you wrap an input stream in a BufferedReader, you should be sure to only read that stream through the same buffered reader. If you don't you'll end up losing data (as some bytes will have been consumed and are now sitting in the BufferedReader's cache waiting to be served).
DataInputStream is most likely what you want to use. Also, don't use the available() method as it is generally useless.
A BufferedReader assumes that it is the only one reading from the underlying input stream.
It's purpose is to minimize the number of reads from the underlying stream (which are expensive, as they can delegate quite deeply). To that end, it keeps a buffer, which it fills by reading as many bytes as possible into it in a single call to the underlying stream.
So yes, your diagnosis is accurate.
Just a wild stab here - 40000 is 1001110001000000 in binary. Now, the first seven bits here are 1001110 which is 78. Meaning, you're writing 2 bytes of information but reading seven bits.

Categories