Is it possible to close a Reader without closing the stream? - java

I have a method which accepts an InputStream (of binary data) and serializes it to XML. In order to do so, it wraps the stream with a base64 encoder and a Reader to convert it to character data. However, since the InputStream is passed in as a parameter, I would consider it a harmful side effect to close the stream, and the contract for Reader.close() says it would do just that. If I don't close the reader, the compiler warns me that I have a
Resource leak: reader is never closed
So, I can add a #SuppressWarnings( "resource" ) to the reader declaration, but is that the right thing to do? Am I missing something?
Here is the actual code:
/**
* Writes base64 encoded text read from the binary stream.
*
* #param binaryStream
* The binary stream to write from
* #return <code>this</code> XmlWriter (for chaining)
* #throws IOException
*/
public XmlWriter binary( InputStream binaryStream ) throws IOException {
Reader reader = new InputStreamReader(
new Base64InputStream( binaryStream, true, base64LineLength, base64LineSeparator.getBytes( charset ) ) );
int bufferSize = 2048;
int charsRead;
char[] buffer = new char[bufferSize];
while ( (charsRead = reader.read( buffer, 0, bufferSize )) >= 0 ) {
writer.write( buffer, 0, charsRead );
}
return this;
}

If you are a happy Java 7 user, try this:
try(InputStream binaryStream = /* ... */) {
xmlWriter.binary(binaryStream);
}
and stream is closed for you. If you can't use Java 7, I agree that it's not the responsibility of binary() method to close() the stream. Just ignore the warning and don't let tools drive your design. It's fine.
As a last resort you can write a lightweight Reader wrapper ignoring close(), but I don't advice it as it makes following the program flow harder.
Also let Apache Commons IO help you with IOUtils.copy():
public XmlWriter binary( InputStream binaryStream ) throws IOException {
Reader reader = new InputStreamReader(
new Base64InputStream( binaryStream, true, base64LineLength, base64LineSeparator.getBytes( charset ) ) );
IOUtils.copy(reader, writer);
return this;
}

This is a perhaps a "feature" in the way Base64InputStream work that even though you specify the length to read, it will close the underlying stream if you close it when clearly you intended not to read the whole stream.
You could wrap the binaryStream in an InputStream which ignores the close, or you could suppress the warning as you have.

Related

JsonGenerator appending json data to file without ovewriting

I am reading bytes from a socket and then writing to a json file using jsonGenerator. Problem is JsonGenerator is overwriting the file everytime a stream is received from the socket. how do I make it append the subsequent streams instead of overwriting?
JsonFactory factory = new JsonFactory();
JsonGenerator generator = factory.createGenerator(
new File("transactions.json"), JsonEncoding.UTF8);
try {
while ( (bytesRead = in.read(bytes)) != -1 ){ // -1 indicates EOF
output= new String(bytes, "UTF-8");
String length = output.substring(0, 4).trim();
String mti = output.substring(4, 8).trim();
String resp = "000";
String newoutput = "";
String bitmap = output.substring(8, 24);
String stan = output.substring(24, 30);
String date = output.substring(30, 44);
String function_code = output.substring(44, 47);
mti = "1814";
// output to file
generator.writeStartObject();
generator.writeStringField("MTI", mti);
generator.writeStringField("lenght", length);
generator.writeStringField("stan", stan);
generator.writeStringField("date", date);
generator.writeStringField("Function Code", function_code);
generator.writeEndObject();
}
} catch (Exception e) {
System.out.println("Exceptions "+e);
}finally{
generator.close();
}
Also when I declare generator outside the while loop and close it outside the loop for some reason the data is not written to the file so I am assuming the generator kinda like buffers it and when you close it writes to the file.
I might be missing something in your question, but the overwriting reason that jumps out to me is that you aren't specifying that the file should be appended to. Most Java APIs (including Jackson) default to overwriting instead of appending. The simple solution to this is just to use:
// the second parameter specifies whether the file should be appended
try(OutputStream fos = new FileOutputStream(new File("transactions.json"), true)) {
// pass the FileOutputStream to the generator instead
JsonGenerator generator = factory.createGenerator(fos , JsonEncoding.UTF8);
}
I would leave it at that for my answer, but I would be remiss if I didn't point out that if you are reading from multiple sockets concurrently, then you are probably going to end up with JSON data written interleaved.
I suggest wrapping the method in a synchronize block of some sort to prevent this and make it thread-safe.
Below I have an example of how I would re-write this functionality.
import com.fasterxml.jackson.core.JsonEncoding;
import com.fasterxml.jackson.core.JsonFactory;
import com.fasterxml.jackson.core.JsonGenerator;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
/**
* A thread-safe class that will append JSON transaction data to a file.
*/
public class TransactionWriter {
private static final JsonFactory jsonFactory = new JsonFactory();
/**
* Helper method that will read a number of UTF8 characters from an input stream and return them in a string.
*
* #param inputStream stream to read from
* #param charsToRead number of characters to read
* #return string of read characters
* #throws IOException when unable to read enough characters from the stream
*/
private static String readUtf8Chars(InputStream inputStream, int charsToRead) throws IOException {
// since we know this is UTF8 up front, we can assume one byte per char
byte[] buffer = new byte[charsToRead];
// fill the buffer
int readBytes = inputStream.read(buffer);
// check that the buffer was actually filled
if(readBytes < charsToRead)
throw new IOException("less bytes available to read than expected: " + readBytes + " instead of " + charsToRead);
// create a string from the buffer
return new String(buffer, StandardCharsets.UTF_8);
}
private final File file;
private final Object writeLock = new Object();
/**
* Constructs a new instance for an output file.
*
* #param file file to append to
*/
public TransactionWriter(File file) {
this.file = file;
}
/**
* Reads a transaction from the input stream and appends a JSON representation to this instance's output file.
*
* #param inputStream stream to read from; will be closed after this method is closed
* #throws IOException when reading or writing failed
*/
public void write(InputStream inputStream) throws IOException {
// since we have multiple threads appending to the same file, synchronize to prevent concurrency issues
synchronized(writeLock) {
// open the output stream to append to the file
try(FileOutputStream outputStream = new FileOutputStream(file, true)) {
// create the generator for the output stream
JsonGenerator generator = jsonFactory.createGenerator(outputStream, JsonEncoding.UTF8);
// write the data to the generator
generator.writeStartObject();
generator.writeStringField("length", readUtf8Chars(inputStream, 4).trim());
generator.writeStringField("MTI", readUtf8Chars(inputStream, 4).trim());
String bitmap = readUtf8Chars(inputStream, 16);
generator.writeStringField("stan", readUtf8Chars(inputStream, 8));
generator.writeStringField("date", readUtf8Chars(inputStream, 14));
generator.writeStringField("Function Code", readUtf8Chars(inputStream, 3));
generator.writeEndObject();
} finally {
// output stream is closed in try-with-resources, but also close the input stream
inputStream.close();
}
}
}
}
To be clear, I have not tested this code at all. I simply know that it compiles on Java 7 language level.
Declare the generator outside the loop (before the loop).
take the loop in try-catch statement, there are two options:
You should close the generator (after the loop) in finally block, or use it in "try-with-resources" way if you use java 1.7
Additionally, do you know that you should ensure that you read the whole message? In this code that you present it could read a half of message and try to process it, which will probably result in exception. You should have some protocol that reads the messages of the inputStream from socket and process only whole messages, not half-loaded chunks. – Krzysztof Cichocki 2 mins ago edit

Testing a method that reads from standard input and outputs to standard output

I have a method which reads lines from the standard input and writes lines to the standard output.
From within a JUnit test, how can I send input to the method, and how can I capture its output so that I can make assertions on it?
You should not have a method which reads from standard input and writes to standard output.
You should have a method which accepts as parameters the InputStream from which it reads, and the PrintStream into which it writes. (This is an application, at the method level, of a principle known as Dependency Injection (Wikipedia) which is generally used at the class level.)
Then, under normal circumstances, you invoke that method passing it System.in and System.out as parameters.
But when you want to test it, you can pass it an InputStream and a PrintStream that you have created for test purposes.
So, you can use something along these lines:
void testMyAwesomeMethod( String testInput, String expectedOutput )
{
byte[] bytes = testInput.getBytes( StandardCharsets.UTF_8 );
InputStream inputStream = new ByteArrayInputStream( bytes );
StringWriter stringWriter = new StringWriter();
try( PrintWriter printWriter = new PrintWriter( stringWriter ) )
{
myAwesomeMethod( inputStream, printWriter );
}
String result = stringWriter.toString();
assert result.equals( expectedOutput );
}

Why output is incomplete using OutputStreamWriter with GZIPOutputStream?

I'm hoping someone can shed light on a problem I'm having with the code below.
private static String encode(String data) throws IOException {
try (
ByteArrayOutputStream out = new ByteArrayOutputStream();
InputStream is = new ByteArrayInputStream(data.getBytes());
BufferedReader br = new BufferedReader(new InputStreamReader(is));
OutputStreamWriter writer = new OutputStreamWriter(new GZIPOutputStream(out));
) {
char[] charBuffer = new char[data.length()];
while (br.read(charBuffer) != -1) {
writer.write(charBuffer);
}
// writer.close();
return new String(Base64.encodeBase64(out.toByteArray()));
}
}
My test value is
This is some text that I can test some base64 encoding with.
I am getting a strange problem with the encoded value
when writer.close is commented out, my output is
H4sIAAAAAAAAAA==
When it's not commented, my output is
H4sIAAAAAAAAACWMzQ2DMBSD70jdwRP0VLED9y7wACuJRJKKZ37GJxTJF/uz/Y3J0eQ1E+IpKJowYLLSvOshozn7D1imOqcScCTF96sbYBmB0p0ZXKuVQPzWOi7M/1b747PjjN2WjRd08DfZcwAAAA==
This second output is the correct value and can be decoded back into the original value.
It is my understanding that the process is as follows:
Try code is executed
Return value is evaluated and stored
Resources are all closed
The return value is returned.
Why then does the writer need to be closed for the output to be correct?
A GZIP output stream has to be told when to finish the compression and write the result to the underlying stream. close() implicitely flushes the writer and calls finish() on the GZIP stream.
So, call at least flush() on the writer and finish() or close() of the GZIP stream, or close the writer, which will do all that and won't hurt anyway.
Its because when we close the writer it flushes the stream. When you dont close the writer it does not get flushed (Buffer is flushed automatically only when it gets full or it is closed). So u can do it explicitly by writing
writer.flush()

Why does my Sax Parser produce no results after using InputStream Read?

I have this piece of code which I'm hoping will be able to tell me how much data I have downloaded (and soon put it in a progress bar), and then parse the results through my Sax Parser. If I comment out basically everything above the //xr.parse(new InputSource(request.getInputStream())); line and swap the xr.parse's over, it works fine. But at the moment, my Sax parser tells me I have nothing. Is it something to do with is.read (buffer) section?
Also, just as a note, request is a HttpURLConnection with various signatures.
/*Input stream to read from our connection*/
InputStream is = request.getInputStream();
/*we make a 2 Kb buffer to accelerate the download, instead of reading the file a byte at once*/
byte [ ] buffer = new byte [ 2048 ] ;
/*How many bytes do we have already downloaded*/
int totBytes,bytes,sumBytes = 0;
totBytes = request.getContentLength () ;
while ( true ) {
/*How many bytes we got*/
bytes = is.read (buffer);
/*If no more byte, we're done with the download*/
if ( bytes <= 0 ) break;
sumBytes+= bytes;
Log.v("XML", sumBytes + " of " + totBytes + " " + ( ( float ) sumBytes/ ( float ) totBytes ) *100 + "% done" );
}
/* Parse the xml-data from our URL. */
// OLD, and works if comment all the above
//xr.parse(new InputSource(request.getInputStream()));
xr.parse(new InputSource(is))
/* Parsing has finished. */;
Can anyone help me at all??
Kind regards,
Andy
'I could only find a way to do that
with bytes, unless you know another
method?'.
But you haven't found a method. You've just written code that doesn't work. And you don't want to save the input to a String either. You want to count the bytes while you're parsing them. Otherwise you're just adding latency, i.e. wasting time and slowing everything down. For an example of how to do it right, see javax.swing.ProgressMonitorInputStream. You don't have to use that but you certainly do have to use a FilterInputStream of some sort, probaby one you write yourself, that is wrapped around the request input stream and passed to the parser.
Your while loop is consuming the input stream and leaving nothing for the parser to read.
For what you're trying to do, you might want to look into implementing a FilterInputStream subclass wrapping the input stream.
You are building an InputStream over another InputStream that consumes its data before.
If you want to avoid reading just single bytes you could use a BufferedInputStream or different things like a BufferedReader.
In any case it's better to obtain the whole content before parsing it! Unless you need to dynamically parse it.
If you really want to keep it on like you are doing you should create two piped streams:
PipedOutputStream pipeOut = new PipedOutputStream();
PipedInputStream pipeIn = new PipedInputStream();
pipeIn.connect(pipeOut);
pipeOut.write(yourBytes);
xr.parse(pipeIn);
Streams in Java, like their name suggest you, doesn't have a precise dimension neither you know when they'll finish so whenever you create an InputStream, if you read from them you cannot then pass the same InputStream to another object because data is already being consumed from the former one.
If you want to do both things (downloading and parsing) together you have to hook between the data received from the HTTPUrlConncection you should:
first know the length of the data being downloaded, this can be obtained from HttpUrlConnection header
using a custom InputStream that decorates (this is how streams work in Java, see here) updading the progressbar..
Something like:
class MyInputStream extends InputStream
{
MyInputStream(InputStream is, int total)
{
this.total = total;
}
public int read()
{
stepProgress(1);
return super.read();
}
public int read(byte[] b)
{
int l = super.read(b);
stepProgress(l);
return l;
}
public int read(byte[] b, int off, int len)
{
int l = super.read(b, off, len);
stepProgress(l);
return l
}
}
InputStream mis= new MyInputStream(request.getInputStream(), length);
..
xr.parse(mis);
You can save your data in a file, and then read them out.
InputStream is = request.getInputStream();
if(is!=null){
File file = new File(path, "someFile.txt");
FileOutputStream os = new FileOutputStream(file);
buffer = new byte[2048];
bufferLength = 0;
while ((bufferLength = is.read(buffer)) > 0)
os.write(buffer, 0, bufferLength);
os.flush();
os.close();
XmlPullParserFactory factory = XmlPullParserFactory.newInstance();
factory.setNamespaceAware(true);
XmlPullParser xpp = factory.newPullParser();
FileInputStream fis = new FileInputStream(file);
xpp.setInput(new InputStreamReader(fis));
}

Determine the size of an InputStream

My current situation is: I have to read a file and put the contents into InputStream. Afterwards I need to place the contents of the InputStream into a byte array which requires (as far as I know) the size of the InputStream. Any ideas?
As requested, I will show the input stream that I am creating from an uploaded file
InputStream uploadedStream = null;
FileItemFactory factory = new DiskFileItemFactory();
ServletFileUpload upload = new ServletFileUpload(factory);
java.util.List items = upload.parseRequest(request);
java.util.Iterator iter = items.iterator();
while (iter.hasNext()) {
FileItem item = (FileItem) iter.next();
if (!item.isFormField()) {
uploadedStream = item.getInputStream();
//CHANGE uploadedStreambyte = item.get()
}
}
The request is a HttpServletRequest object, which is like the FileItemFactory and ServletFileUpload is from the Apache Commons FileUpload package.
This is a REALLY old thread, but it was still the first thing to pop up when I googled the issue. So I just wanted to add this:
InputStream inputStream = conn.getInputStream();
int length = inputStream.available();
Worked for me. And MUCH simpler than the other answers here.
Warning This solution does not provide reliable results regarding the total size of a stream. Except from the JavaDoc:
Note that while some implementations of {#code InputStream} will return
* the total number of bytes in the stream, many will not.
I would read into a ByteArrayOutputStream and then call toByteArray() to get the resultant byte array. You don't need to define the size in advance (although it's possibly an optimisation if you know it. In many cases you won't)
You can't determine the amount of data in a stream without reading it; you can, however, ask for the size of a file:
http://java.sun.com/javase/6/docs/api/java/io/File.html#length()
If that isn't possible, you can write the bytes you read from the input stream to a ByteArrayOutputStream which will grow as required.
I just wanted to add, Apache Commons IO has stream support utilities to perform the copy. (Btw, what do you mean by placing the file into an inputstream? Can you show us your code?)
Edit:
Okay, what do you want to do with the contents of the item?
There is an item.get() which returns the entire thing in a byte array.
Edit2
item.getSize() will return the uploaded file size.
For InputStream
org.apache.commons.io.IoUtils.toByteArray(inputStream).length()
For Optional < MultipartFile >
Stream.of(multipartFile.get()).mapToLong(file->file.getSize()).findFirst().getAsLong()
you can get the size of InputStream using getBytes(inputStream) of Utils.java check this following link
Get Bytes from Inputstream
The function below should work with any InputStream. As other answers have hinted, you can't reliably find the length of an InputStream without reading through it, but unlike other answers, you should not attempt to hold the entire stream in memory by reading into a ByteArrayOutputStream, nor is there any reason to. Instead of reading the stream, you should ideally rely on other API for stream sizes, for example getting the size of a file using the File API.
public static int length(InputStream inputStream, int chunkSize) throws IOException {
byte[] buffer = new byte[chunkSize];
int chunkBytesRead = 0;
int length = 0;
while((chunkBytesRead = inputStream.read(buffer)) != -1) {
length += chunkBytesRead;
}
return length;
}
Choose a reasonable value for chunkSize appropriate to the kind of InputStream. E.g. reading from disk it would not be efficient to have too small a value for chunkSize.
When explicitly dealing with a ByteArrayInputStream then contrary to some of the comments on this page you can use the .available() function to get the size. Just have to do it before you start reading from it.
From the JavaDocs:
Returns the number of remaining bytes that can be read (or skipped
over) from this input stream. The value returned is count - pos, which
is the number of bytes remaining to be read from the input buffer.
https://docs.oracle.com/javase/7/docs/api/java/io/ByteArrayInputStream.html#available()
If you need to stream the data to another object that doesn't allow you to directly determine the size (e.g. javax.imageio.ImageIO), then you can wrap your InputStream within a CountingInputStream (Apache Commons IO), and then read the size:
CountingInputStream countingInputStream = new CountingInputStream(inputStream);
// ... process the whole stream ...
int size = countingInputStream.getCount();
If you know that your InputStream is a FileInputStream or a ByteArrayInputStream, you can use a little reflection to get at the stream size without reading the entire contents. Here's an example method:
static long getInputLength(InputStream inputStream) {
try {
if (inputStream instanceof FilterInputStream) {
FilterInputStream filtered = (FilterInputStream)inputStream;
Field field = FilterInputStream.class.getDeclaredField("in");
field.setAccessible(true);
InputStream internal = (InputStream) field.get(filtered);
return getInputLength(internal);
} else if (inputStream instanceof ByteArrayInputStream) {
ByteArrayInputStream wrapper = (ByteArrayInputStream)inputStream;
Field field = ByteArrayInputStream.class.getDeclaredField("buf");
field.setAccessible(true);
byte[] buffer = (byte[])field.get(wrapper);
return buffer.length;
} else if (inputStream instanceof FileInputStream) {
FileInputStream fileStream = (FileInputStream)inputStream;
return fileStream.getChannel().size();
}
} catch (NoSuchFieldException | IllegalAccessException | IOException exception) {
// Ignore all errors and just return -1.
}
return -1;
}
This could be extended to support additional input streams, I am sure.
Add to your pom.xml:
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
</dependency>
Use to get the content type lenght (InputStream file):
IOUtils.toByteArray(file).length
Use this method, you just have to pass the InputStream
public String readIt(InputStream is) {
if (is != null) {
BufferedReader reader = new BufferedReader(new InputStreamReader(is, "utf-8"), 8);
StringBuilder sb = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
sb.append(line).append("\n");
}
is.close();
return sb.toString();
}
return "error: ";
}
try {
InputStream connInputStream = connection.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
int size = connInputStream.available();
int available ()
Returns an estimate of the number of bytes that can be read (or skipped over) from this input stream without blocking by the next invocation of a method for this input stream. The next invocation might be the same thread or another thread. A single read or skip of this many bytes will not block, but may read or skip fewer bytes.
InputStream - Android SDK | Android Developers

Categories