Issue in reading data from Socket - java

I am facing some problem during reading data from socket If there is some null data in socket stream so the DataInputStream would not read the full data and the so at the receiving end there is exception for parsing data.
What is the right way to read the data from socket so there is no loss of data at any time ?
Thanks in advance.

You should post the code being used to read from the socket, but to me the most likely case is that the reading code is incorrectly interpreting a 0 byte as the end of the stream similar to this code
InputStream is = ...;
int val = is.read();
while (0 != (val = is.read()) {
// do something
}
But the end of stream indicator is actually -1
InputStream is = ...;
int val = is.read();
while (-1 != (val = is.read()) {
// do something
}
EDIT: in response to your comment on using isavailable(). I assume you mean available() since there is no method on isavailable() InputStream. If you're using available to detect the end of the stream, that is also wrong. That function only tells you how many bytes can be read without blocking (i.e. how many are currently in the buffer), not how many bytes there are left in the stream.

I personally prefer ObjectInputStream over DataInputStream since it can handle all types including strings, arrays, and even objects.
Yes, you can read an entire object only by 1 line receive.readObject(), but don't forget to type-cast the returned object.
read() mightbe easier since you read the whole thing in 1 line, but not accurate. read the data one by one, like this:
receive.readBoolean()
receive.readInt()
receive.readChar()
etc..

String finalString = new String("");
int finalSize = remainingData;//sizeOfDataN;
int reclen = 0;
int count_for_breaking_loop_for_reading_data = 0;
boolean for_parsing_data = true;
while(allDataReceived == false) {
ByteBuffer databuff = ByteBuffer.allocate(dis.available());
// System.out.println("bis.availbale is "+dis.available());
databuff.clear();
databuff.flip();
dis.read(databuff.array());
String receivedStringN = trimNull(databuff.array());
finalString = finalString + receivedStringN;
System.out.println("final string length "+finalString.length());
if(finalString.length() == finalSize) {
allDataReceived = true;
}
count_for_breaking_loop_for_reading_data++;
if(count_for_breaking_loop_for_reading_data > 1500) {
For_parsing_data = false;
Break;
}
}

Related

Java - Socket read from datainputstream and not get stuck in it

I've been trying to do communication from another language to Java, but when I try read data from DataInputStream in a while loop...
static String getData(DataInputStream stream){
int charbyte;
StringBuilder strbuilder = new StringBuilder();
try {
while ((charbyte = stream.read()) != -1){
strbuilder.append(Character.toChars(charbyte));
}
stream.close();
return new String(strbuilder);
} catch (Exception e) {
return null;
}
}
The problem is stream.read() is not returning -1 because it just keeps waiting for new data to be sent. How can I just get the data that was just sent?
The method never returns because the while loop never ends, and this is caused by the connection or the DataInputStream remaining open.
To send a variable number of bytes over a network connection where the reader reads a stream of characters you have three options:
Send the number of bytes to follow, as an int in network order, followed by as many bytes.
If the bytes are printable characters, send a null byte to indicate the end.
Close the stream after sending the bytes.
For #1, change the loop to
try {
int count = stream.readInt();
for( int i = 0; i < count; ++i ){
strbuilder.append(Character.toChars(stream.read()));
}
return strbuilder.toString();
}
For #2, use
try {
while ((charbyte = stream.read()) != 0){
strbuilder.append(Character.toChars(charbyte));
}
return strbuilder.toString();
}
The code you have now is for #3.

Java: Reading file in two parts - partly as String and partly as byte[]

I have a file which is split in two parts by "\n\n" - first part is not too long String and second is byte array, which can be quite long.
I am trying to read the file as follows:
byte[] result;
try (final FileInputStream fis = new FileInputStream(file)) {
final InputStreamReader isr = new InputStreamReader(fis);
final BufferedReader reader = new BufferedReader(isr);
String line;
// reading until \n\n
while (!(line = reader.readLine()).trim().isEmpty()){
// processing the line
}
// copying the rest of the byte array
result = IOUtils.toByteArray(reader);
reader.close();
}
Even though the resulting array is the size it should be, its contents are broken. If I try to use toByteArray directly on fis or isr, the contents of result are empty.
How can I read the rest of the file correctly and efficiently?
Thanks!
The reason your contents are broken is because the IOUtils.toByteArray(...) function reads your data as a string in the default character encoding, i.e. it converts the 8-bit binary values into text characters using whatever logic your default encoding prescribes. This usually leads to many of the binary values getting corrupted.
Depending on how exactly the charset is implemented, there is a slight chance that this might work:
result = IOUtils.toByteArray(reader, "ISO-8859-1");
ISO-8859-1 uses only a single byte per character. Not all character values are defined, but many implementations will pass them anyways. Maybe you're lucky with it.
But a much cleaner solution would be to instead read the String in the beginning as binary data first and then converting it to text via new String(bytes) rather than reading the binary data at the end as a String and then converting it back.
This might mean, though, that you need to implement your own version of a BufferedReader for performance purposes.
You can find the source code of the standard BufferedReader via the obvious Google search, which will (for example) lead you here:
http://www.docjar.com/html/api/java/io/BufferedReader.java.html
It's a bit long, but conceptually not too difficult to understand, so hopefully it will be useful as a reference.
Alternatively, you could read the file into byte array, find \n\n position and split the array into the line and bytes
byte[] a = Files.readAllBytes(Paths.get("file"));
String line = "";
byte[] result = a;
for (int i = 0; i < a.length - 1; i++) {
if (a[i] == '\n' && a[i + 1] == '\n') {
line = new String(a, 0, i);
int len = a.length - i - 1;
result = new byte[len];
System.arraycopy(a, i + 1, result, 0, len);
break;
}
}
Thanks for all the comments - the final implementation was done in this way:
try (final FileInputStream fis = new FileInputStream(file)) {
ByteBuffer buffer = ByteBuffer.allocate(64);
boolean wasLast = false;
String headerValue = null, headerKey = null;
byte[] result = null;
while (true) {
byte current = (byte) fis.read();
if (current == '\n') {
if (wasLast) {
// this is \n\n
break;
} else {
// just a new line in header
wasLast = true;
headerValue = new String(buffer.array(), 0, buffer.position()));
buffer.clear();
}
} else if (current == '\t') {
// headerKey\theaderValue\n
headerKey = new String(buffer.array(), 0, buffer.position());
buffer.clear();
} else {
buffer.put(current);
wasLast = false;
}
}
// reading the rest
result = IOUtils.toByteArray(fis);
}

InputStream.read(byte[], 0 length) stops early?

I have been writing something to read a request stream (containing gzipped data) from an incoming HttpServletRequest ('request' below), however it appears that the normal InputStream read method doesn't actually read all content?
My code was:
InputStream requestStream = request.getInputStream();
if ((length = request.getContentLength()) != -1)
{
received = new byte[length];
requestStream.read(received, 0, length);
}
else
{
// create a variable length list of bytes
List<Byte> bytes = new ArrayList<Byte>();
boolean endLoop = false;
while (!endLoop)
{
// try and read the next value from the stream.. if not -1, add it to the list as a byte. if
// it is, we've reached the end.
int currentByte = requestStream.read();
if (currentByte != -1)
bytes.add((byte) currentByte);
else
endLoop = true;
}
// initialize the final byte[] to the right length and add each byte into it in the right order.
received = new byte[bytes.size()];
for (int i = 0; i < bytes.size(); i++)
{
received[i] = bytes.get(i);
}
}
What I found during testing was that sometimes the top part (for when a content length is present) would just stop reading part way through the incoming request stream and leave the remainder of the 'received' byte array blank. If I just make it run the else part of the if statement at all times, it reads fine and all the expected bytes are placed in 'received'.
So, it seems like I can just leave my code alone now with that change, but does anyone have any idea why the normal 'read'(byte[], int, int)' method stopped reading? The description says that it may stop if an end of file is present. Could it be that the gzipped data just happened to include bytes matching whatever the signature for that looks like?
You need to add a while loop at the top to get all the bytes. The stream will attempt to read as many bytes as it can, but it is not required to return len bytes at once:
An attempt is made to read as many as len bytes, but a smaller number may be read, possibly zero.
if ((length = request.getContentLength()) != -1)
{
received = new byte[length];
int pos = 0;
do {
int read = requestStream.read(received, pos, length-pos);
// check for end of file or error
if (read == -1) {
break;
} else {
pos += read;
}
} while (pos < length);
}
EDIT: fixed while.
You need to see how much of the buffer was filled. Its only guaranteed to give you at at least one byte.
Perhaps what you wanted was DataInputStream.readFully();

Why does my Sax Parser produce no results after using InputStream Read?

I have this piece of code which I'm hoping will be able to tell me how much data I have downloaded (and soon put it in a progress bar), and then parse the results through my Sax Parser. If I comment out basically everything above the //xr.parse(new InputSource(request.getInputStream())); line and swap the xr.parse's over, it works fine. But at the moment, my Sax parser tells me I have nothing. Is it something to do with is.read (buffer) section?
Also, just as a note, request is a HttpURLConnection with various signatures.
/*Input stream to read from our connection*/
InputStream is = request.getInputStream();
/*we make a 2 Kb buffer to accelerate the download, instead of reading the file a byte at once*/
byte [ ] buffer = new byte [ 2048 ] ;
/*How many bytes do we have already downloaded*/
int totBytes,bytes,sumBytes = 0;
totBytes = request.getContentLength () ;
while ( true ) {
/*How many bytes we got*/
bytes = is.read (buffer);
/*If no more byte, we're done with the download*/
if ( bytes <= 0 ) break;
sumBytes+= bytes;
Log.v("XML", sumBytes + " of " + totBytes + " " + ( ( float ) sumBytes/ ( float ) totBytes ) *100 + "% done" );
}
/* Parse the xml-data from our URL. */
// OLD, and works if comment all the above
//xr.parse(new InputSource(request.getInputStream()));
xr.parse(new InputSource(is))
/* Parsing has finished. */;
Can anyone help me at all??
Kind regards,
Andy
'I could only find a way to do that
with bytes, unless you know another
method?'.
But you haven't found a method. You've just written code that doesn't work. And you don't want to save the input to a String either. You want to count the bytes while you're parsing them. Otherwise you're just adding latency, i.e. wasting time and slowing everything down. For an example of how to do it right, see javax.swing.ProgressMonitorInputStream. You don't have to use that but you certainly do have to use a FilterInputStream of some sort, probaby one you write yourself, that is wrapped around the request input stream and passed to the parser.
Your while loop is consuming the input stream and leaving nothing for the parser to read.
For what you're trying to do, you might want to look into implementing a FilterInputStream subclass wrapping the input stream.
You are building an InputStream over another InputStream that consumes its data before.
If you want to avoid reading just single bytes you could use a BufferedInputStream or different things like a BufferedReader.
In any case it's better to obtain the whole content before parsing it! Unless you need to dynamically parse it.
If you really want to keep it on like you are doing you should create two piped streams:
PipedOutputStream pipeOut = new PipedOutputStream();
PipedInputStream pipeIn = new PipedInputStream();
pipeIn.connect(pipeOut);
pipeOut.write(yourBytes);
xr.parse(pipeIn);
Streams in Java, like their name suggest you, doesn't have a precise dimension neither you know when they'll finish so whenever you create an InputStream, if you read from them you cannot then pass the same InputStream to another object because data is already being consumed from the former one.
If you want to do both things (downloading and parsing) together you have to hook between the data received from the HTTPUrlConncection you should:
first know the length of the data being downloaded, this can be obtained from HttpUrlConnection header
using a custom InputStream that decorates (this is how streams work in Java, see here) updading the progressbar..
Something like:
class MyInputStream extends InputStream
{
MyInputStream(InputStream is, int total)
{
this.total = total;
}
public int read()
{
stepProgress(1);
return super.read();
}
public int read(byte[] b)
{
int l = super.read(b);
stepProgress(l);
return l;
}
public int read(byte[] b, int off, int len)
{
int l = super.read(b, off, len);
stepProgress(l);
return l
}
}
InputStream mis= new MyInputStream(request.getInputStream(), length);
..
xr.parse(mis);
You can save your data in a file, and then read them out.
InputStream is = request.getInputStream();
if(is!=null){
File file = new File(path, "someFile.txt");
FileOutputStream os = new FileOutputStream(file);
buffer = new byte[2048];
bufferLength = 0;
while ((bufferLength = is.read(buffer)) > 0)
os.write(buffer, 0, bufferLength);
os.flush();
os.close();
XmlPullParserFactory factory = XmlPullParserFactory.newInstance();
factory.setNamespaceAware(true);
XmlPullParser xpp = factory.newPullParser();
FileInputStream fis = new FileInputStream(file);
xpp.setInput(new InputStreamReader(fis));
}

Efficient way of handling file pointers in Java? (Using BufferedReader with file pointer)

I have a log file which gets updated every second. I need to read the log file periodically, and once I do a read, I need to store the file pointer position at the end of the last line I read and in the next periodic read I should start from that point.
Currently, I am using a random access file in Java and using the getFilePointer() method to get he offset value and the seek() method to go to the offset position.
However, I have read in most articles and even the Java doc recommendations to use BufferredReader for efficient reading of a file. How can I achieve this (getting the filepointer and moving to the last line) using a BufferedReader, or is there any other efficient way to achieve this task?
A couple of ways that should work:
open the file using a FileInputStream, skip() the relevant number of bytes, then wrap the BufferedReader around the stream (via an InputStreamReader);
open the file (with either FileInputStream or RandomAccessFile), call getChannel() on the stream/RandomAccessFile to get an underlying FileChannel, call position() on the channel, then call Channels.newInputStream() to get an input stream from the channel, which you can pass to InputStreamReader -> BufferedReader.
I haven't honestly profiled these to see which is better performance-wise, but you should see which works better in your situation.
The problem with RandomAccessFile is essentially that its readLine() method is very inefficient. If it's convenient for you to read from the RAF and do your own buffering to split the lines, then there's nothing wrong with RAF per se-- just that its readLine() is poorly implemented
Neil Coffey's solution is good if you are reading fixed length files. However for files that have variable length (data keep coming in) there are some problems with using BufferedReader directly on FileInputStream or FileChannel inputstream via an InputStreamReader. For ex consider the cases
1)
You want to read data from some offset to current file length. So you use BR on FileInputStream/FileChannel(via an InputStreamReader) and use its readLine method. But while you are busy reading the data let say some data got added which causes BF's readLine to read more data than what you expected(the previous file length)
2)
You finished readLine stuff but when you try to read the current file length/channel position some data got added suddenly which causes the current file length/channel position to increase but you have already read less data than this.
In both of the above cases it is difficult to know the actual data you have read (you cannot just use the length of data read using readLine because it skips some chars like carriage return)
So it is better to read the data in buffered bytes and use a BufferedReader wrapper around this. I wrote some methods like this
/** Read data from offset to length bytes in RandomAccessFile using BufferedReader
* #param offset
* #param length
* #param accessFile
* #throws IOException
*/
public static void readBufferedLines(long offset, long length, RandomAccessFile accessFile) throws IOException{
if(accessFile == null) return;
int bufferSize = BYTE_BUFFER_SIZE;// constant say 4096
if(offset < length && offset >= 0){
int index = 1;
long curPosition = offset;
/*
* iterate (length-from)/BYTE_BUFFER_SIZE times to read into buffer no matter where new line occurs
*/
while((curPosition + (index * BYTE_BUFFER_SIZE)) < length){
accessFile.seek(offset); // seek to last parsed data rather than last data read in to buffer
byte[] buf = new byte[bufferSize];
int read = accessFile.read(buf, 0, bufferSize);
index++;// Increment whether or not read successful
if(read > 0){
int lastnewLine = getLastLine(read,buf);
if(lastnewLine <= 0){ // no new line found in the buffer reset buffer size and continue
bufferSize = bufferSize+read;
continue;
}
else{
bufferSize = BYTE_BUFFER_SIZE;
}
readLine(buf, 0, lastnewLine); // read the lines from buffer and parse the line
offset = offset+lastnewLine; // update the last data read
}
}
// Read last chunk. The last chunk size in worst case is the total file when no newline occurs
if(offset < length){
accessFile.seek(offset);
byte[] buf = new byte[(int) (length-offset)];
int read = accessFile.read(buf, 0, buf.length);
if(read > 0){
readLine(buf, 0, read);
offset = offset+read; // update the last data read
}
}
}
}
private static void readLine(byte[] buf, int from , int lastnewLine) throws IOException{
String readLine = "";
BufferedReader reader = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(buf,from,lastnewLine) ));
while( (readLine = reader.readLine()) != null){
//do something with readLine
System.out.println(readLine);
}
reader.close();
}
private static int getLastLine(int read, byte[] buf) {
if(buf == null ) return -1;
if(read > buf.length) read = buf.length;
while( read > 0 && !(buf[read-1] == '\n' || buf[read-1] == '\r')) read--;
return read;
}
public static void main(String[] args) throws IOException {
RandomAccessFile accessFile = new RandomAccessFile("C:/sri/test.log", "r");
readBufferedLines(0, accessFile.length(), accessFile);
accessFile.close();
}
I had a similar problem, and I created this class to take lines from BufferedStream, and count how many bytes you have read so far by using getBytes(). We assume the line separator has a single byte by default, and we re-instance the BufferedReader for seek() to work.
public class FileCounterIterator {
public Long position() {
return _position;
}
public Long fileSize() {
return _fileSize;
}
public FileCounterIterator newlineLength(Long newNewlineLength) {
this._newlineLength = newNewlineLength;
return this;
}
private Long _fileSize = 0L;
private Long _position = 0L;
private Long _newlineLength = 1L;
private RandomAccessFile fp;
private BufferedReader itr;
public FileCounterIterator(String filename) throws IOException {
fp = new RandomAccessFile(filename, "r");
_fileSize = fp.length();
this.seek(0L);
}
public FileCounterIterator seek(Long newPosition) throws IOException {
this.fp.seek(newPosition);
this._position = newPosition;
itr = new BufferedReader(new InputStreamReader(new FileInputStream(fp.getFD())));
return this;
}
public Boolean hasNext() throws IOException {
return this._position < this._fileSize;
}
public String readLine() throws IOException {
String nextLine = itr.readLine();
this._position += nextLine.getBytes().length + _newlineLength;
return nextLine;
}
}

Categories