if i have a file contains 4000 bytes, can i have 4 threads read from the file at same time? and each thread access a different section of the file.
thread 1 read 0-999, thread 2 read 1000 - 2999, etc.
please give an example in java.
The file is very very small, and will be very fast to load. What I would do is create a thread-safe data class that loads the data. Each processing thread can then request an ID from the data class and receive a unique one with a guarantee of no other thread sending the same ID to your remote service.
In this manner, you remove the need to have all the threads accessing the file, and trying to figure out who has read and sent what ID.
RandomAccessFile or FileChannel will let you access bytes within a file. For waiting until your threads finish, look at CyclicBarrier or CountDownLatch.
Given this comment by the question's author:
I want to run a batch file , in which
it contains thousands of unique id's.
this each unique id will send as a
request to the remote system. So i
want to send requests parallely using
threads to speed up the process. But
if use multi threads then all the
threads reading complete data and
duplicate requests are sending. So i
want to avoid this duplicate requests.
I would suggest that you load the file into memory as some kind of data structure - an array of ids perhaps. Have the threads consume ids from the array. Be sure to access the array in a synchronized manner.
If the file is larger than you'd like to load in memory or the file is constantly being appended to then create a single producer thread that watches and reads from the file and inserts ids into a queue type structure.
Sorry, Here is the working code. Now i've test it self :-)
package readfilemultithreading;
import java.io.BufferedInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
public class MultiThreadFileReader
{
public MultiThreadFileReader(File fileToRead, int numThreads, int numBytesForEachThread)
{
this.file = fileToRead;
this.numThreads = numThreads;
this.bytesForEachThread = numBytesForEachThread;
this.bytes = new byte[(int) file.length()];
}
private File file;
private int numThreads;
private byte[] bytes;
int bytesForEachThread;
public byte[] getResult()
{
return bytes;
}
public void startReading()
{
List<ReaderThread> readers = new ArrayList<ReaderThread>();
for (int i = 0; i < numThreads; i ++) {
ReaderThread rt = new ReaderThread(i * bytesForEachThread, bytesForEachThread, file);
readers.add(rt);
rt.start();
}
// Each Thread is Reading....
int resultIndex = 0;
for (int i = 0; i < numThreads; i++) {
ReaderThread thread = readers.get(i);
while (!thread.done) {
try {
Thread.sleep(1);
} catch (Exception e) {
}
}
for (int b = 0; b < thread.len; b++, resultIndex++)
{
bytes[resultIndex] = thread.rb[b];
}
}
}
private class ReaderThread extends Thread
{
public ReaderThread(int off, int len, File f)
{
this.off = off;
this.len = len;
this.f = f;
}
public int off, len;
private File f;
public byte[] rb;
public boolean done = false;
#Override
public void run()
{
done = false;
rb = readPiece();
done = true;
}
private byte[] readPiece()
{
try {
BufferedInputStream reader = new BufferedInputStream(new FileInputStream(f));
if (off + len > f.length()) {
len = (int) (f.length() - off);
if (len < 0)
{
len = 0;
}
System.out.println("Correct Length to: " + len);
}
if (len == 0)
{
System.out.println("No bytes to read");
return new byte[0];
}
byte[] b = new byte[len];
System.out.println("Length: " + len);
setName("Thread for " + len + " bytes");
reader.skip(off);
for (int i = off, index = 0; i < len + off; i++, index++)
{
b[index] = (byte) reader.read();
}
reader.close();
return b;
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
}
}
Here is usage code:
package readfilemultithreading;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
public class Main
{
public static void main(String[] args)
{
new Main().start(args);
}
public void start(String[] args)
{
try {
MultiThreadFileReader reader = new MultiThreadFileReader(new File("C:\\Users\\Martijn\\Documents\\Test.txt"), 4, 2500);
reader.startReading();
byte[] result = reader.getResult();
FileOutputStream stream = new FileOutputStream(new File("C:\\Users\\Martijn\\Documents\\Test_cop.txt"));
for (byte b : result) {
System.out.println(b);
stream.write((int) b);
}
stream.close();
} catch (IOException ex) {
System.err.println("Reading failed");
}
}
}
Can I get now my +1 back ;-)
You must somehow synchronize the read access to the file. I suggest to use the ExecutorService:
Your main thread reads the IDs from the file and passed them to the executor service one at a time. The executor will run N threads to process N IDs concurrently.
Related
What can I use to read log file in real time in Java 8?
I read blogs to understand BufferedReader is a good option for reading fine.
I tried below:
BufferedReader reader = new
BufferedReader(new
InputStreamReader(inputStream));
String line;
while(true) {
line = reader.readLine(); // blocks until next line
available
// do whatever You want with line
}
However it keeps printing null irrespective of file is updated or not. Any idea what can be going wrong.
Any other options?
Details are as below :
I am trying to create an utility in Java 8 or above, where I need to read log file of an application at real time (as live transactions are occurring and getting printed in logs).
I can access log file as I am on sme server, so that is not an issue.
So some of the specifics are below
-> I don't want to poll the log files for Changes, I want to keep it the bridge open to read log file in "while true" loop. So ideally i want to block my reader if there are no new lines getting printed.
-> I don't want to store the entire content of the file in memory at all time as I want it to be memory efficient.
-> my code will run as a separate application to read log file of another application.
-> only job of my code is to read log, match against a pattern, if matched then send a message with log content.
Kindly let me know if any detail is ambiguous.
Any help is appericiated, thanks.
For this to work, your inputStream must block until new data becomes available, which a standard FileInputStream does not when reaching the end-of-file.
I suppose, you initialize inputStream to just new FileInputStream("my-logfile.log");. This stream will only read to the current end of the log file and signal the "end of file" condition to the BufferedReader. This in turn will signal "end of file" by returning null from readLine().
Have a look at the utility org.apache.commons.io.input.Tailer. This allows to write programs like the Unix utility tail -f.
To make your code work, you would have to use an "infinite" input stream that could be realized using a RandomAccessFile as in the following example:
package test;
import java.io.BufferedReader;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.io.RandomAccessFile;
import java.nio.file.Files;
import java.nio.file.StandardOpenOption;
public class TestRead {
public static void main(String[] args) throws IOException, InterruptedException {
File logFile = new File("my-log.log");
// Make sure to start form a defined condition.
logFile.delete();
try (OutputStream out = Files.newOutputStream(logFile.toPath(), StandardOpenOption.CREATE)) {
// Just create an empty file to append later on.
}
Thread analyzer = Thread.currentThread();
// Simulate log file writing.
new Thread() {
#Override
public void run() {
try {
for (int n = 0; n < 16; n++) {
try (OutputStream out = Files.newOutputStream(logFile.toPath(), StandardOpenOption.APPEND)) {
PrintWriter printer = new PrintWriter(out);
String line = "Line " + n;
printer.println(line);
printer.flush();
System.out.println("wrote: " + line);
}
Thread.sleep(1000);
}
} catch (Exception ex) {
ex.printStackTrace();
} finally {
analyzer.interrupt();
}
}
}.start();
// The original code reading the log file.
try (InputStream inputStream = new InfiniteInputStream(logFile);) {
BufferedReader reader = new BufferedReader(new InputStreamReader(inputStream), 8);
String line;
while (true) {
line = reader.readLine();
if (line == null) {
System.out.println("End-of-file.");
break;
}
System.out.println("read: " + line);
}
}
}
public static class InfiniteInputStream extends InputStream {
private final RandomAccessFile _in;
public InfiniteInputStream(File file) throws IOException {
_in = new RandomAccessFile(file, "r");
}
#Override
public int read(byte[] b, int off, int len) throws IOException {
if (b == null) {
throw new NullPointerException();
} else if (off < 0 || len < 0 || len > b.length - off) {
throw new IndexOutOfBoundsException();
} else if (len == 0) {
return 0;
}
int c = read();
if (c == -1) {
return -1;
}
b[off] = (byte)c;
int i = 1;
try {
for (; i < len ; i++) {
c = readDirect();
if (c == -1) {
break;
}
b[off + i] = (byte)c;
}
} catch (IOException ee) {
}
return i;
}
#Override
public int read() throws IOException {
int result;
while ((result = readDirect()) < 0) {
// Poll until more data becomes available.
try {
Thread.sleep(500);
} catch (InterruptedException ex) {
return -1;
}
}
return result;
}
private int readDirect() throws IOException {
return _in.read();
}
}
}
I try to read a file in chunks and to pass each chunk to a thread that will count how many times each byte in the chunk is contained. The trouble is that when I pass the whole file to only one thread I get correct result but passing it to multiple threads the result becomes very strange. Here`s my code:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.util.HashSet;
import java.util.Scanner;
import java.util.Set;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
public class Main{
public static void main(String[] args) throws InterruptedException, ExecutionException, IOException
{
// get number of threads to be run
Scanner in = new Scanner(System.in);
int numberOfThreads = in.nextInt();
// read file
File file = new File("testfile.txt");
long fileSize = file.length();
long chunkSize = fileSize / numberOfThreads;
FileInputStream input = new FileInputStream(file);
byte[] buffer = new byte[(int)chunkSize];
ExecutorService pool = Executors.newFixedThreadPool(numberOfThreads);
Set<Future<int[]>> set = new HashSet<Future<int[]>>();
while(input.available() > 0)
{
if(input.available() < chunkSize)
{
chunkSize = input.available();
}
input.read(buffer, 0, (int) chunkSize);
Callable<int[]> callable = new FrequenciesCounter(buffer);
Future<int[]> future = pool.submit(callable);
set.add(future);
}
// let`s assume we will use extended ASCII characters only
int alphabet = 256;
// hold how many times each character is contained in the input file
int[] frequencies = new int[alphabet];
// sum the frequencies from each thread
for(Future<int[]> future: set)
{
for(int i = 0; i < alphabet; i++)
{
frequencies[i] += future.get()[i];
}
}
input.close();
for(int i = 0; i< frequencies.length; i++)
{
if(frequencies[i] > 0) System.out.println((char)i + " " + frequencies[i]);
}
}
}
//help class for multithreaded frequencies` counting
class FrequenciesCounter implements Callable<int[]>
{
private int[] frequencies = new int[256];
private byte[] input;
public FrequenciesCounter(byte[] buffer)
{
input = buffer;
}
public int[] call()
{
for(int i = 0; i < input.length; i++)
{
frequencies[(int)input[i]]++;
}
return frequencies;
}
}
My testfile.txt is aaaaaaaaaaaaaabbbbcccccc.
With 1 thread the output is:
a 14
b 4
c 6`
With 2 threads the output is:
a 4
b 8
c 12
With 3 threads the output is:
b 6
c 18
And so other strange results that I cannot figure out. Could anybody help?
Every thread is using the same buffer, and one thread will be overwriting the buffer as another thread is trying to process it.
You need to make sure every thread has its own buffer that nobody else can modify.
Create byte[] array for every thread.
public static void main(String[] args) throws InterruptedException, ExecutionException, IOException {
// get number of threads to be run
Scanner in = new Scanner(System.in);
int numberOfThreads = in.nextInt();
// read file
File file = new File("testfile.txt");
long fileSize = file.length();
long chunkSize = fileSize / numberOfThreads;
FileInputStream input = new FileInputStream(file);
ExecutorService pool = Executors.newFixedThreadPool(numberOfThreads);
Set<Future<int[]>> set = new HashSet<Future<int[]>>();
while (input.available() > 0) {
//create buffer for every thread.
byte[] buffer = new byte[(int) chunkSize];
if (input.available() < chunkSize) {
chunkSize = input.available();
}
input.read(buffer, 0, (int) chunkSize);
Callable<int[]> callable = new FrequenciesCounter(buffer);
Future<int[]> future = pool.submit(callable);
set.add(future);
}
// let`s assume we will use extended ASCII characters only
int alphabet = 256;
// hold how many times each character is contained in the input file
int[] frequencies = new int[alphabet];
// sum the frequencies from each thread
for (Future<int[]> future : set) {
for (int i = 0; i < alphabet; i++) {
frequencies[i] += future.get()[i];
}
}
input.close();
for (int i = 0; i < frequencies.length; i++) {
if (frequencies[i] > 0)
System.out.println((char) i + " " + frequencies[i]);
}
}
}
I have a writer program that writes a huge serialized java object (at the scale of 1GB) into a binary file on local disk at a specific speed. Actually, the writer program (implemented in C language) is a network receiver that receives the bytes of the serialized object from a remote server. The implementation of the writer is fixed.
Now, I want to implement a Java reader program that reads the file and deserializes it to a Java object. Since the file could be very large, it is beneficial to reduce the latency of deserializing the object. Particularly, I want the Java reader starts to read/deserialize the object once the first byte of the object has been written to the disk file so that the reader can start to deserialize the object even before the entire serialized object has been written to the file. The reader knows the size of the file ahead of time (before the first byte is written to the file).
I think what I need is something like a blocking file InputStream that will be blocked when it reaches the EndOfFile but it has not read the expected number of bytes (the size of the file will be). Thus, whenever new bytes have been written to the file, the reader's InputStream could keep reading the new content. However, FileInputStream in Java does not support this feature.
Probably, I also need a file listener that monitoring the changes made to the file to achieve this feature.
I am wondering if there is any existing solution/library/package can achieve this function. Probably the question may be similar to some of the questions in monitoring the log files.
The flow of the bytes is like this:
FileInputStream -> SequenceInputStream -> BufferedInputStream -> JavaSerializer
You need two threads: Thread1 to download from the server and write to a File, and Thread2 to read the File as it becomes available.
Both threads should share a single RandomAccessFile, so access to the OS file can be synchronized correctly. You could use a wrapper class like this:
public class ReadWriteFile {
ReadWriteFile(File f, long size) throws IOException {
_raf = new RandomAccessFile(f, "rw");
_size = size;
_writer = new OutputStream() {
#Override
public void write(int b) throws IOException {
write(new byte[] {
(byte)b
});
}
#Override
public void write(byte[] b, int off, int len) throws IOException {
if (len < 0)
throw new IllegalArgumentException();
synchronized (_raf) {
_raf.seek(_nw);
_raf.write(b, off, len);
_nw += len;
_raf.notify();
}
}
};
}
void close() throws IOException {
_raf.close();
}
InputStream reader() {
return new InputStream() {
#Override
public int read() throws IOException {
if (_pos >= _size)
return -1;
byte[] b = new byte[1];
if (read(b, 0, 1) != 1)
throw new IOException();
return b[0] & 255;
}
#Override
public int read(byte[] buff, int off, int len) throws IOException {
synchronized (_raf) {
while (true) {
if (_pos >= _size)
return -1;
if (_pos >= _nw) {
try {
_raf.wait();
continue;
} catch (InterruptedException ex) {
throw new IOException(ex);
}
}
_raf.seek(_pos);
len = (int)Math.min(len, _nw - _pos);
int nr = _raf.read(buff, off, len);
_pos += Math.max(0, nr);
return nr;
}
}
}
private long _pos;
};
}
OutputStream writer() {
return _writer;
}
private final RandomAccessFile _raf;
private final long _size;
private final OutputStream _writer;
private long _nw;
}
The following code shows how to use ReadWriteFile from two threads:
public static void main(String[] args) throws Exception {
File f = new File("test.bin");
final long size = 1024;
final ReadWriteFile rwf = new ReadWriteFile(f, size);
Thread t1 = new Thread("Writer") {
public void run() {
try {
OutputStream w = new BufferedOutputStream(rwf.writer(), 16);
for (int i = 0; i < size; i++) {
w.write(i);
sleep(1);
}
System.out.println("Write done");
w.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
};
Thread t2 = new Thread("Reader") {
public void run() {
try {
InputStream r = new BufferedInputStream(rwf.reader(), 13);
for (int i = 0; i < size; i++) {
int b = r.read();
assert (b == (i & 255));
}
int eof = r.read();
assert (eof == -1);
r.close();
System.out.println("Read done");
} catch (IOException ex) {
ex.printStackTrace();
}
}
};
t1.start();
t2.start();
t1.join();
t2.join();
rwf.close();
}
Hi I have a script that downloads are file from the web and while doing so prints out the progress. The problem is that the line that prints out the progress slows the program down alot, is there any way to stop this?
import java.io.FileOutputStream;
import java.io.InputStream;
import java.net.URL;
public class download {
public static void main(String[] args) {
try{
URL u = new URL("http://upload.wikimedia.org/wikipedia/commons/1/16/Appearance_of_sky_for_weather_forecast,_Dhaka,_Bangladesh.JPG");
FileOutputStream fos = new FileOutputStream("C://Users/xxx/Desktop/test.jpg");
InputStream is = u.openStream();
long size = u.openConnection().getContentLengthLong();
int data;
long done = 0;
while((data = is.read())!=-1){
double progress = (double) (done)/(double)(size)*100;
System.out.println(progress); // if we dont do this then then it completes fast
fos.write(data);
done++;
}
fos.close();
}catch(Exception e){
e.printStackTrace();
}
}
}
First of all, every I/O operation takes a high cost. Now, you're printing a message for every byte read! (noted in InputStream#read).
If you want/need to print the progress, do it for a bunch of KBs read, usually every 4 KBs. You can do this by using a byte[] buffer to read and write the data from the streams.
BufferedInputStream input = null;
BufferedOutStream output = null;
final int DEFAULT_BUFFER_SIZE = 4 * 1024;
try {
input = new BufferedInputStream(is, DEFAULT_BUFFER_SIZE);
output = new BufferedOutputStream(fos, DEFAULT_BUFFER_SIZE);
byte[] buffer = new byte[DEFAULT_BUFFER_SIZE];
int length;
while ((length = input.read(buffer)) > 0) {
output.write(buffer, 0, length);
done += length;
double progress = (double) (done)/(double)(size)*100
System.out.println(progress);
}
} catch (IOException e) {
//log your exceptions...
} finally {
closeResource(output);
closeResource(input);
}
And have this closeResource method:
public void closeResource(Closeable resource) {
if (resource != null) {
try {
resource.close();
} catch (IOException e) {
logger.error("Error while closing the resource.", e);
}
}
}
Try only printing out every xth loop.
if(done % 10 == 0) System.out.println(progress);
You can print the line only if (done % 100 == 0) let's say.
Also, you can use buffered way of reading, that would speed the program up.
Suggestion: don't print the progress with every iteration of the loop. Use a counter, decide on a reasonable frequency, a number to mod the counter by, and print the progress at that selected frequency.
After browsing some other threads regarding my problem I think I've understood that I need to re-design my application. But just for clarification: I have a single TCP/IP connection between a client and a server. On the client side there are a number of threads running concurrently. Randomly one or more of these threads use the TCP/IP connection to communicate with the server. I've found out that, e. g. While a long running file transfer is active, using the connection with another thread concurrently might lead to errors. Though I've preceeded each message with a specific header including the data length it appears to me that the IP stack sometimes delivers a mix of more than one messages to my program, which means that though one message has net yet been delivered completely, part of another message is delivered to my read method. Is this a correct observation which matches the intended TCP/IP behaviour? Thanks in advance - Mario
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
For anybody who's interested: following is the source code of my test program. You may play with various values for the BUFFER_SIZE and the number of THREADS used to bombard the server socket with concurrent TCP/IP sends using the same socket. I've left out some error handling and removed a more sophisticated termination including the closing of the sockets. Test with a BUFFER_SIZE greater than 64KB always leads to errors on my machine.
import java.io.*;
import java.net.*;
import java.nio.ByteBuffer;
public class TCPTest
{
private final static String INPUT_FILE = "c:/temp/tcptest.in";
private final static int BUFFER_SIZE = 64 * 1024 - 8; //65536;
private final static int MESSAGE_SIZE = 512 * 64 * 1024;
private final static int THREADS = 3;
private final static int SIZE_OF_INT = 4;
private final static int LENGTH_SIZE = SIZE_OF_INT;
private final static int ID_SIZE = SIZE_OF_INT;
private final static int HEADER_SIZE = LENGTH_SIZE + ID_SIZE;
private final static String NEW_LINE = System.getProperty("line.separator");
private ServerSocket m_serverSocket = null;
private Socket m_clientSocket = null;
private int m_iThreadCounter;
public static void main(String[] args)
{
new TCPTest();
} // main
public TCPTest()
{
final String id = "ReaderThread[*]";
// start a new thread creating a server socket waiting for connections
new Thread(new Runnable()
{
public void run()
{
try
{
// create server socket and accept client requests
m_serverSocket = new ServerSocket(12345);
m_clientSocket = m_serverSocket.accept();
// client request => prepare and read data
long startTime = System.currentTimeMillis();
byte[] buffer = new byte[BUFFER_SIZE];
ByteBuffer header = ByteBuffer.allocate(HEADER_SIZE);
int iTotalBytesRead = 0;
boolean fTerminate = false;
int iBytesRead;
// get hold of socket's input stream
InputStream clientInputStream = m_clientSocket.getInputStream();
// loop
while (false == fTerminate)
{
// loop to read next header
for (int i = 0; i < HEADER_SIZE; i++)
clientInputStream.read(header.array(), i, 1);
header.rewind();
// get information of interest
int iLength = header.getInt();
int iId = header.getInt();
int iLengthSoFar = 0;
int iBytesLeft = iLength;
int iBytesToRead;
// any length given?
if ((0 < iLength) && (BUFFER_SIZE >= iLength))
{
// that's the case => read complete message
while (iLengthSoFar < iLength)
{
// calculate number of bytes left
iBytesLeft = iLength - iLengthSoFar;
// calculate maximum number of bytes to read
if (iBytesLeft > BUFFER_SIZE)
iBytesToRead = BUFFER_SIZE;
else
iBytesToRead = iBytesLeft;
// read next portion of bytes
if ((iBytesRead = clientInputStream.read(buffer, 0, iBytesToRead)) != -1)
{
// maintain statistics
iTotalBytesRead += iBytesRead;
iLengthSoFar += iBytesRead;
} // if
else
{
// finish => print message
System.out.println("==> "+id+": ERROR length=<-1> received " +
"for id=<"+iId+">");
fTerminate = true;
break;
} // else
} // while
} // if
else
{
System.out.println("==> "+id+": ERROR data length <= 0 for id=<"+iId+">");
dump(header, 0, HEADER_SIZE / SIZE_OF_INT, "Error header");
} // else
} // while
System.out.println("==> "+id+": "+ iTotalBytesRead + " bytes read in "
+ (System.currentTimeMillis() - startTime) + " ms.");
} // try
catch (IOException e)
{
e.printStackTrace();
} // catch
} // run
}).start();
// create the socket writer threads
try
{
// ensure server is brought up and request a connection
Thread.sleep(1000);
System.out.println("==> "+id+": just awoke");
Socket socket = new Socket("localhost", 12345);
OutputStream socketOutputStream = socket.getOutputStream();
System.out.println("==> "+id+": socket obtained");
// create some writer threads
for (int i = 0; i < THREADS; i++)
// create a new socket writer and start the thread
(new SocketWriter(socket,
(i+1),
BUFFER_SIZE,
new String("WriterThread["+(i+1)+"]"),
socketOutputStream)).start();
} // try
catch (Exception e)
{
e.printStackTrace();
} // catch
} // TCPTestEx
private final static void dump(ByteBuffer bb, int iOffset, int iInts, String header)
{
System.out.println(header);
bb.rewind();
for (int i = 0; i < iInts; i++)
System.out.print(" " + Integer.toHexString(bb.getInt()).toUpperCase());
System.out.print(NEW_LINE);
} // dump
private class SocketWriter extends Thread
{
Socket m_socket;
int m_iId;
int m_iBufferSize;
String m_id;
OutputStream m_os;
protected SocketWriter(Socket socket, int iId, int iBufferSize, String id, OutputStream os)
{
m_socket = socket;
m_iId = iId;
m_iBufferSize = iBufferSize;
m_id = id;
m_os = os;
// increment thread counter
synchronized (m_serverSocket)
{
m_iThreadCounter++;
} // synchronized
} // SocketWriter
public final void run()
{
try
{
long startTime = System.currentTimeMillis();
ByteBuffer buffer = ByteBuffer.allocate(m_iBufferSize + HEADER_SIZE);
int iTotalBytesRead = 0;
int iNextMessageSize = 512 * m_iBufferSize;
int iBytesRead;
// open input stream for file to read and send
FileInputStream fileInputStream = new FileInputStream(INPUT_FILE);
System.out.println("==> "+m_id+": file input stream obtained");
// loop to read complete file
while (-1 != (iBytesRead = fileInputStream.read(buffer.array(), HEADER_SIZE, m_iBufferSize)))
{
// add length and id to buffer and write over TCP
buffer.putInt(0, iBytesRead);
buffer.putInt(LENGTH_SIZE, m_iId);
m_os.write(buffer.array(), 0, HEADER_SIZE + iBytesRead);
// maintain statistics and print message if so desired
iTotalBytesRead += iBytesRead;
if (iNextMessageSize <= iTotalBytesRead)
{
System.out.println("==> "+m_id+": <"+iTotalBytesRead+"> bytes processed");
iNextMessageSize += MESSAGE_SIZE;
} // if
} // while
// close my file input stream
fileInputStream.close();
System.out.println("==> "+m_id+": file input stream closed");
System.out.println("==> "+m_id+": <"+ iTotalBytesRead + "> bytes written in "
+ (System.currentTimeMillis() - startTime) + " ms.");
// decrement thread counter
synchronized (m_serverSocket)
{
m_iThreadCounter--;
// last thread?
if (0 >= m_iThreadCounter)
// that's the case => terminate
System.exit(0);
} // synchronized
} // try
catch (Exception e)
{
e.printStackTrace();
} // catch
} // run
} // SocketWriter
} // TCPTest
Yer. TCP is a byte oriented stream protocol. That means that the application receives an (undelimited) stream of bytes. The concept of "message" should be provided by the application (or use a message oriented protocol instead).