Java read serial data for given time period. - java

I have a function that reads serial data from an embedded device. My program shows a picture and a title and basically the device acts as a buzzer for a game. Is there a way to check for serial data for lets say 5 seconds and if nothing was received to continue with the code (go to the next picture and title). My current function looks like this.
public String getUARTLine(){
String inputLine = null;
try{
BufferedReader input = new BufferedReader(new InputStreamReader(serialPort.getInputStream()));
inputLine = input.readLine();
if (inputLine.length() == 0)
return null;
} catch (IOException e){
//System.out.println("IOException: " + e);
return null;
}
return inputLine;
}

You can start reading data from serialPort and start a timer in other thread. Something like this:
class ReadItWithTimeLimit implements Runnable {
int miliSeconds;
BufferedReader reader;
public ReadItWithTimeLimit (BufferedReader reader, int miliSeconds) {
this.miliSeconds = miliSeconds;
this.reader = reader;
}
public void run() {
Thread.sleep(miliSeconds);
this.reader.close();
}
}
So you can call it from your code:
// ...
BufferedReader input = new BufferedReader(new InputStreamReader(serialPort.getInputStream()));
new Thread(new ReadItWithTimeLimit(input, 5000)).start();
inputLine = input.readLine();
// ...
This code is without excaption handling, so it requires some finalization work...

Drop the buffer. Start a read on the input stream yourself and in a different thread count 5 seconds. After that, close the stream (that will cause the read function to return -1).

Yes you can. You can use a separate timer thread that triggers the timeout that closes the input stream (that will cause input.readLine() to come back with an IOException). Or you can use java.nio. However I personally prefer the first method.

Related

Java: Write to and read from same process multiple times

I've gone through so many related StackOverflow questions for this that I'm getting lost in them, and I've coded this multiple ways, but none seem to solve this problem in a way that works for me: How can I send output to the same command and process multiple times while at the same time receiving input from this same process?
(See Input various strings to same process in Java for a similar question, but this ended with only a theoretical answer.)
The command (command line, from a C++ executable) loads a large file, and then I want to send input to it very quickly, get back the answer, do other stuff in between, then send different input and get the corresponding answer. Multiply this by thousands or millions of times.
One implementation, with threads:
ProcessBuilder pb = new ProcessBuilder(command.split(" "));
kenLMProcess = pb.start();
KenLMInThread lmInput = new KenLMInThread(kenLMProcess.getInputStream());
KenLMInThread lmError = new KenLMInThread(kenLMProcess.getErrorStream());
KenLMOutThread lmOutput = new KenLMOutThread(kenLMProcess.getOutputStream());
lmOutput.inStr = "Test . \n";
lmInput.start();
lmOutput.start();
lmError.start();
lmOutput.join();
lmInput.join();
lmError.join();
outStr = lmInput.newStr;
But join waits until the thread ends. What if I don't want to wait for it to end? I can't seem to figure out how to use wait() for that purpose. For one I'd prefer to not have to keep opening and closing a new output stream and input stream every time I query the command. But at least that's better than starting a new ProcessBuilder every time.
Here's what run() looks like for KenLMOutThread:
public void run() {
try {
pw.write(inStr+"\n");
pw.write('\n');
} catch (Exception e) {
System.out.println("Error while inputting to KenLM.");
e.printStackTrace();
} finally {
pw.flush();
try {
pw.flush();
bw.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Supposedly flush is supposed to let it move on, and "\n" at the end is supposed to help, but it just hangs unless I use close. And if I use close, I can't use the OutputStream anymore. I'm also then unable to make a new OutputStream from the Process.
If it helps, here's a more simple implementation with everything together (taken from How to send EOF to a process in Java?):
Note that close() is used, and using flush() without close() causes the program to hang.
public static String pipe(String str, String command2) throws IOException, InterruptedException {
Process p2 = Runtime.getRuntime().exec(command2);
OutputStream out = p2.getOutputStream();
out.write(str.getBytes());
out.close();
p2.waitFor();
BufferedReader reader
= new BufferedReader(new InputStreamReader(p2.getInputStream()));
StringBuilder sb = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
sb.append(line + "\n");
}
return sb.toString();
}
Other things I've tried:
Using exec(): Process kenLMProcess=Runtime.getRuntime().exec(command);
Putting the command process in its own thread: KenLMProcessThread procThread = new KenLMProcessThread(pb.start());
If the target process is hanging unless you close the output stream, the problem is at that end: it is reading until end of stream before doing anything. Nothing you can do about that at the sending end.

Multiple threads reading from same file/odd behavior

I'm writing a program that needs to read lines from a very large file (400K+ lines) and send the data in each line on to a web service. I decided to try threading and am seeing some behavior I did not expect, it appears like my BufferedReader starts reusing lines it's already given me when I call readline() on it.
My program is made up of two classes. A "Main" class that kicks off the threads and holds a static reference to the BufferedReader and has a static sychronized "readNextLine()" method that the threads can use to basically call readLine() on the BufferedReder. And the "Runnable" class that calls readNextLine() and makes a webservice call with the data from each readNextLine() call. I made the BufferedReader and readNextLine() static just because that's the only way I could think of for the threads to share the reader aside from passing an instance of my main class into the threads, I wasn't sure which was better.
After about 5 minutes, I start seeing errors in my web service saying that it's processing a line it's already processed. I'm able to verify lines are indeed being sent multiple times, minutes apart.
Does anyone have any ideas as to why the BufferedReader seems to be giving the threads lines it already read? I was under the impression readline() was sequential and all I needed to do was make sure the calls to readline() were synchronized.
I'll show some of the Main class code below. The runnable is essentially a while loop that calls readNextLine() and processes each line until there are no more lines left.
Main class:
//showing reader and thread creation
inputStream = sftp.get(path to file);
reader = new BufferedReader(new InputStreamReader(inputStream));
ExecutorService executor = Executors.newFixedThreadPool(threads);
Collection<Future> futures = new ArrayList<Future>();
for(int i=0;i<threads;i++){
MyRunnable runnable = new MyRunnable(i);
futures.add(executor.submit(runnable));
}
LOGGER.debug("futures.get()");
for(Future f:futures){
f.get(); //use to wait until all threads are done
}
public synchronized static String readNextLine(){
String results = null;
try{
if(reader!=null){
results = reader.readLine();
}
}catch(Exception e){
LOGGER.error("Error reading from file");
}
return results;
}
I'm testing what you said, but I found you get an error logic in your readNextLine() method, how can reader.readLine() be invoked as the results is null and the if condition is it is not null?
Now I finished my demo, and it seems it works well, the following is the demo, no re-read line happened:
static BufferedReader reader;
public static void main(String[] args) throws FileNotFoundException, ExecutionException, InterruptedException {
reader = new BufferedReader(new FileReader("test.txt"));
ExecutorService service = Executors.newFixedThreadPool(3);
List<Future> results = new ArrayList<Future>();
for (int i = 0; i < 3; i++) {
results.add(service.submit(new Runnable() {
#Override
public void run() {
try {
String line = null;
while ((line = readNextLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
}
}));
}
}
public synchronized static String readNextLine() throws IOException {
return reader.readLine();
}

Exec'ing multiple processes from Java: outputs mixed up?

I have a servlet which creates a new Action object inside doGet(), and this object uses exec() to run external processes. There may be several requests to the servlet at the same time, so I might have several Action objects where each one is running an external process at the same time. Occasionally when this happens I find that the output from one process gets mixed up with the output from one of the others.
Each Action creates a unique temporary directory and runs the process with this as its current directory. Separate threads then read the output streams from the process into a string buffer. The code that the Action object executes looks basically like this:
Process proc = null;
ReaderThread stdout = null;
ReaderThread stderr = null;
StringBuffer buff = new StringBuffer();
int exit = -1;
try {
//
// Run the process
//
proc = Runtime.getRuntime().exec(command,null,directory);
//
// Read the output from the process
//
stdout = new ReaderThread(proc.getInputStream(),buff);
stderr = new ReaderThread(proc.getErrorStream(),buff);
stdout.start();
stderr.start();
//
// Get the exit code
//
exit = proc.waitFor();
//
// Wait for all the output to be read
//
stdout.join();
stderr.join();
}
catch (InterruptedException e) {
if (proc != null) {
proc.destroy();
}
}
catch (Exception e) {
buff.append(e.getClass() + ": " + e.getMessage());
if (proc != null) {
proc.destroy();
}
}
So each request uses a separate Action object to run a process, and this has its own StringBuffer "buff" that the output of the process is accumulated into by the two ReaderThreads. But what I find is that, when two requests are running two processes at the same time, the output of one will sometimes end up in the StringBuffer of the thread that is running the other one, and one of the two servlet requests will see output intended for the other one. It basically behaves as if Runtime.exec() provides a single global pipe to which the output streams of all the processes are connected.
The ReaderThread looks like this:
public class ReaderThread extends Thread {
private BufferedReader reader;
private StringBuffer buffer;
public ReaderThread (InputStream stream, StringBuffer buffer) {
this.reader = new BufferedReader(new InputStreamReader(stream));
this.buffer = buffer;
}
#Override
public void run () {
try {
String line;
while ((line = reader.readLine()) != null) {
synchronized (buffer) {
buffer.append(line + "\n");
}
}
}
catch (IOException e) {
synchronized (buffer) {
buffer.append(e.getMessage() + "\n");
}
}
}
}
Can anyone suggest what I can do to fix this?
Use a ThreadLocal variable to store the output of each thread.
When and how should I use a ThreadLocal variable?
Here's the explanation, partly as a cautionary tale:
the output was being added to an ArrayList in the XML node that
accessed it
the XML nodes were created by cloning prototypes
the ArrayList was initialised in the declaration, not in the initialise()
method of the class, so every instance ended up referring to the same
object.
Duh.
Another two days of my life down the drain!
Happy new year...

Java Process.waitFor() and Readline hangs

First, this is my code :
import java.io.*;
import java.util.Date;
import com.banctecmtl.ca.vlp.shared.exceptions.*;
public class PowershellTest implements Runnable {
public static final String PATH_TO_SCRIPT = "C:\\Scripts\\ScriptTest.ps1";
public static final String SERVER_IP = "XX.XX.XX.XXX";
public static final String MACHINE_TO_MOD = "MachineTest";
/**
* #param args
* #throws OperationException
*/
public static void main(String[] args) throws OperationException {
new PowershellTest().run();
}
public PowershellTest(){}
#Override
public synchronized void run() {
String input = "";
String error = "";
boolean isHanging = false;
try {
Runtime runtime = Runtime.getRuntime();
Process proc = runtime.exec("powershell -file " + PATH_TO_SCRIPT +" "+ SERVER_IP +" "+ MACHINE_TO_MOD);
proc.getOutputStream().close();
InputStream inputstream = proc.getInputStream();
InputStreamReader inputstreamreader = new InputStreamReader(inputstream);
BufferedReader bufferedreader = new BufferedReader(inputstreamreader);
proc.waitFor();
String line;
while (!isHanging && (line = bufferedreader.readLine()) != null) {
input += (line + "\n");
Date date = new Date();
while(!bufferedreader.ready()){
this.wait(1000);
//if its been more then 1 minute since a line has been read, its hanging.
if(new Date().getTime() - date.getTime() >= 60000){
isHanging = true;
break;
}
}
}
inputstream.close();
inputstream = proc.getErrorStream();
inputstreamreader = new InputStreamReader(inputstream);
bufferedreader = new BufferedReader(inputstreamreader);
isHanging = false;
while (!isHanging && (line = bufferedreader.readLine()) != null) {
error += (line + "\n");
Date date = new Date();
while(!bufferedreader.ready()){
this.wait(1000);
//if its been more then 1 minute since a line has been read, its hanging.
if(new Date().getTime() - date.getTime() >= 60000){
isHanging = true;
break;
}
}
}
inputstream.close();
proc.destroy();
} catch (IOException e) {
//throw new OperationException("File IO problem.", e);
} catch (InterruptedException e) {
//throw new OperationException("Script thread problem.",e);
}
System.out.println("Error : " + error + "\nInput : " + input);
}
}
I'm currently trying to run a powershell script that will start/stop a vm (VMWARE) on a remote server. The script work from command line and so does this code. The thing is, I hate how I have to use a thread (and make it wait for the script to respond, as explained further) for such a job. I had to do it because both BufferedReader.readline() and proc.waitFor() hang forever.
The script, when ran from cmd, is long to execute. it stall for 30 sec to 1 min from validating authentification with the server and executing the actual script. From what I saw from debugging, the readline hang when it start receiving those delays from the script.
I'm also pretty sure it's not a memory problem since I never had any OOM error in any debugging session.
Now I understand that Process.waitFor() requires me to flush the buffer from both the error stream and the regular stream to work and so that's mainly why I don't use it (I need the output to manage VM specific errors, certificates issues, etc.).
I would like to know if someone could explain to me why it hangs and if there is a way to just use the typical readline() without having it to hang so hard. Even if the script should have ended since a while, it still hang (I tried to run both the java application and a cmd command using the exact same thing I use in the java application at the same time, left it runingfor 1h, nothing worked). It is not just stuck in the while loop, the readline() is where the hanging is.
Also this is a test version, nowhere close to the final code, so please spare me the : this should be a constant, this is useless, etc. I will clean the code later. Also the IP is not XX.XX.XX.XXX in my code, obviously.
Either explanation or suggestion on how to fix would be greatly appreciated.
Ho btw here is the script I currently use :
Add-PSSnapin vmware.vimautomation.core
Connect-VIServer -server $args[0]
Start-VM -VM "MachineTest"
If you need more details I will try to give as much as I can.
Thanks in advance for your help!
EDIT : I also previously tested the code with a less demanding script, which job was to get the content of a file and print it. Since no waiting was needed to get the information, the readline() worked well. I'm thus fairly certain that the problem reside on the wait time coming from the script execution.
Also, forgive my errors, English is not my main language.
Thanks in advance for your help!
EDIT2 : Since I cannot answer to my own Question :
Here is my "final" code, after using threads :
import java.io.*;
public class PowershellTest implements Runnable {
public InputStream is;
public PowershellTest(InputStream newIs){
this.is = newIs;
}
#Override
public synchronized void run() {
String input = "";
String error = "";
try {
InputStreamReader inputstreamreader = new InputStreamReader(is);
BufferedReader bufferedreader = new BufferedReader(inputstreamreader);
String line;
while ((line = bufferedreader.readLine()) != null) {
input += (line + "\n");
}
is.close();
} catch (IOException e) {
//throw new OperationException("File IO problem.", e);
}
System.out.println("Error : " + error + "\nInput : " + input);
}
}
And the main simply create and start 2 thread (PowerShellTest instances), 1 with the errorStream and 1 with the inputStream.
I believe I made a dumb error when I first coded the app and fixed it somehow as I reworked the code over and over. It still take a good 5-6 mins to run, which is somehow similar if not longer than my previous code (which is logical since the errorStream and inputStream get their information sequentially in my case).
Anyway, thanks to all your answer and especially Miserable Variable for his hint on threading.
First, don't call waitFor() until after you've finished reading the streams. I would highly recommend you look at ProcessBuilder instead of simply using Runtime.exec, and split the command up yourself rather than relying on Java to do it for you:
ProcessBuilder pb = new ProcessBuilder("powershell", "-file", PATH_TO_SCRIPT,
SERVER_IP, MACHINE_TO_MOD);
pb.redirectErrorStream(true); // merge stdout and stderr
Process proc = pb.start();
redirectErrorStream merges the error output into the normal output, so you only have to read proc.getInputStream(). You should then be able to just read that stream until EOF, then call proc.waitFor().
You are currently waiting to complete reading from inputStream before starting to read from errorStream. If the process writes to its stderr before stdout maybe you are getting into a deadlock situation.
Try reading from both streams from concurrently running threads. While you are at it, also remove proc.getOutputStream().close();. It shouldn't affect the behavior, but it is not required either.

Java: Pause thread and get position in file

I'm writing an application in Java with multithreading which I want to pause and resume.
The thread is reading a file line by line while finding matching lines to a pattern. It has to continue on the place I paused the thread. To read the file I use a BufferedReader in combination with an InputStreamReader and FileInputStream.
fip = new FileInputStream(new File(*file*));
fileBuffer = new BufferedReader(new InputStreamReader(fip));
I use this FileInputStream because I need the filepointer for the position in the file.
When processing the lines it writes the matching lines to a MySQL database. To use a MySQL-connection between the threads I use a ConnectionPool to make sure just one thread is using one connection.
The problem is when I pause the threads and resume them, a few matching lines just disappear. I also tried to subtract the buffersize from the offset but it still has the same problem.
What is a decent way to solve this problem or what am I doing wrong?
Some more details:
The loop
// Regex engine
RunAutomaton ra = new RunAutomaton(this.conf.getAuto(), true);
lw = new LogWriter();
while((line=fileBuffer.readLine()) != null) {
if(line.length()>0) {
if(ra.run(line)) {
// Write to LogWriter
lw.write(line, this.file.getName());
lw.execute();
}
}
}
// Loop when paused.
while(pause) { }
}
Calculating place in file
// Get the position in the file
public long getFilePosition() throws IOException {
long position = fip.getChannel().position() - bufferSize + fileBuffer.getNextChar();
return position;
}
Putting it into the database
// Get the connector
ConnectionPoolManager cpl = ConnectionPoolManager.getManager();
Connector con = null;
while(con == null)
con = cpl.getConnectionFromPool();
// Insert the query
con.executeUpdate(this.sql.toString());
cpl.returnConnectionToPool(con);
Here's an example of what I believe you're looking for. You didn't show much of your implementation so it's hard to debug what might be causing gaps for you. Note that the position of the FileInputStream is going to be a multiple of 8192 because the BufferedReader is using a buffer of that size. If you want to use multiple threads to read the same file you might find this answer helpful.
public class ReaderThread extends Thread {
private final FileInputStream fip;
private final BufferedReader fileBuffer;
private volatile boolean paused;
public ReaderThread(File file) throws FileNotFoundException {
fip = new FileInputStream(file);
fileBuffer = new BufferedReader(new InputStreamReader(fip));
}
public void setPaused(boolean paused) {
this.paused = paused;
}
public long getFilePos() throws IOException {
return fip.getChannel().position();
}
public void run() {
try {
String line;
while ((line = fileBuffer.readLine()) != null) {
// process your line here
System.out.println(line);
while (paused) {
sleep(10);
}
}
} catch (IOException e) {
// handle I/O errors
} catch (InterruptedException e) {
// handle interrupt
}
}
}
I think the root of the problem is that you shouldn't be subtracting bufferSize. Rather you should be subtracting the number of unread characters in the buffer. And I don't think there's a way to get this.
The easiest solution I can think of is to create a custom subclass of FilterReader that keeps track of the number of characters read. Then stack the streams as follows:
FileReader
< BufferedReader
< custom filter reader
< BufferedReader(sz == 1)
The final BufferedReader is there so that you can use readLine ... but you need to set the buffer size to 1 so that the character count from your filter matches the position that the application has reached.
Alternatively, you could implement your own readLine() method in the custom filter reader.
After a few days searching I found out that indeed subtracting the buffersize and adding the position in the buffer wasn't the right way to do it. The position was never right and I was always missing some lines.
When searching a new way to do my job I didn't count the number of characters because it are just too many characters to count which will decrease my performance a lot. But I've found something else. Software engineer Mark S. Kolich created a class JumpToLine which uses the Apache IO library to jump to a given line. It can also provide the last line it has readed so this is really what I need.
There are some examples on his homepage for those interested.

Categories