How to find status of producer using consumer - java

I am reading a large sized file like approximately 2 GB. But I am not aware of whether the writer has written the file completely. Is there any way to check whether the file is written completely from reader side?

Without any code, it's going to be hard to make any concrete suggestions but why not just extend whatever class you're using as a writer and add a simple CountDownLatch to it (I'm assuming that reader and writer are running in two different threads) and call countDown() once writing is done? On the reader side, you simply use the latch's await method.

AFAIK, there isn't a portable way to do it. However on Linux you can test if some other process has a file open by running the lsof command and processing its output.
For example:
lsof /home/fred/someFile
will output the pids for the process or processes that have that file open. If you can assume that the process that is writing the file closes it once it has finished writing (and doesn't reopen it), you can use the lsof output to test if the file has been fully written.
You can run the lsof command from Java using Process / ProcessBuilder.

Related

Java ProcessBuilder how to capture file reading request from process and provide as stream?

I am somewhat familiar with ProcessBuilder and do process the streams.
Now I ran into the problem that the process that I am automating reads some information from two files that I need to provide.
Currently, I am writing the files and provide the paths to the program via ProcessBuilder.
Since I am expecting to have millions of runs in the near future I would like to speed-up things by doing all work in memory and not reading and writing to file.
Basically, what I need to be able to do is capture the file open request from the automated program and provide the expected data from a stringstream or something similar.
Of course if I could tell ProcessBuilder somehow that the file paths I am giving be replaced by streams that would be even better.
How can I achieve this?
There is no interface to Process that allows you to intercept and modify I/O access like that. Unless you have the source code for the program whose execution you're trying to automate, you'll most likely have to do it on OS level.
It could be achieved though by creating a ram disk. If you're on Linux for instance, it's not that complicated. Have a look at this link: Linux RAM Disk: Creating A Filesystem In RAM.
I suppose that another alternative would be to let your Java program create a named pipe and pass these as the paths to the automated program.

Java Dealing with child Process

I have a simple script that writes output to the console using ProcessBuilder. And OutputStreamWriter see:
Java Process with Input/Output Stream
An earlier thread. I believe my problem rests with the fact a sub process is spawned and the initial Parent process is killed. Resulting in the OutputStreamWriter to throw a java.io.IOException: Broken pipe exception. Given the application is spawning a second process how can I connect my OutputStreamWriter to this new process? Including being able to read back the output it is generating? Is this even possible in Java? Surly java should be able to follow onto the spawned process.
Thanks
Are you calling waitFor() on the process you are running? That should ensure you're parent doesn't complete before the child. You may also want to look at commons exec, which is an open source library designed to make your life easier when running separate processes in Java.

External program blocks when run by Runtime exec

I'm attempting to launch an instance of the VideoLAN program from within a java application. One of the ways I've tried to do this is shown here:
Process p = Runtime.getRuntime().exec("\"C:\\Program Files\\VideoLAN\\VLC\\vlc.exe\" \"http://www.dr.dk/Forms/Published/PlaylistGen.aspx?qid=1316859&odp=true\" :sout=#std{access=udp,mux=ts,dst=127.0.0.1:63928}");
If I execute the above command the vlc program will be launched, and will start a streaming operation (it goes through connect, buffering and then streaming phases).
When the command is executed by Runtime exec (or ProcessBuilder start), the vlc program will hang when it reached the end of the buffering phase. If all threads in the java program are terminated/run to an end, the vlc program will progress to the streaming phase. The java process will not terminate until the vlc process is closed, so this behavior is obviously the result of some sort of coupling between the processes.
Have tried to execute the command indirectly by writing it to a .cmd file and then executing it, but results in the same behavior.
Any ideas for how I can avoid the external process hanging?
Hmm, my guess would be that VLC filled your STDOUT buffer and is hung in a printf statement because STDOUT is waiting for that buffer to empty.
You need to get the stream for the process's output and read it (even if you discard it).
I recommend you read this article
On the 4th page is a good example of how to read the streams in threads so your child process won't block.
This site is fantastic :). For some reason an approach I thought had already been tried suddenly started working.
The problem is that vlc writes to its stdErrOut (which is not visible when executed in a prompt). It then blocks once some output buffer is full. A solution is to have a stdErr redirected to stdOut and then have a thread empty the input stream of the process object.
It is however not an optimal solution, since I need a fair amount of external processes, and you can't do non-blocking I/O on their input streams. Will experiment a bit with having a timer service drive empty-reading for a number of processes. Other suggestions for how to de-couple the processes to avoid this problem are very welcome.

Java ProcessBuilder: external process hangs

I'm using Java's ProcessBuilder class to run an external process. The process should not terminate before the Java program does; it must stay alive in command/response mode.
I know that the process streams may easily 'jam' if neglected, so I've done the following:
The program reads the process's combined output and error streams in a "reader" thread, and uses a "writer" thread to manage the commands. The reader thread does blocking character reads from process output, buffers them up into Strings and dispatches the results. The writer thread writes complete "command" lines via a PrintWriter; it uses a queue to ensure that no two command writes are "too close together" (currently 100ms), and that no new command gets written before the output of the previous command is complete. I also call flush() and checkError() after every println().
This scheme works fine for a few seconds or minutes, then the reader thread hangs on the blocking read(). No errors, no exceptions thrown, no more process output. Thereafter nothing will revive the external process (short of restarting it). (BTW this happens on both Linux and Windows.)
I've looked at the code and test-cases in Jakarta Commons Exec and in Plexus Utils http://plexus.codehaus.org/plexus-utils/ but (a) neither gives an example of using a long-lived Process and (b) neither appears to be doing anything basically different from what I've described.
Does anyone have a clue what's happening here please?
Thanks!
Do you also have a thread managing stderr? You only mention the two streams.
i had implemented error, input and output stream in three sepereate threads and i can read and write to external processes without any problem.
I tested both on windows/linux with multitude of built in apps cmd/bash as well as other cmd line binaries and it works fine except on some occasions it just throws io stream exception, what i do is catch the exception and restart thread again, so that program keeps on working.
If you are trying to e.g ssh in linux, then you might run across problem like you won't be able to write to same stdin, this is because of security reasons.
Try taking input from System.in and see if it works, it worked in my case
Just a guess, but have you tried un-combining the error and output streams?

Best Way to Launch External Process from Java Web-Service?

I've inherited a Java web-services code-base (BEA/Oracle Weblogic) and need to start/launch an external background application from a web-service.
I've already tried:
ProcessBuilder pb = new ProcessBuilder(arg);
pb.start();
as well as:
Runtime.exec(cmdString);
But am experiencing strange behaviors when launching applications in this manner (i.e. the launched application stops working even though the process is still active. -- The application works fine when manually run from a normal command line).
Is there a better way to launch an external processes?
EDIT: ----------------------
I have some additional information that may help shed some light on the problem.
The process we are trying to start will require hours to complete so waiting for completion (using waitfor()) in the webservice will not be an ideal scenario.
Yes, the process we are trying to start from the webservice was created by a fellow team member [cue: your eyes roll... now]
I have had success when I use process builder to start a bash script, where the external application is launched as a background process (using "&").
#!/bin/bash
java -jar myApp.jar &
This obviously creates an orphaned process but at least the application does continue to execute.
Simply put: if the launched application writes to SDTOUT/STDIN and you don't flush them frequently (see Process.getErrorStream/Process.getInputStream) then the process will block when the buffer is full (that is really small, 4KB or less).
I recommend you to invoke ProcessBuilder.redirectErrorStream() before starting the process. Then, after that, create a thread with the run() method along the lines of:
public void run() {
BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
}
By "stops working even though the process is still active" I am assuming you might be expecting some output from the application you have launched and not getting anything.
Try using the following:
ProcessBuilder pb = new ProcessBuilder(arg);
Process p = pb.start();
p.waitFor();
waitFor() causes the current thread to wait, if necessary, until the process represented by this Process object has terminated.
http://java.sun.com/javase/6/docs/api/java/lang/Process.html#waitFor()
Firstly, is this happening on Windows or on Linux? Also, what is the launched application supposed to more or less do? (is it a script? is it a binary? is it your binary?)
EDIT
OK, so starting a bash script (using ProcessBuilder) which in turns spawns a new JVM (java -jar myApp.jar) works.
What happens exactly when you try to spawn the new JVM directly using ProcessBuilder? You originally said:
the launched application stops working
By launched application do you mean "java -jar myApp.jar", when invoked directly, not via an intermediate bash script?
What are the exact and complete parameters (and their values) you pass to the various ProcessBuilder methods (and in which order) when you try to launch Java directly and this new JVM stops working? (e.g. provide annotated code)
If you install lsof on your *nix machine, what file is shown to be associated with file descriptor 2 (look at the FD column) when running: lsof -p 1234 (where 1234 is the process ID of the "hung" JVM?) It might be interesting to attach the entire output of the lsof command here.
What is appended to the file you have identified in step 3 above (for file descriptor 2), up to several seconds after you issue the command: kill -QUIT 1234 (where 1234 is the process ID of the "hung" JVM?)
Are you properly handling the standard input and output of the process? If the standard input or output is being processed by your application and you are not properly handling it, then the process you execute will hang waiting for I/O.
A way to test this is to write a script that runs your program, redirecting standard input, output and error to files. Then have your web service app run the script instead of the program. If the program runs to completion this way, then the problem is handling of the output of the process.
I'm guessing that the problem might be the thread that launches the process gets TERMINATED or whatever after the request is over. Try having a single thread in the applicaton that you are sure is kept alive always, you can use this for starting the processes by making calls to it from other threads.

Categories