So, what I want to do is basically filter everything that is passed through System.out and System.err, is it possible to apply a filter of some sort or to create my own OutputStream to divert System.out with, and then process it normally?
Edit for clarity:
To clarify, I need to read what goes out of System.out from other programs and manipulate it how I see fit, so a logger is not really an option as I do not have control over what the other program will use.
Edit for more clarity:
I am creating a plugin for a larger program that needs to read everything written to System.out from other plugins. Because it is a plugin-based system the process my plugin is running on will always be the same one other plugins are running on.
Something I am not clear: you have mentioned that you want to "read what goes out of System.out from other programs". Are your application creating the process and want to monitor its standard out/err, or you want to monitor the standard out/err of your process itself?
For former case, after you created the process, you can get its output, input and error stream.
For latter case, you can replace the standard out of your Java process by System.setOut(yourOwnOutputStream);
However, if you are trying to deal with streams of totally irrelevant process, I believe there is no way doing so without having the caller pipe the stdout/stderr to your process (unless through some platform specific methods)
Update:
In order to "intercept" the standard out, it is nothing different from all those classic "filter outputstreams". What you can do is something like this:
import java.io.FilterOutputStream;
import java.io.IOException;
import java.io.OutputStream;
class AnalyzingOutputStream extends FilterOutputStream {
public AnalyzingOutputStream (OutputStream out) {
super(out);
}
#Override
public void write(int b) throws IOException {
// do whatever analysis you want
super.write(b); // delegate to super class' write, which will
// delegate to the filtered outputstream
}
// other overrides
}
In order to use it, your main logic should do something like:
AnalyzingOutputStream analyzingOutputStream = new AnalyzingOutputStream(System.out);
System.setOut(analyzingOutputStream );
// then you can call your methods of AnalyzingOutputStream to do whatever you want
You should use a logger but if you don't want to you van create your own PrintStream which holds the System.out. Then call System.setOut(yourPrintStream).
To capture all output from another process in Java, the simplest way is to launch the process yourself using Runtime.exec which will give you a Process object which has getOutputStream. Reading off that stream will give you the stdout for the launched process.
If you're trying to capture the output of an arbitrary process, you've got a bigger task ahead of you - and it's probably not a good idea. I have no idea how it works on Windows. Under Linux, you'll need to be running under the same user as the process you want to investigate or (more likely) root. You can look at the file descriptors for any given process under /proc/<process id>/fd/<file descriptor>. The file descriptors for stdin, stdout and stderr are 0, 1 and 2 respectively. I would recommend you stop at this point and re-think what you're trying to do.
Related
I want to read what gets written to stdout in a Java process that spawns other processes using inheritIO. I cannot use redirectOut as I have no control over the code that starts the process!. Also note that resetting System.setOut doesn't work in this case.
Also I don't have access to the Process object.
Example:
new ProcessBuilder().command('/bin/echo', 'FooBar').inheritIO.start
// read 'FooBar' from standard out
By definition, inheritIO causes the output of subprocess to be the same of the output of caller. So as soon as you call it on the ProcessBuilder, the calling process cannot read the standard output nor error streams of the callee.
As you say you cannot change that, the only way I can imagine is to use an external launcher that redirects output to a pipe (the default for ProcessBuilder...), starts the program containing the line you showed (new ProcessBuilder().command('/bin/echo', 'FooBar').inheritIO.start) and processes that output.
I'd like to set up a blocking file read in Java. That is, have a file such that when wrapped by FileInputStream and any read() method is call, the call blocks.
I can't think of an easy OS-independent way - on Unix-like OSes I could try to create a FIFO using mkfifo and read from that file. A possible work around would be to just create a very large file and read from that - the read is unlikely to complete before I capture the stack, but it's ugly and slow (and indeed reads can still be incredibly fast when cached).
The corresponding socket read() case is trivial to set up - create a socket yourself and read from it, and you can have deterministic blocking.
The purpose is to examine stack of the method to determine what the top frames are in such a case. Imagine I have a component which periodically samples the stacks traces of all running threads and then tries to categorize what that thread is doing at the moment. One thing it could be doing is file IO. So I need to know what the "top of stack" looks like during file IO. I have already determined that by experimentation (simply read a file in a variety of ways and sample the stack), but I want to write a test that will fail if this ever changes.
The natural way to write such a test is to kick off a thread which does a file read, then examine the top frame(s). To do this reliably, I want a blocking read (or else the thread may finish its read before the stack trace is taken, etc).
To get a guaranteed blocked I/O, read from a console, e.g. /dev/console on Linux or CON on Windows.
To make this platform-independent, you may hack the FileDescriptor of FileInputStream:
// Open a dummy FileInputStream
File f = File.createTempFile("dummy", ".tmp");
f.deleteOnExit();
FileInputStream fis = new FileInputStream(f);
// Replace FileInputStream's descriptor with stdin
Field fd = FileInputStream.class.getDeclaredField("fd");
fd.setAccessible(true);
fd.set(fis, FileDescriptor.in);
System.out.println("Reading...");
fis.read();
System.out.println("Complete");
UPDATE
I've realized you don't even need a method to block. In order just to get a proper stacktrace you may invoke read() on an invalid FileInputStream:
FileInputStream fis = new FileInputStream(new FileDescriptor());
fis.read(); // This will throw IOException exactly with the right stacktrace
If you still need a blocking read(), named pipes is the way to go: run mkfifo using Runtime.exec on POSIX systems or create \\.\PIPE\MyPipeName on Windows.
I don't know of anyway to make a File in a OS-Independent way that will always block when read.
If I were trying to find the stack trace when a specific function were called, I would run the program under a debugger and set a break point on that function. Although, method breakpoints will slow down your program and give you different results than you would normally get if timing is important.
If you have access to the source code of the progream, you could make a fake FileInputStream that extends the real one but always blocks on a read. All you need to do is to switch out the import statements throughout the code. However, this won't capture places where you are not able to switch out import statements and it could be a pain if there is a lot of code.
If you want to use your own FileInputStream without changing the program source code or compiling, you can make a custom class loader that loads your custom FileInputStream class instead of the real one. You can specify which class loader to use on the command line by:
java -Djava.system.class.loader=com.test.MyClassLoader xxx
Now that I think about it, I have an even better idea, instead of making a custom FileInputStream that blocks on read(), make a custom FileInputStream that prints out the stack traces on read(). The custom class can then call the real version of read(). This way you will get all of the stack traces for all calls.
From my understanding you want to write a test which inspects the stack trace of FileInputStream.read() method. What about descendants of FileInputStream if they override the read() method?
If you don't need to inspect the descendants, I think you can use the JVM Tool Interface by inserting a break point at runtime in the desired method, and in the event processing of this event (break point) - dump the stack trace.
After the dump is completed you remove the break point and continue the execution.
(This all occurs in runtime using this API, no black magic :) )
You could have a separate thread watch for changes to the file's access time and generate a jvm thread dump when that happens. As to generating the thread dump in code I haven't tried but looks like that's answered here: Generate a Java thread dump without restarting.
I don't know how well this will work with the timing between your threads but I imagine this should come pretty close. I'm also not 100% on the OS independence of this solution as I haven't tested it, but it should work for most modern-ish systems. See the javadocs on the java.nio.file.attribute.BasicFileAttributes to see what will return if it's not supported.
One trick is : if it is possible to modify your API to return a Reader instead of a File, then you can wrap a String with a custom StringReader (class SlowAsRubyStringReader extends Reader, say) that overrides the various int read() methods with a Thread.sleep(500) before it does the real work. Only during testing, of course.
#see http://docs.oracle.com/javase/7/docs/api/java/io/StringReader.html
I think there is a larger issue here, not just Files : you want to inspect the context in which an API is getting called during your test cases is it not? That is, you want to be able to examine the stack and say, "aha! I caught you calling the MudFactory API from the JustTookABath object, OUTRAGEOUS!". If this is the case, then you may have to delve into dynamic proxies, which will allow you to hijack function calls or use aspect-oriented programming, which allows you to do the same, but in a more systematic way. See http://en.wikipedia.org/wiki/Pointcut
read() dives quickly into native code so yes probably need to go native to block at that level. Alternatively you may want to consider logging a stack trace at
the point in your code before or after read().
Something like:
log ( ExceptionUtils.getStackTrace(new Exception()) );
ExceptionUtils doco is here: https://commons.apache.org/proper/commons-lang/javadocs/api-3.1/org/apache/commons/lang3/exception/ExceptionUtils.html
I want to invoke an external program in java code, then the Google tell me that the Runtime or ProcessBuilder can help me to do this work. I have tried it, and there come out a problem the java program can't exit, that means both the sub process and the father process wait for forever. they are hanging or deadlock.
Someone tell me the reason is that the sub process's cache is too small. when it try to give back data to the father process, but the father process don't read it in time, then both of them hang. So they advice me fork an thread to be in charge of read sub process's cache data. I do it as what they tell me, but there still some problem.
Then I close the output stream which get by the method getOutputStream(). Finally, the program success. But I don't know why it happen? Is there some relationship between the output steam and input stream?
You have provided very few details in your question, so I can only provide a general answer.
All processes have three standard streams: standard input, standard output and standard error. Standard input is used for reading in data, standard output for writing out data, and standard error for writing out error messages. When you start an external program using Runtime.getRuntime().exec() or ProcessBuilder, Java will create a Process object for the external program, and this Process object will have methods to access these streams.
These streams are accessed as follows:
process.getOutputStream(): return the standard input of the external program. This is an OutputStream as it is something your Java code will write to.
process.getInputStream(): return the standard output of the external program. This is an InputStream as it is something your Java code will read from.
process.getErrorStream(): return the standard error of the external program. This is an InputStream as, like standard output, it is something your Java code will read from.
Note that the names of getInputStream() and getOutputStream() can be confusing.
All streams between your Java code and the external program are buffered. This means each stream has a small amount of memory (a buffer) where the writer can write data that is yet to be read by the reader. The writer does not have to wait for the reader to read its data immediately; it can leave its output in the buffer and continue.
There are two ways in which writing to buffers and reading from them can hang:
attempting to write data to a buffer when there is not enough space left for the data,
attempting to read from an empty buffer.
In the first situation, the writer will wait until space is made in the buffer by reading data out of it. In the second, the reader will wait until data is written into the buffer.
You mention that closing the stream returned by getOutputStream() caused your program to complete successfully. This closes the standard input of the external program, telling it that there will be nothing more for it to read. If your program then completes successfully, this suggests that your program was waiting for more input to come when it was hanging.
It is perhaps arguable that if you do run an external program, you should close its standard input if you don't need to use it, as you have done. This tells the external program that there will be no more input, and so removes the possibility of it being stuck waiting for input. However, it doesn't answer the question of why your external program is waiting for input.
Most of the time, when you run external programs using Runtime.getRuntime().exec() or ProcessBuilder, you don't often use the standard input. Typically, you'd pass whatever inputs you'd need to the external program on the command line and then read its output (if it generates any at all).
Does your external program do what you need it to and then get stuck, apparently waiting for input? Do you ever need to send it data to its standard input? If you start a process on Windows using cmd.exe /k ..., the command interpreter will continue even after the program it started has exited. In this case, you should use /c instead of /k.
Finally, I'd like to emphasise that there are two output streams, standard output and standard error. There can be problems if you read from the wrong stream at the wrong time. If you attempt to read from the external program's standard output while its buffer is empty, your Java code will wait for the external program to generate output. However, if your external program is writing a lot of data to its standard error, it could fill the buffer and then find itself waiting for your Java code to make space in the buffer by reading from it. The end result of this is your Java code and the external program are both waiting for each other to do something, i.e. deadlock.
This problem can be eliminated simply by using a ProcessBuilder and ensuring that you call its redirectErrorStream() method with a true value. Calling this method redirects the standard error of the external program into its standard output, so you only have one stream to read from.
I have started a process in my Java code, this process take a very long time to run and could generate some output from time to time. I need to react to every output when they are generated, what is the best way to do this?
What kind of reaction are you talking about? Is the process writing to its standard output and/or standard error? If so, I suspect Process.getInputStream and Process.getErrorStream are what you're looking for. Read from both of those and react accordingly. Note that you may want to read from both of them from different threads, to avoid the individual buffer for either stream from filling up.
Alternatively, if you don't need the two separately, just leave redirectErrorStream in ProcessBuilder as false, so the error and output streams are merged.
You should start a thread which reads from the Process.getInputStream() and getErrorStream() (or alternatively use ProcessBuilder.redirectErrorStream(true)) and handle it when something shows up in the stream. There are many ways that how to handle it - the right way depends on how the data is being used. Please tell more details.
Here is one real-life example: SbtRunner uses ProcessRunner to send commands to a command line application and wait for the command to finish execution (the application will print "> " when a command finishes execution). There is some indirection happening to make it easier to read from the process' output (the output is written to a MulticastPipe from where it is then read by an OutputReader).
I am creating a GUI using Java. This GUI launches a program from the command line using the ProcessBuilder class.
A little information on the process being launched: from the command line, it creates another window and prints information to said window.
In my GUI window, I have a text area to where I would like to redirect said output. I originally intended to use a SwingWorker object to constantly check for more output and not hold up the GUI. To test and make sure I had the original syntax down (without even bringing the GUI into things) I thought I would print the output from the secondary process' window to System.out. However, something seems to be wrong as I can see the output in the secondary process' window, but not the terminal from which I am working.
Excerpt of code is as follows:
Process p = pb.start();
Scanner s = new Scanner(p.getInputStream());
SwingWorker pipe = new SwingWorker<String, Void> (){
public String doInBackground(){
while(run){
if(s.hasNextLine()){
System.out.println("S has next!");
System.out.println(s.nextLine());
}
}
return null;
}
};
pipe.execute();
The boolean run is defined elsewhere in the program and is set to false when the process p exits or is force quit (additional question: is that a really bad idea? I feel like it might be...).
Does anyone have an idea as to why I am never getting any output when I see it being printed to the other window? Initially my reaction was to use p.getOutputStream() but Scanner does not take an outputStream as a paramter.
Thank you for your time.
You should also scan p.getErrorStream() - some programs write to STDERR which is indistinguishable from STDOUT when run from the command line. It is generally good practice to consume both streams, as if either one is not consumed it can cause the external process to hang.
If the external process is writing its output to its own window, it is almost certain that the output is NOT being written to STDOUT, which is what you are reading with your code. If it did so, then the external program's output would be appearing both in its window and in the command line session from which it was launched (if one existed). Without access to the source of the external program it's unlikely you will be able to intercept its output unless the authors made provisions for that functionality (i.e. a command-line switch that redirects output to STDOUT instead of the window).
As to p.getOutputStream(), that returns a stream which is "output" from YOUR point of view -- i.e. you write to it to send data to the process' STDIN. Your use of p.getInputStream() would be correct for the case where the external program writes to its STDOUT.