I've created a class which processes files and if it encounters certain specific errors, it outputs relevant error messages to the error stream.
I am working on another class that needs to access these error messages. I'm not sure how to do this. I am a beginner in Java programming. Based on my limited knowledge, I thought that my two options would be to either call the main method of the first class (but I don't know how I would get the error messages in this case) or to execute the compiled class and access the messages through the getErrorStream() method of the Process class. But, I am having trouble with the system deadlocking or possibly not even executing the exec command, so I'm not sure how implement the second case either.
I'm not quite sure what you're asking here, but a potential problem with your code is that you're not reading from the process' stdout. Per the Process API, "failure to promptly ... read the output stream of the subprocess may cause the subprocess to block, and even deadlock." Is this the "trouble" you mentioned?
Edit: So yeah, you can either do what you're doing, but be sure to read both the error stream and the output stream (see my comment), or you could just call the main method directly from your code, in which case the error output will be written to System.err. You could use System.setErr() to install your own stream that would let you get what's written to it, but keep in mind that any error output from your own app--the one that's running the other app--will also show up here. It sounds like spawning a separate process, like you're already doing, is what you want.
You can't build modularity based on many little programs with a main method. You have to make blocks of function as classes that are designed to be called from elsewhere -- and that means returning status information in some programmatic fashion, not just blatting it onto System.err. If it really is an error, throw an exception. If you have to return status, design a data structure to hold the status and return it. But don't go launching new processes all over the place and reading their error streams.
Related
Is it possible to debug a running java process (with Eclipse / IntelliJ) without having a breakpoint? Would be useful to get the state of an object when having a construct like this:
double d = Math.random();
BlockingQueue<String> queue = new LinkedBlockingQueue<String>();
queue.take();
"take()" will block forever, and I'd like to get the value of d with the debugger after take() was called.
I'd like to get the value of d with the debugger after take() was called
A simple solution would be to print the value. Then you don't need the debugger at all. The debugger is useful when changing data due to testing something, inspecting certain objects at runtime etc.
queue.take();
System.out.println(d);
In your case it's not option. One option you can have is to de-compile the jar containing BlockingQueue class, convert them to source files, include it in your project, and set breakpoints inside take() to see the behavior.
One best decompiler is:
http://jd.benow.ca/
using this you can see source of take() function and copy whole class and paste it in your package with name BlockingQueue.java and paste whole source and debug as you wish.
Viewing the state of a thread after an error would be very useful; and while it is more common in some interpreted languages it is sadly lacking in Java. There are however four approaches that come to mind, however bare in mind that the standard approach here in Java is to log important state for later diagnostics.
journal your system, and keep all processing idempotent and deterministic.
attach a debugger after the error; you will not be able to roll back to the point of the exception but you will be able to inspect the current state of the system
add a repl to your server, one that you can telnet into and inspect the system with
Investigate DVR solutions for Java, such as http://chrononsystems.com. They allow rollback of the system to the point of the exception.
Just noticed that there is a pause / suspend button in Eclipse and IntelliJ. This is doing the job.
I'd like to set up a blocking file read in Java. That is, have a file such that when wrapped by FileInputStream and any read() method is call, the call blocks.
I can't think of an easy OS-independent way - on Unix-like OSes I could try to create a FIFO using mkfifo and read from that file. A possible work around would be to just create a very large file and read from that - the read is unlikely to complete before I capture the stack, but it's ugly and slow (and indeed reads can still be incredibly fast when cached).
The corresponding socket read() case is trivial to set up - create a socket yourself and read from it, and you can have deterministic blocking.
The purpose is to examine stack of the method to determine what the top frames are in such a case. Imagine I have a component which periodically samples the stacks traces of all running threads and then tries to categorize what that thread is doing at the moment. One thing it could be doing is file IO. So I need to know what the "top of stack" looks like during file IO. I have already determined that by experimentation (simply read a file in a variety of ways and sample the stack), but I want to write a test that will fail if this ever changes.
The natural way to write such a test is to kick off a thread which does a file read, then examine the top frame(s). To do this reliably, I want a blocking read (or else the thread may finish its read before the stack trace is taken, etc).
To get a guaranteed blocked I/O, read from a console, e.g. /dev/console on Linux or CON on Windows.
To make this platform-independent, you may hack the FileDescriptor of FileInputStream:
// Open a dummy FileInputStream
File f = File.createTempFile("dummy", ".tmp");
f.deleteOnExit();
FileInputStream fis = new FileInputStream(f);
// Replace FileInputStream's descriptor with stdin
Field fd = FileInputStream.class.getDeclaredField("fd");
fd.setAccessible(true);
fd.set(fis, FileDescriptor.in);
System.out.println("Reading...");
fis.read();
System.out.println("Complete");
UPDATE
I've realized you don't even need a method to block. In order just to get a proper stacktrace you may invoke read() on an invalid FileInputStream:
FileInputStream fis = new FileInputStream(new FileDescriptor());
fis.read(); // This will throw IOException exactly with the right stacktrace
If you still need a blocking read(), named pipes is the way to go: run mkfifo using Runtime.exec on POSIX systems or create \\.\PIPE\MyPipeName on Windows.
I don't know of anyway to make a File in a OS-Independent way that will always block when read.
If I were trying to find the stack trace when a specific function were called, I would run the program under a debugger and set a break point on that function. Although, method breakpoints will slow down your program and give you different results than you would normally get if timing is important.
If you have access to the source code of the progream, you could make a fake FileInputStream that extends the real one but always blocks on a read. All you need to do is to switch out the import statements throughout the code. However, this won't capture places where you are not able to switch out import statements and it could be a pain if there is a lot of code.
If you want to use your own FileInputStream without changing the program source code or compiling, you can make a custom class loader that loads your custom FileInputStream class instead of the real one. You can specify which class loader to use on the command line by:
java -Djava.system.class.loader=com.test.MyClassLoader xxx
Now that I think about it, I have an even better idea, instead of making a custom FileInputStream that blocks on read(), make a custom FileInputStream that prints out the stack traces on read(). The custom class can then call the real version of read(). This way you will get all of the stack traces for all calls.
From my understanding you want to write a test which inspects the stack trace of FileInputStream.read() method. What about descendants of FileInputStream if they override the read() method?
If you don't need to inspect the descendants, I think you can use the JVM Tool Interface by inserting a break point at runtime in the desired method, and in the event processing of this event (break point) - dump the stack trace.
After the dump is completed you remove the break point and continue the execution.
(This all occurs in runtime using this API, no black magic :) )
You could have a separate thread watch for changes to the file's access time and generate a jvm thread dump when that happens. As to generating the thread dump in code I haven't tried but looks like that's answered here: Generate a Java thread dump without restarting.
I don't know how well this will work with the timing between your threads but I imagine this should come pretty close. I'm also not 100% on the OS independence of this solution as I haven't tested it, but it should work for most modern-ish systems. See the javadocs on the java.nio.file.attribute.BasicFileAttributes to see what will return if it's not supported.
One trick is : if it is possible to modify your API to return a Reader instead of a File, then you can wrap a String with a custom StringReader (class SlowAsRubyStringReader extends Reader, say) that overrides the various int read() methods with a Thread.sleep(500) before it does the real work. Only during testing, of course.
#see http://docs.oracle.com/javase/7/docs/api/java/io/StringReader.html
I think there is a larger issue here, not just Files : you want to inspect the context in which an API is getting called during your test cases is it not? That is, you want to be able to examine the stack and say, "aha! I caught you calling the MudFactory API from the JustTookABath object, OUTRAGEOUS!". If this is the case, then you may have to delve into dynamic proxies, which will allow you to hijack function calls or use aspect-oriented programming, which allows you to do the same, but in a more systematic way. See http://en.wikipedia.org/wiki/Pointcut
read() dives quickly into native code so yes probably need to go native to block at that level. Alternatively you may want to consider logging a stack trace at
the point in your code before or after read().
Something like:
log ( ExceptionUtils.getStackTrace(new Exception()) );
ExceptionUtils doco is here: https://commons.apache.org/proper/commons-lang/javadocs/api-3.1/org/apache/commons/lang3/exception/ExceptionUtils.html
I want to invoke an external program in java code, then the Google tell me that the Runtime or ProcessBuilder can help me to do this work. I have tried it, and there come out a problem the java program can't exit, that means both the sub process and the father process wait for forever. they are hanging or deadlock.
Someone tell me the reason is that the sub process's cache is too small. when it try to give back data to the father process, but the father process don't read it in time, then both of them hang. So they advice me fork an thread to be in charge of read sub process's cache data. I do it as what they tell me, but there still some problem.
Then I close the output stream which get by the method getOutputStream(). Finally, the program success. But I don't know why it happen? Is there some relationship between the output steam and input stream?
You have provided very few details in your question, so I can only provide a general answer.
All processes have three standard streams: standard input, standard output and standard error. Standard input is used for reading in data, standard output for writing out data, and standard error for writing out error messages. When you start an external program using Runtime.getRuntime().exec() or ProcessBuilder, Java will create a Process object for the external program, and this Process object will have methods to access these streams.
These streams are accessed as follows:
process.getOutputStream(): return the standard input of the external program. This is an OutputStream as it is something your Java code will write to.
process.getInputStream(): return the standard output of the external program. This is an InputStream as it is something your Java code will read from.
process.getErrorStream(): return the standard error of the external program. This is an InputStream as, like standard output, it is something your Java code will read from.
Note that the names of getInputStream() and getOutputStream() can be confusing.
All streams between your Java code and the external program are buffered. This means each stream has a small amount of memory (a buffer) where the writer can write data that is yet to be read by the reader. The writer does not have to wait for the reader to read its data immediately; it can leave its output in the buffer and continue.
There are two ways in which writing to buffers and reading from them can hang:
attempting to write data to a buffer when there is not enough space left for the data,
attempting to read from an empty buffer.
In the first situation, the writer will wait until space is made in the buffer by reading data out of it. In the second, the reader will wait until data is written into the buffer.
You mention that closing the stream returned by getOutputStream() caused your program to complete successfully. This closes the standard input of the external program, telling it that there will be nothing more for it to read. If your program then completes successfully, this suggests that your program was waiting for more input to come when it was hanging.
It is perhaps arguable that if you do run an external program, you should close its standard input if you don't need to use it, as you have done. This tells the external program that there will be no more input, and so removes the possibility of it being stuck waiting for input. However, it doesn't answer the question of why your external program is waiting for input.
Most of the time, when you run external programs using Runtime.getRuntime().exec() or ProcessBuilder, you don't often use the standard input. Typically, you'd pass whatever inputs you'd need to the external program on the command line and then read its output (if it generates any at all).
Does your external program do what you need it to and then get stuck, apparently waiting for input? Do you ever need to send it data to its standard input? If you start a process on Windows using cmd.exe /k ..., the command interpreter will continue even after the program it started has exited. In this case, you should use /c instead of /k.
Finally, I'd like to emphasise that there are two output streams, standard output and standard error. There can be problems if you read from the wrong stream at the wrong time. If you attempt to read from the external program's standard output while its buffer is empty, your Java code will wait for the external program to generate output. However, if your external program is writing a lot of data to its standard error, it could fill the buffer and then find itself waiting for your Java code to make space in the buffer by reading from it. The end result of this is your Java code and the external program are both waiting for each other to do something, i.e. deadlock.
This problem can be eliminated simply by using a ProcessBuilder and ensuring that you call its redirectErrorStream() method with a true value. Calling this method redirects the standard error of the external program into its standard output, so you only have one stream to read from.
I am calling a .exe file from my java code using :
Runtime r=Runtime.getRuntime();
Process p=null;
p=r.exec("ABCD.exe");
I want the program to wait till the exe completes its job .(This is actually server side code...control passes to Client side after this).The problem now is that UI on client side is populated before the .exe on server side can form the required components.Hence UI formed does not have the correct files.
I have tried the normal p.waitfor() thing but it doesn't seem to work.
Any suggestions?
The short answer is that you want to call Process.waitFor() in your main thread, as you allude to.
However, dealing with Processes is not exactly fire-and-forget, because, as referenced by the class javadocs, you likely need to be reading the process' output. If you don't do this (which in this case will require a separate thread) then in many instances you'll have an effective deadlock - your Java app is waiting for the process to finish, but the process is trying to write output to a full buffer and thus waiting for the Java app to read its output.
If you gave more information about how "it didn't work", that would help with the diagnosis too.
Edit: on a completely separate point, there's no purpose in initialising p to null and then immediately reassigning it. Your second line would be clearer and less confusing as Process p = r.exec("ABCD.exe");.
I have started a process in my Java code, this process take a very long time to run and could generate some output from time to time. I need to react to every output when they are generated, what is the best way to do this?
What kind of reaction are you talking about? Is the process writing to its standard output and/or standard error? If so, I suspect Process.getInputStream and Process.getErrorStream are what you're looking for. Read from both of those and react accordingly. Note that you may want to read from both of them from different threads, to avoid the individual buffer for either stream from filling up.
Alternatively, if you don't need the two separately, just leave redirectErrorStream in ProcessBuilder as false, so the error and output streams are merged.
You should start a thread which reads from the Process.getInputStream() and getErrorStream() (or alternatively use ProcessBuilder.redirectErrorStream(true)) and handle it when something shows up in the stream. There are many ways that how to handle it - the right way depends on how the data is being used. Please tell more details.
Here is one real-life example: SbtRunner uses ProcessRunner to send commands to a command line application and wait for the command to finish execution (the application will print "> " when a command finishes execution). There is some indirection happening to make it easier to read from the process' output (the output is written to a MulticastPipe from where it is then read by an OutputReader).