What I'm trying to do is simply run a batch file that does some preparatory work necessary for the subsequent commands to be executed successfully (setting environment variables and stuff). To prove this I put together a sample that uses Commons Exec
public class Tester {
public static void main(String[] args) throws Exception {
Tester tester = new Tester();
MyResultHandler handler = tester.new MyResultHandler();
CommandLine commandLine = CommandLine.parse("bash");
PipedOutputStream ps = new PipedOutputStream();
PipedInputStream is = new PipedInputStream(ps);
BufferedWriter os = new BufferedWriter(new OutputStreamWriter(ps));
Executor executor = new DefaultExecutor();
PumpStreamHandler ioh = new PumpStreamHandler(System.out, System.err, is);
executor.setStreamHandler(ioh);
ioh.start();
executor.execute(commandLine, handler);
os.write("export MY_VAR=test");
os.flush();
os.write("echo $MY_VAR");
os.flush();
os.close();
}
private class MyResultHandler extends DefaultExecuteResultHandler {
#Override
public void onProcessComplete(final int exitValue) {
super.onProcessComplete(exitValue);
System.out.println("\nsuccess");
}
#Override
public void onProcessFailed(final ExecuteException e) {
super.onProcessFailed(e);
e.printStackTrace();
}
}
}
But that prints empty string instead of the word "test". Any clues?
Answering my own question based on feedback from another forum. The trick is to add a new line character at the end of each command like this:
os.write("export MY_VAR=test\n");
os.flush();
os.write("echo $MY_VAR\n");
Related
I'm writing to a file via multiple threads so I am using the following code:
synchronized (this) {
BufferedWriter bw = new BufferedWriter(
new FileWriter(path, true));
bw.write(data);
}
and I'm wondering for educational purposes how I can use some synchronizer (Semaphore, CountDownLatch, CyclicBarrier, Phaser or Exchanger) to essentially achieve the same thing: safe multithreaded writing.
Thanks
Honestly, the much better nearly tautological way to write from multiple sources to the same writer is to use a (blocking) queue. This way, all threads can drop their messages on the queue, and the writer thread can pick the messages off the queue and write them to the file. It's generally easier to implement, as well as more efficient
-- edit --
Example:
public class MyQueueableWriter implements Runnable {
private BlockingQueue<Msg> q = new BlockingQueue();
private FileOutputStream fis = ...;
private volatile boolean running = true;
public MyQueueableWriter(FileOutputStream fis) {
this.fis = fis;
}
public void run() {
try {
while (running) {
Message m = q.take();
fis.write(...);
}
fis.close();
} catch (IOException iox) {
...
}
}
public void addMsg(Msg m) {
q.put(m);
}
public void stop() {
running = false;
}
}
Then adding to the queue:
public class EnqueueMsgRunnable implements Runnable {
MyQueueableWriter q = ...;
q.put(myMessage);
q.put(myMessage2);
}
Then just
for (int i =0; i < numSources; i++) {
EnqueueMsgRunnable r = new EnqueueMsgRunnable(...);
new Thread(r).start();
}
I'm using ScheduledThreadPoolExecutor to create a file every fileIntervalInSeconds seconds:
executorService = new ScheduledThreadPoolExecutor(1);
executorService.scheduleAtFixedRate(new Runnable()
{
#Override
public void run()
{
File file = new File(fileName);
if (file.exists())
{
Log.debug("creating new file");
openFileWriter(file);
}
}
}, fileIntervalInSeconds, fileIntervalInSeconds, TimeUnit.SECONDS);
}
private void openFileWriter() throws FileSystemNotificationException
{
// 1 - close exist writer
writer.close();
// 2 - rename to backup file name
...
// 3 - create new file
FileWriter writerFile = new FileWriter(fileName, true);
writer = new PrintWriter(writerFile);
}
And i'm writing alert Messages to the file all the time:
private synchronized void writeLine(String line) throws InterruptedException
{
writer.println(line);
}
My problem is:
how can i ensure that i'm using writer when it is not closed? (writer.close())
How can i wait to the ScheduledThreadPoolExecutor to finish creating the file before start writing
How about checking the file exists before you write to it. No need for a backrgound thread or synchronization
private synchronized void writeLine(String line) {
if (!file.exists())
reopenWritingFile();
writer.println(line);
}
You could simply create the new file every hour within your write method. You would have some slight overhead for the time check but that should be negligible. The example below will create a new log file every hour with the time in milliseconds added to the front of the file name. You can format the time however suits you.
public class LogWriter {
private long lastCreationTime;
PrintWriter writer;
String logFileName;
public LogWriter(String logFileName) {
this.logFileName = logFileName;
createLogFile(logFileName);
}
private void createLogFile(String fileName) {
if(writer != null) {
writer.close();
}
lastCreationTime = System.currentTimeMillis();
FileWriter writerFile;
try {
writerFile = new FileWriter(lastCreationTime + "_" + fileName, true);
writer = new PrintWriter(writerFile);
} catch (IOException e) {
e.printStackTrace();
}
}
private synchronized void writeLine(String line) {
if(lastCreationTime < System.currentTimeMillis() - 3600000) {
createLogFile(logFileName);
}
writer.write(line);
}
}
You have two alternatives:
Create the new writer before closing the old one.
Establish a lock before closing the writer which you check when writing.
Examples:
volatile PrintWriter writer;
ReadWriteLock lock = new ReentrantReadWriteLock();
Lock writeLock = lock.writeLock();
Lock readLock = lock.readLock();
private void openFileWriterWithLock() throws IOException {
if (writeLock.tryLock()) {
try {
// 1 - close exist writer
writer.close();
// 2 - rename to backup file name
//...
// 3 - create new file
FileWriter writerFile = new FileWriter(fileName, true);
writer = new PrintWriter(writerFile);
} finally {
writeLock.unlock();
}
}
}
private synchronized void writeLineWithLock(String line) throws InterruptedException {
readLock.lock();
try {
writer.println(line);
} finally {
readLock.unlock();
}
}
private void openFileWriterWithoutLock() throws IOException {
// 0. Note old file.
PrintWriter oldWriter = writer;
// 1. Create new file.
FileWriter writerFile = new FileWriter(fileName, true);
// 2. Swap the new one in.
writer = new PrintWriter(writerFile);
// 3 - close old writer
oldWriter.close();
}
private synchronized void writeLineWithoutLock(String line) throws InterruptedException {
writer.println(line);
}
How about having a separate thread that handles the logging instead of that rather complicated construct?
public class Logger extends Thread {
private final LinkedBlockingQueue<String> linesToWrite = new LinkedBlockingQueue<>();
private final String filename;
private Logger(String filename) {
super("Logging thread");
this.filename = filename;
this.setDaemon(true);
this.setPriority(Thread.MIN_PRIORITY);
}
#Override
public void run() {
try (BufferedWriter out = new BufferedWriter(new FileWriter(filename, true))) {
String line;
while (this.isInterrupted() == false) {
line = linesToWrite.take();
out.write(line);
out.newLine();
out.flush();
}
} catch (InterruptedException e) {
} catch (IOException ex) {
System.out.println("Failed to access log file: " + ex);
}
}
public void log(final String line) {
this.linesToWrite.add(line);
}
Then initialize the logger once:
final Logger logger = new Logger("test.log");
logger.start();
And then you can use it from anywhere in a thread-safe way like this:
logger.log("Test message");
You don't need to stop the logger, because Java will ensure with the try construct that the file is properly closed. However if you want, you can stop it like this:
logger.interrupt();
Now you can do all file manipulation in a single-threaded way, because there is only one thread accessing the log files ever at any time.
I'm trying to execute terminal command in linux trough Java and i cant get any input from inputStream.
This is my code
ProcessBuilder build = new ProcessBuilder("/usr/bin/xterm", "find /home");
Process pr = null;
BufferedReader buf;
try {
build.redirectErrorStream(true);
pr = build.start();
buf = new BufferedReader(new InputStreamReader( pr.getInputStream()));
String line = buf.readLine();
pr.waitFor();
while (true) {
System.out.println(line + "sadasdas");
line = buf.readLine();
}
} catch (Exception e) {
e.printStackTrace();
}
Process is executed and immediately terminal closes, and no output is catched and printed. On the other hand if i will compose an unknown command i get all the lines with tips how to use commands. Same problem i had with windows cmd. I was trying to use getRuntime.exec(cmd) method but the end is the same.
I've also tried to created separate threads for process and reader which looks like this
public class kurdee
{
public static Thread thread;
public kurdee()
{
List cmd = new LinkedList();
cmd.add(new String("/usr/bin/xterm"));
cmd.add(new String("find"));
thisProc thispr = new thisProc(cmd);
this.thread = new Thread(thispr);
thread.start();
reader rd = new reader(thispr.proc);
Thread thread1 = new Thread(rd);
thread1.start();}
public static void main(String args[])
{
java.awt.EventQueue.invokeLater(new Runnable() {
public void run() {
kurdee kurd = new kurdee();
}
});
}
}
class reader implements Runnable
{
private BufferedReader buf;
private Process proc;
public reader(Process proc)
{
this.proc=proc;
this.buf = new BufferedReader(new InputStreamReader(proc.getInputStream()));
}
public void run()
{
String line="";
System.out.println("Thread is alive");
try{
//Thread.sleep(1000);
line = buf.readLine();
}catch(Exception ex){System.out.println(ex + " before first while started");}
while(kurdee.thread.isAlive())
{
System.out.println("Thread is alive");
while(line!=null)
{
try{
//System.out.println(proc.exitValue());
System.out.println(line + " asd");
line=buf.readLine();
}catch(Exception e){System.out.println(e + " Inner while loop");}
}
}
}
}
class thisProc implements Runnable
{
private ProcessBuilder build;
public static Process proc=null;
public thisProc(List<String> args)
{
this.build = new ProcessBuilder(args);
build.redirectErrorStream(true);
try{
this.proc = build.start();
}catch(Exception ex){System.out.println(ex + " proc class");}
}
public void run()
{
try{
proc.waitFor();
}catch(Exception ex){System.out.println(ex + " proc class");}
}
}
But with any combination of invoking threads etc i make there is still nothing to read.
I'm trying to use command "find /home -xdev -samefile file" to get all hard links to file so maybe there is an easier way.
xterm is not the way to execute processes in unix, it is not a shell. a shell is something like "/bin/sh". however, "find" is a normal unix executable, so you should just execute that directly, e.g. new ProcessBuilder("find", "/home"). and yes, you should always process the streams on separate threads, as recommended by this article.
First, don't try to execute the command with xterm, that's pointless; just do it directly. Secondly, be careful when you compose your array of command strings to put one word into each string; passing, for example "find /home" as a single string among many to ProcessBuilder is going to error out.
How do I know if a software is done writing a file if I am executing that software from java?For example, I am executing geniatagger.exe with an input file RawText that will produce an output file TAGGEDTEXT.txt. When geniatagger.exe is finished writing the TAGGEDTEXT.txt file, I can do some other staffs with this file. The problem is- how can I know that geniatagger is finished writing the text file?
try{
Runtime rt = Runtime.getRuntime();
Process p = rt.exec("geniatagger.exe -i "+ RawText+ " -o TAGGEDTEXT.txt");
}
You can't, or at least not reliably.
In this particular case your best bet is to watch the Process complete.
You get the process' return code as a bonus, this could tell you if an error occurred.
If you are actually talking about this GENIA tagger, below is a practical example which demonstrates various topics (see explanation about numbered comments beneath the code). The code was tested with v1.0 for Linux and demonstrates how to safely run a process which expects both input and output stream piping to work correctly.
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.util.concurrent.Callable;
import org.apache.commons.io.IOUtils;
public class GeniaTagger {
/**
* #param args
*/
public static void main(String[] args) {
tagFile(new File("inputText.txt"), new File("outputText.txt"));
}
public static void tagFile(File input, File output) {
FileInputStream ifs = null;
FileOutputStream ofs = null;
try {
ifs = new FileInputStream(input);
ofs = new FileOutputStream(output);
final FileInputStream ifsRef = ifs;
final FileOutputStream ofsRef = ofs;
// {1}
ProcessBuilder pb = new ProcessBuilder("geniatagger.exe");
final Process pr = pb.start();
// {2}
runInThread(new Callable<Void>() {
public Void call() throws Exception {
IOUtils.copy(ifsRef, pr.getOutputStream());
IOUtils.closeQuietly(pr.getOutputStream()); // {3}
return null;
}
});
runInThread(new Callable<Void>() {
public Void call() throws Exception {
IOUtils.copy(pr.getInputStream(), ofsRef); // {4}
return null;
}
});
runInThread(new Callable<Void>() {
public Void call() throws Exception {
IOUtils.copy(pr.getErrorStream(), System.err);
return null;
}
});
// {5}
pr.waitFor();
// output file is written at this point.
} catch (Exception e) {
e.printStackTrace();
} finally {
// {6}
IOUtils.closeQuietly(ifs);
IOUtils.closeQuietly(ofs);
}
}
public static void runInThread(final Callable<?> c) {
new Thread() {
public void run() {
try {
c.call();
} catch (Exception e) {
e.printStackTrace();
} finally {
}
}
}.start();
}
}
Use a ProcessBuilder to start your process, it has a better interface than plain-old Runtime.getRuntime().exec(...).
Set up stream piping in different threads, otherwhise the waitFor() call in ({5}) might never complete.
Note that I piped a FileInputStream to the process. According to the afore-mentioned GENIA page, this command expects actual input instead of a -i parameter. The OutputStream which connects to the process must be closed, otherwhise the program will keep running!
Copy the result of the process to a FileOutputStream, the result file your are waiting for.
Let the main thread wait until the process completes.
Clean up all streams.
If the program exits after generating the output file then you can call Process.waitFor() to let it run to completion then you can process the file. Note that you will likely have to drain both the standard output and error streams (at least on Windows) for the process to finish.
[Edit]
Here is an example, untested and likely fraught with problems:
// ...
Process p = rt.exec("geniatagger.exe -i "+ RawText+ " -o TAGGEDTEXT.txt");
drain(p.getInputStream());
drain(p.getErrorStream());
int exitCode = p.waitFor();
// Now you should be able to process the output file.
}
private static void drain(InputStream in) throws IOException {
while (in.read() != -1);
}
I have a Java program that outputs some text into console. It uses print, println, and some other methods to do this.
At the end of the program , I want to read all the text in console and copy it into a String buffer. How could I do this in Java ? I need to read stdout and stderr separately.
Ok, this was a fun problem. Dosen't seem to be an elegant way of solving it for all PrintStream methods at once. (Unfortunately there is no FilterPrintStream.)
I did write up an ugly reflection-based workaround though (not to be used in production code I suppose :)
class LoggedPrintStream extends PrintStream {
final StringBuilder buf;
final PrintStream underlying;
LoggedPrintStream(StringBuilder sb, OutputStream os, PrintStream ul) {
super(os);
this.buf = sb;
this.underlying = ul;
}
public static LoggedPrintStream create(PrintStream toLog) {
try {
final StringBuilder sb = new StringBuilder();
Field f = FilterOutputStream.class.getDeclaredField("out");
f.setAccessible(true);
OutputStream psout = (OutputStream) f.get(toLog);
return new LoggedPrintStream(sb, new FilterOutputStream(psout) {
public void write(int b) throws IOException {
super.write(b);
sb.append((char) b);
}
}, toLog);
} catch (NoSuchFieldException shouldNotHappen) {
} catch (IllegalArgumentException shouldNotHappen) {
} catch (IllegalAccessException shouldNotHappen) {
}
return null;
}
}
...that can be used like this:
public class Test {
public static void main(String[] args) {
// Create logged PrintStreams
LoggedPrintStream lpsOut = LoggedPrintStream.create(System.out);
LoggedPrintStream lpsErr = LoggedPrintStream.create(System.err);
// Set them to stdout / stderr
System.setOut(lpsOut);
System.setErr(lpsErr);
// Print some stuff
System.out.print("hello ");
System.out.println(5);
System.out.flush();
System.err.println("Some error");
System.err.flush();
// Restore System.out / System.err
System.setOut(lpsOut.underlying);
System.setErr(lpsErr.underlying);
// Print the logged output
System.out.println("----- Log for System.out: -----\n" + lpsOut.buf);
System.out.println("----- Log for System.err: -----\n" + lpsErr.buf);
}
}
Resulting output:
hello 5
Some error
----- Log for System.out: -----
hello 5
----- Log for System.err: -----
Some error
(Note though, that the out field in FilterOutputStream is protected and documented, so it is part of the API :-)
You can't do that once the program is finished running. You need to do it before the program starts to write output.
See this article(archive.org) for details on how to replace stdout and stderr. The core calls are System.setOut() and System.setErr().
You can use PipedInputStream and PipedOutputStream.
//create pairs of Piped input and output streasm for std out and std err
final PipedInputStream outPipedInputStream = new PipedInputStream();
final PrintStream outPrintStream = new PrintStream(new PipedOutputStream(
outPipedInputStream));
final BufferedReader outReader = new BufferedReader(
new InputStreamReader(outPipedInputStream));
final PipedInputStream errPipedInputStream = new PipedInputStream();
final PrintStream errPrintStream = new PrintStream(new PipedOutputStream(
errPipedInputStream));
final BufferedReader errReader = new BufferedReader(
new InputStreamReader(errPipedInputStream));
final PrintStream originalOutStream = System.out;
final PrintStream originalErrStream = System.err;
final Thread writingThread = new Thread(new Runnable() {
#Override
public void run() {
try {
System.setOut(outPrintStream);
System.setErr(errPrintStream);
// You could also set the System.in here using a
// PipedInputStream
DoSomething();
// Even better would be to refactor DoSomething to accept
// PrintStream objects as parameters to replace all uses of
// System.out and System.err. DoSomething could also have
// an overload with DoSomething() calling:
DoSomething(outPrintStream, errPrintStream);
} finally {
// may also want to add a catch for exceptions but it is
// essential to restore the original System output and error
// streams since it can be very confusing to not be able to
// find System.out output on your console
System.setOut(originalOutStream);
System.setErr(originalErrStream);
//You must close the streams which will auto flush them
outPrintStream.close();
errPrintStream.close();
}
} // end run()
}); // end writing thread
//Start the code that will write into streams
writingThread.start();
String line;
final List<String> completeOutputStreamContent = new ArrayList<String>();
while ((line = outReader.readLine()) != null) {
completeOutputStreamContent.add(line);
} // end reading output stream
final List<String> completeErrorStreamContent = new ArrayList<String>();
while ((line = errReader.readLine()) != null) {
completeErrorStreamContent.add(line);
} // end reading output stream
Here is a utility Class named ConsoleOutputCapturer. It allows the output to go to the existing console however behind the scene keeps capturing the output text. You can control what to capture with the start/stop methods. In other words call start to start capturing the console output and once you are done capturing you can call the stop method which returns a String value holding the console output for the time window between start-stop calls. This class is not thread-safe though.
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.PrintStream;
import java.util.Arrays;
import java.util.List;
public class ConsoleOutputCapturer {
private ByteArrayOutputStream baos;
private PrintStream previous;
private boolean capturing;
public void start() {
if (capturing) {
return;
}
capturing = true;
previous = System.out;
baos = new ByteArrayOutputStream();
OutputStream outputStreamCombiner =
new OutputStreamCombiner(Arrays.asList(previous, baos));
PrintStream custom = new PrintStream(outputStreamCombiner);
System.setOut(custom);
}
public String stop() {
if (!capturing) {
return "";
}
System.setOut(previous);
String capturedValue = baos.toString();
baos = null;
previous = null;
capturing = false;
return capturedValue;
}
private static class OutputStreamCombiner extends OutputStream {
private List<OutputStream> outputStreams;
public OutputStreamCombiner(List<OutputStream> outputStreams) {
this.outputStreams = outputStreams;
}
public void write(int b) throws IOException {
for (OutputStream os : outputStreams) {
os.write(b);
}
}
public void flush() throws IOException {
for (OutputStream os : outputStreams) {
os.flush();
}
}
public void close() throws IOException {
for (OutputStream os : outputStreams) {
os.close();
}
}
}
}
Don't do it afterwards, create two StringBuilder objects before the first System.out.print() gets called and just append every string you want to save to the appropriate StringBuilder.
These two line of code will put your output in a text file or u can change the destination as u require.
// Create a file:
System.setOut(new PrintStream( new FileOutputStream("D:/MyOutputFile.txt")));
// Redirect the output to the file:
System.out.println("Hello to custom output stream!");
hope its help u .. :)