I have one application that prints messages from Test.exe in console .My java program creates one process by executing this Test.exe.
This application prints messages by reading from input-stream of that process.
The problem, that I am facing is,
I have two scenarios:
1) When I double click test.exe, messages("Printing : %d") are printing for every second.
2)But when I run my java application,whole messages are printing at last(not for every second) before terminating Test.exe.If .exe has a very huge messages to print,then it will print those messages(I think whenever buffer becomes full)and flushing will be done.
But how can I print messages same as 1st case.
Help from anyone would be appreciated. :)
Here is the code for this Test.exe.
#include <stdio.h>
#include <windows.h>
void main(void)
{
int i=0;
while (1)
{
Sleep(500);
printf("\nPrinting : %d",i);
i++;
if (i==10)
//if(i==100)
{
return 0;
}
}
}
And my Java application is below:
public class MainClass {
public static void main(String[] args) {
String str = "G:\\Charan\\Test\\Debug\\Test.exe";
try {
Process testProcess = Runtime.getRuntime().exec(str);
InputStream inputStream = new BufferedInputStream(
testProcess.getInputStream());
int read = 0;
byte[] bytes = new byte[1000];
String text;
while (read >= 0) {
if (inputStream.available() > 0 ) {
read = inputStream.read(bytes);
if (read > 0) {
text = new String(bytes, 0, read);
System.out.println(text);
}
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
Is it possible in reverse order.If I input some text from console,Java should read and pass that String to .exe(or testProcess).How .exe scan something from Java program.
Could anyone help me..
Given that you're trying to print stdout from that process line by line, I would created a BufferedReader object using the process' input stream and use the readLine() method on that. You can get a BufferedReader object using the following chain of constructors:
BufferedReader testProcessReader = new BufferedReader(new InputStreamReader(testProcess.getInputStream()));
And to read line by line:
String line;
while ((line = testProcessReader.readLine()) != null) {
System.out.println(line);
}
The assumption here is that Test.exe is flushing its output, which is required by any read from the Java side. You can flush the output from C by calling fflush(stdout) after every call to printf().
If you don't flush, the data only lives in a buffer. When considering performance, it's a trade-off, how often you want the data to be written vs. how many writes / flush operations you want to save. If performance is critical, you can consider looking into a more efficient inter-process communication mechanism to pass data between the processes instead of stdout. Since you are on Windows, the first step might be to take a look at the Microsoft IPC help page.
Seems to have something to do with not flushing. I guess it's on both sides - The C library you use seems to only automatically flush output when writing to a terminal. Flush manually after calling printf.
On the Java side, try reading from a non-buffered stream.
Related
I’m developing an IDE kind of software for C/C++ using java (although there are lots of available, but I want my own) that can compile and execute C or C++ program. So I tried a simple program to compile and execute the C program in java using Process and ProcessBuilder.
Here is my simple java program which compiles and execute C program:
public class RunProgram {
public static void main(String[] args) throws Exception {
new ProcessBuilder("gcc", "-o", "first", "first.c").start().waitFor(); //To Compile the source file using gcc and wait for compilation
/*
Although I've to handle error-stream but
for now, my assumption is that there is no error
in program.
*/
ProcessBuilder run = new ProcessBuilder("./first");
execute.redirectErrorStream(true);
Process runProcess = run.start();
StreamReader sr = new StreamReader(runProcess.getInputStream());
new Thread(sr).start(); //A new thread to handle output of program .
//rest of coding to provide input using OutputStream of 'runProcess' and to close the stream.
}
}
class StreamReader implements Runnable {
private InputStream reader;
public StreamReader(InputStream inStream) {
reader = inStream;
}
#Override
public void run() {
byte[] buf = new byte[1024];
int size = 0;
try {
while ((size = reader.read(buf)) != -1) {
System.out.println(new String(buf));
}
reader.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
And here is my first.c program.
#include<stdio.h>
int main() {
int a;
int k;
printf("input a: ");
scanf("%d", &a);
for(k = 0; k < a; k++)
printf("k = %d\n", k);
return 0;
}
I want to create interactive IO console just like most of the IDEs or command terminals(Terminal in Linux bases OS and command prompt in Windows based OS). For above example: firstly, it should print “Input a: “and then wait for input to be provided and then rest of program. But It won’t work as I thought, as it doesn’t print the result of printf statement appeared before scanf until I provide input possibly using OutputStream.
I googled for my problem and visited many links but didn't get solution. Mean while, I found this link which suggest to append fflush after every printf statement or use setbuf or setvbuf methods (from some other sub-links) to clear the buffer. But a new person (who is going to learn C) might not be aware of fflush or these functions and he/she will never use it, as it doesn’t require in other IDEs or not even on terminals.
How can I solve this problem and can build integrated console for my IDE
Here is a glimpse of what I want:
From the comments above, think adding a little bit of explanation of how buffering for I/O streams works makes sense here.
What happens behind the scenes when calling printf(3) and the like is that the data is written to a buffer until the buffer fills up or some trigger happens.
The content of the buffer is then copied from the buffer to the actual output device/another output buffer ...
The trigger is usually encountering a line end (\n under Linux/Unix).
Thus a crude version of this buffering is:
struct buffered_file_t {
char* buffer;
size_t capacity;
size_t current_char;
FILE* file;
};
void flush_buffered(struct buffered_file_t* file) {
assert(0 != file);
assert(0 != file->buffer);
fwrite(file->buffer, file->current_char, 1, file->file);
file->current_char = 0;
}
void print(struct buffered_file_t* file, const char* str) {
assert(0 != file);
assert(0 != file->buffer);
assert(0 != str);
for(size_t i = 0; 0 != str[i]; ++i) {
if(file->current_char >= file->capacity - 1) flush_buffered(file);
file->buffer[file->current_char++] = str[i];
if('\n' == str[i]) flush_buffered(file);
}
}
Now, if you invoke print like
const size_t BUFSIZE = 100;
struct buffered_file_t stdout_buffered = {
.buffer = calloc(1, BUFSIZE),
.capacity = BUFSIZE,
.current_char = 0,
.file= stdout,
};
print(&stdout_buffered, "Naglfar\n");
print(&stdout_buffered, "Surthur");
You won't see Surthur appear onstdout ever.
In order to have it written from the buffer to stdout, you have to either
call flush_buffered explicitly
Disable buffering by reducing the buffer size (buffered_file.capacity = 1 in the example above)
In your case, you cannot invoke fflush(3) explicitly (that's what you stated as requirement). thus the only means left is disabling buffering.
How to do this is OS dependent, IMHO.
For Linux, look at stdbuf(1) from the Coreutils package to find out how to diable buffering for certain streams of foreign processes.
Under GNU/Linux, for switching off buffering for the standard I/O streams, you could use stdbuf(1)like so:
....
ProcessBuilder run = new ProcessBuilder("stdbuf", "-o0", "./first");
....
Add -e0 and -i0 options if you want to turn off buffering for stderr and stdin as well.
Of course, it would be nicer if you would not have to rely upon external tools but could do switching off buffering in your own code - simplest thing is to have a look at the source of stdbuf, but I guess it would end up in you having to use the JNI, and then, I guess, I would just stick to stdbuf ...
I'm trying to write a curl like program using java, which uses only java socket programming (and not apache http client or any other APIs)
I want to have the option of showing whole or only the body of the response to my get request to user. Currently came up with the following code:
BufferedReader br = new BufferedReader(new InputStreamReader(s.getInputStream()));
String t;
while ((t = br.readLine()) != null) {
if (t.isEmpty() && !parameters.isVerbose()) {
StringBuilder responseData = new StringBuilder();
while ((t = br.readLine()) != null) {
responseData.append(t).append("\r\n");
}
System.out.println(responseData.toString());
parameters.verbose = false;
break;
} else if(parameters.isVerbose())// handle output
System.out.println(t);
}
br.close();
When the verbose option is on, it works quick and shows the whole response body in less than a second. but when I want to just have the body of the message it takes too much time(approx 10 sec) to hand it out.
Does any one knows how can it be processed in a faster way?
Thank you.
I'm going to assume what you mean by slow is that it starts displaying something almost immediately but keeps on printing lines for a long time. Writing to the console takes time, and you're printing each line invidually while in the other code path you first store the entire response in memory and then flush it to the console.
If the verbose response is small enough to fit in memory, you should do the same, otherwise you can decide on an arbitrary number of lines to print in batches (i.e; you accumulate n lines in memory and then flush to the console, clear the StringBuilderand repeat).
The most elegant way to implement my suggestion is to use a PrintStream wrapping a BufferedOutputStream, itself wrapping System.out. All my comments and advices are condensed in the following snippet:
private static final int BUFFER_SIZE = 4096;
public static void printResponse(Socket socket, Parameters parameters) throws IOException {
try (BufferedReader br = new BufferedReader(new InputStreamReader(socket.getInputStream()));
PrintStream printStream = new PrintStream(new BufferedOutputStream(System.out, BUFFER_SIZE))) {
// there is no functional difference in your code between the verbose and non-verbose code paths
// (they have the same output). That's a bug, but I'm not fixing it in my snippet as I don't know
// what you intended to do.
br.lines().forEach(line -> printStream.append(line).append("\r\n"));
}
}
If it uses any language construct you don't know about, feel free to ask further questions.
A part of my application writes data to a .csv file in the following way:
public class ExampleWriter {
public static final int COUNT = 10_000;
public static final String FILE = "test.csv";
public static void main(String[] args) throws Exception {
try (OutputStream os = new FileOutputStream(FILE)){
os.write(239);
os.write(187);
os.write(191);
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(os, StandardCharsets.UTF_8));
for (int i = 0; i < COUNT; i++) {
writer.write(Integer.toString(i));
writer.newLine();
}
} catch (IOException e) {
e.printStackTrace();
}
System.out.println(checkLineCount(COUNT, new File(FILE)));
}
public static String checkLineCount(int expectedLineCount, File file) throws Exception {
BufferedReader expectedReader = new BufferedReader(new FileReader(file));
try {
int lineCount = 0;
while (expectedReader.readLine() != null) {
lineCount++;
}
if (expectedLineCount == lineCount) {
return "correct";
} else {
return "incorrect";
}
}
finally {
expectedReader.close();
}
}
}
The file will be opened in excel and all kind of languages are present in the data. The os.write parts are for prefixing the file with a byte order mark as to enable all kinds of characters.
Somehow the amount of lines in the file do not match the count in the loop and I can not figure out how. Any help on what I am doing wrong here would be greatly appreciated.
You simply need to flush and close your output stream (forcing fsync) before opening the file for input and counting. Try adding:
writer.flush();
writer.close();
inside your try-block. after the for-loop in the main method.
(As a side note).
Note that using a BOM is optional, and (in many cases) reduces the portability of your files (because not all consuming app's are able to handle it well). It does not guarantee that the file has the advertised character encoding. So i would recommend to remove the BOM. When using Excel, just select the file and and choose UTF-8 as encoding.
You are not flushing the stream,Refer oracle docs for more info
which says that
Flushes this output stream and forces any buffered output bytes to be
written out. The general contract of flush is that calling it is an
indication that, if any bytes previously written have been buffered by
the implementation of the output stream, such bytes should immediately
be written to their intended destination. If the intended destination
of this stream is an abstraction provided by the underlying operating
system, for example a file, then flushing the stream guarantees only
that bytes previously written to the stream are passed to the
operating system for writing; it does not guarantee that they are
actually written to a physical device such as a disk drive.
The flush method of OutputStream does nothing.
You need to flush as well as close the stream. There are 2 ways
manually call close() and flush().
use try with resource
As I can see from your code that you have already implemented try with resource and also BufferedReader class also implements Closeable, Flushable so use code as per below
public static void main(String[] args) throws Exception {
try (OutputStream os = new FileOutputStream(FILE); BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(os, StandardCharsets.UTF_8))){
os.write(239);
os.write(187);
os.write(191);
for (int i = 0; i < COUNT; i++) {
writer.write(Integer.toString(i));
writer.newLine();
}
} catch (IOException e) {
e.printStackTrace();
}
System.out.println(checkLineCount(COUNT, new File(FILE)));
}
When COUNT is 1, the code in main() will write a file with two lines, a line with data plus an empty line afterwards. Then you call checkLineCount(COUNT, file) expecting that it will return 1 but it returns 2 because the file has actually two lines.
Therefore if you want the counter to match you must not write a new line after the last line.
(As another side note).
Notice that writing CSV-files the way you are doing is really bad practice. CSV is not so easy as it may look at first sight! So, unless you really know what you are doing (so being aware of all CSV quirks), use a library!
I want to use an external tool while extracting some data (loop through lines).
For that I first used Runtime.getRuntime().exec() to execute it.
But then my extraction got really slow. So I am searching for a possibility to exec the external tool in each instance of the loop, using the same instance of shell.
I found out, that I should use ProcessBuilder. But it's not working yet.
Here is my code to test the execution (with input from the answers here in the forum already):
public class ExecuteShell {
ProcessBuilder builder;
Process process = null;
BufferedWriter process_stdin;
BufferedReader reader, errReader;
public ExecuteShell() {
String command;
command = getShellCommandForOperatingSystem();
if(command.equals("")) {
return; //Fehler! No error handling yet
}
//init shell
builder = new ProcessBuilder( command);
builder.redirectErrorStream(true);
try {
process = builder.start();
} catch (IOException e) {
System.out.println(e);
}
//get stdout of shell
reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
errReader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
//get stdin of shell
process_stdin = new BufferedWriter(new OutputStreamWriter(process.getOutputStream()));
System.out.println("ExecuteShell: Constructor successfully finished");
}
public String executeCommand(String commands) {
StringBuffer output;
String line;
try {
//single execution
process_stdin.write(commands);
process_stdin.newLine();
process_stdin.flush();
} catch (IOException e) {
System.out.println(e);
}
output = new StringBuffer();
line = "";
try {
if (!reader.ready()) {
output.append("Reader empty \n");
return output.toString();
}
while ((line = reader.readLine())!= null) {
output.append(line + "\n");
return output.toString();
}
if (!reader.ready()) {
output.append("errReader empty \n");
return output.toString();
}
while ((line = errReader.readLine())!= null) {
output.append(line + "\n");
}
} catch (Exception e) {
System.out.println("ExecuteShell: error in executeShell2File");
e.printStackTrace();
return "";
}
return output.toString();
}
public int close() {
// finally close the shell by execution exit command
try {
process_stdin.write("exit");
process_stdin.newLine();
process_stdin.flush();
}
catch (IOException e) {
System.out.println(e);
return 1;
}
return 0;
}
private static String getShellCommandForOperatingSystem() {
Properties prop = System.getProperties( );
String os = prop.getProperty( "os.name" );
if ( os.startsWith("Windows") ) {
//System.out.println("WINDOWS!");
return "C:/cygwin64/bin/bash";
} else if (os.startsWith("Linux") ) {
//System.out.println("Linux!");
return"/bin/sh";
}
return "";
}
}
I want to call it in another Class like this Testclass:
public class TestExec{
public static void main(String[] args) {
String result = "";
ExecuteShell es = new ExecuteShell();
for (int i=0; i<5; i++) {
// do something
result = es.executeCommand("date"); //execute some command
System.out.println("result:\n" + result); //do something with result
// do something
}
es.close();
}
}
My Problem is, that the output stream is always empty:
ExecuteShell: Constructor successfully finished
result:
Reader empty
result:
Reader empty
result:
Reader empty
result:
Reader empty
result:
Reader empty
I read the thread here: Java Process with Input/Output Stream
But the code snippets were not enough to get me going, I am missing something. I have not really worked with different threads much. And I am not sure if/how a Scanner is of any help to me. I would really appreciate some help.
Ultimatively, my goal is to call an external command repeatetly and make it fast.
EDIT:
I changed the loop, so that the es.close() is outside. And I wanted to add, that I do not want only this inside the loop.
EDIT:
The problem with the time was, that the command I called caused an error. When the command does not cause an error, the time is acceptable.
Thank you for your answers
You are probably experiencing a race condition: after writing the command to the shell, your Java program continues to run, and almost immediately calls reader.ready(). The command you wanted to execute has probably not yet output anything, so the reader has no data available. An alternative explanation would be that the command does not write anything to stdout, but only to stderr (or the shell, maybe it has failed to start the command?). You are however not reading from stderr in practice.
To properly handle output and error streams, you cannot check reader.ready() but need to call readLine() (which waits until data is available) in a loop. With your code, even if the program would come to that point, you would read only exactly one line from the output. If the program would output more than one line, this data would get interpreted as the output of the next command. The typical solution is to read in a loop until readLine() returns null, but this does not work here because this would mean your program would wait in this loop until the shell terminates (which would never happen, so it would just hang infinitely).
Fixing this would be pretty much impossible, if you do not know exactly how many lines each command will write to stdout and stderr.
However, your complicated approach of using a shell and sending commands to it is probably completely unnecessary. Starting a command from within your Java program and from within the shell is equally fast, and much easier to write. Similarly, there is no performance difference between Runtime.exec() and ProcessBuilder (the former just calls the latter), you only need ProcessBuilder if you need its advanced features.
If you are experiencing performance problems when calling external programs, you should find out where they are exactly and try to solve them, but not with this approach. For example, normally one starts a thread for reading from both the output and the error stream (if you do not start separate threads and the command produces large output, everything might hang). This could be slow, so you could use a thread pool to avoid repeated spawning of processes.
I'm firing up an external process from Java and grabbing its stdin, stdout and stderr via process.getInputStream() etc. My issue is: when I want to write data to my output stream (the proc's stdin) it's not getting sent until I actually call close() on the stream. I am explicitly calling flush().
I did some experimenting and noticed that if I increased the number of bytes I was sending, it would eventually go through. The magic number, on my system, is 4058 bytes.
To test I'm sending the data over to a perl script which reads like this:
#!/usr/bin/perl
use strict;
use warnings;
print "Perl starting";
while(<STDIN>) {
print "Perl here, printing this: $_"
}
Now, here's the java code:
import java.io.InputStream;
import java.io.IOException;
import java.io.OutputStream;
public class StreamsExecTest {
private static String readInputStream(InputStream is) throws IOException {
int guessSize = is.available();
byte[] bytes = new byte[guessSize];
is.read(bytes); // This call has side effect of filling the array
String output = new String(bytes);
return output;
}
public static void main(String[] args) {
System.out.println("Starting up streams test!");
ProcessBuilder pb;
pb = new ProcessBuilder("./test.pl");
// Run the proc and grab the streams
try {
Process p = pb.start();
InputStream pStdOut = p.getInputStream();
InputStream pStdErr = p.getErrorStream();
OutputStream pStdIn = p.getOutputStream();
int counter = 0;
while (true) {
String output = readInputStream(pStdOut);
if (!output.equals("")) {
System.out.println("<OUTPUT> " + output);
}
String errors = readInputStream(pStdErr);
if (!errors.equals("")) {
System.out.println("<ERRORS> " + errors);
}
if (counter == 50) {
// Write to the stdin of the execed proc. The \n should
// in turn trigger it to treat it as a line to process
System.out.println("About to send text to proc's stdin");
String message = "hello\n";
byte[] pInBytes = message.getBytes();
pStdIn.write(pInBytes);
pStdIn.flush();
System.out.println("Sent " + pInBytes.length + " bytes.");
}
if (counter == 100) {
break;
}
Thread.sleep(100);
counter++;
}
// Cleanup
pStdOut.close();
pStdErr.close();
pStdIn.close();
p.destroy();
} catch (Exception e) {
// Catch everything
System.out.println("Exception!");
e.printStackTrace();
System.exit(1);
}
}
}
So when I run this, I get effectively nothing back. If immediately after calling flush(), I call close() on pStdIn, it works as expected. This isn't what I want though; I want to be able to continually hold the stream open and write to it whenever it so pleases me. As mentioned before, if message is 4058 bytes or larger, this will work without the close().
Is the operating system (running on 64bit Linux, with a 64bit Sun JDK for what it's worth) buffering the data before sending it? I could see Java having no real control over that, once the JVM makes the system call to write to the pipe all it can do is wait. There's another puzzle though:
The Perl script prints line before going into the while loop. Since I check for any input from Perl's stdout on every iteration of my Java loop, I would expect to see it on the first run through the loop, see the attempt at sending data from Java->Perl and then nothing. But I actually only see the initial message from Perl (after that OUTPUT message) when the write to the output stream happens. Is something blocking that I'm not aware of?
Any help greatly appreciated!
You haven't told Perl to use unbuffered output. Look in perlvar and search for $| for different ways to set unbuffered mode. In essence, one of:
HANDLE->autoflush( EXPR )
$OUTPUT_AUTOFLUSH
$|
Perl may be buffering it before it starts printing anything.
is.read(bytes); // This call has side effect of filling the array
No it doesn't. It has the effect of reading between 1 and bytes.length-1 bytes into the array. See the Javadoc.
I don't see any obvious buffering in your code, so it may be on the Perl side. What happens if you put a newline \n at the end of your print statement?
Note also that you can't, in general, read the stdin and stderr on the main thread like that. You'll be subject to deadlock - e.g., if the child process prints lots of stderr, while the parent is reading stdin, the stderr buffer will fill and the child process will block, but the parent will stay blocked forever trying to read stdin.
You need to use separate threads to read stderr and stding (also separate from the main thread, which here is used to pump input to the process).