I am using Fedora linux where ulimit -n 10000 increases file limit upto 10000. I want to achieve the same using java program
How to write java program to increase file limit using ulimit
I have tried with the below program but it didnot work well. The program didnot give any error. but didnot increase file limit also
public class IncreaseFIle {
public static void main(String[] args) {
String command = "/bin/bash ulimit -n 10000";
// String command = "pwd";
try {
Runtime.getRuntime().exec(command);
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
Thanks
Sunil Kumar Sahoo
From the man page:
The ulimit utility shall set or report the file-size writing limit imposed on files written by the shell and its child processes.
You java program is not the shell or one of its child process - it is the ancestor process, and therefore is unaffected by anything that its child process does. To get another ulimit you must somehow contrive to call ulimit before java is started.
The program did not give any error.
Indeed, you're ignoring any result. You need to get hold of the returned Process object and read its getInputStream() and getErrorStream() (which returns the program's stdout and stderr respectively). This information should tell more about the cause of the problem and understanding the cause should lead to the solution.
Check this article (all the 4 pages!) to learn how to use Runtime#exec() properly:
When Runtime.exec() won't
Related
I made this simple piece of code to test ProcessBuilder:
#SpringBootApplication
public class TerminalDemoApplication {
public static void main(String[] args) {
SpringApplication.run(TerminalDemoApplication.class, args);
try {
System.out.println("hello");
Process process = new ProcessBuilder("python", "--version").start();
BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
int exitCode = process.waitFor();
System.out.println("\nExited with error code : " + exitCode);
} catch (IOException | InterruptedException e) {
e.printStackTrace();
}
}
}
It works in Windows (returns python version of my system) but the same code in my macbook returns end of line, so basically empty. ¿This needs further configuration according to the OS? ¿why is this happening?
What error code are you getting?
There are (at least) two explanations; that error code would indicate which one it is.
You're not running python, or running 'the wrong' python
This would mean you are getting an error code of some sort, or an exception.
The likely reason for this is a path issue.
Running python, just like that - as in, no path information at all, is nominally neccessarily broken: That's just not how your OS works, it has no idea what to do with this path.
It's a bashism (as in, the shell does it, not the OS) to interpret such a command as 'oh, actually, go through each listed entry in the $PATH environment variable, and stick that path in front of this name, see if you find an executable there. If you do, run that and stop).
Java mostly doesn't engage in any bashisms. But, in a few bizarre places, it does - it tries to do basic space splitting when you use the single-string version of new ProcessBuilder), which is a shellism, and it does attempt to do basic PATH lookup, but that's about where it ends. It won't do * unpacking, which on windows is an OS-level thing but on posix systems is a shellism.
I strongly, strongly advise you to avoid java's basic shellisms. It's unreliable and highly OS-specific.
So: Always pass arguments explicitly (good, you're doing that), always use ProcessBuilder (good, you're doing that), never use relative paths (that's where you're going wrong).
It's going to the error stream instead
processes on OSes are generally hooked up to 3 pipes, not 2. There's the 'standard in', the 'standard out' and the 'standard err'. Your own java process exposes these as System.out, in, and err.
In linux in particular, it is common to redirect standard out of some process to a file or another process.
This means that standard err naturally has the property that it tends to emit to the console, even if you are redirecting things. In other words, the terms 'standard out' and 'standard err' are really stupid names on posix. The much better naming would be 'standard process output' and 'standard process messages'.
Asking python to print its version is in a bit of a limbo scenario. The string "Python v3.0.1" or whatnot is certainly not an error, but it's a bit dubious if one should consider this as 'the output of the process'. It's likely that the authors of the python tool consider it more 'some information I should print to you, even if you are redirecting things.
Thus, my guess is that this version is heading out to standard err instead.
You can solve this in two ways: Either read from standard err as well, or, use process builder's features: You can ask it to bundle up standard out and standard err into a single stream (.redirectErrorStream(true)).
I would expect the exit code to be 0 if this explanation is the correct one.
I'm making an app which, using root, takes a logcat so it can find a certain error on another app. The easiest approach for me is saving the logcat to a file directly from the command. So all I have to do is run su and then logcat | grep --line-buffered "search string" > /path/to/save/logcat.log. When I run this on a terminal emulator (like this or even this), it saves the output to a file just exactly how I want it to do so. But when I run the exact same command from my app, it gets me a blank file. I've tried many different ways to output the logcat but they all get me an empty file. Interestingly, when I take a normal logcat using the app (without grep, using ">" to output), the file is being saved as it should and it contains the string I want to grep. What am I doing wrong?
Here is the code I use:
try {
Process p = Runtime.getRuntime().exec("su");
DataOutputStream dos = new DataOutputStream(p.getOutputStream());
dos.writeBytes("logcat | grep --line-buffered \"search string\" > /storage/emulated/0/logcat.log\n");
dos.flush();
} catch (IOException e) {
e.printStackTrace();
}
I'm listing my comment as an answer since that evidently helped solve your issue but there is more going on here which I'm not capable of fully addressing:
Try redirecting stderr as well to see if there is any error which can then be captured - I think that would be &> (that's bash) - and >outfile 2>&1 for more general syntax.
So
dos.writeBytes("logcat | grep --line-buffered \"search string\" &> /storage/emulated/0/logcat.log\n");
Or
dos.writeBytes("logcat | grep --line-buffered \"search string\" > /storage/emulated/0/logcat.log 2>&1\n");
The original intention of the comment was to get you more information as to what was really going on - as it turns out, it helped you get the result you were looking for.
I believe there are a few factors at work here which may contribute to why adding stderr helped:
stderr (which the comment suggested adding) is non-buffered - I think (but can't prove) the non-buffering is what is helping you even though the output being captured is stdout.
stdout processing is sensitive to TTY (terminal emulation) vs non-TTY (your program) and has different buffering approaches (line buffering in TTY otherwise fully buffered). I realize your grep option should overcome this. This difference in TTY-nonTTY may explain the source of your problem.
The code posted is sending a command (logcat...) to the process created and continuing. So for example if the logcat were to output lots of data, in theory your posted code would continue and leave scope - what happens to the process created when p is out scope - not sure.
Anyways, glad you were able to proceed.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
java/shellscript code to find out if a jar file is already running on current machine
I would love to get a cross-platform solution for this, but even if it's unix only- that would be fine.
The simple solution would be to do this from the shell (Pseudocode)(if ps -aux | grep myJar | print {awk 2}.contains myPID, don't run myProgram.
Now unfortunately our linux team doesn't want a script like that running in production since it can (admittedly) have undesired behaviors.
So what I need is to be able to have a file run, and when it runs see if another program is running. If the program is already running and it's below that time limit, it should prevent the program from running.
A bit of an example:
Myprog.jar -- timeout 5 min
Myprog.jar is in a cron that gets called every 4 minutes,
the first time it's called it launches, the second time it's called it's still running, but since it's not over the timeout, it's fine.
If it's still running when the third check comes through (at 8 minutes into execution) it's killed, and its process is replaced by itself afterwards.
If someone can help me understand how to do this (We've been trying to set up a lock file with limited success)
Thanks!
You could make your program open a dummy file for writing with a FileWriter when your program starts, and keep the file open until the program is finished.
When you now start a second instance of your program, it will also try to open this file for writing, which will throw an IOException, because only one process can have a write handle to a file at the same time.
You could use a port as a semaphore. See this question for more info on that. I think a port would be a good cross-platform solution
You can create a temporary file on a fixed location.
private static final File LOCK_FILE = new File("app.lock");
public static boolean checkIfAlreadyRunning()
{
return LOCK_FILE.exists();
}
public static void createLockFile()
{
LOCK_FILE.createNewFile();
Runnable shutDown = new Runnable()
{
public void run()
{
try
{
LOCK_FILE.delete();
} catch (Exception e) { /* Sad but true */ }
}
};
Runtime.getRuntime().addShutdownHook(new Thread(shutDown));
Thread.setUncaughtExceptionHandler(new Thread.UncaughtExceptionHandler()
{
public void uncaughtException(Thread t, Exception e)
{
shutDown.run();
System.exit(-1);
}
});
}
I had exactly the same problem, and it can be pretty tricky to solve. Both File and Socket based approaches can be made to work, but it gets really tricky on some OS's (think of Windows with multiple users in multiple terminal server sessions etc.).
First, determine the scope where you want only one instance. Then decide on a solution.
The ServerSocket method with a fixed port number will allow you one instance per machine (maybe not exactly what you want).
The locking file approach can be tailored to create the locking file in the users temp directoy, so it gives one instance per session/user.
I personally use a combined approach where the locking file specifies a random port and a second instance connects to that port to pass command line parameter to the running instance.
I'm creating a simple Java wrapper for git executable, that I want to use in my app.
A small code example:
public static void main(String[] args) {
String gitpath = "C:/eclipse/git/bin/git.exe";
File folder = new File("C:/eclipse/teste/ssadasd");
try {
folder.mkdirs();
Runtime.getRuntime().exec(
gitpath + " clone git#192.168.2.15:test.git", null,
folder);
} catch (IOException e) {
e.printStackTrace();
}
}
The code simply never ends the execution.. seems like it has caught inside exec.
If I run the git clone via command line, it work as expected.
If I try another repository, from github, e.g., it works too.
Someone have a ide for what is going on here?
Thanks in advance
This isn't a direct answer to your question, but you may want to take a look at JGit, which is direct Java implementation of Git operations (no wrapping of commandline git). JGit gets a lot of use and stabilization work as it is the foundation for EGit (Eclipse Git integration).
Runtime.getRuntime().exec returns a Process object that you can use to interact with the process and see what's going on. My suspicion is that you just need to do something like this:
Process p = Runtime.getRuntime().exec(
gitpath + " clone git#192.168.2.15:test.git", null,
folder);
p.waitFor();
If not, you can also do getErrorStream() or getOutputStream() on the process to see what it's writing out; that might be helpful in debugging.
Runtime.exec() can cause hanging under various circumstances - see this article which quotes the Javadoc, which says (in JDK 7):
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input
stream or read the output stream of the subprocess may cause the
subprocess to block, and even deadlock.
The article gives some example solutions, which consume the output and error streams, although I think the ProcessBuilder class was introduced after the article was written, so may be more satisfactory: the newer Javadoc adds:
Where desired, subprocess I/O can also be redirected using methods of the ProcessBuilder class.
I want to send a command to linux shell and get it's response with java.How can i do this?
Have a look at ProcessBuilder - example here.
You should look at the Runtime class, and its exec() family of methods.
It's probably best to explicitly specify that you want to run the command through a shell, i.e. create a command line like "bash -c 'my command'".
Execute a process like this
Runtime.getRuntime().exec("ls");
...then you could get the process input stream and read it with a Reader to get the response
See the Runtime class and the exec() method.
Note that you need to consume the process's stdout/sterr concurrently, otheriwse you'll get peculiar blocking behaviour. See this answer for more information.
I wrote a little class to do this in a very similar question a couple of weeks ago:
java shell for executing/coordinating processes?
The class basically let's you do:
ShellExecutor excutor = new ShellExecutor("/bin/bash", "-s");
try {
System.out.println(excutor.execute("ls / | sort -r"));
} catch (IOException e) {
e.printStackTrace();
}