i need to perform some operations on files - rename, delete and etc.
what is better? use cmd commands or use java.io.File methods?
thanks.
Normally it's not a good idea to depend on OS specific things in a platform independent environment, not mentioning the speed which would be much slower with the local commands.
I would stick with the Java implementations, if it's possible with them.
After lots of comments I catch the real question. So my answer is:
Of course it is better to use java feauters, because:
If you wan't your program to be portable you can't use command line, because it wouldn't work on unix or other systems expect windows. Also can't work on some other versions of windows.
To use command line feautures from your code you should creat new process end execute this commands, which is very slow.
Related
I want to access Java library within Ruby, in example Kafka already give jar for every operation, what things I need to do if I want to use it from Ruby?
Like maybe I just need to run shell command to run the Jar within Ruby, or do I need to port the library in Ruby? If it comes down to porting the library, how to do that too?
Thank you in advance
PS: The Java, Ruby, or Kafka are just examples. What I need to know is the big picture how to porting a library. Of course if you add some code example too I'll be more than happy :)
I agree with Aetherus that the fastest and most convenient way is to use JRuby. However I believe there other options than communicating with external Java processes. What to choose probably depends on what code you want to call. I see at least two other options.
Wrap the Java code you want to call in a main program and call it on the command line. This will be slow since Java needs to start and that takes forever but may be a fast way forward in some cases.
Call the Java code from C code that you compile with your Ruby. This will still need to load the JVM but you could probably make it happen only once. I found an article outlining how to get around doing it with JNI.
Both these paths will probably cause you a lot of pain but if it is important to stay on MRI it may be worth the trip. Have fun!
With JRuby, you can import the jar file, then include the classes you need in that jar:
require 'java'
require '/path/to/your.jar'
include_class 'com.really.long.ClassName'
But with Ruby implementations other than JRuby, you have no choice but to communicate with external java processes (through socket, IPC, kill, ...).
Suppose i want to search some text in a file. I want to know when we should use system utilities/programs like grep and when we should use Java API's like reading a line, and then search the text in that line or use java Scanner class.
I want to understand the trade-offs between the two approaches. I mean, suppose if we use grep, then will there be communication overhead between JVM and the grep process? Is creation of a new OS process for grep an overhead?
Does grep performs better than normal java file search?
Please help...
Yes, there will be an overhead. Starting an external process and communicating with it is costly. And moreover, many systems don't have a grep command. If you want to make your Java code portable, don't rely on OS-specific commands.
Another problem is that OS commands will be able to search (for example) in files, but not in your in-memory data structures.
You're basically trading off system independence for the perceived gain of the tool, in some cases this can't be avoid.
Not every system will have the tools you want installed, in the locations you think they should be or the version you need.
Even if you can deploy the tools with your application, you will need to provide an implementation for each of your targeted platforms.
Sure, it's easy to say "it will never be run on X", but can never come around real quick ;)
There is also the added over head of executing and managing the IO of the external applications, while not difficult, it's much more complex that a well written Java API.
As I said, sometimes, you simply don't have a choice (I have some media inspection tools that I use on Windows and Mac that I'm not about to try and implement in Java, not because it can't be, but because it's complex and time consuming and somebody has already done it (with a native program)).
You need to balance the choice over what the benefits are of the external command weight against the issues of using it. You should also investigate if a API has already begin developed that might solve the problem at hand.
IMHO
I have a matlab code that processes images. I want to create a Hadoop mapper that uses that code. I came across the following solutions but not sure which one is best (as it is very difficult to install matlab compiler runtime on each slave node in hadoop for me):
Manually convert that matlab code into OpenCV in C++ and call its exe/dll (and supply it appropriate parameters) from the mapper. Not sure, since the cluster has Linux installed on every node instead of Windows.
Use Hadoop Streaming. But Hadoop streaming requires an executable as the mapper and the executable of matlab also requires Matlab Compiler Runtime which is very difficult to install on every slave node.
Convert it automatically into C/C++ code and create its exe automatically (not sure whether this is right because either the exe will require the matlab runtime to run or there can be compiler issues in the conversion which are very difficult to fix )
Use Matlab Java Builder. But the jar file thus created will need the runtime too.
Any suggestions?
Thanks in advance.
As you are probably already suspecting, this is going to be inherently difficult to do because of the runtime requirement for MATLAB. I had a similar experience (having to distribute the runtime libraries) when attempting to run MATLAB code over Condor.
As far as the options you are listing are concerned, option #1 will work best. Also, you will probably not be available to avoid working with Linux.
However, if you don't want to lose the convenience provided by higher level software (such as MATLAB, Octave, Scilab and others) you could try Hadoop streaming in combination with Octave executable scripts.
Hadoop streaming does not care about the nature of the executable (whether it is an executable script or an executable file, according to this (http://hadoop.apache.org/common/docs/r0.15.2/streaming.html)).
All it requires, is that it is given an "executable" that in addition can a) read from stdin, b) send output to stdout.
GNU Octave programs can be turned into executable scripts (in Linux) with the ability to read from stdin and send the output to stdout (http://www.gnu.org/software/octave/doc/interpreter/Executable-Octave-Programs.html).
As a simple example consider this:
Create a file (for example "al.oct") with the following contents:
#!/bin/octave -qf (Please note, in my installation i had to use "#!/etc/alternatives/octave -qf")
Q = fread(stdin); #Standard Octave / MATLAB code from here on
disp(Q);
Now from the command prompt issue the following command:
chmod +x al.oct
al.oct is now an executable...You can execute it with "./al.oct". To see where the stdin,stdout fits in (so that you can use it with Hadoop) you can try this:
>>cat al.oct|./al.oct|sort
Or in other words..."cat" the file al.oct, pipe its output to the executable script al.oct and then pipe the output of al.oct to the sort utility (this is just an example,we could have "cat" any file, but since we know that al.oct is a simple text file we just use this).
It could be of course that Octave does not support everything your MATLAB code is trying to call, but this could be an alternative way to using Hadoop Streaming without losing the convenience / power of higher level code.
Does not the nature of the algorithm to be converted matter? If the MATLAB/Octave code is tightly coupled, spreading it out over a map-reduced may yield horrible behavior.
With respect to your first option: The Matlab Coder now supports many image processing functions (partly via system objects) to automatically generate C-code of your algorithm, which is basically platform independent and needs no runtime environment. From my experience this code is about a factor 2..3 slower than "hand-coded" OpenCV (strongly depends on your algorithm and cpu).
The main drawback is, you need a Matlab Coder license ($$$).
Most of the answers here seem to be pre MATLAB R2014b.
In R2014b, MATLAB allows mapreduce from within MATLAB and integration with Hadoop.
I cannot be certain about your specific use case but you may want to check:
http://www.mathworks.com/help/matlab/mapreduce.html
http://www.mathworks.com/discovery/matlab-mapreduce-hadoop.html
I'm trying to make an java application to manage Linux servers from a web interface.
It is a bad idea to perform each task by calling bash shell ?
Are there any other options other than those to use C, Perl or another language?
It's not necessarily a bad idea to use bash to do the actual work. It would help if you gave us more of an idea what exactly the web interface was changing.
Java in particular does not provide much system-specific controls, because it was designed as a cross-platform language, so putting specific platform tools in would go against it's purpose.
You could certainly do it that way. Ideally you'd open up a port and accept specially crafted, specific actions which perform only the intended actions (an interface server) through a socket library.
I should think that the disadvantage(s) of calling Bash scripts for your commands is all related to error handling and return values. Each of your Bash scripts will need to return sensible, useful information to the Java app in the case of failures (or even successes). And you'll probably want a common interface for that such that each Bash script, no matter its function, returns the same types so that the Java can interpret it easily.
So, in that sense, making the changes from your Java program reduces the complexity of handing the information back and forth. On the other hand, if Bash is easier for you, you may find it more fast and flexible.
Opening up a single bash shell and sending commands (and properly parsing results) shouldn't be bad. It does tie your program to a single OS and even a single shell, but that's the nature of what you are trying to do.
What I wouldn't do is open a shell for each command and close it after each result. It seems to me that would cause unnecessary thrashing when many commands were executed in a row.
Security concerns jump at me. If you have a form like "type command" then someone clever could exploit some things. For example "configure network" is fine but "configure network; install rootkit" can be typed because a semi-colon allows commands to be chained.
All in all, this is not tight integration. If this is a personal project, go for it. It's a good learning project to turn a procedural script into a java program. If this is a serious effort to recreate the many various webapp admin tools there are, I'd seriously suggest skipping this. The VPanel/CPanel things I've seen I hate. There used to be a php based linux admin thing that looked ok but I just find them easily to learn, easy to outgrow because the net and community is full of command line knowledge.
If you are trying to automate a large deployment, look at the ruby-based puppet. It looks really neat.
I have a socket server were I perform different operations (using ifconfig by calling shell for example) and I plan to integrate an client in a JSF application.Because I'm not experienced ant I intend to make it my graduation project, but I'm not so sure if calling bash from java to configure a linux box is a good solution.
Java does have some Linux-configuring classes (well, at least, OpenJDK does). Check OperatingSystemMXBean or something like that.
You're beter to write your own server configuration utility in the language you prefer, sanitize it, make it secure and then call it via bash from your java app.
There are two questions here:
Should you try to do the configuration work in Java, or should you externalize it?
Assuming that you have externalized it, should you use bash scripts.
In this case, the answer to question 1) depends. This kind of thing can be difficult to implement in pure Java. This leaves two choices; externalize the task using a Process, or try to do it in a native code library via JNI or JPA. The latter approach is complicated and gives you JVM crashes if you make a mistake, so I would rule that out.
On the other hand, if you can find a good standard or 3rd party Java API that does what you need (without infesting your JVM with flakey JNI, etc), you should use that.
The answer to 2) is that bash scripts will work as well as any other scripting language. I think that using scripts gives you a bit more flexibility. For example, if you need to do things to compensate for differences in the different flavours of Linux, UNIX or maybe even Windows (!) you can put this into the externalized scripts. (A corollary is that the scripts need to be configurable, so don't embed them in your source code!)
Another alternative might be to run the commands (e.g. ifconfig) directly, using a fully qualified command name and supplying the arguments as an array of strings, etc. But unless your app is going to run the external command 100s of times a minute, it is probably not worth the (Java) coding effort and the loss of flexibility / configurability.
A lot of the response would depend on why you're doing this and why other more obvious solutions aren't possible. Knowing why you would choose to roll your own as opposed to installing Webmin would be good, or why you're choosing to use Web UI at all as opposed to VNC to control the box. There are some obvious responses to the later (firewall issues for instance), but there's no immediately obvious reason for the former. Without knowing more about the requirements, answering questions about implementation details like bash scripts versus perl or C is meaningless.
I have to execute several command line executables in Windows through Java. Rather than using Runtime.exec, I thought of using Ant Tasks to do these invocations, hoping it would be a better way.
Is there any other better approach or any popular library to do this?
What exactly is wrong with using Runtime.exec()? It's in the API for a good reason...
Runtime.exec is the way of doing this in Java. Any other library would likely be platform specific. You'd need a very good reason not to use Runtime.exec.
I've currently got several programs that make use of this feature and it hasn't caused any trouble.
With Runtime.exec you can optionally block while waiting for the call to complete. You can capture return codes and anything the command line program writes to the console. And I'm sure there are many other features, those are just the ones that were useful to me.
And you can always have it invoke an ant task if you really want to use ant! :)
I have a similar situation now. I use Runtime and Process classes. They work just fine.