When I execute Java program as a user in terminal with command:
java -jar progName.jar
I receive stable and workable program which writes info to some files. But when I wrote some commands for cron scheduler to execute this program regularly, program works fine, but text in files is written in wrong encoding and I receive not the text but ???.
I use Ubuntu 14.04 on my server machine.
Thanks for your answers and tips. I solve this problem using a comment of user eg04lt3r:
This issue is depends on OS default encoding. If you didn't define direct encoding when write to file, it uses default OS encoding. So, add encoding for characters when you write data in file in your app.
So, I added encode way to my writer in Java program, and now everything works fine.
Related
I am trying to get my java console application to be able to take commands both from Windows PS and Linux Terminal while being run. It isn't working properly though because of a special character (like äüö etc) in a command.
When the program is being executed in Linux everything works fine because of UTF-8. My simple program just takes an input via a scanner and checks whether or not it matches a specific String. My expected command is read from a .txt file and is in UTF-8 format.
I tried to
Arrays.equals(userInputAsString.getBytes(StandardCharsets.UTF_8), expectedInputToExecuteCommandAsString.getBytes(StandardCharsets.UTF_8))
But unfortunately it doesn't work while being run in PS.
I hope someone may be willing to help me!
I have an issue where I think my local JVM is running with a different character encoding to one that runs on a server. Is there a command I can pass to the vm to display what character encoding is has? Something like
java show 'charset'
I've been searching for ages, and all of the solutions require writing a test in java code, but I don't really want to have to write code, deploy it to the server, etc.
So I have an application written in JavaFx 2.2 that has been packaged for linux, mac, and windows. I am getting a strange issue with some of the text fields though. The application will read a file and populate some labels based on whats found in the file. When run on ubuntu or mac we get a result like as you can see we have that special accent character over the c and it looks just fine. However in Windows it shows up like this . Any idea as to why this is happening? I was a bit confused as it is the same exact application on all three. Thanks.
Make sure to specify character encoding when reading the file, in order to avoid using the platform's default encoding, which varies between operating systems. Just by coincidence, the default on Linux and Mac happens to match the file encoding and produces correct output, but you should not rely on it.
I'm building a web application that helps people improve their English pronunciation for some words. The website displays a sentence for the user, and he/she speaks it and then press "Results" button. The web application then sends two files to the server: .txt and .wav files.
The server (which is Linux (Ubuntu)) should take the files, and do some analysis and calculations, and then print out the results on a file called "Results.txt". Then the web application (which is php based) should read the results from the file and displays them to the user.
The problem is: I'm not sure what is best to do the communication between the web application and the Linux server. Till now, I succeeded in writing the .txt and .wav files on the server. And I can build a Linux script that takes these two files and do the required calculations. What I'm facing is that: I don't know how to properly and effectively start the script. And more importantly: when the script is done, how to know that I can safely read the results from the "Results.txt" file? I need a synchronization tool or method.
I asked some guys, and they told me to use a java application on the server side, but I not sure how to do it!
Any help, Please?? :)
First, you can do it with PHP. Using the shell_exec() method you can run commands your Linux scripts and also read the output. Ex:
$output = shell_exec("ls -l > outputFile.txt");
will write the current directory listing to a file called outputFile.txt
Second, you can also do the same using Java. Use:
Runtime.getRuntime().exec("ls -l").waitfor();
Not using the waitfor() in the end will cause asynchronous execution of your shell script. You can also read the stdout and stderr streams of your Linux script using
Process.getInputStream()
and
Process.getErrorStream()
methods respectively. Hope that helps.
For example, one application that I'm working on stores PDF files into a database, then can pull them back out for display. I've got a call in there using Runtime.exec to do a "cmd /c start " plus the PDF filename. Works great for Windows. Would prefer to find a platform independent way (trying to avoid OS detection with alternate methods for various OS) to do this though as we also run the software on Solaris and Mac.
Look at Desktop which has a open method and that would be platform independent.
Launches the associated application to
open the file.
I'd be interested to see if there is a 'correct' answer for this. If I were to do this, I'd have a properties file mapping of OS to the command needed to run, and then resolve the OS at runtime.
Eg in a properties file:
windows=cmd /c start
mac=open #(I think)
linux=... etc