Java application does not works as jar [duplicate] - java

I have a program developed under java with netbeans. It has a text pane that takes text written in non English language and do some operation including save open new.....
The program was fine and complete worked flawlessly when i run it from netbeans. But when i go to the dist folder and run the jar (which was supposed to be the executable) it runs good but when i open a previously saved file to the editor it shows mysterious fonts.
like-
লিখ "The original inputs are" << নতুন_লাইন;
চলবে(সংখ্যা প=০;প<যতটা;প++)
becomes
লিখ "The original inputs are" << নত�ন_লাইন;
চলবে(সংখ�যা প=০;প<যতটা;প++)
one more interesting thing is that. If i type in the editor it is also working fine (no font problem).
I am using these 2 functions to read and write to file
public void writeToFile(String data,String address)
{
try{
// Create file
FileWriter fstream = new FileWriter(address);
BufferedWriter out = new BufferedWriter(fstream);
out.write(data);
//Close the output stream
out.close();
}catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
public String readFromFile(String fileName) {
String output="";
try {
File file = new File(fileName);
FileReader reader = new FileReader(file);
BufferedReader in = new BufferedReader(reader);
String string;
while ((string = in.readLine()) != null) {
output=output+string+"\n";
}
in.close();
} catch (IOException e) {
e.printStackTrace();
}
return output;
}
I have set the font of the text pane to vrinda which works from within the IDE as i mentioned.
Please help me identify what is wrong.
is there something i need to do to publish JAR when native support is required?

Try changing your reading logic to use InputStreamReader which allows setting encoding:
InputStreamReader inputStreamReader =
new InputStreamReader(new FileInputStream (file), "UTF-8" );
Also change your writing logic to use OutputStreamWriter which allows setting encoding:
OutputStreamWriter outputStreamWriter =
new OutputStreamWriter(new FileOutputStream (file), "UTF-8" );

The root problem is that your current application is reading the file using the "platform default" character set / character encoding. This is obviously different when you are running from the command line and from NetBeans. In the former cause, it depends on the locale settings of the host OS or the current shell ... depending on your platform. In NetBeans, it seems to default to UTF-8.
#Andrey Adamovich's answer explains how to specify a character encoding when opening a file using a file reader or adapting a byte stream using an input stream reader.

Related

SonarQube issue with New FileReader

Below mentioned Code snippet gives Sonar comment with following squid rule: squid:S1943
try(BufferedReader reader = new BufferedReader(**new
FileReader**(properties.get(FILE_BASED_CONFIGURATION).toString())))
{
//some code
}
catch (IOException | ArrayIndexOutOfBoundsException e)
{
LOG.error("Exception while reading from File", e);
//customerInfo.clear();
}
[SONAR] MAJOR: Remove this use of constructor "FileReader(String)"
The issue here is that you haven't specified an encoding for the file, which means that the file will be read with your system's default encoding. This means that the behaviour of the code could vary from system to system.
You should explicitly state the file's encoding, for example,
new InputStreamReader(
new FileInputStream(
properties.get(FILE_BASED_CONFIGURATION).toString()), "UTF-8")
This reads the file with a FileInputStream (which reads bytes from a file), then wraps this in an InputStreamReader which converts those bytes to characters using the stated encoding.
this is due to the use of default system encoding for FileReader which is generally bad. You should use
try(BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(properties.get(FILE_BASED_CONFIGURATION).toString())), encoding)))
{
//some code
}
catch (IOException | ArrayIndexOutOfBoundsException e)
{
LOG.error("Exception while reading from File", e);
//customerInfo.clear();
}

trying to write strings line by line into a file

The following code writes an array into the file, but my problem that it writes it on onto one line when instead I needs it to be on a newline every time that it writes and I can't figure out how to make this part of the code work. I tried adding in the code for a newline as you would for strings but I'm assuming this is not the correct way as it doesn't work.
private class SaveButtonListener implements ActionListener
{
public void actionPerformed (ActionEvent event)
{
String [] data = dataSource.getList();
JFileChooser chooser = new JFileChooser();
chooser.setCurrentDirectory(new File("/home/me/Documents"));
int retrival = chooser.showSaveDialog(null);
if (retrival == JFileChooser.APPROVE_OPTION) {
try {
FileWriter fw = new FileWriter(chooser.getSelectedFile()+".txt");
for (int i=0; i<data.length ; i++)
{
fw.write(data[i] + " \n");
}
//fw.write(data.toString());
fw.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}
}
Have you tried
fw.write(data[i] + System.getProperty("line.separator"));
Technically it is writing newlines, but it's likely you're viewing the text file with a text editor that doesn't recognize newline by itself (notepad for instance).
Try:
fw.write(data[i] + " \r\n");
You can use a PrintWriter to easily print individual lines
PrintWriter fw = new PrintWriter(new FileWriter(chooser.getSelectedFile()+".txt"));
for (int i=0; i<data.length ; i++) {
fw.println(data[i]);
}
fw.close();
Note, you should put the close in a finally as otherwise there can be situations when close is not called. In fact you should use the new Java 7 try-with-resources synatx:
try(PrintWriter fw = new PrintWriter(new FileWriter(chooser.getSelectedFile()+".txt"))) {
// stuff
} catch (IOException ex) {
ex.printStackTrace();
}
Note that this will output platform specific linebreaks, so \r\n on windows and \n on Unix. One problem that you will run into time and time again when creating files is opening a file created on one platform on another platform.
FileWriter fw = new FileWriter(chooser.getSelectedFile()+".txt");
chooser.getSelectedFile() should already return a file name with extension so trying adding an extension ".txt" to it isn't making much sense to me.
File writing might be a lengthy operation. So you should run the file writing operation inside a separate thread to avoid blocking EDT thread inside which Swing does it's event management and GUI rendering task. Otherwise your application might get FREEZED.
You might be interested with FileWriter(File file, boolean append) to append your data to a existing file.
Try using System.getProperty("line.separator") to get the correct newline separator for your platform.

Write a file in UTF-8 using FileWriter (Java)?

I have the following code however, I want it to write as a UTF-8 file to handle foreign characters. Is there a way of doing this, is there some need to have a parameter?
I would really appreciate your help with this. Thanks.
try {
BufferedReader reader = new BufferedReader(new FileReader("C:/Users/Jess/My Documents/actresses.list"));
writer = new BufferedWriter(new FileWriter("C:/Users/Jess/My Documents/actressesFormatted.csv"));
while( (line = reader.readLine()) != null) {
//If the line starts with a tab then we just want to add a movie
//using the current actor's name.
if(line.length() == 0)
continue;
else if(line.charAt(0) == '\t') {
readMovieLine2(0, line, surname.toString(), forename.toString());
} //Else we've reached a new actor
else {
readActorName(line);
}
}
} catch (IOException e) {
e.printStackTrace();
}
Safe Encoding Constructors
Getting Java to properly notify you of encoding errors is tricky. You must use the most verbose and, alas, the least used of the four alternate contructors for each of InputStreamReader and OutputStreamWriter to receive a proper exception on an encoding glitch.
For file I/O, always make sure to always use as the second argument to both OutputStreamWriter and InputStreamReader the fancy encoder argument:
Charset.forName("UTF-8").newEncoder()
There are other even fancier possibilities, but none of the three simpler possibilities work for exception handing. These do:
OutputStreamWriter char_output = new OutputStreamWriter(
new FileOutputStream("some_output.utf8"),
Charset.forName("UTF-8").newEncoder()
);
InputStreamReader char_input = new InputStreamReader(
new FileInputStream("some_input.utf8"),
Charset.forName("UTF-8").newDecoder()
);
As for running with
$ java -Dfile.encoding=utf8 SomeTrulyRemarkablyLongcLassNameGoeShere
The problem is that that will not use the full encoder argument form for the character streams, and so you will again miss encoding problems.
Longer Example
Here’s a longer example, this one managing a process instead of a file, where we promote two different input bytes streams and one output byte stream all to UTF-8 character streams with full exception handling:
// this runs a perl script with UTF-8 STD{IN,OUT,ERR} streams
Process
slave_process = Runtime.getRuntime().exec("perl -CS script args");
// fetch his stdin byte stream...
OutputStream
__bytes_into_his_stdin = slave_process.getOutputStream();
// and make a character stream with exceptions on encoding errors
OutputStreamWriter
chars_into_his_stdin = new OutputStreamWriter(
__bytes_into_his_stdin,
/* DO NOT OMIT! */ Charset.forName("UTF-8").newEncoder()
);
// fetch his stdout byte stream...
InputStream
__bytes_from_his_stdout = slave_process.getInputStream();
// and make a character stream with exceptions on encoding errors
InputStreamReader
chars_from_his_stdout = new InputStreamReader(
__bytes_from_his_stdout,
/* DO NOT OMIT! */ Charset.forName("UTF-8").newDecoder()
);
// fetch his stderr byte stream...
InputStream
__bytes_from_his_stderr = slave_process.getErrorStream();
// and make a character stream with exceptions on encoding errors
InputStreamReader
chars_from_his_stderr = new InputStreamReader(
__bytes_from_his_stderr,
/* DO NOT OMIT! */ Charset.forName("UTF-8").newDecoder()
);
Now you have three character streams that all raise exception on encoding errors, respectively called chars_into_his_stdin, chars_from_his_stdout, and chars_from_his_stderr.
This is only slightly more complicated that what you need for your problem, whose solution I gave in the first half of this answer. The key point is this is the only way to detect encoding errors.
Just don’t get me started about PrintStreams eating exceptions.
Ditch FileWriter and FileReader, which are useless exactly because they do not allow you to specify the encoding. Instead, use
new OutputStreamWriter(new FileOutputStream(file), StandardCharsets.UTF_8)
and
new InputStreamReader(new FileInputStream(file), StandardCharsets.UTF_8);
You need to use the OutputStreamWriter class as the writer parameter for your BufferedWriter. It does accept an encoding. Review javadocs for it.
Somewhat like this:
BufferedWriter out = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream("jedis.txt"), "UTF-8"
));
Or you can set the current system encoding with the system property file.encoding to UTF-8.
java -Dfile.encoding=UTF-8 com.jediacademy.Runner arg1 arg2 ...
You may also set it as a system property at runtime with System.setProperty(...) if you only need it for this specific file, but in a case like this I think I would prefer the OutputStreamWriter.
By setting the system property you can use FileWriter and expect that it will use UTF-8 as the default encoding for your files. In this case for all the files that you read and write.
EDIT
Starting from API 19, you can replace the String "UTF-8" with StandardCharsets.UTF_8
As suggested in the comments below by tchrist, if you intend to detect encoding errors in your file you would be forced to use the OutputStreamWriter approach and use the constructor that receives a charset encoder.
Somewhat like
CharsetEncoder encoder = Charset.forName("UTF-8").newEncoder();
encoder.onMalformedInput(CodingErrorAction.REPORT);
encoder.onUnmappableCharacter(CodingErrorAction.REPORT);
BufferedWriter out = new BufferedWriter(new OutputStreamWriter(new FileOutputStream("jedis.txt"),encoder));
You may choose between actions IGNORE | REPLACE | REPORT
Also, this question was already answered here.
Since Java 11 you can do:
FileWriter fw = new FileWriter("filename.txt", Charset.forName("utf-8"));
Since Java 7 there is an easy way to handle character encoding of BufferedWriter and BufferedReaders. You can create a BufferedWriter directly by using the Files class instead of creating various instances of Writer. You can simply create a BufferedWriter, which considers character encoding, by calling:
Files.newBufferedWriter(file.toPath(), StandardCharsets.UTF_8);
You can find more about it in JavaDoc:
Files class
Files#newBufferedWriter
With Chinese text, I tried to use the Charset UTF-16 and lucklily it work.
Hope this could help!
PrintWriter out = new PrintWriter( file, "UTF-16" );
OK it's 2019 now, and from Java 11 you have a constructor with Charset:
FileWriter​(String fileName, Charset charset)
Unfortunately, we still cannot modify the byte buffer size, and it's
set to 8192. (https://www.baeldung.com/java-filewriter)
use OutputStream instead of FileWriter to set encoding type
// file is your File object where you want to write you data
OutputStream outputStream = new FileOutputStream(file);
OutputStreamWriter outputStreamWriter = new OutputStreamWriter(outputStream, "UTF-8");
outputStreamWriter.write(json); // json is your data
outputStreamWriter.flush();
outputStreamWriter.close();
In my opinion
If you wanna write follow kind UTF-8.You should create a byte array.Then,you can do such as the following:
byte[] by=("<?xml version=\"1.0\" encoding=\"utf-8\"?>"+"Your string".getBytes();
Then, you can write each byte into file you created.
Example:
OutputStream f=new FileOutputStream(xmlfile);
byte[] by=("<?xml version=\"1.0\" encoding=\"utf-8\"?>"+"Your string".getBytes();
for (int i=0;i<by.length;i++){
byte b=by[i];
f.write(b);
}
f.close();

Java Native language Application Doesnt work outside IDE

I have a program developed under java with netbeans. It has a text pane that takes text written in non English language and do some operation including save open new.....
The program was fine and complete worked flawlessly when i run it from netbeans. But when i go to the dist folder and run the jar (which was supposed to be the executable) it runs good but when i open a previously saved file to the editor it shows mysterious fonts.
like-
লিখ "The original inputs are" << নতুন_লাইন;
চলবে(সংখ্যা প=০;প<যতটা;প++)
becomes
লিখ "The original inputs are" << নত�ন_লাইন;
চলবে(সংখ�যা প=০;প<যতটা;প++)
one more interesting thing is that. If i type in the editor it is also working fine (no font problem).
I am using these 2 functions to read and write to file
public void writeToFile(String data,String address)
{
try{
// Create file
FileWriter fstream = new FileWriter(address);
BufferedWriter out = new BufferedWriter(fstream);
out.write(data);
//Close the output stream
out.close();
}catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
public String readFromFile(String fileName) {
String output="";
try {
File file = new File(fileName);
FileReader reader = new FileReader(file);
BufferedReader in = new BufferedReader(reader);
String string;
while ((string = in.readLine()) != null) {
output=output+string+"\n";
}
in.close();
} catch (IOException e) {
e.printStackTrace();
}
return output;
}
I have set the font of the text pane to vrinda which works from within the IDE as i mentioned.
Please help me identify what is wrong.
is there something i need to do to publish JAR when native support is required?
Try changing your reading logic to use InputStreamReader which allows setting encoding:
InputStreamReader inputStreamReader =
new InputStreamReader(new FileInputStream (file), "UTF-8" );
Also change your writing logic to use OutputStreamWriter which allows setting encoding:
OutputStreamWriter outputStreamWriter =
new OutputStreamWriter(new FileOutputStream (file), "UTF-8" );
The root problem is that your current application is reading the file using the "platform default" character set / character encoding. This is obviously different when you are running from the command line and from NetBeans. In the former cause, it depends on the locale settings of the host OS or the current shell ... depending on your platform. In NetBeans, it seems to default to UTF-8.
#Andrey Adamovich's answer explains how to specify a character encoding when opening a file using a file reader or adapting a byte stream using an input stream reader.

How to write to Standard Output using BufferedWriter

I am currently writing an application that produces several log files using BufferedWriter. While debugging, however, I want to write to System.out instead of a file. I figured I could change from:
log = new BufferedWriter(new FileWriter(tokenizerLog));
to:
BufferedWriter log = new BufferedWriter(new OutputStreamWriter(System.out));
log.write("Log output\n");
as opposed to:
System.out.println("log output")
The new OutputStreamWriter option has not been working though. How do I change just the Object inside the BufferedWriter constructor to redirect from a file to Standard out. Because I have several log files I will be writing to, using System.out everywhere and changing the output to a file isn't really an option.
Your approach does work, you are just forgetting to flush the output:
try {
BufferedWriter log = new BufferedWriter(new OutputStreamWriter(System.out));
log.write("This will be printed on stdout!\n");
log.flush();
}
catch (Exception e) {
e.printStackTrace();
}
The both OutputStreamWriter and PrintWriter are Writer instances so you can just do something like:
BufferedWriter log;
Writer openForFile(String fileName) {
if (fileName != null)
return new PrintWriter(fileName);
else
return new OutputStreamWriter(System.out);
}
log = new BufferedWriter(openForFile(null)); //stdout
log = new BufferedWriter(openForFile("mylog.log")); // using a file
or whatever, it is just to give you the idea..
Since you mention that this is for logging, you might want to look at using a logger library like log4j. It'll let you change the log destination (either log file or console) by making changes in configuration files only.

Categories