Below mentioned Code snippet gives Sonar comment with following squid rule: squid:S1943
try(BufferedReader reader = new BufferedReader(**new
FileReader**(properties.get(FILE_BASED_CONFIGURATION).toString())))
{
//some code
}
catch (IOException | ArrayIndexOutOfBoundsException e)
{
LOG.error("Exception while reading from File", e);
//customerInfo.clear();
}
[SONAR] MAJOR: Remove this use of constructor "FileReader(String)"
The issue here is that you haven't specified an encoding for the file, which means that the file will be read with your system's default encoding. This means that the behaviour of the code could vary from system to system.
You should explicitly state the file's encoding, for example,
new InputStreamReader(
new FileInputStream(
properties.get(FILE_BASED_CONFIGURATION).toString()), "UTF-8")
This reads the file with a FileInputStream (which reads bytes from a file), then wraps this in an InputStreamReader which converts those bytes to characters using the stated encoding.
this is due to the use of default system encoding for FileReader which is generally bad. You should use
try(BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream(properties.get(FILE_BASED_CONFIGURATION).toString())), encoding)))
{
//some code
}
catch (IOException | ArrayIndexOutOfBoundsException e)
{
LOG.error("Exception while reading from File", e);
//customerInfo.clear();
}
Related
I have a program developed under java with netbeans. It has a text pane that takes text written in non English language and do some operation including save open new.....
The program was fine and complete worked flawlessly when i run it from netbeans. But when i go to the dist folder and run the jar (which was supposed to be the executable) it runs good but when i open a previously saved file to the editor it shows mysterious fonts.
like-
লিখ "The original inputs are" << নতুন_লাইন;
চলবে(সংখ্যা প=০;প<যতটা;প++)
becomes
লিখ "The original inputs are" << নত�ন_লাইন;
চলবে(সংখ�যা প=০;প<যতটা;প++)
one more interesting thing is that. If i type in the editor it is also working fine (no font problem).
I am using these 2 functions to read and write to file
public void writeToFile(String data,String address)
{
try{
// Create file
FileWriter fstream = new FileWriter(address);
BufferedWriter out = new BufferedWriter(fstream);
out.write(data);
//Close the output stream
out.close();
}catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
public String readFromFile(String fileName) {
String output="";
try {
File file = new File(fileName);
FileReader reader = new FileReader(file);
BufferedReader in = new BufferedReader(reader);
String string;
while ((string = in.readLine()) != null) {
output=output+string+"\n";
}
in.close();
} catch (IOException e) {
e.printStackTrace();
}
return output;
}
I have set the font of the text pane to vrinda which works from within the IDE as i mentioned.
Please help me identify what is wrong.
is there something i need to do to publish JAR when native support is required?
Try changing your reading logic to use InputStreamReader which allows setting encoding:
InputStreamReader inputStreamReader =
new InputStreamReader(new FileInputStream (file), "UTF-8" );
Also change your writing logic to use OutputStreamWriter which allows setting encoding:
OutputStreamWriter outputStreamWriter =
new OutputStreamWriter(new FileOutputStream (file), "UTF-8" );
The root problem is that your current application is reading the file using the "platform default" character set / character encoding. This is obviously different when you are running from the command line and from NetBeans. In the former cause, it depends on the locale settings of the host OS or the current shell ... depending on your platform. In NetBeans, it seems to default to UTF-8.
#Andrey Adamovich's answer explains how to specify a character encoding when opening a file using a file reader or adapting a byte stream using an input stream reader.
a java code i've been working in Windows worked perfectly, but when i tried to run it in linux didn't work (i.e it didn't create the file and therefore didn't write)...these are the functions i'm using:
BufferedWriter writer =null;//
String directory= "folder/";
java.io.File directory1 = new File(directory+"resultado");
String directory2;
directory1.mkdirs();
directory2=directory+"resultado/";
try {
writer = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(directory2+"resultado.txt"), "utf-8"));
writer.write("something");
writer.newLine();
} catch (IOException ex) {
System.out.println("ERRORR!!!!");
ex.printStackTrace() ;
// report
} finally {
try {writer.close();} catch (Exception ex) {//ignore}
}
}
Even thoug i have the catch IOException to write "Error" it gives me the error
Exception in thread "main" java.lang.NullPointerException
at memoria.bosques.imprimirenarchivos(bosques.java:17281)
at memoria.bosques.main2(bosques.java:18096)
at memoria.bosques.main(bosques.java:18139)
The folder of the directory is created, but it seems the functions don't create a file to write on it...what can i do?
I suggest writer is null in your finally block, because you got a prior exception, which you didn't tell us about. Either test it for null before closing, or use try-with-resources.
And when you get an exception, don't just print out "ERROR!!!!". It's useless. Print the exception.
And when you call a method like mkdirs() that returns a result, don't ignore it.
If I use the following code:
try {
writer = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(fileName), "utf-8"));
writer.write("<title>");
} catch (IOException e) {
throw new RuntimeException(e);
} finally {
try {
writer.close();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
nothing shows up in the file, but if I remove the "<" , and try to output "title>" it works fine. How can I get around this?
The problem isn't in your code, it's in the viewer (editor) that you're using to view the output. Instead of showing you the plain plain text, it's interpreting the data in the file, and showing you its interpretation. Use a plain editor such as notepad or vi to see what is in the file.
Try flushing your writer after you write:
writer.write("<title>");
writer.flush();
I currently am having problems writing to the text file in my code, the entirety of the program is hit and everything will print out to the console. no errors. But the file is empty. Any suggestions?
public textFiles(String filePath)
{
File file = new File(filePath);
try{
fstream = new FileWriter(filePath,true);
}
catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
out = new BufferedWriter(fstream);
System.out.println("try");
addToText("WOOOOHOOO");
System.out.println(file.exists());
}
public void addToText(String Line)
{
try {
out.write(Line);
out.newLine();
} catch (IOException e) {
System.err.println("writing Error");
}
System.out.println("SHOULDA F****** WORKED");
}
You're never closing the stream, and so probably never flushing the stream either - the text essentially gets cached when you print it out, and gets flushed to the file in chunks (usually chunks that are much bigger than what you're writing, hence the lack of output.)
Make sure you close the stream when you're done (fstream.close();), and it should work fine (the stream will automatically flush to clear any output when it's closed).
Try this code to write a .txt file in any drive.
try
{
String ss="html file write in java";
File file= new File("F:\\inputfile\\aa.txt");
FileWriter fwhn= new FileWriter(file);
fwhn.write(ss);
fwhn.flush();
}
catch(Exception ex)
{
}
I have a program developed under java with netbeans. It has a text pane that takes text written in non English language and do some operation including save open new.....
The program was fine and complete worked flawlessly when i run it from netbeans. But when i go to the dist folder and run the jar (which was supposed to be the executable) it runs good but when i open a previously saved file to the editor it shows mysterious fonts.
like-
লিখ "The original inputs are" << নতুন_লাইন;
চলবে(সংখ্যা প=০;প<যতটা;প++)
becomes
লিখ "The original inputs are" << নত�ন_লাইন;
চলবে(সংখ�যা প=০;প<যতটা;প++)
one more interesting thing is that. If i type in the editor it is also working fine (no font problem).
I am using these 2 functions to read and write to file
public void writeToFile(String data,String address)
{
try{
// Create file
FileWriter fstream = new FileWriter(address);
BufferedWriter out = new BufferedWriter(fstream);
out.write(data);
//Close the output stream
out.close();
}catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
public String readFromFile(String fileName) {
String output="";
try {
File file = new File(fileName);
FileReader reader = new FileReader(file);
BufferedReader in = new BufferedReader(reader);
String string;
while ((string = in.readLine()) != null) {
output=output+string+"\n";
}
in.close();
} catch (IOException e) {
e.printStackTrace();
}
return output;
}
I have set the font of the text pane to vrinda which works from within the IDE as i mentioned.
Please help me identify what is wrong.
is there something i need to do to publish JAR when native support is required?
Try changing your reading logic to use InputStreamReader which allows setting encoding:
InputStreamReader inputStreamReader =
new InputStreamReader(new FileInputStream (file), "UTF-8" );
Also change your writing logic to use OutputStreamWriter which allows setting encoding:
OutputStreamWriter outputStreamWriter =
new OutputStreamWriter(new FileOutputStream (file), "UTF-8" );
The root problem is that your current application is reading the file using the "platform default" character set / character encoding. This is obviously different when you are running from the command line and from NetBeans. In the former cause, it depends on the locale settings of the host OS or the current shell ... depending on your platform. In NetBeans, it seems to default to UTF-8.
#Andrey Adamovich's answer explains how to specify a character encoding when opening a file using a file reader or adapting a byte stream using an input stream reader.