why the other part is not printed in the next line? - java

From the following code :
import java.io.*;
class fileTester {
public static void main( String args[]) throws IOException {
String string = "Suhail" + "\n" + "gupta";
FileOutputStream fos = new FileOutputStream( new File("break.txt"));
byte[] data = string.getBytes();
fos.write( data );
fos.close();
}
}
I expected the output to be :
Suhail
Gupta
int the file created (i.e both the strings in a new line ) but the output is in a single line. Suhail gupta
Why is it so when i have used \n character in between the 2 Strings ?

You shouldn't hard-code the new line character when writing to a file. Use the OS-specific newline String instead:
String newline = System.getProperty("line.separator");
Also, rather than use a FileOutputStream to write raw bytes to a text file, why not wrap it in a PrintStream object so you can easily just use println(...) to do your newlines for you?

I guess you are using notepad to see the file.
End of line character varies from system to system. A more advanced text editor (v.g. Notepad++) will show it correctly, because it tries to find the system that this file was prepared for.
Usually, instead of using always "\n", use
java.lang.System.getProperties().get("line.separator")

If your operating system is windows than you have to use \r\n for a new line, only \n won't work in windows, you can find more details here

This is because for Windows new line is: \r\n. In other OS \n will be good

when you need a new line, the best practice is to use the system newline string, by putting in System.getProperty("line.separator") where you want a line break.
That way, it will use the right new line for the platform you are making the file on (windows/mac/linux).

Related

Format of the output

I am using eclipse to run my program. My programs gives 1000 lines as output, and I write the output on a text file successfully. The problem is that the output on the text file is not same as on the console. On the console there are separate lines, but on text file all lines are appended as one line.
How to get the same console format in a text file?
You will have to make sure of the following:
When writing a line to a file you are including a line separator character(s), you can get a platform independent line separator using the following
System.getProperty("line.separator");
When viewing the text file, some app's (like notepad) may not display new line characters the same as others
The app you are using to view the file will need to be set to view in a monospaced font (such as Courier New)
completely guessing what you are doing but i think you need to do this.
BufferedWriter bw = new BufferedWriter(new FileWriter(f, false));
while ( rs.next() ) {
// code to write a line.
bw.write("\r\n");
}
use
bw.write("\r\n");
instead of
bw.newLine();
This is for windows systems POSIX systems do newlines differently i believe.
\n is a new line operator just remember that.
Well if you are using a PrintWriter I would simply do
PrintWriter pw = new PrintWriter(file);
while(...you still have data){
pw.println(<yourString>);
}
you can also append the string "\n" to create a new line manually

Reading UTF-8 file and writing plain ANSI?

I have an UTF-8 file (it's a csv).
I need to read line by line this file do some replace and then write line by line into another file.
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(fileFix), "ASCII")
);
bw.write(""); //clean current file
BufferedReader br = new BufferedReader(new InputStreamReader(
new FileInputStream(file),"UTF-8")
);
String line;
while ((line = br.readLine()) != null) {
line = line.replace(";", ",");
bw.append(line + "\n");
}
Simple as that.
The problem is that the output file (fileFix) is UTF-8 and i think it has the BOM character.
How can I write the file as plain ANSI without the BOM?
The error I am getting while reading my file with a software (weka)
The first line of this file:
Consider that notepad++ tells me the charset is UTF-8. If i try to convert this file in plain ASCII (with windows notepad), that chars disappers
Solution
When you are on the first line run:
line = line.substring(1);
To remove any BOM char.
It sounds like this is a BOM issue rather than an encoding issue as such.
You can just remove any BOM characters as you write the file, with:
line = line.replace("\ufeff", "");
That leaves the question of whether you're reading the data accurately in the first place... I'd strongly advise you not to use FileWriter and FileReader at all - instead, use InputStreamReader and OutputStreamWriter, specifying the encoding explicitly for both of them. Set the reader encoding to UTF-8 (assuming the input file really is UTF-8), and set the writer encoding to whatever you want... but I'd recommend sticking with UTF-8, to be honest.
Also note that you should be closing your reader/writer in finally blocks, or using the try-with-resources statement if you're using Java 7.
Look at http://en.wikipedia.org/wiki/Byte_order_mark for the pattern to replace, looks like EF BB BF rather than FE FF
This solution is wrong check Jons answer intsead

Displaying special characters

I am running into issues when displaying special characters on the Windows console.
I have written the following code:
public static void main(String[] args) throws IOException {
File newFile = new File("sampleInput.txt");
File newOutFile = new File("sampleOutput.txt");
FileReader read = new FileReader(newFile);
FileWriter write = new FileWriter(newOutFile);
PushbackReader reader = new PushbackReader(read);
int c;
while ((c = reader.read()) != -1)
{
write.write(c);
}
read.close();
write.close();
}
The output file looks exactly what the input file would be containing special characters. i.e. for the contents in input file © Ø ŻƩ abcdefĦ, the output file contains exactly the same contents. But when I add the line System.out.printf("%c", (char) c), the contents on the console are:ÿþ©(containing more characters but I am not able to copy paste here). I did read that the issue might be with the Windows console character set, but not able to figure out the fix for it.
Considering the output medium can be anything in future, I do not want to run into issues with Unicode character display for any type of out stream.
Can anyone please help me understand the issue and how can I fix the same ?
The Reader and Writer will use the platform default charset for transforming characters to bytes. In your environment that's apparently not an Unicode compatible charset like UTF-8.
You need InputStreamReader and OutputStreamWriter wherein you can explicitly specify the charset.
Reader read = new InputStreamReader(new FileInputStream(newFile), "UTF-8"));
Writer write = new OutputStreamWriter(new FileOutputStream(newOutFile), "UTF-8"));
// ...
Also, the console needs to be configured to use UTF-8 to display the characters. In for example Eclipse you can do that by Window > Preferences > General > Workspace > Text File Encoding.
In the command prompt console it's not possible to display those characters due to lack of a font supporting those characters. You'd like to head to a Swing-like UI console approach.
See also:
Unicode - How to get the characters right?
Instead of FileWriter try using OutputStreamWriter and specify the encoding of the output.

Newline in FileWriter

How do you produce a new line in FileWriter? It seems that "\n" does not work.
log = new FileWriter("c:\\" + s.getInetAddress().getHostAddress() + ".txt", true);
log.append("\n" + Long.toString(fileTransferTime));
log.close();
The file output of the code above is just a long string of numbers without the new line.
I'll take a wild guess that you're opening the .txt file in Notepad, which won't show newlines with just \n.
Have you tried using your system-specific newline character combination?
log.append(System.getProperty("line.separator") + Long.toString(fileTransferTime));
You should either encapsulate your FileWriter into a PrintWriter if your goal is to have a formated content, println() will help you. Or use the system property line.separator to have a separator adapted to your Operating System.
System.getProperty("line.separator")
Resources :
JavaDoc - PrintWriter
JavaDoc - Properties available on System.getProperty
I'm using "\r\n" and it works great for me. Even when opening .txt document in notepad;)
Try changing \t to \n in the second line.

Newlines in string not writing out to file

I'm trying to write a program that manipulates unicode strings read in from a file. I thought of two approaches - one where I read the whole file containing newlines in, perform a couple regex substitutions, and write it back out to another file; the other where I read in the file line by line and match individual lines and substitute on them and write them out. I haven't been able to test the first approach because the newlines in the string are not written as newlines to the file. Here is some example code to illustrate:
String output = "Hello\nthere!";
BufferedWriter oFile = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream("test.txt"), "UTF-16"));
System.out.println(output);
oFile.write(output);
oFile.close();
The print statement outputs
Hello
there!
but the file contents are
Hellothere!
Why aren't my newlines being written to file?
You should try using
System.getProperty("line.separator")
Here is an untested example
String output = String.format("Hello%sthere!",System.getProperty("line.separator"));
BufferedWriter oFile = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream("test.txt"), "UTF-16"));
System.out.println(output);
oFile.write(output);
oFile.close();
I haven't been able to test the first
approach because the newlines in the
string are not written as newlines to
the file
Are you sure about that? Could you post some code that shows that specific fact?
Use System.getProperty("line.separator") to get the platform specific newline.
Consider using PrintWriters to get the println method known from e.g. System.out

Categories