Redirect OutputStream to a file - java

I have to use a method to write a model, from an api.
mode.write(OutputStream out);
At the moment I am doing that to see my output on my console:
model.write(System.out);
However, I would like to specify a file path to write the output to a file.
Any recommendations how I could take this OutputStream and change it into a FileOutputStream to write it to a file?
I appreciate your answer!

model.write(new BufferedOutputStream(new FileOutputStream(file)));
This is for binary data. If text is wanted (as using System.out suggests), look whether there is a
model.write(Writer out);
Then use a Writer to convert java text (Unicode) to binary data (bytes) having some encoding.
You may also omit the encoding for the default platform encoding, i.e. for a computer local file.
String encoding = "UTF-8";
model.write(new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(file), encoding)));

A FileOutputStream is an OutputStream, so you can simply do
model.write(new FileOutputStream("filename"));
Note that you will have to add some exception handling (FileOutputStream() can throw a FileNotFoundException).

Related

java Files.readAllBytes(image.png) doesn't work

I was trying to read from file and then write to other file. I use code bellow to do so.
byte[] bytes = Files.readAllBytes(file1);
Writer Writer = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(file2), "UTF-8"));
for(int i=0;i<bytes.length;i++)
Writer.write(bytes[i]);
Writer.close();
But when I change file1 to picture.png and file2 to picture2.png, this method doesn't work and I can't open picture2.png using image viewer.
What have I done wrong?
Writers are for writing text, possibly in different formats (ie utf-8 / 16, etc). For writing raw bytes, don't use writers. Just use (File)OutputStreams.
It is truly as simple as
byte[] bytes = ...;
FileOutputStream fos = ...;
fos.write(bytes);
The other answers explain why what you have potentially fails.
I'm curious why you're already using one Java NIO method, but not others? The library already has methods to do this for you.
byte[] bytes = Files.readAllBytes(file1);
Files.write(file2, bytes, StandardOpenOption.CREATE_NEW); // or relevant OpenOptions
or
FileOutputStream out = new FileOutputStream(file2); // or buffered
Files.copy(file1, out);
out.close();
or
Files.copy(file1, file2, options);
The problem is that Writer.write() doesn't take a byte. It takes a char, which is variable size, and often bigger than one byte.
But once you've got the whole thing read in as a byte[], you can just use Files.write() to send the whole array to a file in much the same way that you read it in:
Files.write(filename, bytes);
This is the more modern NIO idiom, rather than using an OutputStream.
It's worth reading the tutorial.

Reading , Writing unreadable file stream to save in database

I need to write a file stream to database. The file content must be readable only through
the program. Manual open file should not display the readable content. I decided to use
ObjectOutput stream as it is the binary writing mechanism in java. But I can see the string
content when I open the file.
Writing to stream
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream os = new ObjectOutputStream(baos);
os.writeObject("HIIIIIIIIIIIIIIIIIIIIII HOW ARE YOU");
The created content is look like
’ t #HIIIIIIIIIIIIIIIIIIIIII HOW ARE YOU
How to get complete binary stream output?
The file content must be readable only through the program. Manual open file should not display the readable content.
So you need some security.
I decided to use ObjectOutput stream as it is the binary writing mechanism in java.
That's (a) a non sequitur, and (b) security by obscurity: i.e. it is no security at all.
You should use encryption.

How to convert strange character from web page?

In the web page, it is "Why don't we" as follows:
But when I parse the webpage and save it to a text file, it becomes this under eclipse:
Why don鈥檛 we
More information about my implementation:
The webpage is: utf-8
I use jSoup to parse, the file is saved as a txt.
I use FileWriter f = new FileWriter() to write to file.
UPDATE:
I actually solve the display problem in eclipse by changing eclipse's encoding to utf-8.
FileWriter is a utility class that uses the default current platform encoding. That is non-portable, and probably incorrect.
BufferedWriter f = new BufferedWriter(New OutputStreamWriter(
new FileOutputStream(file), StandardCharsets.UTF_9));
f,Write("\uFEFF"); // Redundant BOM character might be written to be sure
// the text is read as UTF-8
...

Printing Unicode characters using Java in eclipse works, but not when I export to a jar

I have code that creates a PrintWriter and prints Unicode symbols
out = new PrintWriter(new FileWriter("test.txt"));
out.print("\u2588");
out.close();
It works perfectly while saved with UTF-8 encoding inside eclipse, but when I export it to a jar it just prints off question marks. How would I tell it to use UTF-8 when printing off strings?
Eclipse may be helping you to create UTF-8 encoded file but it is better to use the right encoding in your code as well.
FileWriter does not take any parameter for encoding. You can use OutputStreamWriter as it accepts the encoding also. You may change your PrintWriter initialization to:
PrintWriter out = new PrintWriter(new OutputStreamWriter(new FileOutputStream("test.txt"),"UTF-8")));
Instead of FileWriter, create a FileOutputStream. then wrap this in an OutputStreamWriter, It allows you to pass an encoding in the constructor.
FileOutputStream fos = new FileOutputStream("test.txt");
OutputStreamWriter osw = new OutputStreamWriter(fos, "UTF-8")
PrintWriter out = new PrintWriter(osw);

Commons Net FTPClient retrieved file encoding issue

I'm retrieving a file from a FTP Server. The file is encoded as UTF-8
ftpClient.connect(props.getFtpHost(), props.getFtpPort());
ftpClient.login(props.getUsername(), props.getPassword());
ftpClient.setFileType(FTP.BINARY_FILE_TYPE);
inputStream = ftpClient.retrieveFileStream(fileNameBuilder
.toString());
And then somewhere else I'm reading the input stream
bufferedReader = new BufferedReader(new InputStreamReader(
inputStream, "UTF-8"));
But the file is not getting read as UTF-8 Encoded!
I tried ftpClient.setAutodetectUTF8(true); but still doesn't work.
Any ideas?
EDIT:
For example a row in the original file is
...00248090041KENAN SARÐIN 00000000015.993FAC...
After downloading it through FTPClient, I parse it and load in a java object, one of the fields of the java object is name, which for this row is read as "KENAN SAR�IN"
I tried dumping to disk directly:
File file = new File("D:/testencoding/downloaded-file.txt");
FileOutputStream fop = new FileOutputStream(file);
ftpClient.retrieveFile(fileName, fop);
if (!file.exists()) {
file.createNewFile();
}
I compared the MD5 Checksums of the two files(FTP Server one and the and the one dumped to disk), and they're the same.
I would separate out the problems first: dump the file to disk, and compare it with the original. If it's the same as the original, the problem has nothing to do with UTF-8. The FTP code looks okay though, and if you're saying you want the raw binary data, I'd expect it not to mess with anything.
If the file is the same after transfer as before, then the problem has nothing to do with FTP. You say "the file is not getting read as UTF-8 Encoded" but it's not clear what you mean. How certain are you that it's UTF-8 text to start with? If you could edit your question with the binary data, how it's being read as text, and how you'd expect it to be read as text, that would really help.
Try to download the file content as bytes and not as characters using InputStream and OutputStream instead of InputStreamReader. This way you are sure that the file is not changed during transfer.

Categories