Export to CSV JSF and PrimeFaces - java

Export to Excel JSF and PrimeFaces
I am trying to download CSV file which is created in runtime. This link is useful for excel and I need to do the same for CSV. HSSFWorkbook is used for excel but I am using FileWriter for CSV. I need a line to use rather than workbook.write(externalContext.getResponseOutputStream()); I cannot use writer.write(externalContext.getResponseOutputStream()); writer is FileWriter variable and does not accept outputStream as parameter.

It seems to me that you have two issues here :
You shouldn't have a FileWriter if you don't want to write to a file - you need to choose the right implementation of the Writer abstract class for your use case (here, you want to chose the one that writes to an OutputStream, not to a File).
You're trying to use Writer#write(...) like HSSFWorkbook#write(java.io.OutputStream), but they don't do the same thing at all. In HSSFWorkbook, the write method writes the workbook's content to some OutputStream; the parameter tells the method where you want to write. In Writer, the write method writes something to the writer itself; the parameter tells the method what you want to write.
Based on your link for writing from a HSSFWorkbook, writing a CSV in a similar way could look something like :
public void getReportData() throws IOException {
FacesContext facesContext = FacesContext.getCurrentInstance();
ExternalContext externalContext = facesContext.getExternalContext();
externalContext.setResponseContentType("text/csv");
externalContext.setResponseHeader("Content-Disposition", "attachment; filename=\"my.csv\"");
OutputStream out = externalContext.getResponseOutputStream());
Writer writer = new OutputStreamWriter(out);
// Let's write the CSV content
try {
writer.write("Line number,Col 1,Col 2");
writer.write("1,Value 1,Value 2");
writer.write("2,Value 3,Value4");
} finally {
if (writer != null {
// Closing the writer also flushes it, and does the same to the underlying OutputStream
writer.close();
}
}
facesContext.responseComplete();
}

Whole working copy that you may use ;
String csvFileName = "mydoc.csv";
FileWriter writer = new FileWriter(csvFileName);
int columnNameSize = activeTab.getColumnNames().size();
for (int i = 0; i < columnNameSize; i++) {
writer.append(activeTab.getColumnNames().get(i));
if (i != (columnNameSize - 1)) {
if (delimiterType.equalsIgnoreCase(TAB_DELIMITER_VALUE_NAME)) {
writer.append('\t');
} else {
writer.append(delimiterType);
}
}
}
writer.append("\n");
for (DBData[] temp : activeTab.getTabularData()) {
int tempSize = temp.length;
for (int k = 0; k < tempSize; k++) {
writer.append(temp[k].toFullString());
if (k != (tempSize - 1)) {
if (delimiterType.equalsIgnoreCase(TAB_DELIMITER_VALUE_NAME)) {
writer.append('\t');
} else {
writer.append(delimiterType);
}
}
}
writer.append("\n");
}
writer.flush();
writer.close();
InputStream stream = new BufferedInputStream(new FileInputStream(csvFileName));
exportFile = new DefaultStreamedContent(stream, "application/csv", csvFileName);

Related

Write multiple Zip files to an OutputStream using IOUtils.copy method

This question feels very difficult to explain to me but ill do my best.
Currently I have a method that returns an InputStream with a Zip file that i have to add to a main zip file. Problem is when I write something into the OutputStream it overwrites previous written data. I tried using ZipOutputStream with ZipEntries but this recompresses the file and does weird things, so its not a solution. Things that I'm required to use and are not negotiable are:
Retrieving the file with the method that returns an InputStream
Using IOUtils.copy() method to download the file (this may be optional if u have another solution that allows me to download the file through a browser)
This is the code so far:
OutputStream os = null;
InputStream is = null;
try {
os = response.getOutputStream();
for (int i = 0; i < splited.length; i += 6) {
String[] file= //an array with the data to retrieve the file
is = FileManager.downloadFile(args);
int read;
byte[] buffer = new byte[1024];
while (0 < (read = is.read(buffer))) {
os.write(buffer, 0, read);
}
}
} catch (Exception ex) {
//Exception captures
}
response.setHeader("Content-Disposition", "attachment; filename=FileName");
response.setContentType("application/zip");
IOUtils.copy(is, os);
os.close();
is.close();
return forward;
You can use ZipOutputStream wrapping the response's OutputStream but instead of close call finish, and do not call close on the ResponseOutputStream.
You must start with the HTTP headers.
response.setHeader("Content-Disposition", "attachment; filename=FileName");
response.setContentType("application/zip");
try {
ZipOutputStream os = new ZipOutputStream(response.getOutputStream());
for (int i = 0; i < splited.length - 5; i += 6) {
String[] file= //an array with the data to retrieve the file
try (InputStream is = FileManager.downloadFile(args)) {
os.putNextEntry(new ZipEntry(filePath));
is.TransferTo(os);
os.closeEntry();
}
}
os.finish();
} catch (Exception ex) {
//Exception captures
}
return forward;
Since java 9 transferTo copies Input/OutputStreams.
One can also copy a Path with Files.copy(Path, OutputStream) where Path is an URI based generalisation of File, so also URLs might immediately be copied.
Here try-with-resources ensures that every is is closed.

Copy content from one file to multiple file using java

FileInputStream Fread = new FileInputStream("somefilename");
FileOutputStream Fwrite = null;
for (int i = 1; i <= 5; i++)
{
String fileName = "file" + i + ".txt";
Fwrite = new FileOutputStream(fileName);
int c;
while ((c = Fread.read()) != -1)
{
Fwrite.write((char) c);
}
Fwrite.close();
}
Fread.close();
The above code writes only to one file. How to make it work to write the content of one file to multiple files?
FYI: Note that the read() method you used returns a byte, not a char, so calling write((char) c) should have been just write(c).
To write to multiple files in parallel when copying a file, you create a array of output streams for the destination files, then iterate the array to write the data to all of them.
For better performance, you should always do this using a buffer. Writing one byte at a time will not perform well.
public static void copyToMultipleFiles(String inFile, String... outFiles) throws IOException {
OutputStream[] outStreams = new OutputStream[outFiles.length];
try {
for (int i = 0; i < outFiles.length; i++)
outStreams[i] = new FileOutputStream(outFiles[i]);
try (InputStream inStream = new FileInputStream(inFile)) {
byte[] buf = new byte[16384];
for (int len; (len = inStream.read(buf)) > 0; )
for (OutputStream outStream : outStreams)
outStream.write(buf, 0, len);
}
} finally {
for (OutputStream outStream : outStreams)
if (outStream != null)
outStream.close();
}
}
You will have to create multiple instances of FileOutputStream fwrite1, fwrite2, fwrite3, one per each file you want to write to, then, as you read, you simply write to all of them. This is how you achieve it.
Add this line:
Fread.reset();
after Fwrite.close();
And change the first line of code to this:
InputStream Fread = new BufferedInputStream(new FileInputStream("somefilename"));
Fread.mark(0);
The FReadstream gets to the end once and then there is nothing to make it start from the beginning.
To solve this you can:
call to FRead.reset() after each file writing
cache FRead's value somewhere and write to FWrite from this source
create an array / collection of FileOutputStream and write each byte to all of them during iteration
The recommended solution is of course the first one.
Also there are some problems in your code:
You are highly encouraged to use try-with-resouce for Streams as they should be safely closed
You seem to not follow naming conventions which say to name variables in lowerCamelCase

FileWriter output to csv file is blank

FileWriter outfile = new FileWriter("ouput.csv", true); //true = append
for(int len = 0; len < tempList.size(); len++) {
LineItem tempItem = tempList.get(len);
if ( len == 0 ) {
lastTime = tempItem.getTimeEnd();
tempItem.setStatus("OK");
//out
output( tempItem.toCSV(), outfile);
} else {
if ( tempItem.getTimeStart().compareTo(lastTime) <= 0 ) {
//WARN
if (!tempItem.getStatus().equals("OVERLAP")) {
tempItem.setStatus("WARN");
}
} else {
//OK
//System.out.println( "OK ;" + tempItem.toCSV());
if (!tempItem.getStatus().equals("OVERLAP")) {
tempItem.setStatus("OK");
}
}
// file out write
output( tempItem.toCSV(), outfile);
lastTime = tempItem.getTimeEnd();
}
}
}
private static void output(String line, FileWriter outfile) throws IOException {
System.out.println(line);
// Write each line to a new csv file
outfile.write(line + "\n");
}
Why is my output.csv file 0 kb and empty? But when I print to line I see each string in my console...
You aren't closing the FileWriter.
NB The suggestion to flush as well as close is redundant.
After output( tempItem.toCSV(), outfile); please add the below statement. You forgot to flush. Close automatically flush for you.
outfile.close();
When you flush(outfile) it will be written to the file.
When you close(outfile) it will be flushed too, automatically. Sometimes you want to flush() at other times, but often it's not necessary. You should always close files when you've finished with them.
Since Java 7, it's often a good idea to use try-with-resources:
try(FileWriter outfile = new FileWriter("output.csv", true)) {
// code that writes to outfile
}
Because FileWriter implements Closeable, it will call outfile.close() automatically when execution leaves this block.

RDD Save as Text file

How can i save a text file with a delimited format using the RDD.save as Text file?.. Also i need to write the dataframe columns as headers.. How do i achieve that?
Is there a easier way than below for large RDDs..
List<Row> data = resultFrame.toJavaRDD().collect();
try {
File file = new File(fileName);
if (!file.exists()) {
file.createNewFile();
}
FileWriter fw = new FileWriter(file);
BufferedWriter bufferedWriter = new BufferedWriter(fw);
for (Row dataRow:data)
{
StringBuilder row = new StringBuilder();
for(int i = 0; i<dataRow.size();i++)
{
row.append(dataRow.get(i));
if (i != dataRow.size()-1)
{
row.append("~");
}
}
bufferedWriter.write(row.toString());
bufferedWriter.write("\n");
row.setLength(0);
}
bufferedWriter.close();
} catch (IOException e) {
LOGGER.error("Error in writing to the ruf file");
}
Just as you read using SQLContext.read (Java API), you need to use DataFrame.write (Java API).
The other ways are deprecated (e.g. SQLContext.parquetFile, SQLContext.jsonFile).
Thanks for the response. The following worked
public class TildaDelimiter implements Function<Row, String> {
public String call(Row r) {
return r.mkString("~");
}
}
in my save as i did the following to save as a ~ delimited file
resultFrame.toJavaRDD().map(new TildaDelimiter()).coalesce(1, true)
.saveAsTextFile(folderName);

java split string[] array to multiple files

I'm having a problem figuring out how to split a string to multiple files. At the moment I should get two files both with JSON data. The code below writes to the first file but leaves the second empty. Any ideas why?
public void splitFile(List<String> results) throws IOException {
int name = 0;
for (int i=0; i<results.size(); i ++) {
write = new FileWriter("/home/tom/files/"+ name +".json");
out = new BufferedWriter(write);
out.write(results.get(i));
if (results.get(i).startsWith("}")) {
name++;
}
}
}
Edit: it splits at line starting with { because that denotes the end of a JSON document.
Enhance the cut-control
Get togher this:
write = new FileWriter("/home/tom/files/"+ name +".json");
out = new BufferedWriter(write);
and this:
name++;
Check for starting, not for end
Check for line starting with {, and execute those three lines to open the file.
Remember to close and flush
If it's not the first line (i > 0) then close the last writer (write.close();).
Close the last opened writer
if (!results.isEmpty())
out.close();
Result
It should look something like this:
public void splitFile(List<String> results) throws IOException {
int name = 0;
BufferedWriter out = null;
for (int i=0; i<results.size(); i ++) {
String line = results.get(i);
if (line.startsWith("{")) {
if (out != null) // it's not the first
out.close(); // tell buffered it's going to close, it makes it flush
FileWriter writer = new FileWriter("/home/tom/files/"+ name +".json");
out = new BufferedWriter(writer);
name++;
}
if (out == null)
throw new IllegalArgumentException("first line doesn't start with {");
out.write(line);
}
if (out != null) // there was at least one file
out.close();
}
I would close your buffered writer after each completed write sequence. i.e. after each iteration through the loop before you assign write to a new FileWriter().
Closing the BufferedWriter will close the underlying FileWriter, and consequently force a flush on the data written to the disk.
Note: If you're using a distinct FileWriter per loop then I'd scope that variable to that inner loop e.g.
FileWriter write = new FileWriter("/home/tom/files/"+ name +".json");
The same goes for the BufferedWriter. In fact you can write:
BufferedWriter outer = new BufferedWriter(new FileWriter(...
and just deal with outer.
Try the following code..
public void splitFile(List<String> results) throws IOException {
int name = 0;
for (int i = 0; i < results.size(); i++) {
write = new FileWriter("/home/tom/files/" + name + ".json");
out = new BufferedWriter(write);
out.write(results.get(i));
out.flush();
out.close(); // you have to close your stream every time in your case.
if (results.get(i).startsWith("}")) {
name++;
}
}
}

Categories