I have REST API, which calls a method to retrieve CSV, as csv can be millions of rows, I am retrieving 50k rows at a time, I want the download to begin as soon as I receive first query result and subsequently keep writing to the same file.
I have written the following code, but still the file download starts only after all the result is retrieved.
#GET
#Path("network/all/stats/csv/download")
#Produces({"text/csv"})
public Response downloadCSV(UIAffiliateRequest uiInstallsRequest) {StreamingOutput stream = new StreamingOutput() {
#Override
public void write(OutputStream os) throws IOException, WebApplicationException {
Writer writer = new BufferedWriter(new OutputStreamWriter(os));
int off=0;
try {
CSVResponseType rs;
do {
rs = installsDao.getCSVResult(uiInstallsRequest, off);
String lines[] = rs.getFileName().split("\\r?\\n");
for (String path : lines) {
writer.write(path.toString() + "\n");
}
writer.flush();
off+=50000;
}
while (rs.isEmpty());
}
catch (DaoException e) {
ResponseUtils.getDataAccessErrorResponse(e);
}
}
};
return Response.ok(stream).header("Content-Disposition", "attachment;filename=resp.csv").build();
}
Related
i have created and rest-api in core java.
this api process large number of data then returns excel file.
now the issue i'm facing is when i send single request from postman the excel contains complete data. but when i hit multiple request like 3 requests using postman the excel in first two requests contains incomplete data (only 40-60 records instead of 100) but for last request the excel again contains complete data.
it seems when ever i get new request the processing for old stops.
the api code
#Path("dynamictest")
#POST
#Produces(XLSX)
public Object getDynamicExcelReports(ReportParams params)
throws SQLException, IOException, IllegalAccessException, NoSuchFieldException {
params.setUserId(getUserId());
if (params.getMail()) {
new Thread(() -> {
try {
MimeBodyPart attachment = new MimeBodyPart();
attachment.setFileName("report.xlsx");
attachment.setDataHandler(new DataHandler(new ByteArrayDataSource(
Dynamic.getDynamicExcelReporttest(params).toByteArray(), "application/octet-stream")));
Context.getMailManager().sendMessage(
params.getUserId(), "Report", "The report is in the attachment.", attachment, User.class, null);
} catch (Exception e) {
LOGGER.warn("Report failed", e);
}
}).start();
return Response.noContent().build();
} else {
return Response.ok(Dynamic.getDynamicExcelReporttest(params).toByteArray())
.header(HttpHeaders.CONTENT_DISPOSITION, CONTENT_DISPOSITION_VALUE_XLSX).build();
}
}
the excel write function
public static ByteArrayOutputStream processExcelV2test(Collection<DynamicReport> reports, ReportParams params, RpTmplWrapper rpTmplWrapper,
Date from, Date to, boolean isDriverReport) throws IOException, IllegalAccessException, NoSuchFieldException, SQLException {
XSSFWorkbook workbook = new XSSFWorkbook();
List<RpTmplTblWrapper> tblListWrapper = new ArrayList(rpTmplWrapper.getRpTmplTblWrappers());
tblListWrapper.sort(Comparator.comparing(tblWrapper -> tblWrapper.getRpTmplTbl().getPosition()));
tblListWrapper.forEach((tblWrapper) -> { //loop for multiple sheets
try {
String sheetName = tblWrapper.getRpTmplTbl().getLabel().replaceAll("[^A-Za-z0-9]", "|");
Sheet sheet = workbook.createSheet(sheetName);
/**setting data in rows and columns**/
} catch (Exception ex) {}
});
Logger.getLogger(DynamicExcelUtils.class.getName()).log(Level.WARNING, "workbook completed");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
workbook.write(stream);
workbook.close();
return stream;
}
any help or suggestion will be helpful
thanks
I've got kinda weird situation, I have methods:
public void generateRecords(Request request) {
String pathToFile = request.getPathFile();
String recordOne = generateRecordOne(request);
String recordTwo = generateRecordTwo(request);
fileService.writeToFile(pathToFile, recordOne);
fileService.writeToFile(pathToFile, recordTwo);
}
public void writeToFile(String path, String content) {
try {
FileWriter writer = new FileWriter(path, true);
writer.append(content);
writer.close();
} catch (IOException e) {
e.printStack();
}
}
generateRecords() is executing is rest endpoint. I am getting something like this:
id:1:record1
id:2:record1
id:1:record2
id:2:record2
While I would like to get something like this:
id:1:record1
id:1:record2
id:2:record1
id:2:record2
It is occuring sometimes, but still it is destroying my file. How can I avoid this?
Try using synchronized on writeToFile method.
Also, consider using the try-with-resources statement. In the code you have right now, an exception in your writer would lead to not closing the FileWriter.
public synchronized void writeToFile(String path, String content) {
try (FileWriter writer = new FileWriter(path, true)) {
writer.append(content);
} catch (IOException e) {
e.printStackTrace();
}
}
I'm working on compiling a bunch of tweets for an information retrieval class. I'm trying this using both the REST API and the Streaming API through twitter4j. When using the Streaming API, I use the following modifications to this example:
final LimitedFileWriter output = new LimitedFileWriter("Tweets","tweets");
TwitterStream twitterStream = new TwitterStreamFactory(cb.build()).getInstance();
StatusListener listener = new StatusListener() {
#Override
public void onStatus(Status status) {
try{
output.write("#" + status.getUser().getScreenName() + " -- " + status.getText()+"\n");
}
catch(IOException e){
e.printStackTrace();
}
}
}
twitterStream.addListener(listener);
twitterStream.sample("en");
//output.close();
It seems I can't ever close my writer. The writer I am using simply wraps BufferedWriter, while keeping track of file size. If the file exceeds a certain size (128MB), the writer will close the current file and create a new file. Here are the relevant class functions:
public void write(String s) throws IOException
{
if(bytesWritten + s.getBytes(charset).length >= MAXSIZE){
output.close();
bytesWritten = 0;
fileNum++;
String fileName = directory + "/" + baseName+fmt.format(fileNum);
currentFile = new File(fileName);
output = new BufferedWriter
(new OutputStreamWriter(new FileOutputStream(fileName),charset));
}
output.write(s);
bytesWritten += s.getBytes(charset).length;
}
public void close() throws IOException{
output.close();
}
If I try to close the writer after twitterStream.sample() (commented out), the program crashes because I am trying to write to a closed file. If my understanding is correct, this is because the TwitterStream class creates a new thread which runs concurrently with the main thread. Then, the main thread closes the stream and the twitterStream can no longer write to it.
If that's the case, where should I close my writer?
If I have understood your question correctly, you want to be able to turn of the tweets collection at some point, close your open file writers and have a clean exit. To achieve it you can use a synchronized block.
final Object lock = new Object();
final LimitedFileWriter output = new LimitedFileWriter("Tweets","tweets");
TwitterStream twitterStream = new TwitterStreamFactory(cb.build()).getInstance();
StatusListener listener = new StatusListener() {
#Override
public void onStatus(Status status) {
try{
output.write("#" + status.getUser().getScreenName() + " -- " + status.getText()+"\n");
// free the lock
if (some_condition_like_I_have_enough_files) {
synchronized (lock) {
lock.notify();
}
}
catch(IOException e){
e.printStackTrace();
}
}
}
twitterStream.addListener(listener);
twitterStream.sample("en");
try {
synchronized (lock) {
lock.wait();
}
} catch (InterruptedException e) {
e.printStackTrace();
}
// close the twitterstream
// close the writer
I have a couple of class Foo and FooContainer. Foo is the JSONObject returned from the API I'm fetching from and I use FooContainer alongside Gson to parse the incoming request.
What I'm unclear about is that the API involves pagination and each time I make a new request, I'm unsure how to save the previously loaded data along with the newest data.
In other words, right now when I save the latest FooContainer, none of the previous data is saved along with it. I suppose I'm just unclear on how to effectively use Gson to save all of my data for offline access.
FooContainer
public class FooContainer {
public List<Foo> foos;
}
put
public void put(FooContainer fooContainer) {
final File file = getFileForKey(FOO_CACHE);
Writer writer = null;
try {
writer = new BufferedWriter(new FileWriter(file));
mGson.toJson(fooContainer, FooContainer.class, new JsonWriter(writer));
} catch (final IOException e) {
LOGE(TAG, "put - Error adding FooContainer", e);
} finally {
IOUtils.closeQuietly(writer);
}
}
get
public FooContainer get() {
Reader reader = null;
try {
reader = new BufferedReader(new FileReader(getFileForKey(FOO_CACHE)));
return mGson.fromJson(reader, FooContainer.class);
} catch (final IOException e) {
LOGE(TAG, "get - Error retrieving FooContainer", e);
} finally {
IOUtils.closeQuietly(reader);
}
return null;
}
I'm using an AsyncTaskLoader to fetch the data, which basically just consists of:
new Gson().fromJson(reader, FooContainer.class);
I need to list all subfolders in a directory and written on to text file.But when i coded only the last subfolder is only written on to the file.Please help.I am a beginner to Java.
public class Main {
// private Object bufferedWriter;
/**
* Prints some data to a file using a BufferedWriter
*/
public void writeToFile(String filename) {
try
{
BufferedWriter bufferedWriter = null;
bufferedWriter = new BufferedWriter(new FileWriter(filename));
int i=1;
File f=new File("D:/Moviezzz");
File[] fi=f.listFiles();
for(File fil:fi)
{
if(fil.isHidden())
{
System.out.print("");
}
else if(fil.isDirectory()||fil.isFile())
{
int s=i++;
String files = fil.getName();
//Start writing to the output stream
bufferedWriter.write(s+" "+fil);
bufferedWriter.newLine();
// bufferedWriter.write(s+" "+files);
}
}
//Construct the BufferedWriter object
} catch (FileNotFoundException ex) {
ex.printStackTrace();
}catch (IOException ex) {
ex.printStackTrace();}
}
public static void main(String[] args) {
new Main().writeToFile("d://my.txt");
}
}
Uptil you call flush() method of BufferWriter class it will not write your data to file.
It is not necessary to flush() every time in a loop. But you can write it after your end of the loop.
Main thing to put that yourObj.flush() is to keep your buffer memory clean. as after call of that flush() method, data will be release from memory and written to your file.
Close the BufferedReader after the loop.
for(File fil:fi)
{
...
}
bufferedReader.close();
Also, I suggest these changes in your code to make it more readable and efficient:
BufferedWriter bufferedWriter = new BufferedWriter(new FileWriter(filename));
...
if(!fil.isHidden() && (fil.isDirectory() || fil.isFile()))
{
...
}
You can create the BufferedReaderdirectly. Then, you are getting the file name, but not doing anything with it, so just remove the get. And last, you don't have to have put System.out.print(""); in an if to check if the file is hidden. You can use an empty statement or even no code, or use the ! operator to invert.
if(fil.isHidden())
{
; // Do nothing
}
else
{
// Do something
}
if(fil.isHidden()); // Do nothing
else
{
// Do something
}
if(!fil.isHidden)
{
// Do something
}