On multiple api-requests old request processing stop - java

i have created and rest-api in core java.
this api process large number of data then returns excel file.
now the issue i'm facing is when i send single request from postman the excel contains complete data. but when i hit multiple request like 3 requests using postman the excel in first two requests contains incomplete data (only 40-60 records instead of 100) but for last request the excel again contains complete data.
it seems when ever i get new request the processing for old stops.
the api code
#Path("dynamictest")
#POST
#Produces(XLSX)
public Object getDynamicExcelReports(ReportParams params)
throws SQLException, IOException, IllegalAccessException, NoSuchFieldException {
params.setUserId(getUserId());
if (params.getMail()) {
new Thread(() -> {
try {
MimeBodyPart attachment = new MimeBodyPart();
attachment.setFileName("report.xlsx");
attachment.setDataHandler(new DataHandler(new ByteArrayDataSource(
Dynamic.getDynamicExcelReporttest(params).toByteArray(), "application/octet-stream")));
Context.getMailManager().sendMessage(
params.getUserId(), "Report", "The report is in the attachment.", attachment, User.class, null);
} catch (Exception e) {
LOGGER.warn("Report failed", e);
}
}).start();
return Response.noContent().build();
} else {
return Response.ok(Dynamic.getDynamicExcelReporttest(params).toByteArray())
.header(HttpHeaders.CONTENT_DISPOSITION, CONTENT_DISPOSITION_VALUE_XLSX).build();
}
}
the excel write function
public static ByteArrayOutputStream processExcelV2test(Collection<DynamicReport> reports, ReportParams params, RpTmplWrapper rpTmplWrapper,
Date from, Date to, boolean isDriverReport) throws IOException, IllegalAccessException, NoSuchFieldException, SQLException {
XSSFWorkbook workbook = new XSSFWorkbook();
List<RpTmplTblWrapper> tblListWrapper = new ArrayList(rpTmplWrapper.getRpTmplTblWrappers());
tblListWrapper.sort(Comparator.comparing(tblWrapper -> tblWrapper.getRpTmplTbl().getPosition()));
tblListWrapper.forEach((tblWrapper) -> { //loop for multiple sheets
try {
String sheetName = tblWrapper.getRpTmplTbl().getLabel().replaceAll("[^A-Za-z0-9]", "|");
Sheet sheet = workbook.createSheet(sheetName);
/**setting data in rows and columns**/
} catch (Exception ex) {}
});
Logger.getLogger(DynamicExcelUtils.class.getName()).log(Level.WARNING, "workbook completed");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
workbook.write(stream);
workbook.close();
return stream;
}
any help or suggestion will be helpful
thanks

Related

Wait subscribe method to get all data into array

I have an endpoint that its purpose is to receive a csv file, make a couple of changes to its name and then send this to a method to upload all the data it contains in a single file to Google Cloud as plain text.
The file can have more than 100,000 records, so when parsing it I have to save all the data in a variable and after that save it in Google Cloud. Today I can do it, but I am overwriting all the time the same file because I don't know how to indicate in the method to wait for the complete process of the subscribe and after that upload the file, so every time data is added to the array the file is uploaded again.
Although the method complies with the idea, I want to improve the performance of this, since to upload a file of only 2 mb with 100,000 records is taking approximately 15 minutes. Any idea?
private Storage storage;
private void uploadToGoogleCloudStorage(FilePart filePart, BlobInfo blobInfo) throws IOException {
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
filePart.content()
.subscribe(dataBuffer -> {
byte[] bytes = new byte[dataBuffer.readableByteCount()];
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
try {
bos.write(bytes);
storage.createFrom(blobInfo, new ByteArrayInputStream(bos.toByteArray()));
} catch (IOException e) {
e.printStackTrace();
}
});
}
}
Finally i get the solution. I change the subscribe to map, then i get the last response from the flux and with that i subscribe the response to upload the data to google cloud store with the Storage interface (package from google to use their api)
private Storage storage;
private void uploadToGoogleCloudStorage(FilePart filePart, BlobInfo blobInfo) throws IOException {
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
filePart.content()
.map(dataBuffer -> {
byte[] bytes = new byte[dataBuffer.readableByteCount()];
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
try {
bos.write(bytes);
} catch (IOException e) {
e.printStackTrace();
}
return bos;
}).last().subscribe(data -> {
try {
storage.createFrom(blobInfo, new ByteArrayInputStream(bos.toByteArray()));
} catch (IOException e) {
e.printStackTrace();
}
});
}
}

Unable to visualize pdf file after submitting via FilePond component in React to a servlet(backend)

I have a front React application with an upload document form via FilePond library and as backend I use Java to save the pdf document to my local files.
I am a little bit stucked beacause I have the pdf file and I can see the content(in Notepad/Code) but when I open it with Adobe/Microsoft Edge all pages are blank I don't know why.
In FilePond documentation it's not specified the defaut encoding and I don't know there is a way to see the encoding in the request(req).Also if it's not possible in this way how th send the file with getFileEncodeBase64String. Thank you very much for any ideas.
Code in React :
const APIClient = axios.create({
// baseURL: 'https://postman-echo.com',
baseURL: Config.faqServerUrl,
timeout: Config.timeout,
headers: {
'Accept': 'application/json;charset=UTF-8',
'Content-Type': 'application/json;charset=UTF-8'
}
});
Call backend :
function processFile(fieldName, file, metadata, load, error, progress, abort, setFileName) {
const formData = new FormData();
let feedbackWS = null;
formData.append('file', file, file.name);
try{
feedbackWS = APIClient.post('/upload-document',formData, {
onUploadProgress: (e) => {
// updating progress indicator
progress(e.lengthComputable, e.loaded, e.total);
}
}).then((response) => {
load(response.data);
setFileName(response.data);
})
.catch((error) => {
console.error(error);
});
} catch (error) {
console.log(error);
}
}
FilePond component :
<FilePond
server={{
process:(fieldName, file, metadata, load, error, progress, abort) => {
processFile(fieldName, file, metadata, load, error, progress, abort);
}
}}
oninit={() => handleInit()}
// callback onupdatefiles- a file has been added or removed, receives a list of file items
onupdatefiles={(fileItems) => {
// Set current file objects to this.state
setFile(fileItems.map(fileItem => fileItem.file));
}}
instantUpload={true}
onprocessfile={(error, file)=>{
console.log('PROCESSED', file, 'SERVER_ID', file.serverId);
// console.log('ERROR PROCESSED', error);
}}
onaddfile={(error, file)=>{
console.log('PROCESSED', file, 'SERVER_ID', file.serverId);
// console.log('ERROR PROCESSED', error);
}}
/>
Java code :
protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
System.out.println("encoding" + req.getCharacterEncoding()); // it is null
nom_upload_faq = "test.pdf";
String repertoireFaqTemp = "/temp/";
String pathComplet = repertoireFaqTemp + nom_upload_faq;
File fichierTemp = null;
try {
fichierTemp = new File(pathComplet);
if (fichierTemp.exists() && fichierTemp.isFile()) {
fichierTemp.delete();
}
if (fichierTemp.createNewFile()) {
fichierTemp.setReadable(true, false);
fichierTemp.setWritable(true, false);
fichierTemp.setExecutable(true, false);
fichierTemp.deleteOnExit();
} else {
System.out.println("Impossible d'arriver ici on l'a déjà supprimer avant");
}
} catch (IOException e) {
System.out.println("An error occurred.");
e.printStackTrace();
}
//byte[] fileAsBytes = getArrayFromInputStream(req.getInputStream());
byte[] fileAsBytes = null;
try {
List<FileItem> items = new ServletFileUpload(new DiskFileItemFactory()).parseRequest(req);
for (FileItem item: items) {
if(!item.isFormField()) { // it'a file
fileAsBytes = getArrayFromInputStream(item.getInputStream());
}
}
} catch (FileUploadException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try (BufferedOutputStream output = new BufferedOutputStream(new FileOutputStream(fichierTemp))) {
output.write(fileAsBytes);
output.flush();
output.close();
}
} catch (Exception e) {
e.printStackTrace();
}
}
private static byte[] getArrayFromInputStream(InputStream inputStream) throws IOException {
byte[] bytes = null;
byte[] buffer = new byte[1024];
try (BufferedInputStream is = new BufferedInputStream(inputStream)) {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int length;
while ((length = is.read(buffer)) > -1) {
bos.write(buffer, 0, length);
}
bos.flush();
bytes = bos.toByteArray();
} catch (Exception e) {
System.out.println("toto");
}
return bytes;
}
Edit: Thank you Rik indeed the call formData.append('file', file, file.name); is the right one but still the same problem with the encoding(adding ? symbol) this is why I thought that it will be maybe the only possiblity to send the pdf file as base64 I've installed the plugin but as it'a asyncronous I can't call the methods getFileEncodeBase64String and in the servlet the request stream it's not the right one as you can see in the picture.
here you can see the sequence before the mouse selection I have an 8 with ?
You don't have to encode the file
formData.append('file', file.getFileEncodeBase64String(), file.name);
You can do:
formData.append('file', file, file.name);
And your browser will post a file object which you can handle like any other file posted with a form. See: https://developer.mozilla.org/en-US/docs/Web/API/FormData/Using_FormData_Objects

Stream csv file download immediately without waiting

I have REST API, which calls a method to retrieve CSV, as csv can be millions of rows, I am retrieving 50k rows at a time, I want the download to begin as soon as I receive first query result and subsequently keep writing to the same file.
I have written the following code, but still the file download starts only after all the result is retrieved.
#GET
#Path("network/all/stats/csv/download")
#Produces({"text/csv"})
public Response downloadCSV(UIAffiliateRequest uiInstallsRequest) {StreamingOutput stream = new StreamingOutput() {
#Override
public void write(OutputStream os) throws IOException, WebApplicationException {
Writer writer = new BufferedWriter(new OutputStreamWriter(os));
int off=0;
try {
CSVResponseType rs;
do {
rs = installsDao.getCSVResult(uiInstallsRequest, off);
String lines[] = rs.getFileName().split("\\r?\\n");
for (String path : lines) {
writer.write(path.toString() + "\n");
}
writer.flush();
off+=50000;
}
while (rs.isEmpty());
}
catch (DaoException e) {
ResponseUtils.getDataAccessErrorResponse(e);
}
}
};
return Response.ok(stream).header("Content-Disposition", "attachment;filename=resp.csv").build();
}

Apache POI 3.14 Writing SXSSF to ServletResponse OutputStream is corrupting workbook JSF

In my webapp, I'm building SXSSFWorkbook objects to generate reports with roughly 30-60k records each. I kick off a process through a request to first fetch and build each SXSSFWorkbook. I'm able to generate the report and open a FileOutputStream to export my object to my desktop(locally of course). However, I want to let user choose which report to download through a request(JSF)to the server. When I feed the OutputStream from the servlet response, I can download the .xlsx file but it tells me it's been corrupted. I've done some research, tried some different workarounds all with no results. Posted is my code. I'm kind of at a loss for what's going on here.
p.s. I was previously generating HSSFWorkbook objects and downloading them but they were starting to causing heap space issues. Hence, the switch to SXSSFWorkbook.
My command button
<h:commandButton value="Download Report" styleClass="secondary-button"
action="#{backingBean.getLatestReport}"
id="thirdReportButton">
</h:commandButton>
My action
public void getLatestReport() throws Exception {
FacesContext faces = FacesContext.getCurrentInstance();
String templateName = "Report.xlsx";
HttpServletResponse response = null;
OutputStream outputStream = null;
//workbookForLatestReport is a SXSSFWorkbook object
try {
if (workbookForLatestReport != null) {
response = (HttpServletResponse) faces.getExternalContext()
.getResponse();
response.reset();
response.setContentType("application/vnd.ms-excel");
response.setHeader("Content-disposition",
"attachment; filename=\"" + templateName + "\"");
outputStream = response.getOutputStream();
workbookForLatestReport.write(outputStream);
outputStream.close();
workbookForLatestReport.dispose();
}
faces.renderResponse();
} catch (Exception e) {
throw e;
}
}
Recently I successfully accomplished similar task so I might be able to help you.
I've just ran your code (on Payara 4.1 using Chrome as browser) adding part that you omitted in your post
#ManagedBean(name = "backingBean")
#ViewScoped
public class BackingBean {
//your command button should call getLatestReport and not getSecondReport() as in your original post
public void getLatestReport() throws Exception {
FacesContext faces = FacesContext.getCurrentInstance();
String templateName = "Report.xlsx";
HttpServletResponse response = null;
OutputStream outputStream = null;
//workbookForLatestReport is a SXSSFWorkbook object
//MY ADDITION START
//I've created SXSSFWorkbook object since your post did't contain this part
//100K rows, 100 columns
SXSSFWorkbook workbookForLatestReport = new SXSSFWorkbook(SXSSFWorkbook.DEFAULT_WINDOW_SIZE);
workbookForLatestReport.setCompressTempFiles(true);
SXSSFSheet sheet = workbookForLatestReport.createSheet();
for (int rowNumber = 0; rowNumber < 100000; rowNumber++) {
SXSSFRow row = sheet.createRow(rowNumber);
for (int columnNumber = 0; columnNumber < 100; columnNumber++) {
SXSSFCell cell = row.createCell(columnNumber);
cell.setCellValue("ROW " + rowNumber + " COLUMN " + columnNumber);
}
}
//MY ADDITION END
try {
if (workbookForLatestReport != null) {
response = (HttpServletResponse) faces.getExternalContext()
.getResponse();
response.reset();
response.setContentType("application/vnd.ms-excel");
response.setHeader("Content-disposition",
"attachment; filename=\"" + templateName + "\"");
outputStream = response.getOutputStream();
workbookForLatestReport.write(outputStream);
outputStream.close();
workbookForLatestReport.dispose();
}
faces.renderResponse();
} catch (Exception e) {
throw e;
}
}
}
It was working just fine and as expected.
I have 2 suggestions:
your command button "calls" action="#{backingBean.getSecondReport}"
but your managed bean action is named public void getLatestReport.
Check out if it is typing error or not.
compare your SXSSFWorkbook object creation code with my example. Are
there any crucial differences?
I was able to come to a solution with the aid of omifaces and changing some code around.
After I create my workbook, I use a ByteArrayStream to set a byte[] attribute on my model bean that can be referenced when the download listener from the jsf is clicked.
ByteArrayOutputStream bos;
byte[] workbookForLatestBytes;
XSSFXWorkbook workbookForLatestReport;
.
.
.
workbookForLatestReport = <generate the report here>
if(workbookForLatestReport != null){
bos = new ByteArrayOutputStream();
workbookForLatestReport.write(bos);
workbookForLatestBytes = bos.toByteArray();
workbookForPreviousReport.dispose();
}
Here is the action being fired from my JSF code.
<h:commandButton value="Download Report"
styleClass="secondary-button"
action="#{productRuleAuditCompareBackingBean.getSecondReport}"
id="thirdReportButton"
rendered="#{not empty productRuleAuditCompareModelBean.workbookForLatestBytes}">
</h:commandButton>
My backing bean action is as follows. I reset the HTTP Response before writing the byte array to the response output stream.
public void getSecondReport() {
FacesContext faces = FacesContext.getCurrentInstance();
try {
faces.getExternalContext().responseReset();
this.prAuditCompareModelBean.getLatestReport();
faces.responseComplete();
} catch (Exception e) {
PODBException.wrap(e, PODBExceptionInformation.create(
PODBExceptionType.ERROR,
ProductRuleAuditCompareBackingBean.class.getName(),
"getting first report",
"Error while getting second report for download", "1"));
}
}
Here I'm using Omniface Faces#sendFile method to write the workbook to the response outputstream.
public void getLatestReport() throws Exception {
try {
if (getWorkbookForLatestBytes() != null) {
Faces.sendFile(getWorkbookForLatestBytes(), reportName, true);
}
} catch (Exception e) {
throw e;
}
}

Exception : Stream is closed during downloading multiple attachments

I have requirement where a user can upload a file and later if other user see that he can download a file. New requirement suggest that now a user can upload multiple attachments and any user who see it can download multiple attachment as well.
So i took a list in which attachments are added and direct it to download controller, i changed the earlier line and kept a for-loop but during download only first attachment is downloaded and later it gives exception stream is closed.Below is the code of controller.Please let me know how can i over come this?
#ApiOperation(value = "Download content")
#RequestMapping(value = "/api/content/{id}/download/", method = RequestMethod.GET)
public ResponseEntity<String> downloadContent(HttpServletResponse response, #PathVariable("id") final Long id)
throws IOException, APIException {
Content content = null;
try {
content = this.contentService.get(this.contentUtils.getContentObject(id));
} catch (ServiceException e) {
throw new APIException("Access denied");
}
if (null == content) {
throw new APIException("Invalid content id");
}
List<Document> documentList = this.contentService.getDocumentByContent(content);
if (documentList != null && !documentList.isEmpty()) {
//Document document = documentList.get(0); //If multiple files supported?, then need to be handled here
for (Document document : documentList) {
File file = new File(document.getLocalFilePath());
if (file.exists()) {
response.setHeader("Content-Disposition", "attachment;filename=\"" + file.getName() + "\"");
try (InputStream inputStream = new FileInputStream(file); ServletOutputStream sos = response.getOutputStream();) {
IOUtils.copy(inputStream, sos);
} catch (final IOException e) {
LOGGER.error("File not found during content download" + id, e);
throw new APIException("Error during content download:" + id);
}
} else {
try {
s3FileUtil.download(document.getS3Url(), document.getLocalFilePath());
} catch (S3UtilException e) {
throw new APIException("Document not found");
}
}
}
} else {
//404
return new ResponseEntity<String>(HttpStatus.NOT_FOUND);
}
return new ResponseEntity<String>(HttpStatus.OK);
}
practically you cannot download all the files at once. Because, once you open a stream and write the file content to the stream, you have to close the stream.
When you add the files in for loop, you have to append the file content instead of file, which is not an expected behavior.
When you want to download multiple files at once, you have to zip
the files and download.
check this link: download multiple files

Categories