Wait subscribe method to get all data into array - java

I have an endpoint that its purpose is to receive a csv file, make a couple of changes to its name and then send this to a method to upload all the data it contains in a single file to Google Cloud as plain text.
The file can have more than 100,000 records, so when parsing it I have to save all the data in a variable and after that save it in Google Cloud. Today I can do it, but I am overwriting all the time the same file because I don't know how to indicate in the method to wait for the complete process of the subscribe and after that upload the file, so every time data is added to the array the file is uploaded again.
Although the method complies with the idea, I want to improve the performance of this, since to upload a file of only 2 mb with 100,000 records is taking approximately 15 minutes. Any idea?
private Storage storage;
private void uploadToGoogleCloudStorage(FilePart filePart, BlobInfo blobInfo) throws IOException {
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
filePart.content()
.subscribe(dataBuffer -> {
byte[] bytes = new byte[dataBuffer.readableByteCount()];
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
try {
bos.write(bytes);
storage.createFrom(blobInfo, new ByteArrayInputStream(bos.toByteArray()));
} catch (IOException e) {
e.printStackTrace();
}
});
}
}

Finally i get the solution. I change the subscribe to map, then i get the last response from the flux and with that i subscribe the response to upload the data to google cloud store with the Storage interface (package from google to use their api)
private Storage storage;
private void uploadToGoogleCloudStorage(FilePart filePart, BlobInfo blobInfo) throws IOException {
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
filePart.content()
.map(dataBuffer -> {
byte[] bytes = new byte[dataBuffer.readableByteCount()];
dataBuffer.read(bytes);
DataBufferUtils.release(dataBuffer);
try {
bos.write(bytes);
} catch (IOException e) {
e.printStackTrace();
}
return bos;
}).last().subscribe(data -> {
try {
storage.createFrom(blobInfo, new ByteArrayInputStream(bos.toByteArray()));
} catch (IOException e) {
e.printStackTrace();
}
});
}
}

Related

On multiple api-requests old request processing stop

i have created and rest-api in core java.
this api process large number of data then returns excel file.
now the issue i'm facing is when i send single request from postman the excel contains complete data. but when i hit multiple request like 3 requests using postman the excel in first two requests contains incomplete data (only 40-60 records instead of 100) but for last request the excel again contains complete data.
it seems when ever i get new request the processing for old stops.
the api code
#Path("dynamictest")
#POST
#Produces(XLSX)
public Object getDynamicExcelReports(ReportParams params)
throws SQLException, IOException, IllegalAccessException, NoSuchFieldException {
params.setUserId(getUserId());
if (params.getMail()) {
new Thread(() -> {
try {
MimeBodyPart attachment = new MimeBodyPart();
attachment.setFileName("report.xlsx");
attachment.setDataHandler(new DataHandler(new ByteArrayDataSource(
Dynamic.getDynamicExcelReporttest(params).toByteArray(), "application/octet-stream")));
Context.getMailManager().sendMessage(
params.getUserId(), "Report", "The report is in the attachment.", attachment, User.class, null);
} catch (Exception e) {
LOGGER.warn("Report failed", e);
}
}).start();
return Response.noContent().build();
} else {
return Response.ok(Dynamic.getDynamicExcelReporttest(params).toByteArray())
.header(HttpHeaders.CONTENT_DISPOSITION, CONTENT_DISPOSITION_VALUE_XLSX).build();
}
}
the excel write function
public static ByteArrayOutputStream processExcelV2test(Collection<DynamicReport> reports, ReportParams params, RpTmplWrapper rpTmplWrapper,
Date from, Date to, boolean isDriverReport) throws IOException, IllegalAccessException, NoSuchFieldException, SQLException {
XSSFWorkbook workbook = new XSSFWorkbook();
List<RpTmplTblWrapper> tblListWrapper = new ArrayList(rpTmplWrapper.getRpTmplTblWrappers());
tblListWrapper.sort(Comparator.comparing(tblWrapper -> tblWrapper.getRpTmplTbl().getPosition()));
tblListWrapper.forEach((tblWrapper) -> { //loop for multiple sheets
try {
String sheetName = tblWrapper.getRpTmplTbl().getLabel().replaceAll("[^A-Za-z0-9]", "|");
Sheet sheet = workbook.createSheet(sheetName);
/**setting data in rows and columns**/
} catch (Exception ex) {}
});
Logger.getLogger(DynamicExcelUtils.class.getName()).log(Level.WARNING, "workbook completed");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
workbook.write(stream);
workbook.close();
return stream;
}
any help or suggestion will be helpful
thanks

How do I create a file sending client/server with RSocket?

I can't seem to find any resources/tutorials on RSocket, other than just reading their code on GitHub, which I don't understand.
I have a file's path on my server: String serverFilePath;
I'd like to be able to download it from my client (using RSocket's Aeron implementation, preferably). Does anyone know how to do this using RSocket?
Thanks in advance.
I work on RSocket, and wrote a large portion of the java version including the Aeron transport.
I wouldn't recommend using the Aeron implementation currently. There's a couple ways you can send files:
Using a requestChannel to push the data to a remote server.
Use requestChannel or requestStream to stream bytes to a client.
Here's an example using requestStream:
public class FileCopy {
public static void main(String... args) throws Exception {
// Create a socket that receives incoming connections
RSocketFactory.receive()
.acceptor(
new SocketAcceptor() {
#Override
// Create a new socket acceptor
public Mono<RSocket> accept(ConnectionSetupPayload setup, RSocket sendingSocket) {
return Mono.just(
new AbstractRSocket() {
#Override
public Flux<Payload> requestStream(Payload payload) {
// Get the path of the file to copy
String path = payload.getDataUtf8();
SeekableByteChannel _channel = null;
try {
_channel = Files.newByteChannel(Paths.get(path), StandardOpenOption.READ);
} catch (IOException e) {
return Flux.error(e);
}
ReferenceCountUtil.safeRelease(payload);
SeekableByteChannel channel = _channel;
// Use Flux.generate to create a publisher that returns file at 1024 bytes
// at a time
return Flux.generate(
sink -> {
try {
ByteBuffer buffer = ByteBuffer.allocate(1024);
int read = channel.read(buffer);
buffer.flip();
sink.next(DefaultPayload.create(buffer));
if (read == -1) {
channel.close();
sink.complete();
}
} catch (Throwable t) {
sink.error(t);
}
});
}
});
}
})
.transport(TcpServerTransport.create(9090))
.start()
.subscribe();
String path = args[0];
String dest = args[1];
// Connect to a server
RSocket client =
RSocketFactory.connect().transport(TcpClientTransport.create(9090)).start().block();
File f = new File(dest);
f.createNewFile();
// Open a channel to a new file
SeekableByteChannel channel =
Files.newByteChannel(f.toPath(), StandardOpenOption.CREATE, StandardOpenOption.WRITE);
// Request a stream of bytes
client
.requestStream(DefaultPayload.create(path))
.doOnNext(
payload -> {
try {
// Write the bytes received to the new file
ByteBuffer data = payload.getData();
channel.write(data);
// Release the payload
ReferenceCountUtil.safeRelease(payload);
} catch (Exception e) {
throw new RuntimeException(e);
}
})
// Block until all the bytes are received
.blockLast();
// Close the file you're writing too
channel.close();
}
}
There is now a resumable file transfer example here
https://github.com/rsocket/rsocket-java/commit/d47629147dd1a4d41c7c8d5af3d80838e01d3ba5

Writing with an OutputStream to a DocumentFile: data seem to be written but file ends up empty

The application KDE Connect allows remotely browsing an Android device from a desktop computer through SFTP. Since Android 4.4, developers don't have write permission to SD cards directly through the filesystem anymore. So I am trying to port the SFTP module using the Storage Access Framework (DocumentFile, etc.)
I am taking the permission with an Intent.ACTION_OPEN_DOCUMENT_TREE and FLAG_GRANT_WRITE_URI_PERMISSION and passing the context to my classes.
I am able to create new empty files, rename files and delete files on the SD card inside my class so I believe I am getting the necessary permissions. However, transferring a file results in an empty file (0 bytes) being created. I can see the transfer taking a certain time and a progress bar on the desktop side, so it doesn't just abort.
Here is the relevant part of the SftpSubsystem class from the Apache SSHD library (see doc here) with my own comments to explain what's going on:
public class SftpSubsystem implements Command, Runnable, SessionAware, FileSystemAware {
// This method receives a buffer from an InputStream and processes it
// according to its type. In this situation, it would also contain
// a block of the file being transferred (4096 bytes)
protected void process(Buffer buffer) {
int type = buffer.getByte();
switch (type) {
case WRITE:
FileHandle fh = getHandleFromString(buffer.getString());
long offset = buffer.getLong();
byte[] data = buffer.getBytes();
fh.write(data, offset);
break;
// other cases
}
}
// This class is a handle to a file (duh) with
// an OutputStream to write and InputStream to read
protected static class FileHandle {
SshFile file;
OutputStream output;
long outputPos;
InputStream input;
long inputPos;
// Method called inside process()
public void write(byte[] data, long offset) throws IOException {
if (output != null && offset != outputPos) {
IoUtils.closeQuietly(output);
output = null;
}
if (output == null) {
// This is called once at the start of the transfer.
// This is what I think I need to rewrite to make
// it work with DocumentFile objects.
output = file.createOutputStream(offset);
}
output.write(data);
outputPos += data.length;
}
}
}
The original implementation of createOutputStream() that I want to rewrite because RandomAccessFile doesn't work with DocumentFile:
public class NativeSshFile implements SshFile {
private File file;
public OutputStream createOutputStream(final long offset)
throws IOException {
// permission check
if (!isWritable()) {
throw new IOException("No write permission : " + file.getName());
}
// move to the appropriate offset and create output stream
final RandomAccessFile raf = new RandomAccessFile(file, "rw");
try {
raf.setLength(offset);
raf.seek(offset);
// The IBM jre needs to have both the stream and the random access file
// objects closed to actually close the file
return new FileOutputStream(raf.getFD()) {
public void close() throws IOException {
super.close();
raf.close();
}
};
} catch (IOException e) {
raf.close();
throw e;
}
}
}
One of the ways I tried to implement it:
class SimpleSftpServer {
static class AndroidSshFile extends NativeSshFile {
// This is the DocumentFile that is stored after
// create() created the empty file
private DocumentFile docFile;
public OutputStream createOutputStream(final long offset) throws IOException {
// permission check
if (!isWritable()) {
throw new IOException("No write permission : " + docFile.getName());
}
ParcelFileDescriptor pfd = context.getContentResolver().openFileDescriptor(docFile.getUri(), "rw");
FileDescriptor fd = pfd.getFileDescriptor();
try {
android.system.Os.lseek(fd, offset, OsConstants.SEEK_SET);
} catch (ErrnoException e) {
Log.e("SimpleSftpServer", "" + e);
return null;
}
return new FileOutputstream(fd, offset);
}
}
}
I also tried a simple (the offset is ignored but it's just a test):
public OutputStream createOutputStream(final long offset) throws IOException {
// permission check
if (!isWritable()) {
throw new IOException("No write permission : " + docFile.getName());
}
return context.getContentResolver().openOutputStream(docFile.getUri());
}
I also tried with a FileChannel and to flush and sync the FileOutputStream.
Any idea why I end up with an empty file?
EDIT: here is a small example of a test I did to just write a new file from an existing file. It works, but this is not what I actually want to do (see code above) but I thought I'd provide an example to show that I understand the basics of how to write to an OutputStream.
private void createDocumentFileFromFile() {
File fileToRead = new File("/storage/0123-4567/lady.m4a");
File fileToWrite = new File("/storage/0123-4567/lady2.m4a");
File dir = fileToWrite.getParentFile();
DocumentFile docDir = DocumentFile.fromTreeUri(context, SimpleSftpServer.externalStorageUri);
try {
DocumentFile createdFile = docDir.createFile(null, fileToWrite.getName());
Uri uriToRead = Uri.fromFile(fileToRead);
InputStream in = context.getContentResolver().openInputStream(uriToRead);
OutputStream out = context.getContentResolver().openOutputStream(createdFile.getUri());
try {
int nbOfBytes = 0;
final int BLOCKSIZE = 4096;
byte[] bytesRead = new byte[BLOCKSIZE];
while (true) {
nbOfBytes = in.read(bytesRead);
if (nbOfBytes == -1) {
break;
}
out.write(bytesRead, 0, nbOfBytes);
}
} finally {
in.close();
out.close();
}
} catch (IOException e) {
}
}
"When using ACTION_OPEN_DOCUMENT_TREE, your app gains access only to the files in the directory that the user selects. You don't have access to other apps' files that reside outside this user-selected directory.
This user-controlled access allows users to choose exactly what content they're comfortable sharing with your app."
This means, you can only read/write/delete the content/meta data of already existing files or in sub directories in the selected directory, the scope that the user accept to be "comfortable" with.
Actually the user granted permmision to a list of Uri's in this folder for ea file/sub directory there is seperate uri permmision.
Now for example if I will try to create new file in the selected Uri using DocumentFile Ill success but if i will try to outputatream new data to this file I will fail because the user did not grant permision to write to this newly created file.
He only granted to write in the directory path level, means create new file here.
So same happens when you try to move/transfer file to other path that does not have permission from the user.
Path can be folder or file and for ea new path the user needs to grant new access.
move file = new path
write to just created file = new path

Exception : Stream is closed during downloading multiple attachments

I have requirement where a user can upload a file and later if other user see that he can download a file. New requirement suggest that now a user can upload multiple attachments and any user who see it can download multiple attachment as well.
So i took a list in which attachments are added and direct it to download controller, i changed the earlier line and kept a for-loop but during download only first attachment is downloaded and later it gives exception stream is closed.Below is the code of controller.Please let me know how can i over come this?
#ApiOperation(value = "Download content")
#RequestMapping(value = "/api/content/{id}/download/", method = RequestMethod.GET)
public ResponseEntity<String> downloadContent(HttpServletResponse response, #PathVariable("id") final Long id)
throws IOException, APIException {
Content content = null;
try {
content = this.contentService.get(this.contentUtils.getContentObject(id));
} catch (ServiceException e) {
throw new APIException("Access denied");
}
if (null == content) {
throw new APIException("Invalid content id");
}
List<Document> documentList = this.contentService.getDocumentByContent(content);
if (documentList != null && !documentList.isEmpty()) {
//Document document = documentList.get(0); //If multiple files supported?, then need to be handled here
for (Document document : documentList) {
File file = new File(document.getLocalFilePath());
if (file.exists()) {
response.setHeader("Content-Disposition", "attachment;filename=\"" + file.getName() + "\"");
try (InputStream inputStream = new FileInputStream(file); ServletOutputStream sos = response.getOutputStream();) {
IOUtils.copy(inputStream, sos);
} catch (final IOException e) {
LOGGER.error("File not found during content download" + id, e);
throw new APIException("Error during content download:" + id);
}
} else {
try {
s3FileUtil.download(document.getS3Url(), document.getLocalFilePath());
} catch (S3UtilException e) {
throw new APIException("Document not found");
}
}
}
} else {
//404
return new ResponseEntity<String>(HttpStatus.NOT_FOUND);
}
return new ResponseEntity<String>(HttpStatus.OK);
}
practically you cannot download all the files at once. Because, once you open a stream and write the file content to the stream, you have to close the stream.
When you add the files in for loop, you have to append the file content instead of file, which is not an expected behavior.
When you want to download multiple files at once, you have to zip
the files and download.
check this link: download multiple files

PDF file download using BlockingQueue

I'm trying to download a pdf file using URLConnection. Here's how I setup the connection object.
URL serverUrl = new URL(url);
urlConnection = (HttpURLConnection) serverUrl.openConnection();
urlConnection.setDoInput(true);
urlConnection.setRequestMethod("GET");
urlConnection.setRequestProperty("Content-Type", "application/pdf");
urlConnection.setRequestProperty("ENCTYPE", "multipart/form-data");
String contentLength = urlConnection.getHeaderField("Content-Length");
I obtained inputstream from the connection object.
bufferedInputStream = new BufferedInputStream(urlConnection.getInputStream());
And the output stream to write the file contents.
File dir = new File(context.getFilesDir(), mFolder);
if(!dir.exists()) dir.mkdir();
final File f = new File(dir, String.valueOf(documentName));
f.createNewFile();
final BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(new FileOutputStream(f, true)); //true for appendMode
BlockingQueue is created so that threads performing read and write operations can access the queue.
final BlockingQueue<ByteArrayWrapper> blockingQueue = new ArrayBlockingQueue<ByteArrayWrapper>(MAX_VALUE,true);
final byte[] dataBuffer = new byte[MAX_VALUE];
Now created thread to read data from InputStream.
Thread readerThread = new Thread(new Runnable() {
#Override
public void run() {
try {
int count = 0;
while((count = bufferedInputStream.read(dataBuffer, 0, dataBuffer.length)) != -1) {
ByteArrayWrapper byteArrayWrapper = new ByteArrayWrapper(dataBuffer);
byteArrayWrapper.setBytesReadCount(count);
blockingQueue.put(byteArrayWrapper);
}
blockingQueue.put(null); //end of file
} catch(Exception e) {
e.printStackTrace();
} finally {
try {
bufferedInputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
});
Now the writer thread reads those file contents.
Thread writerThread = new Thread(new Runnable() {
#Override
public void run() {
try {
while(true) {
ByteArrayWrapper byteWrapper = blockingQueue.take();
if(null == byteWrapper) break;
bufferedOutputStream.write(byteWrapper.getBytesRead(), 0, byteWrapper.getBytesReadCount());
}
bufferedOutputStream.flush();
} catch(Exception e) {
e.printStackTrace();
} finally {
try {
bufferedOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
});
Finally, threads are started.
readerThread.start();
writerThread.start();
Theoretically it should read the file from InputStream and save it to the target file. However, in reality, it produces blank pdf file. At some other time, it shows invalid pdf format exception. File size matches with content length of the InputStream. Is there anything I'm missing?
I'm not familiar with ByteArrayWrapper. Does it just hold a reference to the array, like this?
public class ByteArrayBuffer {
final private byte[] data;
public ByteArrayBuffer(byte[] data) {
this.data = data;
}
public byte[] getBytesRead() {
return data;
}
/*...etc...*/
}
If so. that would be the problem: all of the ByteArrayWrapper objects are backed by the same array. Which is repeatedly overwritten by the writer. Even though BlockingQueue did the hard work of safely publishing each object from one thread to the other.
The simplest fix might be to make the ByteArrayWrapper effectively immutable i.e. don't change it after publishing it to another thread. Taking a copy of the array on construction would be simplest:
public ByteArrayWrapper(byte[] data) {
this.data = Arrays.copyOf(data, data.length);
}
One other problem is that "BlockingQueue does not accept null elements" (see BlockingQueue docs), and so the "end of input" sentinel value doesn't work. Replacing null with a
private static ByteArrayWrapper END = new ByteArrayWrapper(new byte[]{});
in the appropriate places will fix that.
By making those changes to a copy of the code I was able to retrieve a faithful copy of a PDF file.
Try to use Android DownloadManager (http://developer.android.com/reference/android/app/DownloadManager.html) it is used to handle long-running HTTP requests in the background.
Here you don't need to think about received bytes and the progress is displayed in the notification bar.
There is a good tutorial here: http://blog.vogella.com/2011/06/14/android-downloadmanager-example/

Categories