Export large data from DB to CSV - java

I need to export a Table into a CSV File and serve it to download via my JSF/Icefaces Webapplication.
How can I do that? I have a Table with 20+ Columns and over 10 MIO rows.
At the moment, I use a Java Thread, loading all data into ram. Then I create a new File and iterate the Collection writing row for row into the file. If the Thread is done, the user can download the large file via Servlet.
But I dont want to write so many GB into ram. I cant secure, not to get a memory problem..
Is it possible that hibernate does it for me? Or does somebody has an other idea?
Im connected to a DB2 Datebase. The table I want to export is connected to a hibernate bean but it is also possible to write native sql.
Thank you for response!

Do you need the intermediate stage of a file ? Have you tried loading from the database and writing to your servlet output stream for each row ? That way you're acting simply as a pipe between the client and the db.
Simply set your content-disposition header appropriately and that will signal the client's browser to treat the incoming data as a CSV file itself.

I have also gone through with similar kind of problem , the way I solve the problem is that I initially write a CSV file on disc and fetch 25K batch records from DB ans save to the file, and iteratively repeat the process until all the data required by the report is not written on the file.
And then send the file URL to the client to download the file.

Related

Export SQL database to mdb file (MS Access) using Java-Spring

I'm trying to export a SQL database to a mdb file, through my java-spring application, but I haven't found anything really useful.
The idea is to call a controller endpoint, which calls my service. There, I need to:
Execute a SQL query and get the result set
Return the mdb file, which contains just a new database with the table generated from the result set above
Download the ms-access file from the browser
How can this be achieved?
Thanks in advance
I haven't really tried many things, as the only way I seem to achieve this is by creating the mdb file directly in the client's computer and then opening a connection to execute a insert with the data, and that is not what I want to do. I'm stuck here

What is the correct/proper way to upload files to a server?

What is the correct/proper way to upload files to a server? I'm only talking about small files such as images, text files, and excel/word files.
I know that I can upload images to the database using BLOB. But what about the others?
I have a table called "Ticket" which contains information such as date created, ticket number, caller, attachment, and etc.
I'm having problems on how to upload an attachment to a server.
The first option should be uploading the image to a file server and stored the file id or uuid in your ticket table, or a OneToMany table stores all attachments.
Always void using BLOB to store image binary in database. Database has this capability doesn't mean it's a good way to use it.
If you are working on a small project, you may not see the problem. If the concurrent is relatively high,
Imagine you store the files in the database, even all are just images, whenever you retrieve the tickets, the a-few-MB image will be in the memory. It's a waste of server memory.
If you are using some ORM to retrieve the list, which will be worst and your server can be easily OutOfMemory.
One more thing is that if your system has Web Application Firewall in front, it's also advisable to separate file upload with normal form submission.

how to resolve length exceeded exception in mysql

I'm working on a website and i need to do the functionality for allowing a user to upload the videos. I tried doing it but at run time NetBeans came an error of max length exceeded i tried resolving it by writing max_allowed_packet=100 M in my.ini file in mysql but it did not help as it gave a message of access denied.
Can you please guide me how to get it fixed?
Don't do that. Use the filesystem to store multi-megabyte videos, and use the database to store filenames and paths. A huge video file in a database is also less efficiently served than a video stored directly in a file. Webservers can make use of server calls like sendfile and splice to serve large files efficiently, but the database adds several copies and delays.
That being said, you need to set max_allowed_packet on both the client and server sides.

Create a temporary file, then upload it using FTP (Java webapp)

Users of my web application have an option to start a process that generates a CSV file (populated by some data from a database) and uploads it to an FTP server (and another department will read the file from there). I'm just trying to figure out how to best implement this. I use commons net ftp functionality. It offers two ways to upload data to the FTP server:
storeFile(String remote, InputStream local)
storeFileStream(String remote)
It can take a while to generate all the CSV data so I think keeping a connection open the whole time (storeFileStream) would not be the best way. That's why I want to generate a temporary file, populate it and only then transfer it.
What is the best way to generate a temporary file in a webapp? Is it safe and recommended to use File.createTempFile?
As long as you don't create thousands of CSV files concurrently the upload-time doesn't matter from my point of view. Databases usually output the data row by row and if this is already the format you need for the CSV file I strongly recommend not to use temporary files at all - just do the conversion on-the-fly:
Create an InputStream implementation that reads the database data row by row, converts it to CSV and publish the data via it's read() methods.
BTW: You mentioned that the conversion is done by a web application and that it can take a long time - this can be problematic as the default web client has a timeout. Therefore the long lasting process should be better done by a background thread only triggered by the webapp interface.
It is ok to use createTempFile, new File(tmpDir, UUID.randomUUID().toString()) can do as well. Just do not use deleteOnExit(), it is a leak master. Make sure you delete the file on your right own.
Edit: since you WILL have the data in memory, do not store it anywhere; wrap a java.io.ByteArrayInputSteam and use the method w/ the InputStream. Much neater and better solution.

Java: how to upload files to server and handle authentication?

I have a small linux vps. I have written a Java client application, which needs to connect and submit large string data and images. The string data will be stored as regular text files on the server, and will be parsed by another Java application that will run on the server and use this uploaded files and images.
The next part of the problem is, because this Java client will be run by several users, I need some way to uniquely identify each uploaded file to the currently logged in user session on the website (the user needs to login on the website, to be able to run the tasks). Any suggestions or more efficient patterns ?
Don't write the stuff to files. Punch the uploaded data into 'raw' database tables by user ID. The batch job job can pull the data out, parse/format/fold/spindle/mutilate, and stuff the results into the real tables, then delete the raw data.

Categories