I am going to make a business application for my father to make GST(Goods and Services Tax) filing easier. I have the design ready and I am going to use JavaFX.
The user will enter the data in tableview and that data needs to be stored for future reference.
The tableview needs to be converted to an excel file (gonna use Apache POI). The excel file will be sent to a C.A who will file GST on my father's behalf.
The application will need to import/export data into/from the tableview and edit the data as necessary.
I have 2 options :
Store/retrieve data from MySQL to tableview, update it according to the user's will and later export the data into excel files for sending it to C.A.
Store/retrieve data from excel files to tableview, update it according to the user's will and send the excel file to C.A.
I am planning to expand the application into a complete Business software that can manage entire business.
What should I use?
Which one will be more efficient and why?
I hope I am able to convey my question (I ain't good at writing).
In my own opinion it is more efficient and have more posibilities to explotes the data using MySQL, because reading and writing an Excel file will take a lot of time and it is slower.
I'll answer my own question, since I have got the answer.
I'm going with SQLite for now as using csv or excel files is gonna consume a lot of resources (I tried it).
I am going to sync the .db file in drive using scripts from the application itself. MySQL is definitely better choice but I want to database to be used by 2 computers at a time (not in network) so I will have to pay for online database.
I will store the .db file and drive and will retrieve it whenever the application runs. In this way its going to be safe.
Related
I have to build and app that extracts stock exchange data from an excel sheet then save it
in a database. The problem is that the excel sheet obtains new data via a Bloomberg plug in.
To refresh the page, the user has to open the spread sheet and hit the refresh button on the plug in in excel. After a second the data is added to the sheet.
i have seen guides about getting or putting data to an excel sheet via java but am not sure how to open the sheet, trigger the update and get the data.
Maybe you can retrieve the data directly with Excel, by using Web Queries.
So basically, this feature allows you to get data from the web. You specify the URL, then you select which data on the page you want to import.
You can write these queries in VB as well (ask if you want more details on how to do it with VB).
http://office.microsoft.com/
It is not clear what you are trying to achieve. I understand that you only use the spreadsheet to get the data and save it to a database.
There are a few alternatives I can think of:
[dirty] create a scheduled task in Windows that opens the spreadsheet every x minutes / hours. In your workbook, create a Workbook_Open event that uses Application.OnTime to execute a macro to do what you need to do with the data after a few minutes (the time it takes for the data to update). This is error-prone and will probably fail from time to time.
[better] use something similar except that you get the data programmatically with the VBA Addin, populate the sheet from VBA and/or do what you need to do with the data. No need for Application.OnTime in that case. You can even automatically save/close the spreadsheet.
[better] Have your java code get the data and send it to your database. If users need the data in Excel, you can have Excel query the database when required.
Note: with a typical Bloomberg Terminal/Anywhere license, it is not permitted to save data on a different machine so the database would need to be located on the local PC. Other licenses have different terms.
I think you should look into using Bloomberg's API v3 to retrieve the data. You have access to any field which you see in FLDS on the Bloomberg terminal.
This may be a dumb question, but I have a project for a class where I have to store/retrieve files from a SQL database that connects to a web page. Now, I could just make a webpage to store pictures or music files but I am currently working on creating some basic games in java. I know that there are ways to be able to access these files from a web page, but like I said, the project has to include a SQL database.
So my question is, is there any way to store and retrieve these kinds of files from the SQL database? Being able to download the files would be fine as long as the user would be able to open them, though I would prefer the user be able to open them in the browser.
If anybody has any suggestions I would appreciate it.
When storing into a SQL database, you don't really store the files. You store the file contents. In it's most generic form, you could make a table with a big binary field (a blob or clob depending on which database you use) or a big text field (a varchar) and put the contents of the file into that. Other columns could store file names and such.
To really leverage the SQL database, you would want to know enough about the content of the files to take advantage of indexing and such by breaking it up into more detailed parts. For example, if you are putting a save file in there, you could make a detailed table with columns for username, and all sorts of game-specific state that needs to be saved.
I have this requirement for my business. We have a swing desktop application that works with a mysql database. At the end of each day the swing app exports the data that has changed and uploads it to a server. The set up is, a user working in an office, will have many companies that he is working with. If he changes any data for that company, then I export that company's data alone from the database. The data is exported in the form of java objects, serialised and stored into a file which gets uploaded.
The next day, if there are any changes made to that company again then I will replace the file in the server with the latest uploaded file.
Now on my server, I would like to work with this file. I would like to convert each of these files into mini databases that a webapp can read. It will not write to it. Everytime the user uploads, the database will be deleted and recreated.
So ultimately each of these files are a small subset of the data that a user has in his desktop application.
Now this issues are:
The objects that I have exported are "Apache Torque" objects. Torque is an ORM tool, basically the object represents the table. I need to convert this object into a database. Sqlite, HSQLDB, Derby...? The database should be small. If the object file is about 5KB, then the database that represents that file shouldnt be 3MB. Derby did that actually.
The java object classes could change. Since the underlying database could change. Hence I will need to deserialise these objects and create a database from it as soon as it is uploaded. Otherwise, I will not be able to deserialise these objects later on. Small changes to the database is fine for the web application. But if I dont deserialise it immediately, then I am stuck.
The conversion from the java object to the database should be fast. Since the user actually waits when his data is getting uploaded I would like to add a maximum of 5-10s additional for the conversion.
Is it ok to have thousands of these mini databases lying around? Is this design okay? Is there an alternate solution?
I wouldn't try to put each dataset into its own database. I would put all of them in one big database, along with a column in the key tables indicating the dataset that each row applies to (this sounds like it should just be a company identifier). This is a more normalised design than having many small databases.
You will then need to write the webapp so it makes queries for particular datasets, rather than connecting to a particular database.
if you adopt that approach, you can deserialize and store the datasets as soon as they arrive. The storage is simply inserting rows into an existing database, so it should be very fast.
In addition, i expect that one big database will be much easier to manage, maintain, report on, etc, than many small databases.
If you tell us more about the details of your schema, we could discuss how the database could be organised, if that would be useful.
I am developing an internal system that is intended to work very much like Google Docs. The main piece I am implementing mimics their web-based Spreadsheet implementation. For multiple reasons I am not able to use Google Docs or ZK, which has a very robust Spreadsheet API. I chose POI 3.7 as a starting point for my Excel spreadsheet processing.
Currently when a user uploads an Excel spreadsheet, I take the file byte[] and store it in our db as a blob. When a user wants to view the spreadsheet, I pull out the byte[], build the Workbook, and push it to the client UI for editing. The pushing to the UI isn't my concern. When a user makes edits to the spreadsheet, I push the edits to the server and store them on a stack and only apply the updates when the user presses the "save" button. On save, I pull the workbook back out of the database, make the changes and push the Workbook back to the db. That way, I don't keep it in memory. It's no surprise that all of this is pretty fast except for when multiple users start doing this, obviously exploding Workbooks eats memory as described in other posts here.
A user will only update one tab at a time, why should I need to open the entire workbook? When a user initially uploads an excel spreadsheet, can I pull out each Sheet, convert each to a byte[] and save each as an indiviaual "worksheet" db record? The POI Sheet has a protected, "#write(Stream)" method but I would not like to get into the business of re-compiling POI. I also would not like to explode every cell as a new db entry. Would you guys do this differently in the first place?
Backend is java/spring/jdbc. For internal reasons, these are the technologies I'm stuck using.
Storing big binary blobs in the database is in itself not a good thing if performance is important. You would be much better off storing the workbooks on the disk.
I can only give you half an answer to your question and that is that you can read xslx (not xsl) files one sheet at a time using (http://poi.apache.org/apidocs/index.html?org/apache/poi/xssf/eventusermodel/XSSFReader.html) and that you can use a SAXParser to avoid holding each full sheet in memory. I don't think there is any way of saving it without creating a sheet object.
Warning Hack: One quick hack could be to use reflection to call the protected method. There is of course no guarantee that this will work in future versions of POI.
With Excel files, some things are stored at the sheet level, but other bits are stored at the workbook level. As your user edits a sheet, while most of their changes will be on the sheet part, some bits will need to touch the workbook level entities, and for that you'll need the whole file.
You might want to take a look at how SharePoint does its collaborative editing, which allows several people using Excel to work on the same file much like google docs. All the SharePoint protocol documents are publicly available, and there was an event on the docs very recently for which videos and presentations should be online soon, keep an eye on the office interop blog for when they do. In the SharePoint docs you should find the details on how Microsoft chunks up an Excel file for collaborative editing, and there's something to be said for you doing the same!
I would consider looking into saving the sheets as separate XML's in the database. If you store additional (meta)data about sheets belonging together in the database it shouldn't be too much hassle keeping them together. The reason behind using XML is that from Excel 2003 up spreadsheets can be saved as xml and therefor can easily be created by code as well.
If at one point you seem to be hitting too many walls with Apache POI, you could look into the OpenOffice API as well.
In my java application I am connecting to Microsoft Excel with Jacob libraries. Everything is fine but I do not know how I can catch com events when any changes in Excel page occurs by using Jacob libraries. For example, In my project I connect database takes table values and copy these values into cells of an excel page. Whenever a cell value is changed, the table value is also changed. That's I want synchronization between java and Microsoft Excel application by Jacob.
Don't use Java to achieve that. See this question: Excel OnChange event, with emphasis on this answer. You should access your database using com directly from Excel. That's easy using ADO. This ADO tutorial from w3schools looks also fine.
If the task is too complex to perform directly from Excel you may think of putting a small marker somewhere (e.g. in the database) that the data changed and process this marker from other app, possibly java app. The difficulty is that the credentials to access the database must be hardcoded in the Excel sheet. But you can create a separate database user with narrow database permissions.
Seeing your comment I also tried using Change event of Excel to detect changes done by other users. My experience on Excel 2003 shows that this works only locally. That is the event is fired only for the user that made the change. If many users have the worksheet open they don't receive Change event caused by changes from other users. So your approach is unfeasible. You may test it with Excel 2010, but my impression is that the events in general work only locally. Couldn't find anything on the net about it. Only this general article: Track changes in a shared workbook.