hosting the database file on any cloud service - java

I have an android application, which wants the user to login each time he runs the app. So, the login procedure is simple, using the sqlite dabase file i'm using. I've copied the file in assets folder and doing the necessary modifications. But, the database file is of no use unless it is on the server. I don't have any server so i'm thinkin of keeping the database file on dropbox, google drive etc and then read or update that file as per user commands. The question is how to do that? I was searching the web for it, and found that the only way is downloading the db file modifying it and the uploading it back. Can anyone give me an example??

Doing that isn't possible unless you have a server.
Because, if you are using dropbox, first you'll have to make your file public in order to download it (Not recommended at all. Compromises security). Then you can use the url to download the file. But you won't be able to upload it back (Unless you are able to login to dropbox through your Android code).
Instead if you a web server with MySQL n PHP, you can easily send POST requests to your server.

Related

Include better data export option in Web service

We have Java Vert.x project. We have implemented the download button in our Web Service for our users. When user clicks on download button, we would convert the huge data in our database that the user asked for, to excel file, then upload it in AWS S3 and then the URL from the S3 would be sent in response to the user's download request. But this whole method takes time (Especially generating the excel file). Everything is don in backend. Please suggest better approaches than this to implement download option in the page(User has both option to download filtered or complete data that he is accessible to).

Unable to access uploaded resources in Spring on Heroku [duplicate]

I've built an app where users can upload their avatars. I used the paperclip gem and everything works fine on my local machine. On Heroku everything works fine until server restart. Then every uploaded images disappear. Is it possible to keep them on the server?
Notice: I probably should use services such as Amazon S3 or Google Cloud. However each of those services require credit card or banking account information, even if you want to use a free mode. This is a small app just for my portfolio and I would rather avoid sending that information.
No, this isn't possible. Heroku's filesystem is ephemeral and there is no way to make it persistent. You will lose your uploads every time your dyno restarts.
You must use an off-site file storage service like Amazon S3 if you want to store files long-term.
(Technically you could store your images directly in your database, e.g. as a bytea in Postgres, but I strongly advise against that. It's not very efficient and then you have to worry about how to provide the saved files to the browser. Go with S3 or something similar.)

Store files in google drive from Google App Engine

All,
I have a google application engine (java) that requires to store some images. I tried using Blobs and storing them in datastore but as you know there is a size limit on data that can be stored in datastore.
So as result I'm storing the images on a different server and store the path in my datastore and all works fine.
Now I'm thinking on using a google drive folder instead of using a server to upload the files to the drive and using the share link to display them later.
I've seen https://developers.google.com/drive/web/quickstart/java and got it to work fine. When I try to use it in my application however obviously this won't work as the code is assuming a credential for a local user.
I created a service key on my application and want to change the sample code above to use it but I'm not sure if that's the correct approach.
Tried searching for samples but can't find anyone that takes the same approach. Is there a working sample that shows how to authenticate an application not a user and let's say store a file in google drive?
I've also seen https://developers.google.com/drive/web/examples/ please note what I want is to store files in my google drive and not the user's google drive. So if user A and user B come to my app, they shouldn't have to authorize my application and should both be able to upload a file to my google drive.
I don't know if this can be done directly from their browser or I have to move the file to my application (appspot) and then push it to google drive.
Thanks

Java - Best way to send or upload and retrieve files to a server within a network?

I'm currently developing a Tracking System with JavaFX and MySQL as database that is kept in a server. My application is used within a network and allows users to upload and download pictures and several types of document.
My question: What is the best way to send and retrieve files to a server within a network? And should I store those files in MySQL or just their paths in MySQL? If only files' paths, do I need FTP or other techniques? I need detail answer because it is my first time to develop such application.
Edit: I want to store data in a server. I build this application for client machines so that clients can keep documents in the server and they can access their files from any machines... I have no idea for how to transfer files from client machines to server. Please help me!
You should use socket programming like what is at following link:
http://www.rgagnon.com/javadetails/java-0542.html
.But if you are using Java7, Files is the best one instead of BufferedInputStream or FileInputStream (No extra library require):
/* You can get Path from file also: file.toPath() */
Files.copy(InputStream in, Path target)
Files.copy(Path source, OutputStream out)

Zip File on a web server to extract in to local machine

We have a web application that allows user to download a zip file from a web server. We just provide dummy iframe source to the full URL of zip file on web server. This approach would allow end user to use browser controls which allows the user to open or save the zip to user's local machine.
We have a requirement that the zip file is automatically extracted and save to a specific location on user's machine. Any thoughts on how this can be achieved?
Thanks.
I highly doubt that you'll be able to do that. The closest you're likely to get is to generate a self-extracting executable file (which would be OS-dependent, of course).
I certainly wouldn't want a zip file to be automatically extracted - and I wouldn't want my browser to be able to force that decision upon me.
Short answer is I don't believe this is possible using the simple URL link you've implemented.
Fundamentally the problem you have is that you have no control over what the user does on their end, since you've ceded control to the browser.
If you do want to do this, then you'll need some client-side code that downloads the zipfile and unzips it.
I suspect Java is the way to go for this - Javascript and Flash both have problems writing files to the local drive. Of course if you want to be Windows only then a COM object could work.
Instead of sending a zip file why don't u instruct the web server to compress all the web traffic and just send the files directly?
See http://articles.sitepoint.com/article/web-output-mod_gzip-apache# for example.

Categories