Unable to access uploaded resources in Spring on Heroku [duplicate] - java

I've built an app where users can upload their avatars. I used the paperclip gem and everything works fine on my local machine. On Heroku everything works fine until server restart. Then every uploaded images disappear. Is it possible to keep them on the server?
Notice: I probably should use services such as Amazon S3 or Google Cloud. However each of those services require credit card or banking account information, even if you want to use a free mode. This is a small app just for my portfolio and I would rather avoid sending that information.

No, this isn't possible. Heroku's filesystem is ephemeral and there is no way to make it persistent. You will lose your uploads every time your dyno restarts.
You must use an off-site file storage service like Amazon S3 if you want to store files long-term.
(Technically you could store your images directly in your database, e.g. as a bytea in Postgres, but I strongly advise against that. It's not very efficient and then you have to worry about how to provide the saved files to the browser. Go with S3 or something similar.)

Related

How to compress files on azure data lake store

I'm using Azure data lake store as a storage service for my Java app, sometimes I need to compress multiples files, what I do for now is I copy all files into the server compress them locally and then send the zip to azure, even though this is work it take a lot of time, so I'm wondering is there a way to compress files directly on azure, I checked the data-lake-store-SDK, but there's no such functionality.
Unfortunately, at the moment there is no option to do that sort of compression.
There is an open feature request HTTP compression support for Azure Storage Services (via Accept-Encoding/Content-Encoding fields) that discusses uploading compressed files to Azure Storage, but there is no estimation on when this feature might be released.
The only option for you is to implement such a mechanism on your own (using an Azure Function for example).
Hope it helps!

Store files in google drive from Google App Engine

All,
I have a google application engine (java) that requires to store some images. I tried using Blobs and storing them in datastore but as you know there is a size limit on data that can be stored in datastore.
So as result I'm storing the images on a different server and store the path in my datastore and all works fine.
Now I'm thinking on using a google drive folder instead of using a server to upload the files to the drive and using the share link to display them later.
I've seen https://developers.google.com/drive/web/quickstart/java and got it to work fine. When I try to use it in my application however obviously this won't work as the code is assuming a credential for a local user.
I created a service key on my application and want to change the sample code above to use it but I'm not sure if that's the correct approach.
Tried searching for samples but can't find anyone that takes the same approach. Is there a working sample that shows how to authenticate an application not a user and let's say store a file in google drive?
I've also seen https://developers.google.com/drive/web/examples/ please note what I want is to store files in my google drive and not the user's google drive. So if user A and user B come to my app, they shouldn't have to authorize my application and should both be able to upload a file to my google drive.
I don't know if this can be done directly from their browser or I have to move the file to my application (appspot) and then push it to google drive.
Thanks

hosting the database file on any cloud service

I have an android application, which wants the user to login each time he runs the app. So, the login procedure is simple, using the sqlite dabase file i'm using. I've copied the file in assets folder and doing the necessary modifications. But, the database file is of no use unless it is on the server. I don't have any server so i'm thinkin of keeping the database file on dropbox, google drive etc and then read or update that file as per user commands. The question is how to do that? I was searching the web for it, and found that the only way is downloading the db file modifying it and the uploading it back. Can anyone give me an example??
Doing that isn't possible unless you have a server.
Because, if you are using dropbox, first you'll have to make your file public in order to download it (Not recommended at all. Compromises security). Then you can use the url to download the file. But you won't be able to upload it back (Unless you are able to login to dropbox through your Android code).
Instead if you a web server with MySQL n PHP, you can easily send POST requests to your server.

Uploading Multiple Videos on server simultaneously

We have one web application(With Spring, hibernate and MySQL as a Database) in which multiple users can store the heavy videos(pre-recorded or record from application itself) on server at same time.
In that scenario, server load would be definitely more there. We are assuming there would be 500-2000 users in the application.
So what strategy i should use to reduce the load from server and make the response time faster.
1) Storing the videos on our server(With large Disk Space), and using the ActiveMQ/RabbitMQ mechanisms for File Upload and download in the Queues.
2) Storing the videos on some third party server(like YouTube,vimeo etc) that will upload all the videos on one central account. I had recently check this thing with you tube and vimeo but they require the end user login credentials for each upload. And i don;t want in my application that end-users to provide their credentials before each upload.
Is there any other way to reduce the work load and make the response time better for simultaneously upload on server, then please guide.
Thanks In Advance,
Arun
Multi servers can help.
On a single server:
If you use a single core processor - only ONE client will get served.
If you use a multi core processor and you are oppening a new thread for a new connection - only #ofCores clients will get served, and even that is not correct because your local memory might run out before your os will save the data to your local hard disk (which has one bus), so serving 500-2000 clients leads you to a multi server solution.

scalable file upload/download permissions

What would be a scalable file upload/download system/database?
I'm building a website where users can login, upload images that are private, but truly private. I can't upload them to a map on the harddisk of a server, since that would not scale (what happend if we add more servers?) and it wouldn't be private since everyone could go:
http://127.372.171.33/images/private_picture.png
and download the file.
I am building the project in Play Framework (scala/java)
How do websites like flickr handle these kind of things? Do they put them in a database? And what kind of database would be suitable for this situation?
Thanks for help
I can't tell you how those big sites handle it but putting those images into a database might be one way.
Another way would be to put the files into a virtual filesystem that spans a cluster of servers or distribute them onto different servers and just don't make the directories that contain the images visible to the webserver. Thus nobody should be able to open the image just using the server and the path on that server.
To actually deliver the images you could them implement some streaming service that sends a bytestream to the browser for display (like the webservers would do as well). This service could first check the download permissions for the requested image.

Categories