How to access the files from a s3 storage using java - java

I have created my vm in aws using ec2 instance. Also i have created a bucket in s3 storage and some folders in that bucket. I have my java jar file in filezilla and now i want to run my application. This scenario is working ok in the case when I have my filed I want to access in filezilla also but now that I want to create a s3 storage instance I do not know how can my java application can access these folders. Do I need to create a new java class in my application in order to access these folders from s3 storage?
I am not quite familiar with aws and I will appreciate any help from you guys.Even if you send me some useful links which solve my problem it will be a great help for me.
Thanks in advance

Related

Unable to access uploaded resources in Spring on Heroku [duplicate]

I've built an app where users can upload their avatars. I used the paperclip gem and everything works fine on my local machine. On Heroku everything works fine until server restart. Then every uploaded images disappear. Is it possible to keep them on the server?
Notice: I probably should use services such as Amazon S3 or Google Cloud. However each of those services require credit card or banking account information, even if you want to use a free mode. This is a small app just for my portfolio and I would rather avoid sending that information.
No, this isn't possible. Heroku's filesystem is ephemeral and there is no way to make it persistent. You will lose your uploads every time your dyno restarts.
You must use an off-site file storage service like Amazon S3 if you want to store files long-term.
(Technically you could store your images directly in your database, e.g. as a bytea in Postgres, but I strongly advise against that. It's not very efficient and then you have to worry about how to provide the saved files to the browser. Go with S3 or something similar.)

Upload File to Cloud Storage directly using SignedURL

I am trying to upload a file directly to Google Cloud Storage using Java Client Library
The Code I have written is
Instead of uploading the new file to cloud storage I am getting this output
What I am missing in the code to make the upload to Cloud Storage ?
You need configure the the authorization keys, is a file .json to you enverioment,see this in the documentation https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-gcloud
I don't think you have the correct "BUCKET_NAME" set, please compare the bucket name you are using with your bucket name on your Google Cloud Console so you can see if it's set correctly.
The way it's set, it looks like the compiler thought you were using a different constructor for your blobInfo.newBuilder method.

Access file on a share(File server)

I want to access a file server that is not present in my network, but I have credentials of other domain that can be used to access the file.
How do I gain access to the file in share?
Is it possible to gain access to the file using a java program?
Operating System is Windows. I want to read the contents from .txt and .csv files present in the share and display it on a web page.
I used jcifs library to solve this. It works great.
You can use ftp protocol.
And also can make a map drive from share folder on your system. It's a simple solution.

Is it safe to store .ppk files securely in S3 - use of S3, Java and ssh

Part of the application I'm working connects to an instance using ssh. It requires a .ppk file which I've currently got stored in S3.
My concern is that it's not secure enough and I'm looking for a method in which to make it so.
I've considered encrypting the S3 bucket and allowing programmatic access only, the bucket and file location can be fed to app via env variables.
I really don't want to keep the file in the resources as anyone getting the jar cam unzip and obtain, same with hardcoded values in the codebase. Is this a safe way of storing this file? Would encrypting it be worth the additional steps?

File transfer to a remote machine in amazon ec2

after creating a instance in amazon cloud using webservice in java i need to transfer a executable file or war file via program from my local machine to the newly created instance in amazon and i want to execute that excetuable,i tried and found that there is something called createbucket in ec2 api and using that we can upload the file to that and we can transfer that reference using PutObjectRequest i can transfer the reference to a remote computer in amazon do it is possible or if it is wrong please suggest me the correct way to proceed for file transfer from my local machine to the amazon ec2.
The basic suggestion is, you shouldn't transfer the file(s) with CreateBucket, which is actually an S3 API. Use scp may be a better solution.
Amazon S3, which you are trying to use with CreateBucket, is a data storage service mainly for flexible, public (with authentication) file sharing. You can use REST or SOAP APIs to access the data, but cannot really read/write it in EC2 instances as if it's in local harddisk.
To access file system in EC2 instances, that really depends on your operating system (on EC2). If it's running Linux, scp is a mature choice. You can use Java to directly invoke scp, if you are using Linux locally, or pscp if you are using Windows. If the EC2 instance is running Windows, one choice is to host an SSH/SFTP environment with FreeSSHD, and then proceed like Linux. Another option is use Shared Folder and regular file copy.

Categories