Upload File to Cloud Storage directly using SignedURL - java

I am trying to upload a file directly to Google Cloud Storage using Java Client Library
The Code I have written is
Instead of uploading the new file to cloud storage I am getting this output
What I am missing in the code to make the upload to Cloud Storage ?

You need configure the the authorization keys, is a file .json to you enverioment,see this in the documentation https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-gcloud

I don't think you have the correct "BUCKET_NAME" set, please compare the bucket name you are using with your bucket name on your Google Cloud Console so you can see if it's set correctly.
The way it's set, it looks like the compiler thought you were using a different constructor for your blobInfo.newBuilder method.

Related

Upload file to AWS for ingression to Postgres DB

I need to upload a file using a web form to AWS and then trigger a function to import it into a Postgres DB. I have the file import to a DB working locally using Java, but need it to work in the cloud
It needs a file upload with some settings (such as which table to import into) to be passed through a Java function which imports it to the Postgres DB
I can upload files to an EC2 instance with php, but then need to trigger a lambda function on that file. My research suggests S3 buckets are perhaps a better solution? Looking for some pointers to which services could be best suited
There are two main steps in your scenario:
Step 1: Upload a file to Amazon S3
It is simple to create an HTML form that uploads data directly to an Amazon S3 bucket.
However, it is typically unwise to allow anyone on the Internet to use the form, since they might upload any number and type of files. Typically, you will want your back-end to confirm that they are entitled to upload the file. Your back-end can then Upload objects using presigned URLs - Amazon Simple Storage Service, which authorize the user to perform the upload.
For some examples in various coding languages, see:
Direct uploads to AWS S3 from the browser (crazy performance boost)
File Uploads Directly to S3 From the Browser
Amazon S3 direct file upload from client browser - private key disclosure
Uploading to Amazon S3 directly from a web or mobile application | AWS Compute Blog
Step 2: Load the data into the database
When the object is created in the Amazon S3 bucket, you can configure S3 to trigger an AWS Lambda function, which can be written in the programming language of your choice.
The Bucket and Filename (Key) of the object will be passed into the Lambda function via the event parameter. The Lambda function can then:
Read the object from S3
Connect to the database
Insert the data into the desired table
It is your job to code this functionality but you will find many examples on the Internet.
You can use AWS SDK in your convenient language to invoke Lambda.
Please refer this documentation

AWS API to upload file in S3 and write record in DB

I need to implemnt a AWS backend API that allows the users of my mobile app to upload a file (image) in Amazon S3.
Creating an API directly interfaced with the Amazon S3 is not an option because i will not be able to correlate the uploaded file to the record of the user on DynamoDB.
I've thought to create a Lambda function (Java) triggered by an API that performs the following steps:
1) calls the Amazon S3 functionality to upload the file
2) write the record into my Dynamo DB with the reference of the file
Is there a way to provide a binary file in input to my Lambda function exposed as API?
please let me know. thank you!
davide
The best way to do this is with presigned URLs. You can generate a URL that will let the user upload files directly to S3 with specific name and type. This way you don't have to worry about big files slowing down your server, lambda limits, or double charges for bandwidth. It's also faster for the user in most cases and supports S3 transfer acceleration.
The process can look something like:
User requests link from your server
Your server writes an entry in DynamoDB and returns a presigned URL
User uploads file directly to S3 using presigned URL (with exact name of your server's choice)
Once upload is done you either get a notification using Lambda, or just have the user tell your server the upload is done
Your server performs any required post-processing and marks the file as ready
And to answer your actual question, yes, there is a way to pass binary data to Lambda functions. The link is a step-by-step tutorial, but basically in API Gateway you have to set "Request body passthrough" to "When there are no templates defined (recommended)" and fill in your expected content types. Your mapping should include "base64data": "$input.body", and you need to setup your types under "Binary Support". In your actual lambda function, you should have access to the data as "base64data".

How to access the files from a s3 storage using java

I have created my vm in aws using ec2 instance. Also i have created a bucket in s3 storage and some folders in that bucket. I have my java jar file in filezilla and now i want to run my application. This scenario is working ok in the case when I have my filed I want to access in filezilla also but now that I want to create a s3 storage instance I do not know how can my java application can access these folders. Do I need to create a new java class in my application in order to access these folders from s3 storage?
I am not quite familiar with aws and I will appreciate any help from you guys.Even if you send me some useful links which solve my problem it will be a great help for me.
Thanks in advance

Reading data from Azure Blob with Spark

I am having issue in reading data from azure blobs via spark streaming
JavaDStream<String> lines = ssc.textFileStream("hdfs://ip:8020/directory");
code like above works for HDFS, but is unable to read file from Azure blob
https://blobstorage.blob.core.windows.net/containerid/folder1/
Above is the path which is shown in azure UI, but this doesnt work, am i missing something, and how can we access it.
I know Eventhub are ideal choice for streaming data, but my current situation demands to use storage rather then queues
In order to read data from blob storage, there are two things that need to be done. First, you need to tell Spark which native file system to use in the underlying Hadoop configuration. This means that you also need the Hadoop-Azure JAR to be available on your classpath (note there maybe runtime requirements for more JARs related to the Hadoop family):
JavaSparkContext ct = new JavaSparkContext();
Configuration config = ct.hadoopConfiguration();
config.set("fs.azure", "org.apache.hadoop.fs.azure.NativeAzureFileSystem");
config.set("fs.azure.account.key.youraccount.blob.core.windows.net", "yourkey");
Now, call onto the file using the wasb:// prefix (note the [s] is for optional secure connection):
ssc.textFileStream("wasb[s]://<BlobStorageContainerName>#<StorageAccountName>.blob.core.windows.net/<path>");
This goes without saying that you'll need to have proper permissions set from the location making the query to blob storage.
As supplementary, there is a tutorial about HDFS-compatible Azure Blob storage with Hadoop which is very helpful, please see https://azure.microsoft.com/en-us/documentation/articles/hdinsight-hadoop-use-blob-storage.
Meanwhile, there is an offical sample on GitHub for Spark streaming on Azure. Unfortunately, the sample is written for Scala, but I think it's still helpful for you.
df = spark.read.format(“csv”).load(“wasbs://blob_container#account_name.blob.core.windows.net/example.csv”, inferSchema = True)

Store files in google drive from Google App Engine

All,
I have a google application engine (java) that requires to store some images. I tried using Blobs and storing them in datastore but as you know there is a size limit on data that can be stored in datastore.
So as result I'm storing the images on a different server and store the path in my datastore and all works fine.
Now I'm thinking on using a google drive folder instead of using a server to upload the files to the drive and using the share link to display them later.
I've seen https://developers.google.com/drive/web/quickstart/java and got it to work fine. When I try to use it in my application however obviously this won't work as the code is assuming a credential for a local user.
I created a service key on my application and want to change the sample code above to use it but I'm not sure if that's the correct approach.
Tried searching for samples but can't find anyone that takes the same approach. Is there a working sample that shows how to authenticate an application not a user and let's say store a file in google drive?
I've also seen https://developers.google.com/drive/web/examples/ please note what I want is to store files in my google drive and not the user's google drive. So if user A and user B come to my app, they shouldn't have to authorize my application and should both be able to upload a file to my google drive.
I don't know if this can be done directly from their browser or I have to move the file to my application (appspot) and then push it to google drive.
Thanks

Categories