I am trying to run a query in Google Big Query and export the data to Google cloud storage using GZIP compression.
JobConfigurationExtract jobExtractConfig = new JobConfigurationExtract().setSourceTable(tableReference).set("results.csv", "CSV")
.setDestinationUri("gs://dev-app-uploads/results.zip")
.setCompression("GZIP");
By using this config i am able to generate a results.zip file successfully in cloud storage in the configured bucket dev-app-uploads. But the file inside the zip is generated without a .csv extension. When i extract the zip file, i am getting a "results" file and when i manually add the extention .csv and open the file, the contents are there.
But my necessity is to generate the file with .csv extension and zip it and place it in cloud storage.
Please let me know if this is possible or any other better options to upload data from big query using compression.
Instead of
gs://dev-app-uploads/results.zip
use below
gs://dev-app-uploads/results.csv.zip
Related
I am trying to get metadata of a file lying in Azure blob storage.
I am using ffprobe for this purpose. Though it works, since the ffprobe binary lies on my local system and file lies in Blob, the entire process is too slow
What would be the best way to do the above, getting meta data for a remote file?
Two ways for your reference:
1.Use blob.downloadAttributes(),then use blob.getMetadata()
This method populates the blob's system properties and user-defined
metadata. Before reading or modifying a blob's properties or metadata,
call this method or its overload to retrieve the latest values for the
blob's properties and metadata from the Microsoft Azure storage
service.
2.Use get-metadata-activity in ADF.
Get a file's metadata:
I am trying to upload a file directly to Google Cloud Storage using Java Client Library
The Code I have written is
Instead of uploading the new file to cloud storage I am getting this output
What I am missing in the code to make the upload to Cloud Storage ?
You need configure the the authorization keys, is a file .json to you enverioment,see this in the documentation https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-gcloud
I don't think you have the correct "BUCKET_NAME" set, please compare the bucket name you are using with your bucket name on your Google Cloud Console so you can see if it's set correctly.
The way it's set, it looks like the compiler thought you were using a different constructor for your blobInfo.newBuilder method.
I am writing a web server with Play framework 2.6 in Java. I want to upload a file to WebServer through a multipart form and do some validations, then upload the file s3. The default implementation in play saves the file to a temporary file in the file system but I do no want to do that, I want to upload the file straight to AWS S3.
I looked into this tutorial, which explains how to save file the permanently in file system instead of using temporary file. To my knowledge I have to make a custom Accumulator or a Sink that saves the incoming ByteString(s) to a byte array? but I cannot find how to do so, can someone point me in the correct direction?
thanks
Is it possible to upload a file using some web-service directly to HDFS space. I tried to write file in to local system and moved it to HDFS.
WebHDFS provides REST APIs to support all the filesystem operations.
Direct uploading is not possible though.
It has to follow 2 steps
Create File in the hdfs location http://<HOST>:<PORT>/webhdfs/v1/<PATH>?op=CREATE
Write to that file - by specifying your local file path tat u want to upload in the Header http://<DATANODE>:<PORT>/webhdfs/v1/<PATH>?op=CREATE
Refer APIs here WebHDFS apis
In my project a compressed file is displayed on a php in bytes.
I am trying to find a way to read the php page, decompress the file using GZIP, and write it out to the assets folder.
The file that I am reading in has to be placed in the data/data/package/database file.
I have a class that reads a file from the assets folder and places the file into data/data/package/database.
Is it possible to write to the assets folder during runtime? If not is there a better way to do this?
if the PHP script is running on a server, your Android app will need to make an HTTP request to retrieve the content. of course then it has to store it somewhere. the SQLite database is a good option, but you could also store content in files on the SD card.
there are a couple ways to do the HTTP connection part of it, and they're written up on the Android Dev Blog. Myself, I prefer the HttpURLConnection.