I want to modify the content of a file stored in the Google Cloud Storage bucket and serve it to the user. The delivery content will change based on the User profile.
By the following ways I am trying to achieve it.
User will request for a particular hash key, this key is mapped to a cloud storage url. The contents are read, modified and served to the User when the hash key is used in the url. Say /serve/hash-key.
User will not know the cloud storage url, nor will have a direct access to the content of the file.
Java servlet is mapped in the server-side to process the User request. To process all /serve/* queries.
User can do the request either through the current tab or through a new tab.
The problem what I am facing is when the request is processed in the Java Servlet, App Engine User details could not be fetched using OAuthServiceFactory class. OAuthRequestException is thrown when getCurrentUser Api is called.
Is there a possible way to achieve this or address the error?
Related
Caused by: com.azure.storage.blob.models.BlobStorageException: If you are using a StorageSharedKeyCredential, and the server returned an error message that says 'Signature did not match', you can compare the string to sign with the one generated by the SDK. To log the string to sign, pass in the context key value pair 'Azure-Storage-Log-String-To-Sign': true to the appropriate method call.
If you are using a SAS token, and the server returned an error message that says 'Signature did not match', you can compare the string to sign with the one generated by the SDK. To log the string to sign, pass in the context key value pair 'Azure-Storage-Log-String-To-Sign': true to the appropriate generateSas method call.
Please remember to disable 'Azure-Storage-Log-String-To-Sign' before going to production as this string can potentially contain PII.
Status code 403, "AuthorizationFailureThis request is not authorized to perform this operation.
I have tried uploading a document to azure blob.
I have resolved this issue by adding my client IP Address in Firewalls and Virtual Networks.
If you are facing a similar problem you might not have access for reading or writing in the Azure blob.
All you need to do is open the Azure Resource group -> Storage account -> In the Security + networking section you have an option called Networking -> In the Firewall section you can see an option called Add your client IP address, you can check that check box and save your changes.
That's it you'll get access to read or write the blob storage after that.
I am trying to store metadata into an STS "assume role" session so that I can retrieve it when the session user calls my service.
To accomplish this, I am setting a tag during the STS assumeRole creation:
AWSSecurityTokenService service = ...
AssumeRoleRequest request = new AssumeRoleRequest();
request.setTags(ImmutableList.of(new Tag().withKey("metadataKey").withValue("metadataValue")));
...
service.assumeRole(request);
In my backend service, I receive the username and ARN of the caller which corresponds to the temporary session. However, I am not able to lookup the details of the IAM user (which would contain the tags).
AmazonIdentityManagement iamClient = ...
GetUserRequest request = new GetUserRequest();
request.setUsername(...);
// this next line fails because the temporary user has a colon in the username
iamClient.getUser(request);
How would I retrieve the Tags of a temporary 'Assume Role user'?
How would I retrieve the Tags of a temporary 'Assume Role user'?
This question is based on a misunderstanding of what Tags are used for. Tags are used to further ALLOW / DENY access to resources. They are not used as a canvas for storing metadata. This is supported by the AWS documentation:
When you use the session credentials to make a subsequent request, the request context includes the aws:PrincipalTag context key. You can use the aws:PrincipalTag key in the Condition element of your policies to allow or deny access based on those tags. See more here
Temporary session users cannot be looked up from an IAM ARN as there is no persistent data stored by AWS.
However, there is a workaround where you can store limited metadata using the "session name" field. AWS uses the session name in the ARN, so values can actually be stored as long as they are not sensitive information.
During the role creation:
AWSSecurityTokenService service = ...
request.setRoleSessionName("metadata=test");
service.assumeRole(request);
Finally, the user ARN is in this format and can be read by another service
[generatedId]:metadata=test[moreData]
I need to pass data to my URL to get the data.
I have done like this...
document= Jsoup.connect("MYURL").data("PASSCODE", "001100").post();
System.out.println(document);
but I am not getting proper output
Need help. I have check this Links also
This and this also.
Usually login into a web site requires two steps:
You send a get request to get the page, and you extract from there some values like session ID etc, and the cookies.
You send a post request with the values from step 1, and your user name and password.
To know which values you need to send, use your browser in the developer mode (by pressing F12) and examine the traffic.
If you want to write an android app, change the user agent string to match your browser, since some sites send different pages to different clients.
You can see an example HERE.
The argument has been made and settled on numerous occasions: Blobstore is better than DataStore for storing images. Now say I have an app similar to Instagram or Facebook or Yelp or any of those apps that are image intensive. In my particular case the ideal model would be
public class IdealPostModel{
Integer userId;
String synopsys;
Blob image;
...//more data/fields about the post
}
But since I must use the BlobStore, my model does not have a blob but instead a URL or BlobKey. The heavy catch is that to send a post (i.e. save a post to server) the app must -- in that order --
Send all the non-blob data to the server.
Wait for server to respond with BlobstoreUtils.generateServingUrl(null) data
Send the images to the BlobStore
Have BlobStore send response to my server with the BlobKey or url of the image
Store the BlobKey/url in my DataStore entity
That's a lot of handshakes!!!
Is there a way to send all the data to the server in step one: strings and image. And then from there have the server do everything else? Of course I am here hoping to reduce the amount of work. I image App Engine must be quite mature by now and there has to be a simpler way than my architecture.
Now of course I am here because I am experiencing situations where the data is saved but the BlobKey or URL is not being saved to the Entity. It happens about 10% of the time. Or maybe less, but it does feel like 10%. It's driving my users insane, which means it's driving me even more insane since I don't want to lose my users.
Ideally
App sends everything to server in one call: image and metadata such as userId and synopsys
Server somehow gets a blob key from Blobstore
Server sends image to blobstore, to be stored at the provided blob key, and server sends other data to DataStore, also including the blob key in the datastore.
Update
public static String generateServingUrl(String path) {
BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
return blobstoreService.createUploadUrl(null == path ? "/upload" : path);
}
This is a snippet on my server.
The workflow is different. Given your example:
There is "Post" button and "Attach an image" button under a form for a new post.
When a user hits the Attach button, you ask a user to select a file and save it to the BlobStore (or Google Cloud Storage). There is no need to call BlobstoreUtils.generateServingUrl(null) - you need to get an upload URL. When a call returns a key, you store it in the Post object.
When a user hits the Post button, you save the Post - including the image key (keys) in the Datastore. Thus, the entire workflow takes only two calls.
If a user hits the Cancel button - remember to delete the uploaded image(s) if any, or they will be orphaned. The tricky part is to delete these images if a user simply left your app or lost connection.
If you want, you can reverse the process - save post first, then let users attach images. This is not what most users expect, but it solves the problem with orphaned images.
You actually cand send everything in a single request.
Bear in mind that when you send a blob using a URL gotten by calling blobstoreService.createUploadUrl("/yourservingurl") , GAE actually makes a first request to a different URL, stores the blob and then calls /yourservingurl passing you the blobkeys which you can retrive by doing:
Map<String, List<BlobKey>> blobs = blobstoreService.getUploads(req);
List<BlobKey> blobKeys = blobs.get("myFile");
So in effect all other form values will be lost after that first request, BUT if you can build that URL dynamically eg
blobstoreService.createUploadUrl("/yourservingurl?param1=a¶m2=b")
then you can get back those parameters on your Servlet and persist everything (including the blobkey of the already stored blob) at once, making much fewer calls to the datastore.
UPDATE:
the steps would be
1) Gather all parameters on the client side and create an upload URL with those params eg: blobstoreService.createUploadUrl("/yourservingurl?userId=989787") . GAE will generate a unique URL for that specific request.
when you commit the form, GAE will persist the blobs and call /yourservingurl
2) On the Servlet serving /yourservingurl you'll get all the blobkeys form files you uploaded by using : blobstoreService.getUploads(request) AND you can get the parameters you included with the standard request.getParameter("userId")
3) Now in your servlet you have all the parameters (eg userId) you sent plus the blobkey, you can persist your Post object in one call.
I want to provide a url to an requested user through email for a download request. The url is valid for a minutes, when user tries to access that url after a minutes the web app should redirect him to another page. what is the best logic to go about!!.kindly let me know your views.
Create an URL to the resource to download that contains a query string with an encrypted expiration date. Very simple to manage and you don't have to rely on a database.
http://example.com/download.php?aid=jHYgIK7d
Generate a large random string (GUID)
Write this string with the timestamp to a database
Give the user a link to /download?guid=[your guid]
Write a servlet and map it to /download
in your servlet
5.1. Read the GUID from the request parameter
5.2. check the database that the time is still valid
5.3. if yes, read the file from your server and stream it from the servlet to the user (make sure to set the content type correctly)
5.4. update the db-table to indicate that this GUID was already used
5.3' if not, redirect to error page