I'm trying to change the base URL of the the AWS SDK for Java.
It is possible with the SDK for PHP like this:
require 'vendor/autoload.php';
use Aws\DynamoDb\DynamoDbClient;
// Create a client that that contacts a completely customized base URL
$client = DynamoDbClient::factory(array(
'endpoint' => 'http://my-custom-url',
'region' => 'my-region-1',
'credentials' => array(
'key' => 'abc',
'secret' => '123'
)
));
It's also possible to set this up for s3cmd in .s3conf:
host_base = s3.mylocalaws.com
host_bucket = %(bucket)s.s3.mylocalaws.com
I can't figure out how this works for the Java SDK.
I've tried this but the result is not https://s3.mylocalaws.com/bucketName/key as I expected but https://bucketName.s3.mylocalaws.com
AmazonS3 s3Client = new AmazonS3Client(new AWSTestCredentialsImpl());
s3Client.setEndpoint("https://s3.mylocalaws.com");
S3Object resource = s3Client.getObject(
new GetObjectRequest(bucketName, key));
Look at Choosing a Specific Endpoint in the documentation.
AmazonEC2 ec2 = new AmazonEC2(myCredentials);
ec2.setEndpoint("https://ec2.eu-west-1.amazonaws.com");
Related
I wrote the below controller to generate PreSigned S3 upload links.
case class S3Controller(private val s3Config: S3Config, private val awsConfig: AwsConfig) {
val URL_TIMEOUT_IN_MILLIS: Long = 60 * 5 * 1000
def getPreSignedURLForUpload(keyName: String): String = {
val preSigner: S3Presigner = DefaultS3Presigner.builder()
.serviceConfiguration(s3Config.s3Configuration)
.credentialsProvider(awsConfig.awsCredentialsProvider).build()
val objectRequest: PutObjectRequest = PutObjectRequest.builder()
.bucket(s3Config.bucketName)
.key(keyName)
.contentType("text/plain")
.build()
val preSignRequest: PutObjectPresignRequest = PutObjectPresignRequest.builder()
.signatureDuration(Duration.ofMinutes(10))
.putObjectRequest(objectRequest)
.build()
val preSignedRequest: PresignedPutObjectRequest = preSigner.presignPutObject(preSignRequest)
val myURL: String = preSignedRequest.url().toString
myURL
}
}
Config objects that I used
case class S3Config (
bucketName: String,
s3Configuration: S3Configuration
)
case class AwsConfig (
awsCredentialsProvider: AwsCredentialsProvider
)
I tried to test it with the following code
test("S3 Controller"){
val s3Configuration: S3Configuration = S3Configuration.builder()
.pathStyleAccessEnabled(true).build()
val s3Config: S3Config = S3Config(
bucketName = "ccc",
s3Configuration = s3Configuration
)
val awsCredentials: AwsCredentials = AwsBasicCredentials.create("aaa", "bbb")
val awsCredentialsProvider: AwsCredentialsProvider = AwsCredentialsProviderChain.of(StaticCredentialsProvider.create(awsCredentials))
val awsConfig: AwsConfig = AwsConfig(awsCredentialsProvider = awsCredentialsProvider)
val s3Controller: S3Controller = S3Controller(s3Config, awsConfig)
s3Controller.getPreSignedURLForUpload("ab")
}
This test fails the with the messsage
Unable to load region from any of the providers in the chain software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain#4e5ed836: [software.amazon.awssdk.regions.providers.SystemSettingsRegionProvider#5f8edcc5: Unable to load region from system settings. Region must be specified either via environment variable (AWS_REGION) or system property (aws.region)., software.amazon.awssdk.regions.providers.AwsProfileRegionProvider#60015ef5: No region provided in profile: default, software.amazon.awssdk.regions.providers.InstanceProfileRegionProvider#2ab4bc72: Unable to contact EC2 metadata service.]
software.amazon.awssdk.core.exception.SdkClientException: Unable to load region from any of the providers in the chain software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain#4e5ed836: [software.amazon.awssdk.regions.providers.SystemSettingsRegionProvider#5f8edcc5: Unable to load region from system settings. Region must be specified either via environment variable (AWS_REGION) or system property (aws.region)., software.amazon.awssdk.regions.providers.AwsProfileRegionProvider#60015ef5: No region provided in profile: default, software.amazon.awssdk.regions.providers.InstanceProfileRegionProvider#2ab4bc72: Unable to contact EC2 metadata service.]
I understand this is happening since I've not configured the Region anywhere.
All the ways to configure the Region are either through environment variables or config files.
Is there a way to programmatically configure the Region?
YOu set a Region when you declare a Service Object using a builder():
Region region = Region.US_EAST_1;
S3Presigner presigner = S3Presigner.builder()
.region(region)
.build();
All the Java V2 code examples show this way of setting a Region:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/master/javav2/example_code/s3/src/main/java/com/example/s3
If You are sending an object to a webmethod to use it as prameter:
Example: sending an object that will be represented as a class in the server like this:
$creditCard = new stdClass();
$creditCard->number="705";
$creditCard->expiryDate="03/10/2019";
$creditCard->controlNumber=9;
$param = array("creditCard" => $creditCard);
$Response = $client->__soapCall('valider', array($param));
You won't get any property of the class CreditCard that is in the server to work, and every time you do creditClass.getNumber() or trying to use accessors you will get a null pointer error because your object is not really created.
To fix this problem you have to modify the server webservice and not the client: you have to add the trivial jax-ws annotation #webParam:
public String valider(#WebParam(name = "creditCard")CreditCard creditCard){ ... }
And your PHP soapClient code is very simple: (if you want more security you have to encode your credentials or use an other way for authentication like digest authentication or other jax ws security implementations)
try {
$opts = array (
'http' => [
'header' => "username: myrealname\r\n".
'password: pass1myrealpassword']
);
$par= array(
'trace' => 1,
'exceptions' =>0,
'soap_version' => SOAP_1_1,
'connection_timeout' => 1800,
'stream_context' => stream_context_create($opts),
);
$client = new SoapClient($wsdl,$par);
//...... and call your service method as you do
I am attempting to implement a relatively simple ETL pipeline that iterates through files in a google cloud bucket. The bucket has two folders: /input and /output.
What I'm trying to do is write a Java/Scala script to iterate through files in /input, and have the transformation applied to those that are not present in /output or those that have a timestamp later than that in /output. I've been looking through the Java API doc for a function I can leverage (as opposed to just calling gsutil ls ...), but haven't had any luck so far. Any recommendations on where to look in the doc?
Edit: There is a better way to do this than using data transfer objects:
public Page<Blob> listBlobs() {
// [START listBlobs]
Page<Blob> blobs = bucket.list();
for (Blob blob : blobs.iterateAll()) {
// do something with the blob
}
// [END listBlobs]
return blobs;
}
Old method:
def getBucketFolderContents(
bucketName: String
) = {
val credential = getCredential
val httpTransport = GoogleNetHttpTransport.newTrustedTransport()
val requestFactory = httpTransport.createRequestFactory(credential)
val uri = "https://www.googleapis.com/storage/v1/b/" + URLEncoder.encode(
bucketName,
"UTF-8") +
"o/raw%2f"
val url = new GenericUrl(uri)
val request = requestFactory.buildGetRequest(uri)
val response = request.execute()
response
}
}
You can list objects under a folder by setting the prefix string on the object listing API: https://cloud.google.com/storage/docs/json_api/v1/objects/list
The results of listing are sorted, so you should be able to list both folders and then walk through both in order and generate the diff list.
I am trying to establish connection to aws for perofrming basic operations on s3 bucket. Following is the code:
def list(){
AWSCredentials credentials = new BasicAWSCredentials("Access key", "Secret Key");
AmazonS3 s3client = new AmazonS3Client(credentials);
String bucketName = "sample-bucket-from-java-code";
System.out.println("Listing all buckets : ");
for (Bucket bucket : s3client.listBuckets()) {
System.out.println(" - " + bucket.getName());
}
}
This gives me the error:
request- Received error response: com.amazonaws.services.s3.model.AmazonS3Exception: The request signature we calculated does not match the signature you provided. Check your key and signing method.
I have also double checked the access key and secret key that I am using. Cannot figure out the issue.
Its always good to use "cognito accountId" rather than access and secret key. Because using cognito accountID has limited access to AWS, which can always be changed.
// Initialize the Amazon Cognito credentials provider.
AWSCognitoCredentialsProvider* credentialsProvider = [AWSCognitoCredentialsProvider
credentialsWithRegionType:AWSRegionUSEast1
accountId:#"xxxxxxxxxxx"
identityPoolId:#"xxxxxxxxxxx"
unauthRoleArn:#"arn:aws:iam::xxxxxxxxxxx"
authRoleArn:nil];
AWSServiceConfiguration* configuration = [AWSServiceConfiguration configurationWithRegion:AWSRegionUSWest2
credentialsProvider:credentialsProvider];
[AWSServiceManager defaultServiceManager].defaultServiceConfiguration = configuration;
You can also find running code from my blog.
https://tauheeda.wordpress.com/2015/10/15/use-aws-service-to-downloadupload-files-in-your-applications/
Do not forget to add your credential info:
accountId:#“xxxxxxxx”
identityPoolId:#”xxxxxxxx-xxxxxxxx”
unauthRoleArn:#”xxxxxxxx-xxxxxxxx-xxxxxxxx-xxxxxxxx“
I'm trying to get all the associated products for a configurable product using the Magento SOAP API v2. The catalogProductLink call looks close, but doesn't handle configurable types. I don't see any other calls that contain the associated product and configurable type information for a product. How have others solved this issue?
I'm using Magento version 1.6 and the SOAP API V2 with Java.
I looked deeper into this solution and realized that you may need to override the API model (Mage_Catalog_Model_Product_Api) to achieve the results you're looking for.
In the items function (around line 90), you can do something like this:
foreach ($collection as $product) {
$childProductIds = Mage::getModel('catalog/product_type_configurable')->getChildrenIds($product->getId());
$result[] = array(
'product_id' => $product->getId(),
'sku' => $product->getSku(),
'name' => $product->getName(),
'set' => $product->getAttributeSetId(),
'type' => $product->getTypeId(),
'category_ids' => $product->getCategoryIds(),
'website_ids' => $product->getWebsiteIds(),
'children' => $childProductIds[0],
);
}
create the folder app/code/local/Mage/Catalog/Model/Product/Link
copy the app/code/core/Mage/Catalog/Model/Product/Link/Api.php into this folder and edit it. (I’ve only done the change for the V1 but I’m pretty sure it’s as easy in V2)
replace the content of the the items() function with the following
if($type == "associated"){
$product = $this->_initProduct($productId);
try
{
$result = Mage::getModel('catalog/product_type_configurable')->getUsedProducts(null,$product);
}
catch (Exception $e)
{
$this->_fault('data_invalid', Mage::helper('catalog')->__('The product is not configurable.'));
}
}
else
{
$typeId = $this->_getTypeId($type);
$product = $this->_initProduct($productId, $identifierType);
$link = $product->getLinkInstance()
->setLinkTypeId($typeId);
$collection = $this->_initCollection($link, $product);
$result = array();
foreach ($collection as $linkedProduct) {
$row = array(
'product_id' => $linkedProduct->getId(),
'type' => $linkedProduct->getTypeId(),
'set' => $linkedProduct->getAttributeSetId(),
'sku' => $linkedProduct->getSku()
);
foreach ($link->getAttributes() as $attribute) {
$row[$attribute['code']] = $linkedProduct->getData($attribute['code']);
}
$result[] = $row;
}
}
return $result;
then you can call the API like this now:
$client->call($sessionId, 'product_link.list', array('associated', $id_of_your_configurable_product));
Basically my code is checking the type provided and if it’s “associated” it returns the child products.
I’m pretty sure there’s better way of doing it but I thought that the Product Link API was the most relevant place to do it.
Enjoy!
(please note: this code is not mine, i just adapted it and thought it would be a nice idea to help you guys out)