I am new to Azure related concepts and am facing issue in connecting the Azure Key vault.
Please find my code snippets as follows and let me know why am getting the below exception:
Get Key started.../n
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Get Key failedjava.lang.RuntimeException: java.util.concurrent.ExecutionException: com.microsoft.aad.adal4j.AuthenticationException: {"error_description":"AADSTS70002: Error validating credentials. AADSTS50012: Invalid client secret is provided.\r\nTrace ID: 13f8e909-89d8-472f-a1c1-9f4bcf693700\r\nCorrelation ID: bf818c41-4092-4f7d-8292-b1275a5da62f\r\nTimestamp: 2017-10-17 07:22:12Z","error":"invalid_client"}
Exception in thread "main" java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.microsoft.aad.adal4j.AuthenticationException: {"error_description":"AADSTS70002: Error validating credentials. AADSTS50012: Invalid client secret is provided.\r\nTrace ID: 1234\r\nCorrelation ID: 123456\r\nTimestamp: 2017-10-17 07:22:12Z","error":"invalid_client"}
at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
at Program.main(Program.java:88)
Corresponding Code am trying to connect Azure Key Vault :
KeyVaultCredentials kvCred = new ClientSecretKeyVaultCredential("clientID", "client Secret");
KeyVaultClient vc = new KeyVaultClient(kvCred);
byte[] byteText = textToEncrypt.getBytes("UTF-16");
/*************************************/
// Get Key from Key Vault
System.out.println("Get Key started.../n");
start = System.currentTimeMillis();
ServiceCallback<KeyBundle> serviceCallbackgetkey = new ServiceCallback<KeyBundle>(){
#Override
public void failure(Throwable t) {
System.out.println("Get Key failed"+t.toString());
}
#Override
public void success(KeyBundle result ) {//ServiceResponse
System.out.println("Get Key Success");
JsonWebKey myKey = result.key();
keyIdentifier = myKey.kid();
System.out.println("Key ID:"+keyIdentifier);
end = System.currentTimeMillis();
formatter = new DecimalFormat("#0.00000");
System.out.print("Get Key Execution time is " + formatter.format((end - start) / 1000d) + " seconds\n");
start = 0;
end =0;
}
};
ServiceCall<KeyBundle> call = vc.getKeyAsync(keyVaultURI, "MyKey1", serviceCallbackgetkey);
System.out.println(call.get());
Note: Am using the same Client-ID and Client Secret in postman for connecting a different REST api and is working fine.
Also, I tried executing the following code from here. But facing same issue.
Please help me identifying why am unable to connect the vault.
I tried to reproduce your issue but failed.
I thought your issue probably results from the permission to authorize keyvault API for your application.
You could refer to the code below which works for me.
Program Class:
import java.io.UnsupportedEncodingException;
import java.net.URISyntaxException;
import java.util.concurrent.ExecutionException;
import com.microsoft.azure.keyvault.KeyVaultClient;
import com.microsoft.azure.keyvault.authentication.KeyVaultCredentials;
public class Program {
public static void main(String[] args)
throws InterruptedException, ExecutionException, URISyntaxException, UnsupportedEncodingException {
KeyVaultCredentials kvCred = new ClientSecretKeyVaultCredential("APP_ID", "APP_SECRET");
KeyVaultClient vc = new KeyVaultClient(kvCred);
String keyIdentifier = "https://jaygong.vault.azure.net/keys/jaytest/b21bae081025418c806d73affc2937e0";
System.out.println(vc.getKey(keyIdentifier));
}
}
ClientSecretKeyVaultCredential Class:
import java.net.MalformedURLException;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import com.microsoft.aad.adal4j.AuthenticationContext;
import com.microsoft.aad.adal4j.AuthenticationResult;
import com.microsoft.aad.adal4j.ClientCredential;
import com.microsoft.azure.keyvault.authentication.KeyVaultCredentials;
public class ClientSecretKeyVaultCredential extends KeyVaultCredentials {
private String applicationId;
private String applicationSecret;
public ClientSecretKeyVaultCredential(String applicationId, String applicationSecret) {
this.setApplicationId(applicationId);
this.setApplicationSecret(applicationSecret);
}
public String getApplicationId() {
return applicationId;
}
private void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public String getApplicationSecret() {
return applicationSecret;
}
private void setApplicationSecret(String applicationSecret) {
this.applicationSecret = applicationSecret;
}
#Override
public String doAuthenticate(String authorization, String resource, String scope) {
AuthenticationResult res = null;
try {
res = GetAccessToken(authorization, resource, applicationId, applicationSecret);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (ExecutionException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return res.getAccessToken();
}
private AuthenticationResult GetAccessToken(String authorization, String resource, String clientID, String clientKey)
throws InterruptedException, ExecutionException {
AuthenticationContext ctx = null;
ExecutorService service = Executors.newFixedThreadPool(1);
try {
ctx = new AuthenticationContext(authorization, false, service);
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Future<AuthenticationResult> resp = ctx.acquireToken(resource, new ClientCredential(
clientID, clientKey), null);
AuthenticationResult res = resp.get();
return res;
}
}
Results:
Please notice that you should authorize your application to use the key or secret.
Here is the powershell way which is mentioned in the official doc.
Set-AzureRmKeyVaultAccessPolicy -VaultName 'XXXXXXX' -ServicePrincipalName XXXXX -PermissionsToKeys decrypt,sign,get,unwrapKey
Update Answer:
I'm not sure if your application has permission to call KeyVault API. You could add this permission on portal.
Hope it helps you.
Figured out the issue, the problem is with the client_secret which has some special characters like % when generated. It seems, the azure key vault is accepting client secret which are encrypted with base64 encoding and it's special characters.
Related
I am currently trying to send a request to a nodejs server from a java client that I created but I am getting the error that is showing above. I've been doing some research on it but can seem to figure out why it is happening. The server I created in nodejs:
var grpc = require('grpc');
const protoLoader = require('#grpc/proto-loader')
const packageDefinition = protoLoader.loadSync('AirConditioningDevice.proto')
var AirConditioningDeviceproto = grpc.loadPackageDefinition(packageDefinition);
var AirConditioningDevice = [{
device_id: 1,
name: 'Device1',
location: 'room1',
status: 'On',
new_tempature: 11
}];
var server = new grpc.Server();
server.addService(AirConditioningDeviceproto.AirConditioningDevice.Airconditioning_service.service,{
currentDetails: function(call, callback){
console.log(call.request.device_id);
for(var i =0; i <AirConditioningDevice.length; i++){
console.log(call.request.device_id);
if(AirConditioningDevice[i].device_id == call.request.device_id){
console.log(call.request.device_id);
return callback(null, AirConditioningDevice [i]);
}
console.log(call.request.device_id);
}
console.log(call.request.device_id);
callback({
code: grpc.status.NOT_FOUND,
details: 'Not found'
});
},
setTemp: function(call, callback){
for(var i =0; i <AirConditioningDevice.length; i++){
if(AirConditioningDevice[i].device_id == call.request.device_id){
AirConditioningDevice[i].new_tempature == call.request.new_tempature;
return callback(null, AirConditioningDevice[i]);
}
}
callback({
code: grpc.status.NOT_FOUND,
details: 'Not found'
});
},
setOff: function(call, callback){
for(var i =0; i <AirConditioningDevice.length; i++){
if(AirConditioningDevice[i].device_id == call.request.device_id && AirConditioningDevice[i].status == 'on'){
AirConditioningDevice[i].status == 'off';
return callback(null, AirConditioningDevice[i]);
}else{
AirConditioningDevice[i].status == 'on';
return callback(null, AirConditioningDevice[i]);
}
}
callback({
code: grpc.status.NOT_FOUND,
details: 'Not found'
});
}
});
server.bind('localhost:3000', grpc.ServerCredentials.createInsecure());
server.start();
This is the client that I have created in java:
package com.air.grpc;
import java.util.concurrent.TimeUnit;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.swing.JFrame;
import javax.swing.JLabel;
import javax.swing.JPanel;
import com.air.grpc.Airconditioning_serviceGrpc;
import com.air.grpc.GrpcClient;
import com.air.grpc.deviceIDRequest;
import com.air.grpc.ACResponse;
import io.grpc.ManagedChannel;
import io.grpc.ManagedChannelBuilder;
import io.grpc.StatusRuntimeException;
public class GrpcClient {
private static final Logger logger = Logger.getLogger(GrpcClient.class.getName());
private final ManagedChannel channel;
private final Airconditioning_serviceGrpc.Airconditioning_serviceBlockingStub blockingStub;
private final Airconditioning_serviceGrpc.Airconditioning_serviceStub asyncStub;
public GrpcClient(String host, int port) {
this(ManagedChannelBuilder.forAddress(host, port)
// Channels are secure by default (via SSL/TLS). For the example we disable TLS to avoid
// needing certificates.
.usePlaintext()
.build());
}
GrpcClient(ManagedChannel channel) {
this.channel = channel;
blockingStub = Airconditioning_serviceGrpc.newBlockingStub(channel);
asyncStub = Airconditioning_serviceGrpc.newStub(channel);
}
public void shutdown() throws InterruptedException {
channel.shutdown().awaitTermination(5, TimeUnit.SECONDS);
}
public void currentDetails(int id) {
logger.info("Will try to get device " + id + " ...");
deviceIDRequest deviceid = deviceIDRequest.newBuilder().setDeviceId(id).build();
ACResponse response;
try {
response =blockingStub.currentDetails(deviceid);
}catch(StatusRuntimeException e) {
logger.log(Level.WARNING, "RPC failed: {0}", e.getStatus());
return;
}
logger.info("Device: " + response.getAirConditioning ());
}
public static void main(String[] args) throws Exception {
GrpcClient client = new GrpcClient("localhost", 3000);
try {
client.currentDetails(1);
}finally {
client.shutdown();
}
}
}
Right now the only one that I have tested cause its the most basic one is currentdetails. As you can see I have created an AirConditioningDevice object. I am trying to get the details of it by typing in 1 to a textbox which is the id but like i said when i send it i get the error in the title. This is the proto file that I have created:
syntax = "proto3";
package AirConditioningDevice;
option java_package = "AircondioningDevice.proto.ac";
service Airconditioning_service{
rpc currentDetails(deviceIDRequest) returns (ACResponse) {};
rpc setTemp( TempRequest ) returns (ACResponse) {};
rpc setOff(deviceIDRequest) returns (ACResponse) {};
}
message AirConditioning{
int32 device_id =1;
string name = 2;
string location = 3;
string status = 4;
int32 new_tempature = 5;
}
message deviceIDRequest{
int32 device_id =1;
}
message TempRequest {
int32 device_id = 1;
int32 new_temp = 2;
}
message ACResponse {
AirConditioning airConditioning = 1;
}
lastly this is everything I get back in the console:
Apr 02, 2020 4:23:29 PM AircondioningDevice.proto.ac.AirConClient currentDetails
INFO: Will try to get device 1 ...
Apr 02, 2020 4:23:30 PM AircondioningDevice.proto.ac.AirConClient currentDetails
WARNING: RPC failed: Status{code=NOT_FOUND, description=Not found, cause=null}
I dont know whether I am completely off or if the error is small. Any suggestions? One other thing is I the same proto file in the java client and the node server I dont know if that matters. One last this is I also get this when i run my server: DeprecationWarning: grpc.load: Use the #grpc/proto-loader module with grpc.loadPackageDefinition instead I dont know if that has anything to do with it.
In your .proto file, you declare deviceIDRequest with a field device_id, but you are checking call.request.id in the currentDetails handler. If you look at call.request.id directly, it's probably undefined.
You also aren't getting to this bit yet, but the success callback is using the books array instead of the AirConditioningDevice array.
I need to tweak an API https://sandbox.api.visa.com/cybersource/payments/flex/v1/keys?apikey={apikey}
I am imitating the official document X-Pay Token,but it fail with "Token validation failed" error.
{
"responseStatus": {
"status": 401,
"code": "9159",
"severity": "ERROR",
"message": "Token validation failed",
"info": ""
}
}
Below is my x-pay-token generation code.
import java.math.BigInteger;
import javax.crypto.Mac;
import javax.crypto.spec.SecretKeySpec;
import java.nio.charset.StandardCharsets;
import java.security.SignatureException;
public class T {
private static String resoucePath = "payments/flex/v1/keys";
private static String queryString = "apikey=6DC0NMXO53QQFE6NFOLE213HXA-pvG6xE-1NtuCd5oOQr-O-s";
private static String requestBody = "{encryptionType:RsaOaep256}";
private static String sharedSecret = "gAynzAGf89+V}3{Q4Jx5cp-/R#Y#PEv#1XvxnjQC";
public static void main(String[] args) throws SignatureException {
System.out.println(T.generateXpaytoken(resoucePath, queryString, requestBody, sharedSecret));
}
public static String generateXpaytoken(String resourcePath, String queryString, String requestBody, String sharedSecret) throws SignatureException {
String timestamp = timeStamp();
String beforeHash = timestamp + resourcePath + queryString + requestBody;
String hash = hmacSha256Digest(beforeHash, sharedSecret);
String token = "xv2:" + timestamp + ":" + hash;
return token;
}
private static String timeStamp() {
return String.valueOf(System.currentTimeMillis() / 1000L);
}
private static String hmacSha256Digest(String data, String sharedSecret) throws SignatureException {
return getDigest("HmacSHA256", sharedSecret, data, true);
}
private static String getDigest(String algorithm, String sharedSecret, String data, boolean toLower) throws SignatureException {
try {
Mac sha256HMAC = Mac.getInstance(algorithm);
SecretKeySpec secretKey = new SecretKeySpec(sharedSecret.getBytes(StandardCharsets.UTF_8), algorithm);
sha256HMAC.init(secretKey);
byte[] hashByte = sha256HMAC.doFinal(data.getBytes(StandardCharsets.UTF_8));
String hashString = toHex(hashByte);
return toLower ? hashString.toLowerCase() : hashString;
} catch (Exception e) {
throw new SignatureException(e);
}
}
private static String toHex(byte[] bytes) {
BigInteger bi = new BigInteger(1, bytes);
return String.format("%0" + (bytes.length << 1) + "X", bi);
}
}
somebody can help me please?
The URL you are using:
https://sandbox.api.visa.com/cybersource/payments/flex/v1/keys?apikey={apikey}
Should be:
https://sandbox.api.visa.com/cybersource/payments/flex/v1/keys?apikey=6DC0NMXO53QQFE6NFOLE213HXA-pvG6xE-1NtuCd5oOQr-O-s
As on Feb 2020, If some one still has issue below points can help resolve issue.
Please make sure you have generated API Key and Secret for your sandbox project.
You can get these details in Dashboard -> Project -> Credentials -> Inbound and Authentication Keys -> API Key / Secret.
Please check the "Status" of the Key which should be Active.
If your "Credentials" tab does not have details for "Inbound and Authentication Keys" Please make sure to add the respective API then this section automatically appears.
Visa has "Visa Developer Center PlayGround" [similar to SoapUI/Postman] tool where you can easily test your API's. Unfortunately this is only supported with Windows as on Feb 2020, In future they may release the same for Mac/Linux too.
You can find this tool in Dashboard -> Project -> Assets -> Bottom of the page.
How can I send a PubSub message manually (that is to say, without using a PubsubIO) in Dataflow ?
Importing (via Maven) google-cloud-dataflow-java-sdk-all 2.5.0 already imports a version of com.google.pubsub.v1 for which I was unable to find an easy way to send messages to a Pubsub topic (this version doesn't, for instance, allow to manipulate Publisher instances, which is the way described in the official documentation).
Would you consider using PubsubUnboundedSink? Quick example:
import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.options.ValueProvider.StaticValueProvider;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.values.PCollection;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubClient;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage;
public class PubsubTest {
public static void main(String[] args) {
DataflowPipelineOptions options = PipelineOptionsFactory.fromArgs(args)
.as(DataflowPipelineOptions.class);
// writes message to "output_topic"
TopicPath topic = PubsubClient.topicPathFromName(options.getProject(), "output_topic");
Pipeline p = Pipeline.create(options);
p
.apply("input string", Create.of("This is just a message"))
.apply("convert to Pub/Sub message", ParDo.of(new DoFn<String, PubsubMessage>() {
#ProcessElement
public void processElement(ProcessContext c) {
c.output(new PubsubMessage(c.element().getBytes(), null));
}
}))
.apply("write to topic", new PubsubUnboundedSink(
PubsubJsonClient.FACTORY,
StaticValueProvider.of(topic), // topic
"timestamp", // timestamp attribute
"id", // ID attribute
5 // number of shards
));
p.run();
}
}
Here's a way I found browsing https://github.com/GoogleCloudPlatform/cloud-pubsub-samples-java/blob/master/dataflow/src/main/java/com/google/cloud/dataflow/examples/StockInjector.java:
import com.google.api.services.pubsub.Pubsub;
import com.google.api.services.pubsub.model.PublishRequest;
import com.google.api.services.pubsub.model.PubsubMessage;
public class PubsubManager {
private static final Logger logger = LoggerFactory.getLogger(PubsubManager.class);
private static final JsonFactory JSON_FACTORY = JacksonFactory.getDefaultInstance();
private static final Pubsub pubsub = createPubsubClient();
public static class RetryHttpInitializerWrapper implements HttpRequestInitializer {
// Intercepts the request for filling in the "Authorization"
// header field, as well as recovering from certain unsuccessful
// error codes wherein the Credential must refresh its token for a
// retry.
private final GoogleCredential wrappedCredential;
// A sleeper; you can replace it with a mock in your test.
private final Sleeper sleeper;
private RetryHttpInitializerWrapper(GoogleCredential wrappedCredential) {
this(wrappedCredential, Sleeper.DEFAULT);
}
// Use only for testing.
RetryHttpInitializerWrapper(
GoogleCredential wrappedCredential, Sleeper sleeper) {
this.wrappedCredential = Preconditions.checkNotNull(wrappedCredential);
this.sleeper = sleeper;
}
#Override
public void initialize(HttpRequest request) {
final HttpUnsuccessfulResponseHandler backoffHandler =
new HttpBackOffUnsuccessfulResponseHandler(
new ExponentialBackOff())
.setSleeper(sleeper);
request.setInterceptor(wrappedCredential);
request.setUnsuccessfulResponseHandler(
new HttpUnsuccessfulResponseHandler() {
#Override
public boolean handleResponse(HttpRequest request,
HttpResponse response,
boolean supportsRetry)
throws IOException {
if (wrappedCredential.handleResponse(request,
response,
supportsRetry)) {
// If credential decides it can handle it, the
// return code or message indicated something
// specific to authentication, and no backoff is
// desired.
return true;
} else if (backoffHandler.handleResponse(request,
response,
supportsRetry)) {
// Otherwise, we defer to the judgement of our
// internal backoff handler.
logger.info("Retrying " + request.getUrl());
return true;
} else {
return false;
}
}
});
request.setIOExceptionHandler(new HttpBackOffIOExceptionHandler(
new ExponentialBackOff()).setSleeper(sleeper));
}
}
/**
* Creates a Cloud Pub/Sub client.
*/
private static Pubsub createPubsubClient() {
try {
HttpTransport transport = GoogleNetHttpTransport.newTrustedTransport();
GoogleCredential credential = GoogleCredential.getApplicationDefault();
HttpRequestInitializer initializer =
new RetryHttpInitializerWrapper(credential);
return new Pubsub.Builder(transport, JSON_FACTORY, initializer).build();
} catch (IOException | GeneralSecurityException e) {
logger.error("Could not create Pubsub client: " + e);
}
return null;
}
/**
* Publishes the given message to a Cloud Pub/Sub topic.
*/
public static void publishMessage(String message, String outputTopic) {
int maxLogMessageLength = 200;
if (message.length() < maxLogMessageLength) {
maxLogMessageLength = message.length();
}
logger.info("Received ...." + message.substring(0, maxLogMessageLength));
// Publish message to Pubsub.
PubsubMessage pubsubMessage = new PubsubMessage();
pubsubMessage.encodeData(message.getBytes());
PublishRequest publishRequest = new PublishRequest();
publishRequest.setMessages(Collections.singletonList(pubsubMessage));
try {
pubsub.projects().topics().publish(outputTopic, publishRequest).execute();
} catch (java.io.IOException e) {
logger.error("Stuff happened in pubsub: " + e);
}
}
}
You can send pubsub message using PubsubIO writeMessages method
dataflow Pipeline steps
Pipeline p = Pipeline.create(options);
p.apply("Transformer1", ParDo.of(new Fn.method1()))
.apply("Transformer2", ParDo.of(new Fn.method2()))
.apply("PubsubMessageSend", PubsubIO.writeMessages().to(PubSubConfig.getTopic(options.getProject(), options.getpubsubTopic ())));
Define Project Name and pubsubTopic where to want to send pub subs message in the PipeLineOptions
I'm trying to access data from the Google analytics reporting API using Java.
I was following the "Hello Analytics API: Java quickstart for installed applications" tutorial, and i did everything it tells you, and i get following issues:
com.google.api.client.util.store.FileDataStoreFactory setPermissionsToOwnerOnly
WARNING: unable to change permissions for everybody: C:\Users\<user>\.store\hello_analytics
com.google.api.client.util.store.FileDataStoreFactory setPermissionsToOwnerOnly
WARNING: unable to change permissions for owner: C:\Users\timst\.store\hello_analytics
java.lang.NullPointerException
at java.io.Reader.<init>(Reader.java:78)
at java.io.InputStreamReader.<init>(InputStreamReader.java:72)
at com.example.demo.HelloAnalytics.initializeAnalytics(HelloAnalytics.java:60)
at com.example.demo.HelloAnalytics.main(HelloAnalytics.java:44)
I tried using the full path for the client_secret.json.
tried using different methods i found online, but none seem to work.
After getting frustrated by this error i tried the "Hello Analytics API: Java quickstart for service accounts" tutorial.
But here i have the issue that i can't add users to the account, property or view for the accounts i can access.
I have access to other peoples analytics accounts and I can only remove myself from the accounts.
All code I'm using is from the tutorials, using Intellij and gradle.
tl;dr; All I want to do is access the analytics data for all my
accounts, using the reporting API so i can put all this data in my own
database and use this database for my other projects.
the Tutorials google provides doesn't work for me. (the data is mostly Google Adwords data.)
So the Warning is not the problem, it's a known issue with it not working properly on windows.
The java.lang.NullPointerException is because the profile I call to has no rows of data for the given metric. so the return value of the call doesn't have a .getRows() methode because there isn't a row value.
you should check for the row's first,
GaData results;
if (null != results) {
if(results.get("rows") != null){
if (!results.getRows().isEmpty()){
//do something with the rows exp.
for (List<String> row : results.getRows()) {
for (int i=0; i<results.getColumnHeaders().size();i++) {
List<GaData.ColumnHeaders> headers = results.getColumnHeaders();
System.out.println( headers.get(i).getName()+": " + row.get(i));
}
}
}
}
}
In the example I also use the ColumnHeaders, wich you should also check first.
It was also easier to check every single account i had access to and every webProperty and Profile and not just the first value of each of those.
Also, the query explorer is really useful. you should use it to check out which metrics you can use and which dimensions.
Here is my full HelloAnalytics class i just print everything that might be usefull to the console i also use multiple metrics and a dimension from Google AdWords in the getResults methode:
import com.google.api.client.auth.oauth2.Credential;
import com.google.api.client.extensions.java6.auth.oauth2.AuthorizationCodeInstalledApp;
import com.google.api.client.extensions.jetty.auth.oauth2.LocalServerReceiver;
import com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow;
import com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets;
import com.google.api.client.googleapis.javanet.GoogleNetHttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.gson.GsonFactory;
import com.google.api.client.util.store.FileDataStoreFactory;
import com.google.api.services.analytics.Analytics;
import com.google.api.services.analytics.AnalyticsScopes;
import com.google.api.services.analytics.model.*;
import java.io.*;
import java.util.ArrayList;
import java.util.List;
/**
* A simple example of how to access the Google Analytics API.
*/
public class HelloAnalytics {
// Path to client_secrets.json file downloaded from the Developer's Console.
// The path is relative to HelloAnalytics.java.
private static final String CLIENT_SECRET_JSON_RESOURCE = "/client_secret.json";
// The directory where the user's credentials will be stored.
private static final File DATA_STORE_DIR = new File("out/DataStore/hello_analytics");
private static final File OUTPUT_FILE = new File("out/DataStore/output.text");
private static final String APPLICATION_NAME = "Online Marketing Buddy";
private static final JsonFactory JSON_FACTORY = GsonFactory.getDefaultInstance();
private static NetHttpTransport httpTransport;
private static FileDataStoreFactory dataStoreFactory;
public static void main(String[] args) {
try {
Analytics analytics = initializeAnalytics();
getProfileIds(analytics);
} catch (Exception e) {
e.printStackTrace();
}
}
private static Analytics initializeAnalytics() throws Exception {
httpTransport = GoogleNetHttpTransport.newTrustedTransport();
dataStoreFactory = new FileDataStoreFactory(DATA_STORE_DIR);
// Load client secrets.
InputStream in =
HelloAnalytics.class.getResourceAsStream(CLIENT_SECRET_JSON_RESOURCE);
GoogleClientSecrets clientSecrets =
GoogleClientSecrets.load(JSON_FACTORY, new InputStreamReader(in));
// Set up authorization code flow for all auth scopes.
GoogleAuthorizationCodeFlow flow = new GoogleAuthorizationCodeFlow
.Builder(httpTransport, JSON_FACTORY, clientSecrets,AnalyticsScopes.all())
.setDataStoreFactory(dataStoreFactory)
.build();
// Authorize.
Credential credential = new AuthorizationCodeInstalledApp(flow, new LocalServerReceiver())
.authorize("user");
// Construct the Analytics service object.
Analytics response = new Analytics
.Builder(httpTransport, JSON_FACTORY, credential)
.setApplicationName(APPLICATION_NAME).build();
return response;
}
private static void getProfileIds(Analytics analytics) throws IOException {
// Get the all view (profile) IDs for the authorized user.
List<String> profileIds = new ArrayList<>();
// Query for the list of all accounts associated with the service account.
Accounts accounts = analytics.management().accounts().list().execute();
if (accounts.getItems().isEmpty()) {
System.err.println("No accounts found");
} else {
for (Account account : accounts.getItems()) {
System.out.println("account: " + account.getName());
String accountId = account.getId();
// Query for the list of properties associated with the each account.
Webproperties properties = analytics.management().webproperties()
.list(accountId).execute();
if (properties.getItems().isEmpty()) {
System.err.println("No properties found for accountId: " + accountId);
} else {
for (Webproperty webproperty : properties.getItems()) {
System.out.println("\nwebproperty: " + webproperty.getName());
String webpropertyId = webproperty.getId();
// Query for the list views (profiles) associated with the property.
Profiles profiles = analytics.management().profiles()
.list(accountId, webpropertyId).execute();
if (profiles.getItems().isEmpty()) {
System.err.println("No views (profiles) found for accoundId: " + accountId + "and webpropertyId: " + webpropertyId);
} else {
// Return the first (view) profile associated with the property.
for (Profile profile : profiles.getItems()) {
System.out.println("\nprofileId added for profile: " + profile.getName());
profileIds.add(profile.getId());
printResults(getResults(analytics,profile.getId()), profile.getId());
}
}
System.out.println("---------- ---------- end webproperty: " + webproperty.getName() + "---------- ----------");
}
}
System.out.println("---------- ---------- end account: " + account.getName() + "---------- ----------");
}
}
}
private static GaData getResults(Analytics analytics, String profileId) throws IOException {
// Query the Core Reporting API for the number of sessions
// in the past 30 days.
GaData data = analytics.data().ga()
.get("ga:" + profileId, "30daysAgo", "yesterday", "ga:adClicks, ga:adCost, ga:transactions, ga:transactionRevenue, ga:users, ga:sessions")
.setDimensions("ga:adwordsCampaignID")
.execute();
return data;
}
private static void printResults(GaData results, String profile) {
// Parse the response from the Core Reporting API for
// the profile name and number of sessions.
if (null != results) {
System.out.println("View (Profile: " + profile + ") Name: "
+ results.getProfileInfo().getProfileName() + "\n");
if (results.get("rows") != null && results.get("columnHeaders") != null) {
if (!results.getRows().isEmpty() && !results.getColumnHeaders().isEmpty()) {
for (List<String> row : results.getRows()) {
for (int i=0; i<results.getColumnHeaders().size();i++) {
List<GaData.ColumnHeaders> headers = results.getColumnHeaders();
System.out.println( headers.get(i).getName()+": " + row.get(i) + "\n");
}
System.out.println("---------- ---------- ----------\n");
}
} else {
System.out.println("No rows or columHeaders empty\n");
}
} else {
System.out.println("No rows or columHeaders\n");
}
}
}
}
When I run Microsoft Azure Media Services code written using Java in local it is working but when I deploy the same code in dev environment , I am unable to access the Azure and its throwing java.net.HostNotFoundException.
What is the best approach to use network proxy to connect to Azure
Below is the code I am using via java and using azure-java-sdk
import java.io.*;
import java.security.NoSuchAlgorithmException;
import java.util.EnumSet;
import com.microsoft.windowsazure.Configuration;
import com.microsoft.windowsazure.exception.ServiceException;
import com.microsoft.windowsazure.services.media.MediaConfiguration;
import com.microsoft.windowsazure.services.media.MediaContract;
import com.microsoft.windowsazure.services.media.MediaService;
import com.microsoft.windowsazure.services.media.WritableBlobContainerContract;
import com.microsoft.windowsazure.services.media.models.AccessPolicy;
import com.microsoft.windowsazure.services.media.models.AccessPolicyInfo;
import com.microsoft.windowsazure.services.media.models.AccessPolicyPermission;
import com.microsoft.windowsazure.services.media.models.Asset;
import com.microsoft.windowsazure.services.media.models.AssetFile;
import com.microsoft.windowsazure.services.media.models.AssetFileInfo;
import com.microsoft.windowsazure.services.media.models.AssetInfo;
import com.microsoft.windowsazure.services.media.models.Job;
import com.microsoft.windowsazure.services.media.models.JobInfo;
import com.microsoft.windowsazure.services.media.models.JobState;
import com.microsoft.windowsazure.services.media.models.ListResult;
import com.microsoft.windowsazure.services.media.models.Locator;
import com.microsoft.windowsazure.services.media.models.LocatorInfo;
import com.microsoft.windowsazure.services.media.models.LocatorType;
import com.microsoft.windowsazure.services.media.models.MediaProcessor;
import com.microsoft.windowsazure.services.media.models.MediaProcessorInfo;
import com.microsoft.windowsazure.services.media.models.Task;
public class HelloMediaServices
{
// Media Services account credentials configuration
private static String mediaServiceUri = "https://media.windows.net/API/";
private static String oAuthUri = "https://wamsprodglobal001acs.accesscontrol.windows.net/v2/OAuth2-13";
private static String clientId = "account name";
private static String clientSecret = "account key";
private static String scope = "urn:WindowsAzureMediaServices";
private static MediaContract mediaService;
// Encoder configuration
private static String preferedEncoder = "Media Encoder Standard";
private static String encodingPreset = "H264 Multiple Bitrate 720p";
public static void main(String[] args)
{
try {
// Set up the MediaContract object to call into the Media Services account
Configuration configuration = MediaConfiguration.configureWithOAuthAuthentication(
mediaServiceUri, oAuthUri, clientId, clientSecret, scope);
mediaService = MediaService.create(configuration);
// Upload a local file to an Asset
AssetInfo uploadAsset = uploadFileAndCreateAsset("BigBuckBunny.mp4");
System.out.println("Uploaded Asset Id: " + uploadAsset.getId());
// Transform the Asset
AssetInfo encodedAsset = encode(uploadAsset);
System.out.println("Encoded Asset Id: " + encodedAsset.getId());
// Create the Streaming Origin Locator
String url = getStreamingOriginLocator(encodedAsset);
System.out.println("Origin Locator URL: " + url);
System.out.println("Sample completed!");
} catch (ServiceException se) {
System.out.println("ServiceException encountered.");
System.out.println(se.toString());
} catch (Exception e) {
System.out.println("Exception encountered.");
System.out.println(e.toString());
}
}
private static AssetInfo uploadFileAndCreateAsset(String fileName)
throws ServiceException, FileNotFoundException, NoSuchAlgorithmException {
WritableBlobContainerContract uploader;
AssetInfo resultAsset;
AccessPolicyInfo uploadAccessPolicy;
LocatorInfo uploadLocator = null;
// Create an Asset
resultAsset = mediaService.create(Asset.create().setName(fileName).setAlternateId("altId"));
System.out.println("Created Asset " + fileName);
// Create an AccessPolicy that provides Write access for 15 minutes
uploadAccessPolicy = mediaService
.create(AccessPolicy.create("uploadAccessPolicy", 15.0, EnumSet.of(AccessPolicyPermission.WRITE)));
// Create a Locator using the AccessPolicy and Asset
uploadLocator = mediaService
.create(Locator.create(uploadAccessPolicy.getId(), resultAsset.getId(), LocatorType.SAS));
// Create the Blob Writer using the Locator
uploader = mediaService.createBlobWriter(uploadLocator);
File file = new File("BigBuckBunny.mp4");
// The local file that will be uploaded to your Media Services account
InputStream input = new FileInputStream(file);
System.out.println("Uploading " + fileName);
// Upload the local file to the asset
uploader.createBlockBlob(fileName, input);
// Inform Media Services about the uploaded files
mediaService.action(AssetFile.createFileInfos(resultAsset.getId()));
System.out.println("Uploaded Asset File " + fileName);
mediaService.delete(Locator.delete(uploadLocator.getId()));
mediaService.delete(AccessPolicy.delete(uploadAccessPolicy.getId()));
return resultAsset;
}
// Create a Job that contains a Task to transform the Asset
private static AssetInfo encode(AssetInfo assetToEncode)
throws ServiceException, InterruptedException {
// Retrieve the list of Media Processors that match the name
ListResult<MediaProcessorInfo> mediaProcessors = mediaService
.list(MediaProcessor.list().set("$filter", String.format("Name eq '%s'", preferedEncoder)));
// Use the latest version of the Media Processor
MediaProcessorInfo mediaProcessor = null;
for (MediaProcessorInfo info : mediaProcessors) {
if (null == mediaProcessor || info.getVersion().compareTo(mediaProcessor.getVersion()) > 0) {
mediaProcessor = info;
}
}
System.out.println("Using Media Processor: " + mediaProcessor.getName() + " " + mediaProcessor.getVersion());
// Create a task with the specified Media Processor
String outputAssetName = String.format("%s as %s", assetToEncode.getName(), encodingPreset);
String taskXml = "<taskBody><inputAsset>JobInputAsset(0)</inputAsset>"
+ "<outputAsset assetCreationOptions=\"0\"" // AssetCreationOptions.None
+ " assetName=\"" + outputAssetName + "\">JobOutputAsset(0)</outputAsset></taskBody>";
Task.CreateBatchOperation task = Task.create(mediaProcessor.getId(), taskXml)
.setConfiguration(encodingPreset).setName("Encoding");
// Create the Job; this automatically schedules and runs it.
Job.Creator jobCreator = Job.create()
.setName(String.format("Encoding %s to %s", assetToEncode.getName(), encodingPreset))
.addInputMediaAsset(assetToEncode.getId()).setPriority(2).addTaskCreator(task);
JobInfo job = mediaService.create(jobCreator);
String jobId = job.getId();
System.out.println("Created Job with Id: " + jobId);
// Check to see if the Job has completed
checkJobStatus(jobId);
// Done with the Job
// Retrieve the output Asset
ListResult<AssetInfo> outputAssets = mediaService.list(Asset.list(job.getOutputAssetsLink()));
return outputAssets.get(0);
}
public static String getStreamingOriginLocator(AssetInfo asset) throws ServiceException {
// Get the .ISM AssetFile
ListResult<AssetFileInfo> assetFiles = mediaService.list(AssetFile.list(asset.getAssetFilesLink()));
AssetFileInfo streamingAssetFile = null;
for (AssetFileInfo file : assetFiles) {
if (file.getName().toLowerCase().endsWith(".ism")) {
streamingAssetFile = file;
break;
}
}
AccessPolicyInfo originAccessPolicy;
LocatorInfo originLocator = null;
// Create a 30-day readonly AccessPolicy
double durationInMinutes = 60 * 24 * 30;
originAccessPolicy = mediaService.create(
AccessPolicy.create("Streaming policy", durationInMinutes, EnumSet.of(AccessPolicyPermission.READ)));
// Create a Locator using the AccessPolicy and Asset
originLocator = mediaService
.create(Locator.create(originAccessPolicy.getId(), asset.getId(), LocatorType.OnDemandOrigin));
// Create a Smooth Streaming base URL
return originLocator.getPath() + streamingAssetFile.getName() + "/manifest";
}
private static void checkJobStatus(String jobId) throws InterruptedException, ServiceException {
boolean done = false;
JobState jobState = null;
while (!done) {
// Sleep for 5 seconds
Thread.sleep(5000);
// Query the updated Job state
jobState = mediaService.get(Job.get(jobId)).getState();
System.out.println("Job state: " + jobState);
if (jobState == JobState.Finished || jobState == JobState.Canceled || jobState == JobState.Error) {
done = true;
}
}
}
}
I verified following code below which is working through fiddler proxy. Thanks to how to Capture https with fiddler, in java post which gave me hints:
System.setProperty("http.proxyHost", "127.0.0.1");
System.setProperty("https.proxyHost", "127.0.0.1");
System.setProperty("http.proxyPort", "8888");
System.setProperty("https.proxyPort", "8888");
System.setProperty("javax.net.ssl.trustStore", "C:\\Program Files\\Java\\jdk1.8.0_102\\bin\\FiddlerKeyStore");
System.setProperty("javax.net.ssl.trustStorePassword", "mypassword");
For others who face issue like me we can connect to azure mediaservices using network proxy by using below code
// Set up the MediaContract object to call into the Media Services account
Configuration configuration = MediaConfiguration.configureWithOAuthAuthentication(
mediaServiceUri, oAuthUri, clientId, clientSecret, scope);
configuration.getProperties().put(Configuration.PROPERTY_HTTP_PROXY_HOST, "Hostvalue");
configuration.getProperties().put(Configuration.PROPERTY_HTTP_PROXY_PORT, "Portvalue");
configuration.getProperties().put(Configuration.PROPERTY_HTTP_PROXY_SCHEME, "http");
MediaContract mediaService = MediaService.create(configuration);
Now use the mediaService to perform other operations.