I have a requirement to trigger the Cloud Dataflow pipeline from Cloud Functions. But the Cloud function must be written in Java. So the Trigger for Cloud Function is Google Cloud Storage's Finalise/Create Event, i.e., when a file is uploaded in a GCS bucket, the Cloud Function must trigger the Cloud dataflow.
When I create a dataflow pipeline (batch) and I execute the pipeline, it creates a Dataflow pipeline template and creates a Dataflow job.
But when I create a cloud function in Java, and a file is uploaded, the status just says "ok", but it does not trigger the dataflow pipeline.
Cloud function
package com.example;
import com.example.Example.GCSEvent;
import com.google.api.client.googleapis.javanet.GoogleNetHttpTransport;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.dataflow.Dataflow;
import com.google.api.services.dataflow.model.CreateJobFromTemplateRequest;
import com.google.api.services.dataflow.model.RuntimeEnvironment;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import com.google.cloud.functions.BackgroundFunction;
import com.google.cloud.functions.Context;
import java.io.IOException;
import java.security.GeneralSecurityException;
import java.util.HashMap;
import java.util.logging.Logger;
public class Example implements BackgroundFunction<GCSEvent> {
private static final Logger logger = Logger.getLogger(Example.class.getName());
#Override
public void accept(GCSEvent event, Context context) throws IOException, GeneralSecurityException {
logger.info("Event: " + context.eventId());
logger.info("Event Type: " + context.eventType());
HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();
GoogleCredentials credentials = GoogleCredentials.getApplicationDefault();
HttpRequestInitializer requestInitializer = new HttpCredentialsAdapter(credentials);
Dataflow dataflowService = new Dataflow.Builder(httpTransport, jsonFactory, requestInitializer)
.setApplicationName("Google Dataflow function Demo")
.build();
String projectId = "my-project-id";
RuntimeEnvironment runtimeEnvironment = new RuntimeEnvironment();
runtimeEnvironment.setBypassTempDirValidation(false);
runtimeEnvironment.setTempLocation("gs://my-dataflow-job-bucket/tmp");
CreateJobFromTemplateRequest createJobFromTemplateRequest = new CreateJobFromTemplateRequest();
createJobFromTemplateRequest.setEnvironment(runtimeEnvironment);
createJobFromTemplateRequest.setLocation("us-central1");
createJobFromTemplateRequest.setGcsPath("gs://my-dataflow-job-bucket-staging/templates/cloud-dataflow-template");
createJobFromTemplateRequest.setJobName("Dataflow-Cloud-Job");
createJobFromTemplateRequest.setParameters(new HashMap<String,String>());
createJobFromTemplateRequest.getParameters().put("inputFile","gs://cloud-dataflow-bucket-input/*.txt");
dataflowService.projects().templates().create(projectId,createJobFromTemplateRequest);
throw new UnsupportedOperationException("Not supported yet.");
}
public static class GCSEvent {
String bucket;
String name;
String metageneration;
}
}
pom.xml(cloud function)
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cloudfunctions</groupId>
<artifactId>http-function</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.target>11</maven.compiler.target>
<maven.compiler.source>11</maven.compiler.source>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.google.auth/google-auth-library-credentials -->
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-credentials</artifactId>
<version>0.21.1</version>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-dataflow</artifactId>
<version>v1b3-rev207-1.20.0</version>
</dependency>
<dependency>
<groupId>com.google.cloud.functions</groupId>
<artifactId>functions-framework-api</artifactId>
<version>1.0.1</version>
</dependency>
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.21.1</version>
</dependency>
</dependencies>
<!-- Required for Java 11 functions in the inline editor -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<excludes>
<exclude>.google/</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>
cloud function logs
I went through the below blogs (adding for reference) where they have triggered dataflow from cloud storage via cloud function. But the code has been written in either Node.js or python. But my cloud function must be written in java.
Triggering Dataflow pipeline via cloud functions in Node.js
https://dzone.com/articles/triggering-dataflow-pipelines-with-cloud-functions
Triggering dataflow pipeline via cloud functions using python
https://medium.com/google-cloud/how-to-kick-off-a-dataflow-pipeline-via-cloud-functions-696927975d4e
Any help on this is very much appreciated.
RuntimeEnvironment runtimeEnvironment = new RuntimeEnvironment();
runtimeEnvironment.setBypassTempDirValidation(false);
runtimeEnvironment.setTempLocation("gs://karthiksfirstbucket/temp1");
LaunchTemplateParameters launchTemplateParameters = new LaunchTemplateParameters();
launchTemplateParameters.setEnvironment(runtimeEnvironment);
launchTemplateParameters.setJobName("newJob" + (new Date()).getTime());
Map<String, String> params = new HashMap<String, String>();
params.put("inputFile", "gs://karthiksfirstbucket/sample.txt");
params.put("output", "gs://karthiksfirstbucket/count1");
launchTemplateParameters.setParameters(params);
writer.write("4");
Dataflow.Projects.Templates.Launch launch = dataflowService.projects().templates().launch(projectId, launchTemplateParameters);
launch.setGcsPath("gs://dataflow-templates-us-central1/latest/Word_Count");
launch.execute();
The above code launches a template and executes the dataflow pipeline
using application default credentials(Which can be changed to user cred or service cred)
region is default region(Which can be changed).
creates a job for every HTTP trigger(Trigger can be changed).
The complete code can be found below:
https://github.com/karthikeyan1127/Java_CloudFunction_DataFlow/blob/master/Hello.java
This is my solution using the new data flow dependencies
public class Example implements BackgroundFunction<Example.GCSEvent> {
private static final Logger logger = Logger.getLogger(Example.class.getName());
#Override
public void accept(GCSEvent event, Context context) throws Exception {
String filename = event.name;
logger.info("Processing file: " + filename);
logger.info("Bucket name" + event.bucket);
String projectId = "cedar-router-268801";
String region = "us-central1";
String tempLocation = "gs://cedar-router-beam-poc/temp";
String templateLocation = "gs://cedar-router-beam-poc/template/poc-template.json";
logger.info("path" + String.format("gs://%s/%s", event.bucket, filename));
String scenario = filename.substring(0, 3); //it comes TWO OR ONE
logger.info("scneario " + scenario);
Map<String, String> params = Map.of("sourceFile", String.format("%s/%s", event.bucket, filename),
"scenario", scenario,
"years", "2013,2014",
"targetFile", "gs://cedar-router-beam-poc-kms/result/testfile");
extractedJob(projectId, region, tempLocation, templateLocation, params);
}
private static void extractedJob(String projectId,
String region,
String tempLocation,
String templateLocation,
Map<String, String> params) throws Exception {
HttpTransport httpTransport = GoogleApacheHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = GsonFactory.getDefaultInstance();
GoogleCredentials credentials = GoogleCredentials.getApplicationDefault();
HttpRequestInitializer httpRequestInitializer = new RetryHttpRequestInitializer(ImmutableList.of(404));
ChainingHttpRequestInitializer chainingHttpRequestInitializer =
new ChainingHttpRequestInitializer(new HttpCredentialsAdapter(credentials), httpRequestInitializer);
Dataflow dataflowService = new Dataflow.Builder(httpTransport, jsonFactory, chainingHttpRequestInitializer)
.setApplicationName("Dataflow from Cloud function")
.build();
FlexTemplateRuntimeEnvironment runtimeEnvironment = new FlexTemplateRuntimeEnvironment();
runtimeEnvironment.setTempLocation(tempLocation);
LaunchFlexTemplateParameter launchFlexTemplateParameter = new LaunchFlexTemplateParameter();
launchFlexTemplateParameter.setEnvironment(runtimeEnvironment);
String jobName = params.get("sourceFile").substring(34, 49).replace("_","");
logger.info("job name" + jobName);
launchFlexTemplateParameter.setJobName("job" + LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyyMMddHHmmss")));
launchFlexTemplateParameter.setContainerSpecGcsPath(templateLocation);
launchFlexTemplateParameter.setParameters(params);
LaunchFlexTemplateRequest launchFlexTemplateRequest = new LaunchFlexTemplateRequest();
launchFlexTemplateRequest.setLaunchParameter(launchFlexTemplateParameter);
Launch launch = dataflowService.projects()
.locations()
.flexTemplates()
.launch(projectId, region, launchFlexTemplateRequest);
launch.execute();
logger.info("running job");
}
public static class GCSEvent {
String bucket;
String name;
String metageneration;
}
Just adapt it to your case
Related
It's a Spring Boot Java Lambda for secure smtp mail transmission and it does not work while it does work in another api code of Spring Boot. Whenever I run this code it gives either SecurityException on one occassion or AuthenticationException on another and when I use getDefaultInstance() method of Session instead of getInstance() it gives "cannot create a default session".
It does not even creates an object for Authenticator and gives Authentication Exception on Lambda.
---code below--
import java.util.Properties;
import javax.mail.PasswordAuthentication;
import javax.mail.Session;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.env.Environment;
#Configuration
public class EmailConfig {
private String username = "";
private String password = "";
#Bean
public Session getEmailSession(Environment env) {
Properties props = System.getProperties();
props.setProperty("mail.smtp.host", "smtp.gmail.com");
props.setProperty("mail.smtp.port", "587");
props.setProperty("mail.smtp.auth", "true");
props.setProperty("mail.smtp.starttls.enable", "true");
Session session = Session.getInstance(props,
new javax.mail.Authenticator() {
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication(username, password); // username & password coming from vault
}
});
return session;
}
}
It's Throwing error in Lambda :
2022-02-27T04:49:42.216+05:30
Copy
software/amazon/awssdk/regions/Region: java.lang.NoClassDefFoundError
java.lang.NoClassDefFoundError: software/amazon/awssdk/regions/Region
at aws.smtp.lambda.demo.SendMessageEmailRequest.main_meth(SendMessageEmailRequest.java:47)
at aws.smtp.lambda.demo.LambdaFunctionHandler.handleRequest(LambdaFunctionHandler.java:42)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
Caused by: java.lang.ClassNotFoundException: software.amazon.awssdk.regions.Region. Current classpath: file:/var/task/
software/amazon/awssdk/regions/Region: java.lang.NoClassDefFoundError java.lang.NoClassDefFoundError: software/amazon/awssdk/regions/Region at aws.smtp.lambda.demo.SendMessageEmailRequest.main_meth(SendMessageEmailRequest.java:47) at aws.smtp.lambda.demo.LambdaFunctionHandler.handleRequest(LambdaFunctionHandler.java:42) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.base/java.lang.reflect.Method.invoke(Unknown Source) Caused by: java.lang.ClassNotFoundException: software.amazon.awssdk.regions.Region. Current classpath: file:/var/task/
Code below :
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.6.4</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<groupId>aws.smtp.lambda</groupId>
<artifactId>demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>demo</name>
<description>Demo project for Spring Boot Lambda smtp</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bundle</artifactId>
<version>2.9.3</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<!-- <dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>-->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.0.0</version>
</dependency>
<!-- <dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-tests</artifactId>
<version>1.0.0</version>
<scope>test</scope>
</dependency>-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-mail</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-ses -->
<dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-java-sdk-ses</artifactId>
<version>1.9.16</version> </dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-ses</artifactId>
<version>1.11.561</version>
</dependency>
<!-- Thanks for using https://jar-download.com -->
</dependencies>
<!-- <build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build> -->
</project>
package aws.smtp.lambda.demo;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import com.amazonaws.services.lambda.runtime.Context;
//import com.amazonaws.services.lambda.runtime.RequestHandler;
//import com.amazonaws.services.lambda.runtime.events.ScheduledEvent;
#SpringBootApplication
public class LambdaFunctionHandler /*implements RequestHandler<ScheduledEvent, String>*/ {
//private Context applicationContext;
// #Autowired
private SendMessageEmailRequest reqSES = new SendMessageEmailRequest();
/*#Autowired
private AmazonSESSample sesSample = new AmazonSESSample();*/
/*#Autowired
private SimpleEmail simpleEmail = new SimpleEmail();
*/
public LambdaFunctionHandler() {
}
/* public LambdaFunctionHandler(Context context) {
applicationContext = context;
}*/
/*public void initialize() {
applicationContext = new SpringApplicationBuilder(LambdaFunctionHandler.class).web(WebApplicationType.NONE)
.run();
}*/
public String handleRequest(/*ScheduledEvent input,*/ Context context) throws Exception {
/* try {
if(Objects.isNull(applicationContext)) {
initialize();
}*/
context.getLogger().log("Input: " );
// simpleEmail.sendMail();
// sesSample.sendMail();
System.out.println("Going Inside SES Class ");
reqSES.main_meth();
return "Hello World - " ;//+ input;
/*}
catch(Exception ex) {
ex.printStackTrace();
return "Failure";
}*/
}
public static void main(String args[]) {
}
}
package aws.smtp.lambda.demo;
import javax.mail.MessagingException;
//snippet-end:[ses.java2.sendmessage.request.import]
import org.springframework.stereotype.Service;
//snippet-start:[ses.java2.sendmessage.request.import]
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.ses.SesClient;
import software.amazon.awssdk.services.ses.model.Body;
import software.amazon.awssdk.services.ses.model.Content;
import software.amazon.awssdk.services.ses.model.Destination;
import software.amazon.awssdk.services.ses.model.Message;
import software.amazon.awssdk.services.ses.model.SendEmailRequest;
//import software.amazon.awssdk.services.ses.model.SesException;
/**
* To run this Java V2 code example, ensure that you have setup your development environment, including your credentials.
*
* For information, see this documentation topic:
*
* https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html
*/
#Service
public class SendMessageEmailRequest {
public void main_meth(/*String[] args*/) {
System.out.println(" Inside SES Class : main_meth");
final String USAGE = "\n" +
"Usage:\n" +
" SendMessage <sender> <recipient> <subject> \n\n" +
"Where:\n" +
" sender - an email address that represents the sender. \n"+
" recipient - an email address that represents the recipient. \n"+
" subject - the subject line. \n" ;
System.out.println("Inside SES Class ");
/* if (args.length != 3) {
System.out.println(USAGE);
System.exit(1);
}*/
String sender = ""; // written correctly in the original code
String recipient = ""; // written correctly in the original code
String subject = "Amazon SES test (AWS SDK for Java)";
Region region = Region.AP_SOUTH_1; // failing in this line
/*SesClient client = SesClient.builder()
.region(region)
.build();*/
// The email body for non-HTML email clients
String bodyText = "Hello,\r\n" + "See the list of customers. ";
// The HTML body of the email
String bodyHTML = "<html>" + "<head></head>" + "<body>" + "<h1>Hello!</h1>"
+ "<p> See the list of customers.</p>" + "</body>" + "</html>";
try {
System.out.println(" Inside SES Class : main_meth : in try block");
// send(client, sender, recipient, subject, bodyText, bodyHTML);
//client.close();
System.out.println("Done");
} catch (/*Messaging*/Exception e) {
e.getStackTrace();
}
}
// snippet-start:[ses.java2.sendmessage.request.main]
/*public void send(SesClient client,
String sender,
String recipient,
String subject,
String bodyText,
String bodyHTML
) throws MessagingException {
Destination destination = Destination.builder()
.toAddresses(recipient)
.build();
Content content = Content.builder()
.data(bodyHTML)
.build();
Content sub = Content.builder()
.data(subject)
.build();
Body body = Body.builder()
.html(content)
.build();
Message msg = Message.builder()
.subject(sub)
.body(body)
.build();
SendEmailRequest emailRequest = SendEmailRequest.builder()
.destination(destination)
.message(msg)
.source(sender)
.build();
try {
System.out.println("Attempting to send an email through Amazon SES " + "using the AWS SDK for Java...");
client.sendEmail(emailRequest);
} catch (SesException e) {
System.err.println(e.awsErrorDetails().errorMessage());
System.exit(1);
}
// snippet-end:[ses.java2.sendmessage.request.main]
}*/
}
Look at using the Amazon Simple Email Service if you need email functionality from either an AWS Lambda function or even a Spring Boot app. There are no issues when doing so. You can easily send an email message using this SES Java Code:
// snippet-start:[ses.java2.sendmessage.request.import]
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.ses.SesClient;
import software.amazon.awssdk.services.ses.model.*;
import software.amazon.awssdk.services.ses.model.Message;
import software.amazon.awssdk.services.ses.model.Body;
import javax.mail.MessagingException;
// snippet-end:[ses.java2.sendmessage.request.import]
/**
* To run this Java V2 code example, ensure that you have setup your development environment, including your credentials.
*
* For information, see this documentation topic:
*
* https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html
*/
public class SendMessageEmailRequest {
public static void main(String[] args) {
final String USAGE = "\n" +
"Usage:\n" +
" SendMessage <sender> <recipient> <subject> \n\n" +
"Where:\n" +
" sender - an email address that represents the sender. \n"+
" recipient - an email address that represents the recipient. \n"+
" subject - the subject line. \n" ;
if (args.length != 3) {
System.out.println(USAGE);
System.exit(1);
}
String sender = args[0];
String recipient = args[1];
String subject = args[2];
Region region = Region.US_EAST_1;
SesClient client = SesClient.builder()
.region(region)
.build();
// The email body for non-HTML email clients
String bodyText = "Hello,\r\n" + "See the list of customers. ";
// The HTML body of the email
String bodyHTML = "<html>" + "<head></head>" + "<body>" + "<h1>Hello!</h1>"
+ "<p> See the list of customers.</p>" + "</body>" + "</html>";
try {
send(client, sender, recipient, subject, bodyText, bodyHTML);
client.close();
System.out.println("Done");
} catch (MessagingException e) {
e.getStackTrace();
}
}
// snippet-start:[ses.java2.sendmessage.request.main]
public static void send(SesClient client,
String sender,
String recipient,
String subject,
String bodyText,
String bodyHTML
) throws MessagingException {
Destination destination = Destination.builder()
.toAddresses(recipient)
.build();
Content content = Content.builder()
.data(bodyHTML)
.build();
Content sub = Content.builder()
.data(subject)
.build();
Body body = Body.builder()
.html(content)
.build();
Message msg = Message.builder()
.subject(sub)
.body(body)
.build();
SendEmailRequest emailRequest = SendEmailRequest.builder()
.destination(destination)
.message(msg)
.source(sender)
.build();
try {
System.out.println("Attempting to send an email through Amazon SES " + "using the AWS SDK for Java...");
client.sendEmail(emailRequest);
} catch (SesException e) {
System.err.println(e.awsErrorDetails().errorMessage());
System.exit(1);
}
// snippet-end:[ses.java2.sendmessage.request.main]
}
}
I am not sure why I am receiving a credentials error that is undefined. I am trying to call textract via Amazon S3 bucket from AWS. The suggested fix do not help too. Can someone help me with this issue?
This code structure consists of app.java and pom.xml and will extract the relevant text from the uploaded image in an Amazon S3 bucket and process it into forms.
app.java
package com.textract;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.S3Object;
import com.amazonaws.services.s3.model.S3ObjectInputStream;
import software.amazon.awssdk.core.SdkBytes;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.textract.TextractClient;
import software.amazon.awssdk.services.textract.model.*;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
public class App {
public static Map<String, String> getRelationships(Map<String, Block> blockMap, Map<String, Block> keyMap,Map<String, Block> valueMap) {
Map<String, String> result = new LinkedHashMap<>();
for(Map.Entry<String, Block> itr : keyMap.entrySet()) {
Block valueBlock = findValue(itr.getValue(), valueMap);
String key = getText(itr.getValue(), blockMap);
String value = getText(valueBlock, blockMap);
result.put(key, value);
}
return result;
}
public static Block findValue(Block keyBlock, Map<String, Block> valueMap) {
Block b = null;
for(Relationship relationship : keyBlock.relationships()) {
if(relationship.type().toString().equals("VALUE")) {
for(String id : relationship.ids()) {
b = valueMap.get(id);
}
}
}
return b;
}
public static String getText(Block result, Map<String, Block> blockMap) {
StringBuilder stringBuilder = new StringBuilder();
for(Relationship relationship : result.relationships()) {
if(relationship.type().toString().equals("CHILD")) {
for(String id : relationship.ids()) {
Block b = blockMap.get(id);
if(b.blockTypeAsString().equals("WORD")) {
stringBuilder.append(b.text()).append(" ");
}
}
}
}
return stringBuilder.toString();
}
public static void main(String[] args) {
BasicAWSCredentials creds = new BasicAWSCredentials("Access Key", "Secret Key");
AmazonS3 s3client = AmazonS3Client.builder()
.withRegion("ap-southeast-1")
.withCredentials(new AWSStaticCredentialsProvider(creds))
.build();
// AmazonS3 s3client = AmazonS3ClientBuilder.standard().build();
S3Object s3Object = s3client.getObject("bucket-name", "image.jpg");
S3ObjectInputStream s3ObjectInputStream = s3Object.getObjectContent();
SdkBytes bytes = SdkBytes.fromInputStream(s3ObjectInputStream);
Document doc = Document.builder().bytes(bytes).build();
List<FeatureType> list = new ArrayList<>();
list.add(FeatureType.FORMS);
AnalyzeDocumentRequest request = AnalyzeDocumentRequest.builder().featureTypes(list).document(doc).build();
//**Error in this line with "withCredentials"
TextractClient textractClient = TextractClient.builder().region(Region.AP_SOUTHEAST_1).withCredentials(new AWSStaticCredentialsProvider(creds)).build();
AnalyzeDocumentResponse response = textractClient.analyzeDocument(request);
List<Block> blocks = response.blocks();
Map<String, Block> blockMap = new LinkedHashMap<>();
Map<String, Block> keyMap = new LinkedHashMap<>();
Map<String, Block> valueMap = new LinkedHashMap<>();
for (Block b : blocks) {
String block_id = b.id();
blockMap.put(block_id, b);
if(b.blockTypeAsString().equals("KEY_VALUE_SET")) {
for(EntityType entityType : b.entityTypes()) {
if(entityType.toString().equals("KEY")) {
keyMap.put(block_id, b);
} else {
valueMap.put(block_id, b);
}
}
}
}
System.out.println(getRelationships(blockMap, keyMap, valueMap));
textractClient.close();
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.textract</groupId>
<artifactId>DevProblems</artifactId>
<version>1.0</version>
<name>DevProblems</name>
<url>http://www.example.com</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.795</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>textract</artifactId>
<version>2.15.61</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.11.772</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-textract</artifactId>
<version>1.11.959</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.959</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<version>1.11.959</version>
<scope>compile</scope>
</dependency>
</dependencies>
</project>
You've got a mixture of SDK v1 and v2 classes there. The com.amazonaws classes like AWSStaticCredentialsProvider are from the version 1 SDK, and software.amazon.awssdk classes are from version 2.
Your TextractClient is from the v2 SDK, so you need an AWSCredentialsProvider from the v2 SDK, such as a SystemPropertyCredentialsProvider ...
TextractClient textractClient = TextractClient.builder()
.region(...)
.credentialsProvider(SystemPropertyCredentialsProvider.create())
Consider moving all of your Textract code to AWS SDK for Java V2. You can create an AWS application that analyzes PDF document images located in an Amazon Simple Storage Service (Amazon S3) bucket by using the Amazon Textract service. To learn how to successfully implement this use case with Textract Java V2, see:
Creating an AWS document analyzer application using the AWS SDK for Java
Also to handle creds, you can place the creds in a file named credentials located in:
Windows: C:\Users<yourUserName>.aws\credentials
Linux, macOS, Unix: ~/.aws/credentials
More information here.
I am trying to execute a simple Java example that uses the AWS Polly service. I am using the code provided by AWS on their documentation. I created a simple Maven Project using the following -**
1. group id - com.amazonaws.polly
2. artifact id - java-demo
3. version - 0.0.1-SNAPSHOT
Following is my project structure -
Following is my pom.xml -
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.amazonaws.polly</groupId>
<artifactId>java-demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-polly -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-polly</artifactId>
<version>1.11.77</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.googlecode.soundlibs/jlayer -->
<dependency>
<groupId>com.googlecode.soundlibs</groupId>
<artifactId>jlayer</artifactId>
<version>1.0.1-1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>com.amazonaws.demos.polly.PollyDemo</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
Following is my java class -
package com.amazonaws.demos.polly;
import java.io.IOException;
import java.io.InputStream;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.DefaultAWSCredentialsProviderChain;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.polly.AmazonPollyClient;
import com.amazonaws.services.polly.model.DescribeVoicesRequest;
import com.amazonaws.services.polly.model.DescribeVoicesResult;
import com.amazonaws.services.polly.model.OutputFormat;
import com.amazonaws.services.polly.model.SynthesizeSpeechRequest;
import com.amazonaws.services.polly.model.SynthesizeSpeechResult;
import com.amazonaws.services.polly.model.Voice;
import javazoom.jl.player.advanced.AdvancedPlayer;
import javazoom.jl.player.advanced.PlaybackEvent;
import javazoom.jl.player.advanced.PlaybackListener;
public class PollyDemo {
private final AmazonPollyClient polly;
private final Voice voice;
private static final String SAMPLE = "Congratulations. You have successfully built this working demo "+
"of Amazon Polly in Java. Have fun building voice enabled apps with Amazon Polly (that's me!), and always"+
"look at the AWS website for tips and tricks on using Amazon Polly and other great services from AWS";
public PollyDemo(Region region) {
//Didn't work
//AWSCredentials credentials = new BasicAWSCredentials("someAccessKey","someSecretKey");
//polly = new AmazonPollyClient(credentials);
//Didn't work
// create an Amazon Polly client in a specific region
polly = new AmazonPollyClient(new DefaultAWSCredentialsProviderChain(),
new ClientConfiguration());
polly.setRegion(region);
// Create describe voices request.
DescribeVoicesRequest describeVoicesRequest = new DescribeVoicesRequest();
// Synchronously ask Amazon Polly to describe available TTS voices.
DescribeVoicesResult describeVoicesResult = polly.describeVoices(describeVoicesRequest);
voice = describeVoicesResult.getVoices().get(0);
}
public InputStream synthesize(String text, OutputFormat format) throws IOException {
SynthesizeSpeechRequest synthReq =
new SynthesizeSpeechRequest().withText(text).withVoiceId(voice.getId())
.withOutputFormat(format);
SynthesizeSpeechResult synthRes = polly.synthesizeSpeech(synthReq);
return synthRes.getAudioStream();
}
public static void main(String args[]) throws Exception {
//create the test class
PollyDemo helloWorld = new PollyDemo(Region.getRegion(Regions.US_EAST_1));
//get the audio stream
InputStream speechStream = helloWorld.synthesize(SAMPLE, OutputFormat.Mp3);
//create an MP3 player
AdvancedPlayer player = new AdvancedPlayer(speechStream,
javazoom.jl.player.FactoryRegistry.systemRegistry().createAudioDevice());
player.setPlayBackListener(new PlaybackListener() {
#Override
public void playbackStarted(PlaybackEvent evt) {
System.out.println("Playback started");
System.out.println(SAMPLE);
}
#Override
public void playbackFinished(PlaybackEvent evt) {
System.out.println("Playback finished");
}
});
// play it!
player.play();
}
}
I am running the code locally, therefore I have my AWS IAM credentials configured in my system,
My IAM user also has access to AWS Polly service.
I am getting the following error when I run the code -
Exception in thread "main" com.amazonaws.services.polly.model.AmazonPollyException: The security token included in the request is invalid. (Service: AmazonPolly; Status Code: 403; Error Code: UnrecognizedClientException; Request ID: 4d4b01fb-8015-11e8-8e18-4548f95fba92)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1586)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1254)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1035)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:747)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:721)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:704)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:672)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:654)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:518)
at com.amazonaws.services.polly.AmazonPollyClient.doInvoke(AmazonPollyClient.java:668)
at com.amazonaws.services.polly.AmazonPollyClient.invoke(AmazonPollyClient.java:644)
at com.amazonaws.services.polly.AmazonPollyClient.describeVoices(AmazonPollyClient.java:383)
at com.amazonaws.demos.polly.PollyDemo.<init>(PollyDemo.java:39)
at com.amazonaws.demos.polly.PollyDemo.main(PollyDemo.java:54)
I am referring the following AWS Doc for the Polly java example-
https://docs.aws.amazon.com/polly/latest/dg/examples-java.html
Can someone help fix my code? What do I change in my code?
It's a 403 error. Where are you passing the AWS access and secret key? You can try this
* Constructs a new client to invoke service methods on AmazonPolly. A
* credentials provider chain will be used that searches for credentials in
* this order:
* <ul>
* <li>Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_KEY</li>
* <li>Java System Properties - aws.accessKeyId and aws.secretKey</li>
* <li>Instance profile credentials delivered through the Amazon EC2
* metadata service</li>
* </ul>
* <p>
* All service calls made using this new client object are blocking, and
* will not return until the service call completes.
*
* #see DefaultAWSCredentialsProviderChain
*/
public AmazonPollyClient() {
this(new DefaultAWSCredentialsProviderChain(), new ClientConfiguration());
}
https://github.com/aws/aws-sdk-android/blob/master/aws-android-sdk-polly/src/main/java/com/amazonaws/services/polly/AmazonPollyClient.java
Replace
new DefaultAWSCredentialsProviderChain()
with
AWSStaticCredentialsProvider(new BasicAWSCredentials("AccessKey", "Secret Key"))
I noticed there is no POM file in the Github repository for this service. We will fix that. Here is a POM file that works.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>pollyJ1Project</groupId>
<artifactId>pollyJ1Project</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-polly -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-polly</artifactId>
<version>1.11.774</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.awaitility/awaitility -->
<!-- https://mvnrepository.com/artifact/org.awaitility/awaitility -->
<dependency>
<groupId>org.awaitility</groupId>
<artifactId>awaitility</artifactId>
<version>4.0.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.awaitility</groupId>
<artifactId>awaitility</artifactId>
<version>4.0.2</version>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
Also - here is an example you can try:
package com.amazonaws.polly.samples;
import com.amazonaws.services.polly.AmazonPolly;
import com.amazonaws.services.polly.AmazonPollyClientBuilder;
import com.amazonaws.services.polly.model.DescribeVoicesRequest;
import com.amazonaws.services.polly.model.DescribeVoicesResult;
public class DescribeVoicesSample {
public static void main(String[] args) {
AmazonPolly client = AmazonPollyClientBuilder.defaultClient();
describeVoices(client);
}
public static void describeVoices(AmazonPolly client ) {
DescribeVoicesRequest allVoicesRequest = new DescribeVoicesRequest();
DescribeVoicesRequest enUsVoicesRequest = new DescribeVoicesRequest()
.withLanguageCode("en-US");
try {
String nextToken;
do {
DescribeVoicesResult allVoicesResult =
client.describeVoices(allVoicesRequest);
nextToken = allVoicesResult.getNextToken();
allVoicesRequest.setNextToken(nextToken);
System.out.println("All voices: " + allVoicesResult.getVoices());
} while (nextToken != null);
do {
DescribeVoicesResult enUsVoicesResult = client.describeVoices(enUsVoicesRequest);
nextToken = enUsVoicesResult.getNextToken();
enUsVoicesRequest.setNextToken(nextToken);
System.out.println("en-US voices: " + enUsVoicesResult.getVoices());
} while (nextToken != null);
} catch (Exception e) {
System.err.println("Exception caught: " + e);
}
}
}
The above code is V1. You can find V2 code examples here:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/master/javav2/example_code/polly
I found a nice working demo here https://youtu.be/WMMSQAn_vHI. It is working java example for AWS Polly service using new DefaultAWSCredentialsProviderChain() only.
Per my understanding RXJava works within a single JVM. Is there a wrapper/lib/api to support clustered environments with combination of distributed cache, JMS or any other queue to provide subscribers scale across distributed environments? Would like to check here before reinvent the wheel.
You can deploy Vertx instances in a cluster and use RxJava over it. The idea is use EventBus as a transport layer and subscribe to the messages using RxJava. It's not a pure RxJava solution.
A very simple runnable example:
package com.example;
import java.util.concurrent.TimeUnit;
import io.reactivex.Flowable;
import io.vertx.core.DeploymentOptions;
import io.vertx.core.VertxOptions;
import io.vertx.core.json.JsonObject;
import io.vertx.core.spi.cluster.ClusterManager;
import io.vertx.reactivex.core.AbstractVerticle;
import io.vertx.reactivex.core.Vertx;
import io.vertx.reactivex.core.eventbus.EventBus;
import io.vertx.spi.cluster.hazelcast.HazelcastClusterManager;
public class MainVerticle extends AbstractVerticle {
String nodeId;
static final String CENTRAL = "CENTRAL";
#Override
public void start() throws Exception {
EventBus eventBus = vertx.eventBus();
JsonObject config = config();
String nodeID = config.getString("nodeID");
eventBus.consumer(CENTRAL).toFlowable()
.map(msg -> (JsonObject) msg.body())
.filter(msgBody -> !msgBody.getString("sender", "").equals(nodeID))
.subscribe(msgBody -> {
System.out.println(msgBody);
});
Flowable.interval(1, TimeUnit.SECONDS)
.subscribe(tick -> {
JsonObject msg = new JsonObject()
.put("sender", nodeID)
.put("msg", "Hello world");
eventBus.publish(CENTRAL, msg);
});
}
public static void main(String[] args) {
ClusterManager clusterManager = new HazelcastClusterManager();
VertxOptions options = new VertxOptions().setClusterManager(clusterManager);
Vertx.rxClusteredVertx(options)
.doOnError(throwable -> throwable.printStackTrace())
.subscribe(vertx -> {
if (vertx.isClustered()) {
System.out.println("Vertx is running clustered");
}
String nodeID = clusterManager.getNodeID();
System.out.println("Node ID : " + nodeID);
String mainVerticle = MainVerticle.class.getCanonicalName();
DeploymentOptions deploymentOptions = new DeploymentOptions();
deploymentOptions.setConfig(new JsonObject().put("nodeID", nodeID));
vertx.rxDeployVerticle(mainVerticle, deploymentOptions).subscribe();
});
}
}
Maven dependencies:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>rxjava2-clustered</artifactId>
<version>0.42</version>
<packaging>jar</packaging>
<name>rxjava2-clustered</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-core</artifactId>
<version>3.5.0</version>
</dependency>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-rx-java2</artifactId>
<version>3.5.0</version>
</dependency>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-hazelcast</artifactId>
<version>3.5.0</version>
</dependency>
</dependencies>
</project>
In this example, I am using Hazelcast ClusterManager. There exists implementations for Infinispan, Apache Ignite and Apache Zookeeper. Refer to documentation for full reference:
I tried to create a simple class to create a table and add some columns with HBase and Google app engine.
I have already created a project and an instance in Google Cloud platform.
I cloned this repository : https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/blob/master/java/hello-world/src/main/java/com/example/cloud/bigtable/helloworld/HelloWorld.java
It works like a charm to create a table in my instance.
But when I'm trying to create a new maven project with the same config, it doesn't work, I can't create anything.
I got this issue :
InvocationTargetException: Could not find an appropriate constructor for com.google.cloud.bigtable.hbase1_x.BigtableConnection: com.google.common.util.concurrent.MoreExecutors.platformThreadFactory()Ljava/util/concurrent/ThreadFactory;
here is my AudioBridgeData.java file :
package com.xxx.xxx;
import com.google.cloud.bigtable.hbase.BigtableConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.ResultScanner;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.client.Table;
import org.apache.hadoop.hbase.util.Bytes;
import java.io.IOException;
/**
* A minimal application that connects to Cloud Bigtable using the native HBase API
* and performs some basic operations.
*/
public class AudioBridgeData {
private static final byte[] TABLE_NAME = Bytes.toBytes("audio-bridge");
private static final byte[] COLUMN_FAMILY_NAME = Bytes.toBytes("phone-number");
private static final byte[] COLUMN_NAME = Bytes.toBytes("number");
private static final String[] NUMBERS =
{ "+33697584976", "+19553560976", "+4879665676" };
/**
* Connects to Cloud Bigtable, runs some basic operations and prints the results.
*/
private static void doAudioBridge(String projectId, String instanceId) {
try (Connection connection = BigtableConfiguration.connect(projectId, instanceId)) {
Admin admin = connection.getAdmin();
HTableDescriptor descriptor = new HTableDescriptor(TableName.valueOf(TABLE_NAME));
descriptor.addFamily(new HColumnDescriptor(COLUMN_FAMILY_NAME));
print("Create table " + descriptor.getNameAsString());
admin.createTable(descriptor);
Table table = connection.getTable(TableName.valueOf(TABLE_NAME));
print("Write some numbers to the table");
for (int i = 0; i < NUMBERS.length; i++) {
Put put = new Put(Bytes.toBytes(i));
put.addColumn(COLUMN_FAMILY_NAME, COLUMN_NAME, Bytes.toBytes(NUMBERS[i]));
table.put(put);
}
int rowKey = 0;
Result getResult = table.get(new Get(Bytes.toBytes(rowKey)));
String number = Bytes.toString(getResult.getValue(COLUMN_FAMILY_NAME, COLUMN_NAME));
System.out.println("Get a single number by row key");
System.out.printf("\t%s = %s\n", rowKey, number);
Scan scan = new Scan();
print("Scan for all numbers:");
ResultScanner scanner = table.getScanner(scan);
for (Result row : scanner) {
byte[] valueBytes = row.getValue(COLUMN_FAMILY_NAME, COLUMN_NAME);
System.out.println('\t' + Bytes.toString(valueBytes));
}
} catch (IOException e) {
System.err.println("Exception while running HelloWorld: " + e.getMessage());
e.printStackTrace();
System.exit(1);
}
System.exit(0);
}
private static void print(String msg) {
System.out.println("Number: " + msg);
}
public static void main(String[] args) {
String projectId = requiredProperty("bigtable.projectID");
String instanceId = requiredProperty("bigtable.instanceID");
doAudioBridge(projectId, instanceId);
}
private static String requiredProperty(String prop) {
String value = System.getProperty(prop);
if (value == null) {
throw new IllegalArgumentException("Missing required system property: " + prop);
}
return value;
}
}
Here is my pom.xml file :
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>com.xxx</groupId>
<artifactId>xxx</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<url>http://maven.apache.org</url>
<name>xxx</name>
<properties>
<bigtable.version>1.0.0-pre1</bigtable.version>
<hbase.version>1.1.5</hbase.version>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
</properties>
<repositories>
<repository>
<id>snapshots-repo</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<releases><enabled>false</enabled></releases>
<snapshots><enabled>true</enabled></snapshots>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>com.google.cloud.bigtable</groupId>
<artifactId>bigtable-hbase-1.x</artifactId>
<version>${bigtable.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative-boringssl-static</artifactId>
<version>1.1.33.Fork26</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>0.98.11-hadoop2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>com.xxx.xxx.AudioBridgeData</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
And I tried to run this command :
mvn exec:java -Dbigtable.projectID=xxx -Dbigtable.instanceID=quickstart-instance
Thanks a lot for your help ! :)
I had the same error message, weirdly it was caused by missing application credentials. If its the same issue you should set this environment variable:
GOOGLE_APPLICATION_CREDENTIALS
It should be set to the location of the client credentials file you (might have) downloaded after creating a service account key. This is a good page:
https://developers.google.com/identity/protocols/application-default-credentials