I am trying to execute a simple Java example that uses the AWS Polly service. I am using the code provided by AWS on their documentation. I created a simple Maven Project using the following -**
1. group id - com.amazonaws.polly
2. artifact id - java-demo
3. version - 0.0.1-SNAPSHOT
Following is my project structure -
Following is my pom.xml -
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.amazonaws.polly</groupId>
<artifactId>java-demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-polly -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-polly</artifactId>
<version>1.11.77</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.googlecode.soundlibs/jlayer -->
<dependency>
<groupId>com.googlecode.soundlibs</groupId>
<artifactId>jlayer</artifactId>
<version>1.0.1-1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>com.amazonaws.demos.polly.PollyDemo</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
Following is my java class -
package com.amazonaws.demos.polly;
import java.io.IOException;
import java.io.InputStream;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.DefaultAWSCredentialsProviderChain;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.polly.AmazonPollyClient;
import com.amazonaws.services.polly.model.DescribeVoicesRequest;
import com.amazonaws.services.polly.model.DescribeVoicesResult;
import com.amazonaws.services.polly.model.OutputFormat;
import com.amazonaws.services.polly.model.SynthesizeSpeechRequest;
import com.amazonaws.services.polly.model.SynthesizeSpeechResult;
import com.amazonaws.services.polly.model.Voice;
import javazoom.jl.player.advanced.AdvancedPlayer;
import javazoom.jl.player.advanced.PlaybackEvent;
import javazoom.jl.player.advanced.PlaybackListener;
public class PollyDemo {
private final AmazonPollyClient polly;
private final Voice voice;
private static final String SAMPLE = "Congratulations. You have successfully built this working demo "+
"of Amazon Polly in Java. Have fun building voice enabled apps with Amazon Polly (that's me!), and always"+
"look at the AWS website for tips and tricks on using Amazon Polly and other great services from AWS";
public PollyDemo(Region region) {
//Didn't work
//AWSCredentials credentials = new BasicAWSCredentials("someAccessKey","someSecretKey");
//polly = new AmazonPollyClient(credentials);
//Didn't work
// create an Amazon Polly client in a specific region
polly = new AmazonPollyClient(new DefaultAWSCredentialsProviderChain(),
new ClientConfiguration());
polly.setRegion(region);
// Create describe voices request.
DescribeVoicesRequest describeVoicesRequest = new DescribeVoicesRequest();
// Synchronously ask Amazon Polly to describe available TTS voices.
DescribeVoicesResult describeVoicesResult = polly.describeVoices(describeVoicesRequest);
voice = describeVoicesResult.getVoices().get(0);
}
public InputStream synthesize(String text, OutputFormat format) throws IOException {
SynthesizeSpeechRequest synthReq =
new SynthesizeSpeechRequest().withText(text).withVoiceId(voice.getId())
.withOutputFormat(format);
SynthesizeSpeechResult synthRes = polly.synthesizeSpeech(synthReq);
return synthRes.getAudioStream();
}
public static void main(String args[]) throws Exception {
//create the test class
PollyDemo helloWorld = new PollyDemo(Region.getRegion(Regions.US_EAST_1));
//get the audio stream
InputStream speechStream = helloWorld.synthesize(SAMPLE, OutputFormat.Mp3);
//create an MP3 player
AdvancedPlayer player = new AdvancedPlayer(speechStream,
javazoom.jl.player.FactoryRegistry.systemRegistry().createAudioDevice());
player.setPlayBackListener(new PlaybackListener() {
#Override
public void playbackStarted(PlaybackEvent evt) {
System.out.println("Playback started");
System.out.println(SAMPLE);
}
#Override
public void playbackFinished(PlaybackEvent evt) {
System.out.println("Playback finished");
}
});
// play it!
player.play();
}
}
I am running the code locally, therefore I have my AWS IAM credentials configured in my system,
My IAM user also has access to AWS Polly service.
I am getting the following error when I run the code -
Exception in thread "main" com.amazonaws.services.polly.model.AmazonPollyException: The security token included in the request is invalid. (Service: AmazonPolly; Status Code: 403; Error Code: UnrecognizedClientException; Request ID: 4d4b01fb-8015-11e8-8e18-4548f95fba92)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1586)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1254)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1035)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:747)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:721)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:704)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:672)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:654)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:518)
at com.amazonaws.services.polly.AmazonPollyClient.doInvoke(AmazonPollyClient.java:668)
at com.amazonaws.services.polly.AmazonPollyClient.invoke(AmazonPollyClient.java:644)
at com.amazonaws.services.polly.AmazonPollyClient.describeVoices(AmazonPollyClient.java:383)
at com.amazonaws.demos.polly.PollyDemo.<init>(PollyDemo.java:39)
at com.amazonaws.demos.polly.PollyDemo.main(PollyDemo.java:54)
I am referring the following AWS Doc for the Polly java example-
https://docs.aws.amazon.com/polly/latest/dg/examples-java.html
Can someone help fix my code? What do I change in my code?
It's a 403 error. Where are you passing the AWS access and secret key? You can try this
* Constructs a new client to invoke service methods on AmazonPolly. A
* credentials provider chain will be used that searches for credentials in
* this order:
* <ul>
* <li>Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_KEY</li>
* <li>Java System Properties - aws.accessKeyId and aws.secretKey</li>
* <li>Instance profile credentials delivered through the Amazon EC2
* metadata service</li>
* </ul>
* <p>
* All service calls made using this new client object are blocking, and
* will not return until the service call completes.
*
* #see DefaultAWSCredentialsProviderChain
*/
public AmazonPollyClient() {
this(new DefaultAWSCredentialsProviderChain(), new ClientConfiguration());
}
https://github.com/aws/aws-sdk-android/blob/master/aws-android-sdk-polly/src/main/java/com/amazonaws/services/polly/AmazonPollyClient.java
Replace
new DefaultAWSCredentialsProviderChain()
with
AWSStaticCredentialsProvider(new BasicAWSCredentials("AccessKey", "Secret Key"))
I noticed there is no POM file in the Github repository for this service. We will fix that. Here is a POM file that works.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>pollyJ1Project</groupId>
<artifactId>pollyJ1Project</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-polly -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-polly</artifactId>
<version>1.11.774</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.awaitility/awaitility -->
<!-- https://mvnrepository.com/artifact/org.awaitility/awaitility -->
<dependency>
<groupId>org.awaitility</groupId>
<artifactId>awaitility</artifactId>
<version>4.0.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.awaitility</groupId>
<artifactId>awaitility</artifactId>
<version>4.0.2</version>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
Also - here is an example you can try:
package com.amazonaws.polly.samples;
import com.amazonaws.services.polly.AmazonPolly;
import com.amazonaws.services.polly.AmazonPollyClientBuilder;
import com.amazonaws.services.polly.model.DescribeVoicesRequest;
import com.amazonaws.services.polly.model.DescribeVoicesResult;
public class DescribeVoicesSample {
public static void main(String[] args) {
AmazonPolly client = AmazonPollyClientBuilder.defaultClient();
describeVoices(client);
}
public static void describeVoices(AmazonPolly client ) {
DescribeVoicesRequest allVoicesRequest = new DescribeVoicesRequest();
DescribeVoicesRequest enUsVoicesRequest = new DescribeVoicesRequest()
.withLanguageCode("en-US");
try {
String nextToken;
do {
DescribeVoicesResult allVoicesResult =
client.describeVoices(allVoicesRequest);
nextToken = allVoicesResult.getNextToken();
allVoicesRequest.setNextToken(nextToken);
System.out.println("All voices: " + allVoicesResult.getVoices());
} while (nextToken != null);
do {
DescribeVoicesResult enUsVoicesResult = client.describeVoices(enUsVoicesRequest);
nextToken = enUsVoicesResult.getNextToken();
enUsVoicesRequest.setNextToken(nextToken);
System.out.println("en-US voices: " + enUsVoicesResult.getVoices());
} while (nextToken != null);
} catch (Exception e) {
System.err.println("Exception caught: " + e);
}
}
}
The above code is V1. You can find V2 code examples here:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/master/javav2/example_code/polly
I found a nice working demo here https://youtu.be/WMMSQAn_vHI. It is working java example for AWS Polly service using new DefaultAWSCredentialsProviderChain() only.
Related
As a RPGLE programmer I was asked to convert an IMAP mail poller to EWS. So I'm a bit out of my comfort zone.
With all documentation I managed to cobble a lot of pieces together, but it looks like I'm missing a vital piece because the program keep crashing at Folder folder = service.bindToFolder(folderId, propertySet);
It first I thought it had something to do with authorization. But then I found a little program EwsEditor at Github which works just fine.
Could someone point me to what I am missing?
With all documentation I managed to cobble a lot of pieces together. My testing code:
package test.ewstest;
import com.microsoft.aad.msal4j.ClientCredentialFactory;
import com.microsoft.aad.msal4j.ClientCredentialParameters;
import com.microsoft.aad.msal4j.ConfidentialClientApplication;
import java.io.File;
import java.io.FileInputStream;
import java.net.URI;
import java.util.Collections;
import java.util.Properties;
import microsoft.exchange.webservices.data.core.ExchangeService;
import microsoft.exchange.webservices.data.core.PropertySet;
import microsoft.exchange.webservices.data.core.enumeration.misc.ConnectingIdType;
import microsoft.exchange.webservices.data.core.enumeration.misc.ExchangeVersion;
import microsoft.exchange.webservices.data.core.enumeration.property.BasePropertySet;
import microsoft.exchange.webservices.data.core.enumeration.property.WellKnownFolderName;
import microsoft.exchange.webservices.data.core.enumeration.search.FolderTraversal;
import microsoft.exchange.webservices.data.core.service.schema.FolderSchema;
import microsoft.exchange.webservices.data.credential.TokenCredentials;
import microsoft.exchange.webservices.data.misc.ImpersonatedUserId;
import microsoft.exchange.webservices.data.property.complex.FolderId;
import microsoft.exchange.webservices.data.property.complex.Mailbox;
import microsoft.exchange.webservices.data.search.FindFoldersResults;
import microsoft.exchange.webservices.data.search.FolderView;
import microsoft.exchange.webservices.data.core.service.folder.Folder;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
public class EwsTest {
static String exchangeUser;
static String exchangeTenant;
static String exchangeClientId;
static String exchangeClientSecret;
static ExchangeService service;
static Logger logger = LogManager.getLogger(EwsTest.class.getName());
static Properties properties = new Properties();
public static void main(String[] args) {
try {
properties.load(new FileInputStream(System.getProperty("user.dir") + File.separator + "EwsTest.properties"));
exchangeUser = properties.getProperty("user");
exchangeTenant = properties.getProperty("tenant");
exchangeClientId = properties.getProperty("client");
exchangeClientSecret = properties.getProperty("secret");
createConnection();
displayFolders();
}
catch (Exception ex) {
logger.error(ex.getMessage(), ex);
}
}
static void createConnection() throws Exception {
service = new ExchangeService(ExchangeVersion.Exchange2010_SP2);
service.getHttpHeaders().put("X-AnchorMailbox",exchangeUser);
service.getHttpHeaders().put("X-PublicFolderMailbox",exchangeUser);
service.setCredentials(new TokenCredentials(createOauthToken()));
service.setImpersonatedUserId(new ImpersonatedUserId(ConnectingIdType.SmtpAddress, exchangeUser));
service.setUrl(new URI("https://outlook.office365.com/EWS/Exchange.asmx"));
}
static String createOauthToken() throws Exception {
ConfidentialClientApplication app = ConfidentialClientApplication.builder(
exchangeClientId,
ClientCredentialFactory.createFromSecret(exchangeClientSecret))
.authority("https://login.microsoftonline.com/" + exchangeTenant + "/")
.build();
ClientCredentialParameters clientCredentialParam = ClientCredentialParameters.builder(
Collections.singleton("https://outlook.office365.com/.default"))
.build();
return app.acquireToken(clientCredentialParam).get().accessToken();
}
static void displayFolders() throws Exception {
PropertySet propertySet = new PropertySet(BasePropertySet.IdOnly);
propertySet.add(FolderSchema.DisplayName);
FolderView view = new FolderView(100);
view.setPropertySet(propertySet);
view.setTraversal(FolderTraversal.Deep);
Mailbox mailbox = new Mailbox(exchangeUser);
FolderId folderId = new FolderId(WellKnownFolderName.MsgFolderRoot, mailbox);
logger.info("service.bindToFolder");
Folder folder = service.bindToFolder(folderId, propertySet);
logger.info("Ok");
logger.info("service.findFolders");
FindFoldersResults findFolderResults = service.findFolders(folder.getId(), view);
logger.info("Ok");
// find specific folder
for (Folder f : findFolderResults)
{
logger.info(f.getId());
}
}
}
And my Netbeans maven dependencies:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test</groupId>
<artifactId>EwsTest</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-1.2-api</artifactId>
<version>2.11.2</version>
</dependency>
<dependency>
<groupId>javax.xml.ws</groupId>
<artifactId>jaxws-api</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>msal4j</artifactId>
<version>1.13.3</version>
</dependency>
<dependency>
<groupId>com.microsoft.ews-java-api</groupId>
<artifactId>ews-java-api</artifactId>
<version>2.0</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>19</maven.compiler.source>
<maven.compiler.target>19</maven.compiler.target>
<exec.mainClass>test.ewstest.EwsTest</exec.mainClass>
</properties>
</project>
I created my EwsTest.properties file and filled it with the keys:
user=example#example.nl
client=9999x9xx-xx99-x99x-xx99-xx99x9999x9x
tenant=x99x9999-xx99-9999-xx99-9999xx9x9999
secret=XXxxX!9xx-...-
When I run the program, I see that the connection is being build and a token is being exchanged. But it crashes at the statement:
Folder folder = service.bindToFolder(folderId, propertySet);
With an error: The request failed. An internal server error occurred. The operation failed.
microsoft.exchange.webservices.data.core.exception.service.remote.ServiceRequestException: The request failed. An internal server error occurred. The operation failed.
at microsoft.exchange.webservices.data.core.request.SimpleServiceRequestBase.internalExecute(SimpleServiceRequestBase.java:74) ~[ews-java-api-2.0.jar:?]
at microsoft.exchange.webservices.data.core.request.MultiResponseServiceRequest.execute(MultiResponseServiceRequest.java:158) ~[ews-java-api-2.0.jar:?]
at microsoft.exchange.webservices.data.core.ExchangeService.bindToFolder(ExchangeService.java:504) ~[ews-java-api-2.0.jar:?]
at test.ewstest.EwsTest.displayFolders(EwsTest.java:91) ~[classes/:?]
at test.ewstest.EwsTest.main(EwsTest.java:49) [classes/:?]
Caused by: microsoft.exchange.webservices.data.core.exception.service.remote.ServiceResponseException: An internal server error occurred. The operation failed.
at microsoft.exchange.webservices.data.core.request.ServiceRequestBase.processWebException(ServiceRequestBase.java:548) ~[ews-java-api-2.0.jar:?]
at microsoft.exchange.webservices.data.core.request.ServiceRequestBase.validateAndEmitRequest(ServiceRequestBase.java:641) ~[ews-java-api-2.0.jar:?]
at microsoft.exchange.webservices.data.core.request.SimpleServiceRequestBase.internalExecute(SimpleServiceRequestBase.java:62) ~[ews-java-api-2.0.jar:?]
... 4 more
I tried several folders like inbox and root but they all give that same error:
FolderId folderId = new FolderId(WellKnownFolderName.MsgFolderRoot, mailbox);
But since EwsEditor seems to work, I'm clearly missing something. I tried looking at the code of EwsEditor but that's C# which is even harder to understand for me.
I have a requirement to trigger the Cloud Dataflow pipeline from Cloud Functions. But the Cloud function must be written in Java. So the Trigger for Cloud Function is Google Cloud Storage's Finalise/Create Event, i.e., when a file is uploaded in a GCS bucket, the Cloud Function must trigger the Cloud dataflow.
When I create a dataflow pipeline (batch) and I execute the pipeline, it creates a Dataflow pipeline template and creates a Dataflow job.
But when I create a cloud function in Java, and a file is uploaded, the status just says "ok", but it does not trigger the dataflow pipeline.
Cloud function
package com.example;
import com.example.Example.GCSEvent;
import com.google.api.client.googleapis.javanet.GoogleNetHttpTransport;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.dataflow.Dataflow;
import com.google.api.services.dataflow.model.CreateJobFromTemplateRequest;
import com.google.api.services.dataflow.model.RuntimeEnvironment;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import com.google.cloud.functions.BackgroundFunction;
import com.google.cloud.functions.Context;
import java.io.IOException;
import java.security.GeneralSecurityException;
import java.util.HashMap;
import java.util.logging.Logger;
public class Example implements BackgroundFunction<GCSEvent> {
private static final Logger logger = Logger.getLogger(Example.class.getName());
#Override
public void accept(GCSEvent event, Context context) throws IOException, GeneralSecurityException {
logger.info("Event: " + context.eventId());
logger.info("Event Type: " + context.eventType());
HttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = JacksonFactory.getDefaultInstance();
GoogleCredentials credentials = GoogleCredentials.getApplicationDefault();
HttpRequestInitializer requestInitializer = new HttpCredentialsAdapter(credentials);
Dataflow dataflowService = new Dataflow.Builder(httpTransport, jsonFactory, requestInitializer)
.setApplicationName("Google Dataflow function Demo")
.build();
String projectId = "my-project-id";
RuntimeEnvironment runtimeEnvironment = new RuntimeEnvironment();
runtimeEnvironment.setBypassTempDirValidation(false);
runtimeEnvironment.setTempLocation("gs://my-dataflow-job-bucket/tmp");
CreateJobFromTemplateRequest createJobFromTemplateRequest = new CreateJobFromTemplateRequest();
createJobFromTemplateRequest.setEnvironment(runtimeEnvironment);
createJobFromTemplateRequest.setLocation("us-central1");
createJobFromTemplateRequest.setGcsPath("gs://my-dataflow-job-bucket-staging/templates/cloud-dataflow-template");
createJobFromTemplateRequest.setJobName("Dataflow-Cloud-Job");
createJobFromTemplateRequest.setParameters(new HashMap<String,String>());
createJobFromTemplateRequest.getParameters().put("inputFile","gs://cloud-dataflow-bucket-input/*.txt");
dataflowService.projects().templates().create(projectId,createJobFromTemplateRequest);
throw new UnsupportedOperationException("Not supported yet.");
}
public static class GCSEvent {
String bucket;
String name;
String metageneration;
}
}
pom.xml(cloud function)
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cloudfunctions</groupId>
<artifactId>http-function</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.target>11</maven.compiler.target>
<maven.compiler.source>11</maven.compiler.source>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.google.auth/google-auth-library-credentials -->
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-credentials</artifactId>
<version>0.21.1</version>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-dataflow</artifactId>
<version>v1b3-rev207-1.20.0</version>
</dependency>
<dependency>
<groupId>com.google.cloud.functions</groupId>
<artifactId>functions-framework-api</artifactId>
<version>1.0.1</version>
</dependency>
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.21.1</version>
</dependency>
</dependencies>
<!-- Required for Java 11 functions in the inline editor -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<excludes>
<exclude>.google/</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>
cloud function logs
I went through the below blogs (adding for reference) where they have triggered dataflow from cloud storage via cloud function. But the code has been written in either Node.js or python. But my cloud function must be written in java.
Triggering Dataflow pipeline via cloud functions in Node.js
https://dzone.com/articles/triggering-dataflow-pipelines-with-cloud-functions
Triggering dataflow pipeline via cloud functions using python
https://medium.com/google-cloud/how-to-kick-off-a-dataflow-pipeline-via-cloud-functions-696927975d4e
Any help on this is very much appreciated.
RuntimeEnvironment runtimeEnvironment = new RuntimeEnvironment();
runtimeEnvironment.setBypassTempDirValidation(false);
runtimeEnvironment.setTempLocation("gs://karthiksfirstbucket/temp1");
LaunchTemplateParameters launchTemplateParameters = new LaunchTemplateParameters();
launchTemplateParameters.setEnvironment(runtimeEnvironment);
launchTemplateParameters.setJobName("newJob" + (new Date()).getTime());
Map<String, String> params = new HashMap<String, String>();
params.put("inputFile", "gs://karthiksfirstbucket/sample.txt");
params.put("output", "gs://karthiksfirstbucket/count1");
launchTemplateParameters.setParameters(params);
writer.write("4");
Dataflow.Projects.Templates.Launch launch = dataflowService.projects().templates().launch(projectId, launchTemplateParameters);
launch.setGcsPath("gs://dataflow-templates-us-central1/latest/Word_Count");
launch.execute();
The above code launches a template and executes the dataflow pipeline
using application default credentials(Which can be changed to user cred or service cred)
region is default region(Which can be changed).
creates a job for every HTTP trigger(Trigger can be changed).
The complete code can be found below:
https://github.com/karthikeyan1127/Java_CloudFunction_DataFlow/blob/master/Hello.java
This is my solution using the new data flow dependencies
public class Example implements BackgroundFunction<Example.GCSEvent> {
private static final Logger logger = Logger.getLogger(Example.class.getName());
#Override
public void accept(GCSEvent event, Context context) throws Exception {
String filename = event.name;
logger.info("Processing file: " + filename);
logger.info("Bucket name" + event.bucket);
String projectId = "cedar-router-268801";
String region = "us-central1";
String tempLocation = "gs://cedar-router-beam-poc/temp";
String templateLocation = "gs://cedar-router-beam-poc/template/poc-template.json";
logger.info("path" + String.format("gs://%s/%s", event.bucket, filename));
String scenario = filename.substring(0, 3); //it comes TWO OR ONE
logger.info("scneario " + scenario);
Map<String, String> params = Map.of("sourceFile", String.format("%s/%s", event.bucket, filename),
"scenario", scenario,
"years", "2013,2014",
"targetFile", "gs://cedar-router-beam-poc-kms/result/testfile");
extractedJob(projectId, region, tempLocation, templateLocation, params);
}
private static void extractedJob(String projectId,
String region,
String tempLocation,
String templateLocation,
Map<String, String> params) throws Exception {
HttpTransport httpTransport = GoogleApacheHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = GsonFactory.getDefaultInstance();
GoogleCredentials credentials = GoogleCredentials.getApplicationDefault();
HttpRequestInitializer httpRequestInitializer = new RetryHttpRequestInitializer(ImmutableList.of(404));
ChainingHttpRequestInitializer chainingHttpRequestInitializer =
new ChainingHttpRequestInitializer(new HttpCredentialsAdapter(credentials), httpRequestInitializer);
Dataflow dataflowService = new Dataflow.Builder(httpTransport, jsonFactory, chainingHttpRequestInitializer)
.setApplicationName("Dataflow from Cloud function")
.build();
FlexTemplateRuntimeEnvironment runtimeEnvironment = new FlexTemplateRuntimeEnvironment();
runtimeEnvironment.setTempLocation(tempLocation);
LaunchFlexTemplateParameter launchFlexTemplateParameter = new LaunchFlexTemplateParameter();
launchFlexTemplateParameter.setEnvironment(runtimeEnvironment);
String jobName = params.get("sourceFile").substring(34, 49).replace("_","");
logger.info("job name" + jobName);
launchFlexTemplateParameter.setJobName("job" + LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyyMMddHHmmss")));
launchFlexTemplateParameter.setContainerSpecGcsPath(templateLocation);
launchFlexTemplateParameter.setParameters(params);
LaunchFlexTemplateRequest launchFlexTemplateRequest = new LaunchFlexTemplateRequest();
launchFlexTemplateRequest.setLaunchParameter(launchFlexTemplateParameter);
Launch launch = dataflowService.projects()
.locations()
.flexTemplates()
.launch(projectId, region, launchFlexTemplateRequest);
launch.execute();
logger.info("running job");
}
public static class GCSEvent {
String bucket;
String name;
String metageneration;
}
Just adapt it to your case
I want to use picocli with jline3. So I create a project with the following pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.flaxel</groupId>
<artifactId>picocli_test</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>picocli</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<picocli.version>3.9.3</picocli.version>
<jline.version>3.9.0</jline.version>
</properties>
<dependencies>
<dependency>
<groupId>info.picocli</groupId>
<artifactId>picocli</artifactId>
<version>${picocli.version}</version>
</dependency>
<dependency>
<groupId>info.picocli</groupId>
<artifactId>picocli-shell-jline3</artifactId>
<version>${picocli.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<archive>
<manifest>
<mainClass>com.flaxel.picocli.App</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compile-plugin</artifactId>
<version>3.7.0</version>
</plugin>
</plugins>
</build>
Now I have copied a class from the picocli github page:
package com.flaxel.picocli;
import java.io.IOException;
import java.io.PrintWriter;
import java.nio.charset.Charset;
import java.util.concurrent.Callable;
import java.util.concurrent.TimeUnit;
import org.jline.reader.EndOfFileException;
import org.jline.reader.LineReader;
import org.jline.reader.LineReaderBuilder;
import org.jline.reader.MaskingCallback;
import org.jline.reader.ParsedLine;
import org.jline.reader.UserInterruptException;
import org.jline.reader.impl.DefaultParser;
import org.jline.reader.impl.LineReaderImpl;
import org.jline.terminal.Terminal;
import org.jline.terminal.TerminalBuilder;
import picocli.CommandLine;
import picocli.CommandLine.Command;
import picocli.CommandLine.Option;
import picocli.CommandLine.ParentCommand;
import picocli.shell.jline3.PicocliJLineCompleter;
/**
* Example that demonstrates how to build an interactive shell with JLine3 and
* picocli.
*
* #since 3.9
*/
public class App {
/**
* Top-level command that just prints help.
*/
#Command(name = "", description = "Example interactive shell with completion", footer = { "",
"Press Ctl-D to exit." }, subcommands = { MyCommand.class, ClearScreen.class })
static class CliCommands implements Runnable {
LineReaderImpl reader;
PrintWriter out;
CliCommands() {
}
public void setReader(LineReader reader) {
this.reader = (LineReaderImpl) reader;
out = reader.getTerminal().writer();
}
#Override
public void run() {
out.println(new CommandLine(this).getUsageMessage());
}
}
/**
* A command with some options to demonstrate completion.
*/
#Command(name = "cmd", mixinStandardHelpOptions = true, version = "1.0", description = "Command with some options to demonstrate TAB-completion"
+ " (note that enum values also get completed)")
static class MyCommand implements Runnable {
#Option(names = { "-v", "--verbose" })
private boolean[] verbosity = {};
#Option(names = { "-d", "--duration" })
private int amount;
#Option(names = { "-u", "--timeUnit" })
private TimeUnit unit;
#ParentCommand
CliCommands parent;
#Override
public void run() {
if (verbosity.length > 0) {
parent.out.printf("Hi there. You asked for %d %s.%n", amount, unit);
} else {
parent.out.println("hi!");
}
}
}
/**
* Command that clears the screen.
*/
#Command(name = "cls", aliases = "clear", mixinStandardHelpOptions = true, description = "Clears the screen", version = "1.0")
static class ClearScreen implements Callable<Void> {
#ParentCommand
CliCommands parent;
#Override
public Void call() throws IOException {
parent.reader.clearScreen();
return null;
}
}
public static void main(String[] args) {
try {
// set up the completion
CliCommands commands = new CliCommands();
CommandLine cmd = new CommandLine(commands);
Terminal terminal = TerminalBuilder.builder().build();
LineReader reader = LineReaderBuilder.builder()
.terminal(terminal)
.completer(new PicocliJLineCompleter(cmd.getCommandSpec()))
.parser(new DefaultParser())
.build();
commands.setReader(reader);
String prompt = "prompt> ";
String rightPrompt = null;
// start the shell and process input until the user quits with Ctrl-D
String line;
while (true) {
try {
line = reader.readLine(prompt, rightPrompt, (MaskingCallback) null, null);
ParsedLine pl = reader.getParser().parse(line, 0);
String[] arguments = pl.words().toArray(new String[0]);
CommandLine.run(commands, arguments);
} catch (UserInterruptException e) {
// Ignore
} catch (EndOfFileException e) {
return;
}
}
} catch (Throwable t) {
t.printStackTrace();
}
}
}
If I run the code in the Eclipse IDE, I can write a command in the console and I get an answer. But if I create a jar file with maven, it does not work. I get the following error:
Error: Unable to initialize main class com.flaxel.picocli.App
Caused by: java.lang.NoClassDefFoundError: org/jline/reader/UserInterruptException
I work with Eclipse 2018-09 and Java 11.
The NoClassDefFoundError occurs because you are missing the JLine classes when running your jar. Try using the shade Maven plugin to create an Uber-jar that contains all the dependencies.
With JLine 3 on Windows, you probably want to add the org.jline:jline-terminal-jansi:3.9.0 dependency in addition to info.picoli:picocli-shell-jline3:3.9.3. Without this (or the jline-terminal-jna dependency) you will get a “dumb” terminal without ANSI colors or autocompletion.
(info.picoli:picocli-shell-jline3:3.9.3 will bring in info.picocli:picocli and org.jline:jline as transitive dependencies, so there is no need to include these explicitly.)
Perfect that works for me. I use the maven shade plugin and add some properties, so I get this pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.flaxel</groupId>
<artifactId>picocli_test</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>picocli</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<picocli.version>3.9.3</picocli.version>
<jline.version>3.9.0</jline.version>
</properties>
<dependencies>
<dependency>
<groupId>info.picocli</groupId>
<artifactId>picocli</artifactId>
<version>${picocli.version}</version>
</dependency>
<dependency>
<groupId>info.picocli</groupId>
<artifactId>picocli-shell-jline3</artifactId>
<version>${picocli.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.flaxel.picocli.App</mainClass>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compile-plugin</artifactId>
<version>3.7.0</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>jar-with-dependencies</shadedClassifierName>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
EDIT
I have at least one question. If I start the application and want to use the tab completion, it does not work. So I get no completion, only four spaces are set:
Feb. 07, 2019 6:50:29 NACHM. org.jline.utils.Log logr
WARNING: Unable to create a system terminal, creating a dumb terminal (enable debug logging for more information)
prompt> cmd
Or did I something wrong?
Per my understanding RXJava works within a single JVM. Is there a wrapper/lib/api to support clustered environments with combination of distributed cache, JMS or any other queue to provide subscribers scale across distributed environments? Would like to check here before reinvent the wheel.
You can deploy Vertx instances in a cluster and use RxJava over it. The idea is use EventBus as a transport layer and subscribe to the messages using RxJava. It's not a pure RxJava solution.
A very simple runnable example:
package com.example;
import java.util.concurrent.TimeUnit;
import io.reactivex.Flowable;
import io.vertx.core.DeploymentOptions;
import io.vertx.core.VertxOptions;
import io.vertx.core.json.JsonObject;
import io.vertx.core.spi.cluster.ClusterManager;
import io.vertx.reactivex.core.AbstractVerticle;
import io.vertx.reactivex.core.Vertx;
import io.vertx.reactivex.core.eventbus.EventBus;
import io.vertx.spi.cluster.hazelcast.HazelcastClusterManager;
public class MainVerticle extends AbstractVerticle {
String nodeId;
static final String CENTRAL = "CENTRAL";
#Override
public void start() throws Exception {
EventBus eventBus = vertx.eventBus();
JsonObject config = config();
String nodeID = config.getString("nodeID");
eventBus.consumer(CENTRAL).toFlowable()
.map(msg -> (JsonObject) msg.body())
.filter(msgBody -> !msgBody.getString("sender", "").equals(nodeID))
.subscribe(msgBody -> {
System.out.println(msgBody);
});
Flowable.interval(1, TimeUnit.SECONDS)
.subscribe(tick -> {
JsonObject msg = new JsonObject()
.put("sender", nodeID)
.put("msg", "Hello world");
eventBus.publish(CENTRAL, msg);
});
}
public static void main(String[] args) {
ClusterManager clusterManager = new HazelcastClusterManager();
VertxOptions options = new VertxOptions().setClusterManager(clusterManager);
Vertx.rxClusteredVertx(options)
.doOnError(throwable -> throwable.printStackTrace())
.subscribe(vertx -> {
if (vertx.isClustered()) {
System.out.println("Vertx is running clustered");
}
String nodeID = clusterManager.getNodeID();
System.out.println("Node ID : " + nodeID);
String mainVerticle = MainVerticle.class.getCanonicalName();
DeploymentOptions deploymentOptions = new DeploymentOptions();
deploymentOptions.setConfig(new JsonObject().put("nodeID", nodeID));
vertx.rxDeployVerticle(mainVerticle, deploymentOptions).subscribe();
});
}
}
Maven dependencies:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>rxjava2-clustered</artifactId>
<version>0.42</version>
<packaging>jar</packaging>
<name>rxjava2-clustered</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-core</artifactId>
<version>3.5.0</version>
</dependency>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-rx-java2</artifactId>
<version>3.5.0</version>
</dependency>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-hazelcast</artifactId>
<version>3.5.0</version>
</dependency>
</dependencies>
</project>
In this example, I am using Hazelcast ClusterManager. There exists implementations for Infinispan, Apache Ignite and Apache Zookeeper. Refer to documentation for full reference:
I tried to create a simple class to create a table and add some columns with HBase and Google app engine.
I have already created a project and an instance in Google Cloud platform.
I cloned this repository : https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/blob/master/java/hello-world/src/main/java/com/example/cloud/bigtable/helloworld/HelloWorld.java
It works like a charm to create a table in my instance.
But when I'm trying to create a new maven project with the same config, it doesn't work, I can't create anything.
I got this issue :
InvocationTargetException: Could not find an appropriate constructor for com.google.cloud.bigtable.hbase1_x.BigtableConnection: com.google.common.util.concurrent.MoreExecutors.platformThreadFactory()Ljava/util/concurrent/ThreadFactory;
here is my AudioBridgeData.java file :
package com.xxx.xxx;
import com.google.cloud.bigtable.hbase.BigtableConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.ResultScanner;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.client.Table;
import org.apache.hadoop.hbase.util.Bytes;
import java.io.IOException;
/**
* A minimal application that connects to Cloud Bigtable using the native HBase API
* and performs some basic operations.
*/
public class AudioBridgeData {
private static final byte[] TABLE_NAME = Bytes.toBytes("audio-bridge");
private static final byte[] COLUMN_FAMILY_NAME = Bytes.toBytes("phone-number");
private static final byte[] COLUMN_NAME = Bytes.toBytes("number");
private static final String[] NUMBERS =
{ "+33697584976", "+19553560976", "+4879665676" };
/**
* Connects to Cloud Bigtable, runs some basic operations and prints the results.
*/
private static void doAudioBridge(String projectId, String instanceId) {
try (Connection connection = BigtableConfiguration.connect(projectId, instanceId)) {
Admin admin = connection.getAdmin();
HTableDescriptor descriptor = new HTableDescriptor(TableName.valueOf(TABLE_NAME));
descriptor.addFamily(new HColumnDescriptor(COLUMN_FAMILY_NAME));
print("Create table " + descriptor.getNameAsString());
admin.createTable(descriptor);
Table table = connection.getTable(TableName.valueOf(TABLE_NAME));
print("Write some numbers to the table");
for (int i = 0; i < NUMBERS.length; i++) {
Put put = new Put(Bytes.toBytes(i));
put.addColumn(COLUMN_FAMILY_NAME, COLUMN_NAME, Bytes.toBytes(NUMBERS[i]));
table.put(put);
}
int rowKey = 0;
Result getResult = table.get(new Get(Bytes.toBytes(rowKey)));
String number = Bytes.toString(getResult.getValue(COLUMN_FAMILY_NAME, COLUMN_NAME));
System.out.println("Get a single number by row key");
System.out.printf("\t%s = %s\n", rowKey, number);
Scan scan = new Scan();
print("Scan for all numbers:");
ResultScanner scanner = table.getScanner(scan);
for (Result row : scanner) {
byte[] valueBytes = row.getValue(COLUMN_FAMILY_NAME, COLUMN_NAME);
System.out.println('\t' + Bytes.toString(valueBytes));
}
} catch (IOException e) {
System.err.println("Exception while running HelloWorld: " + e.getMessage());
e.printStackTrace();
System.exit(1);
}
System.exit(0);
}
private static void print(String msg) {
System.out.println("Number: " + msg);
}
public static void main(String[] args) {
String projectId = requiredProperty("bigtable.projectID");
String instanceId = requiredProperty("bigtable.instanceID");
doAudioBridge(projectId, instanceId);
}
private static String requiredProperty(String prop) {
String value = System.getProperty(prop);
if (value == null) {
throw new IllegalArgumentException("Missing required system property: " + prop);
}
return value;
}
}
Here is my pom.xml file :
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>com.xxx</groupId>
<artifactId>xxx</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<url>http://maven.apache.org</url>
<name>xxx</name>
<properties>
<bigtable.version>1.0.0-pre1</bigtable.version>
<hbase.version>1.1.5</hbase.version>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
</properties>
<repositories>
<repository>
<id>snapshots-repo</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<releases><enabled>false</enabled></releases>
<snapshots><enabled>true</enabled></snapshots>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>com.google.cloud.bigtable</groupId>
<artifactId>bigtable-hbase-1.x</artifactId>
<version>${bigtable.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative-boringssl-static</artifactId>
<version>1.1.33.Fork26</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>0.98.11-hadoop2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>com.xxx.xxx.AudioBridgeData</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
And I tried to run this command :
mvn exec:java -Dbigtable.projectID=xxx -Dbigtable.instanceID=quickstart-instance
Thanks a lot for your help ! :)
I had the same error message, weirdly it was caused by missing application credentials. If its the same issue you should set this environment variable:
GOOGLE_APPLICATION_CREDENTIALS
It should be set to the location of the client credentials file you (might have) downloaded after creating a service account key. This is a good page:
https://developers.google.com/identity/protocols/application-default-credentials