I am trying to run a PDI transformation involving database (any database, but noSQL one are more preferred) from Java.
I've tried using mongodb and cassandradb and got missing plugins, I've already asked here: Running PDI Kettle on Java - Mongodb Step Missing Plugins, but no one replied yet.
I've tried switching to SQL DB using PostgreSQL too, but it still doesn't work. From the research I did, I think it was because I didn't connect the database from the Java thoroughly, yet I haven't found any tutorial or direction that works for me. I've tried following directions from this blog : http://ameethpaatil.blogspot.co.id/2010/11/pentaho-data-integration-java-maven.html : but still got some problems about repository (because I don't have any and there seems to be required).
The transformations are fine when I run it from Spoon. It only failed when I run it from Java.
Can anyone help me how to run PDI transformation involving database? Where did I go wrong?
Is anyone ever succeeded in running PDI transformation from involving either noSQL and SQL database? what DB did you use?
I'm sorry if I asked too many questions, I am so desperate. any kind of information will be very appreciated. Thank you.
Executing PDI Jobs from Java is pretty straight forward. You just need to import all the necessary jar files (for the databases) and then call in the kettle class. The best way is obviously to use "Maven" to control the dependency. In the maven pom.xml file, just call the database drivers.
A Sample Maven file would be something like below, assuming you are using pentaho v5.0.0GA and Database as PostgreSQL:
<dependencies>
<!-- Pentaho Kettle Core dependencies development -->
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-core</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-dbdialog</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-engine</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-ui-swt</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle5-log4j-plugin</artifactId>
<version>5.0.0.1</version>
</dependency>
<!-- The database dependency files. Use it if your kettle file involves database connectivity. -->
<dependency>
<groupId>postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.1-902.jdbc4</version>
</dependency>
You can check my blog for more. It works for database connections.
Hope this helps :)
I had the same problem in a application using the pentaho libraries. I resolved the problem with this code:
The singleton to init Kettle:
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* Inicia as configurações das variáveis de ambiente do kettle
*
* #author Marcos Souza
* #version 1.0
*
*/
public class AtomInitKettle {
private static final Logger LOGGER = LoggerFactory.getLogger(AtomInitKettle.class);
private AtomInitKettle() throws KettleException {
try {
LOGGER.info("Iniciando kettle");
KettleJNDI.protectSystemProperty();
KettleEnvironment.init();
LOGGER.info("Kettle iniciado com sucesso");
} catch (Exception e) {
LOGGER.error("Message: {} Cause {} ", e.getMessage(), e.getCause());
}
}
}
And the code that saved me:
import java.io.File;
import java.util.Properties;
import org.pentaho.di.core.Const;
import org.pentaho.di.core.exception.KettleException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class KettleJNDI {
private static final Logger LOGGER = LoggerFactory.getLogger(KettleJNDI.class);
public static final String SYS_PROP_IC = "java.naming.factory.initial";
private static boolean init = false;
private KettleJNDI() {
}
public static void initJNDI() throws KettleException {
String path = Const.JNDI_DIRECTORY;
LOGGER.info("Kettle Const.JNDI_DIRECTORY= {}", path);
if (path == null || path.equals("")) {
try {
File file = new File("simple-jndi");
path = file.getCanonicalPath();
} catch (Exception e) {
throw new KettleException("Error initializing JNDI", e);
}
Const.JNDI_DIRECTORY = path;
LOGGER.info("Kettle null > Const.JNDI_DIRECTORY= {}", path);
}
System.setProperty("java.naming.factory.initial", "org.osjava.sj.SimpleContextFactory");
System.setProperty("org.osjava.sj.root", path);
System.setProperty("org.osjava.sj.delimiter", "/");
}
public static void protectSystemProperty() {
if (init) {
return;
}
System.setProperties(new ProtectionProperties(SYS_PROP_IC, System.getProperties()));
if (LOGGER.isInfoEnabled()) {
LOGGER.info("Kettle System Property Protector: System.properties replaced by custom properies handler");
}
init = true;
}
public static class ProtectionProperties extends Properties {
private static final long serialVersionUID = 1L;
private final String protectedKey;
public ProtectionProperties(String protectedKey, Properties prprts) {
super(prprts);
if (protectedKey == null) {
throw new IllegalArgumentException("Properties protection was provided a null key");
}
this.protectedKey = protectedKey;
}
#Override
public synchronized Object setProperty(String key, String value) {
// We forbid changes in general, but do it silent ...
if (protectedKey.equals(key)) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Kettle System Property Protector: Protected change to '" + key + "' with value '" + value + "'");
}
return super.getProperty(protectedKey);
}
return super.setProperty(key, value);
}
}
}
I think your problem is with connection of data base. You can configure in transformation and do not need use JNDI.
public class DatabaseMetaStep {
private static final Logger LOGGER = LoggerFactory.getLogger(DatabaseMetaStep.class);
/**
* Adds the configurations of access to the database
*
* #return
*/
public static DatabaseMeta createDatabaseMeta() {
DatabaseMeta databaseMeta = new DatabaseMeta();
LOGGER.info("Carregando informacoes de acesso");
databaseMeta.setHostname("localhost");
databaseMeta.setName("stepName");
databaseMeta.setUsername("user");
databaseMeta.setPassword("password");
databaseMeta.setDBPort("port");
databaseMeta.setDBName("database");
databaseMeta.setDatabaseType("MonetDB"); // sql, MySql ...
databaseMeta.setAccessType(DatabaseMeta.TYPE_ACCESS_NATIVE);
return databaseMeta;
}
}
Then you need set the databaseMeta to Transmeta
DatabaseMeta databaseMeta = DatabaseMetaStep.createDatabaseMeta();
TransMeta transMeta = new TransMeta();
transMeta.setUsingUniqueConnections(true);
transMeta.setName("ransmetaNeame");
List<DatabaseMeta> databases = new ArrayList<>();
databases.add(databaseMeta);
transMeta.setDatabases(databases);
I tried your code with a "tranformation without jndi" and works!
But I needed add this repository in my pom.xml:
<repositories>
<repository>
<id>pentaho-releases</id>
<url>http://repository.pentaho.org/artifactory/repo/</url>
</repository>
</repositories>
Also when I try with a datasource I have this error : Cannot instantiate class: org.osjava.sj.SimpleContextFactory [Root exception is java.lang.ClassNotFoundException: org.osjava.sj.SimpleContextFactory]
Complete log here:
https://gist.github.com/eb15f8545e3382351e20.git
[FIX] : Add this dependency :
<dependency>
<groupId>pentaho</groupId>
<artifactId>simple-jndi</artifactId>
<version>1.0.1</version>
</dependency>
After that a new error occurs:
transformation_with_jndi - Dispatching started for transformation [transformation_with_jndi]
Table input.0 - ERROR (version 5.0.0.1.19046, build 1 from 2013-09-11_13-51-13 by buildguy) : An error occurred, processing will be stopped:
Table input.0 - Error occured while trying to connect to the database
Table input.0 - java.io.File parameter must be a directory. [D:\opt\workspace-eclipse\invoke-ktr-jndi\simple-jndi]
Complete log : https://gist.github.com/jrichardsz/9d74c7263f3567ac4b45
[EXPLANATION] This is due to in
KettleEnvironment.init();
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/KettleEnvironment.java
There is a inicialization :
if (simpleJndi) {
JndiUtil.initJNDI();
}
And in JndiUtil:
String path = Const.JNDI_DIRECTORY;
if ((path == null) || (path.equals("")))
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/JndiUtil.java
And in Const class :
public static String JNDI_DIRECTORY = NVL(System.getProperty("KETTLE_JNDI_ROOT"), System.getProperty("org.osjava.sj.root"));
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/Const.java
So wee need set this variable KETTLE_JNDI_ROOT
[FIX] A small change in your example : Just add this
System.setProperty("KETTLE_JNDI_ROOT", jdbcPropertiesPath);
before
KettleEnvironment.init();
A complete example based in your code :
import java.io.File;
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransMeta;
public class ExecuteSimpleTransformationWithJndiDatasource {
public static void main(String[] args) {
String resourcesPath = (new File(".").getAbsolutePath())+"\\src\\main\\resources";
String ktr_path = resourcesPath+"\\transformation_with_jndi.ktr";
//KETTLE_JNDI_ROOT could be the simple-jndi folder in your pdi or spoon home.
//in this example, is the resources folder
String jdbcPropertiesPath = resourcesPath;
try {
/**
* Initialize the Kettle Enviornment
*/
System.setProperty("KETTLE_JNDI_ROOT", jdbcPropertiesPath);
KettleEnvironment.init();
/**
* Create a trans object to properly assign the ktr metadata.
*
* #filedb: The ktr file path to be executed.
*
*/
TransMeta metadata = new TransMeta(ktr_path);
Trans trans = new Trans(metadata);
// Execute the transformation
trans.execute(null);
trans.waitUntilFinished();
// checking for errors
if (trans.getErrors() > 0) {
System.out.println("Erroruting Transformation");
}
} catch (KettleException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
For a complete example check my github channel:
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/tree/master/running-etl-transformation-using-java/invoke-transformation-from-java-jndi/src/main/resources
Related
I'm trying to set an JFX ImageView image from a resource folder, but can't seem to get an appropriate URL/String filepath that won't throw an exception.
var x = getRandomImageFromPackage("pictures").toString();
var y = getClass().getClassLoader().getResource("pictures/mindwave/active/Super Saiyan.gif").toString();
this.iconImageView.setImage(new Image(x));
x returns
/home/sarah/Desktop/Dropbox/School/Current/MAX MSP Utilities/MindWaveMobileDataServer/target/classes/pictures/0515e3b7cb30ac92ebfe729440870a5c.jpg
whereas y returns something that looks like:
file:/home/sarah/Desktop/Dropbox/School/Current/MAX%20MSP%20Utilities/MindWaveMobileDataServer/target/classes/pictures/mindwave/active/Super%20Saiyan.gif
In theory either of these would be acceptable, however, only x will throw an exception if it is placed in the below setImage(String) line.
Is there any way to get a list of images in the package so that I can select a random one and set the ImageView?
I know that there was a custom scanner option, but it appears rather dated (being over 11 years old and wasn't really supported at the time):
Get a list of resources from classpath directory
Routine:
/**
* Gets a picture from the classpath pictures folder.
*
* #param packageName The string path (in package format) to the classpath
* folder
* #return The random picture
*/
private Path getRandomImageFromPackage(String packageName) {
try {
var list = Arrays.asList(new File(Thread.currentThread().getContextClassLoader().getResource(packageName)
.toURI()).listFiles());
var x = list.get(new Random().nextInt(list.size())).toString();
return list.get(new Random().nextInt(list.size())).toPath();
} catch (URISyntaxException ex) {
throw new IllegalStateException("Encountered an error while trying to get a picture from the classpath"
+ "filesystem", ex);
}
}
For reference, this is the resource folder:
Issues with your approach
You don't have a well-formed url
new Image(String url) takes a url as a parameter.
A space is not a valid character for a URL:
Which characters make a URL invalid?
which is why your x string is not a valid URL and cannot be used to construct an image.
You need to provide an input recognized by the Image constructor
Note, that it is slightly more complex because, from the Image javadoc, the url parameter can be somethings other than a straight url, but even still, none of them match what you are trying to lookup.
If a URL string is passed to a constructor, it be any of the
following:
the name of a resource that can be resolved by the context ClassLoader
for this thread
a file path that can be resolved by File
a URL that
can be resolved by URL and for which a protocol handler exists
The RFC
2397 "data" scheme for URLs is supported in addition to the protocol
handlers that are registered for the application. If a URL uses the
"data" scheme, the data must be base64-encoded and the MIME type must
either be empty or a subtype of the image type.
You are assuming the resources are in a file system, but that won't always work
If you pack your resources into a jar, then this will not work:
Arrays.asList(
new File(
Thread.currentThread()
.getContextClassLoader()
.getResource(packageName)
.toURI()
).listFiles()
);
This doesn't work because files in the jar are located using the jar: protocol rather than the file: protocol. So, you will be unable to create File objects from the jar: protocol URIs that will be returned by getResource.
Recommended Approach: Use Spring
Getting a list of resources from a jar is actually a pretty tricky thing. From the question you linked, the easiest solution is the one which uses
Spring's PathMatchingResourcePatternResolver
Unfortunately, that means requiring a dependency on the Spring framework to use it, which is total overkill for this task . . . however I don't know of any other simple robust solution. But at least you can just call the Spring utility class, you don't need to start up a whole spring dependency injection container to use it, so you don't really need to know any Spring at all or suffer any Spring overhead to do it this way.
So, you could write something like this (ResourceLister is a class I created, as well as the toURL method, see the example app):
public List<String> getResourceUrls(String locationPattern) throws IOException {
ClassLoader classLoader = ResourceLister.class.getClassLoader();
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver(classLoader);
Resource[] resources = resolver.getResources(locationPattern);
return Arrays.stream(resources)
.map(this::toURL)
.filter(Objects::nonNull)
.collect(Collectors.toList());
}
Executable Example
ResourceLister.java
import org.springframework.core.io.Resource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import java.io.IOException;
import java.util.*;
import java.util.stream.Collectors;
public class ResourceLister {
// currently, only gets pngs, if needed, can add
// other patterns and union the results to get
// multiple image types.
private static final String IMAGE_PATTERN =
"classpath:/img/*.png";
public List<String> getImageUrls() throws IOException {
return getResourceUrls(IMAGE_PATTERN);
}
public List<String> getResourceUrls(String locationPattern) throws IOException {
ClassLoader classLoader = ResourceLister.class.getClassLoader();
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver(classLoader);
Resource[] resources = resolver.getResources(locationPattern);
return Arrays.stream(resources)
.map(this::toURL)
.filter(Objects::nonNull)
.collect(Collectors.toList());
}
private String toURL(Resource r) {
try {
if (r == null) {
return null;
}
return r.getURL().toExternalForm();
} catch (IOException e) {
return null;
}
}
public static void main(String[] args) throws IOException {
ResourceLister lister = new ResourceLister();
System.out.println(lister.getImageUrls());
}
}
AnimalApp.java
import javafx.application.Application;
import javafx.geometry.*;
import javafx.scene.Scene;
import javafx.scene.control.Button;
import javafx.scene.image.*;
import javafx.scene.layout.VBox;
import javafx.stage.Stage;
import java.io.IOException;
import java.util.*;
import java.util.stream.Collectors;
public class AnimalApp extends Application {
private static final double ANIMAL_SIZE = 512;
// remove the magic seed if you want a different random sequence all the time.
private final Random random = new Random(42);
private final ResourceLister resourceLister = new ResourceLister();
private List<Image> images;
#Override
public void init() {
List<String> imageUrls = findImageUrls();
images = imageUrls.stream()
.map(Image::new)
.collect(Collectors.toList());
}
#Override
public void start(Stage stage) {
ImageView animalView = new ImageView();
animalView.setFitWidth(ANIMAL_SIZE);
animalView.setFitHeight(ANIMAL_SIZE);
animalView.setPreserveRatio(true);
Button findAnimalButton = new Button("Find animal");
findAnimalButton.setOnAction(e ->
animalView.setImage(randomImage())
);
VBox layout = new VBox(10,
findAnimalButton,
animalView
);
layout.setPadding(new Insets(10));
layout.setAlignment(Pos.CENTER);
stage.setScene(new Scene(layout));
stage.show();
}
private List<String> findImageUrls() {
try {
return resourceLister.getImageUrls();
} catch (IOException e) {
e.printStackTrace();
}
return new ArrayList<>();
}
/**
* Chooses a random image.
*
* Allows the next random image chosen to be the same as the previous image.
*
* #return a random image or null if no images were found.
*/
private Image randomImage() {
if (images == null || images.isEmpty()) {
return null;
}
return images.get(random.nextInt(images.size()));
}
public static void main(String[] args) {
launch(args);
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>resource-lister</artifactId>
<version>1.0-SNAPSHOT</version>
<name>resource-lister</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<junit.version>5.7.1</junit.version>
</properties>
<dependencies>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>17.0.2</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>LATEST</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>17</source>
<target>17</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
images
Place in src/main/resources/img.
chicken.png
cow.png
pig.png
sheep.png
execution command
Set the VM arguments for your JavaFX SDK installation:
-p C:\dev\javafx-sdk-17.0.2\lib --add-modules javafx.controls
There is no easy and reliable way to do that. Therefore I create and put an inventory file into my resources folder. So at runtime I can read that in and then have all the file names awailable that I need.
Here is a little test that shows how I create that file:
public class ListAppDefaultsInventory {
#Test
public void test() throws IOException {
List<String> inventory = listFilteredFiles("src/main/resources/app-defaults", Integer.MAX_VALUE);
assertFalse("Directory 'app-defaults' is empty.", inventory.isEmpty());
System.out.println("# src/main/resources/app-defaults-inventory.txt");
inventory.forEach(s -> System.out.println(s));
}
public List<String> listFilteredFiles(String dir, int depth) throws IOException {
try (Stream<Path> stream = Files.walk(Paths.get(dir), depth)) {
return stream
.filter(file -> !Files.isDirectory(file))
.filter(file -> !file.getFileName().toString().startsWith("."))
.map(Path::toString)
.map(s -> s.replaceFirst("src/main/resources/app-defaults/", ""))
.collect(Collectors.toList());
}
}
}
I have a class that scans a column from a dynamo db table, whilst using the aws sdk for java(main method taken out for simplicity):
public class fetchCmdbColumn {
public static List<String> CMDB(String tableName, String tableColumn) throws Exception {
DynamoDbClient client = DynamoDbClient.builder()
.region(Region.EU_WEST_1)
.build();
List<String> ListValues = new ArrayList<>();
try {
ScanRequest scanRequest = ScanRequest.builder()
.tableName(tableName)
.build();
ScanResponse response = client.scan(scanRequest);
for (Map<String, AttributeValue> item : response.items()){
Set<String> keys = item.keySet();
for (String key : keys) {
if (key == tableColumn) {
ListValues.add(item.get(key).s()) ;
}
}
}
//To check what is being returned, comment out below
// System.out.println(ListValues);
} catch (DynamoDbException e){
e.printStackTrace();
System.exit(1);
}
client.close();
return ListValues;
}
}
I also have a junit tests created for that class:
public class fetchCMDBTest {
// Define the data members required for the test
private static String tableName = "";
private static String tableColumn = "";
#BeforeAll
public static void setUp() throws IOException {
// Run tests on Real AWS Resources
try (InputStream input = fetchCMDBTest.class.getClassLoader().getResourceAsStream("config.properties")) {
Properties prop = new Properties();
if (input == null) {
System.out.println("Sorry, unable to find config.properties");
return;
}
//load a properties file from class path, inside static method
prop.load(input);
// Populate the data members required for all tests
tableName = prop.getProperty("environment_list");
tableColumn = prop.getProperty("env_name");
} catch (IOException ex) {
ex.printStackTrace();
}
}
#Test
void fetchCMDBtable() throws Exception{
try {
fetchCMDB.CMDB(tableName, tableColumn);
System.out.println("Test 1 passed");
} catch (Exception e) {
System.out.println("Test 1 failed!");
e.printStackTrace();
}
}
}
When i run the test using mvn test I get the error:
software.amazon.awssdk.core.exception.SdkClientException: Multiple HTTP implementations were found on the classpath ,
even though I have only declared the client builder once in the class.
What am i missing?
I run the UNIT tests from the IntelliJ IDE. I find using the IDE works better then from the command line. Once I setup the config.properties file that contains the values for the tests and run them, all tests pass -- as shown here:
In fact - we test all Java V2 code examples in this manner to ensure they all work.
I also tested all DynamoDB examples from the command line using mvn test . All passed:
Amend your test to build a single instance of the DynamoDB client and then as your first test, make sure it was created successfully. See if this works for you. Once you get this working, add more tests!
public class DynamoDBTest {
private static DynamoDbClient ddb;
#BeforeAll
public static void setUp() throws IOException {
// Run tests on Real AWS Resources
Region region = Region.US_WEST_2;
ddb = DynamoDbClient.builder().region(region).build();
try (InputStream input = DynamoDBTest.class.getClassLoader().getResourceAsStream("config.properties")) {
Properties prop = new Properties();
if (input == null) {
System.out.println("Sorry, unable to find config.properties");
return;
}
//load a properties file from class path, inside static method
prop.load(input);
} catch (IOException ex) {
ex.printStackTrace();
}
}
#Test
#Order(1)
public void whenInitializingAWSService_thenNotNull() {
assertNotNull(ddb);
System.out.println("Test 1 passed");
}
Turns out my pom file contained other clients, so had to remove the likes of :
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
<exclusions>
<exclusion>
<groupId>software.amazon.awssdk</groupId>
<artifactId>netty-nio-client</artifactId>
</exclusion>
<exclusion>
<groupId>software.amazon.awssdk</groupId>
<artifactId>apache-client</artifactId>
</exclusion>
</exclusions>
</dependency>
and replaced them with :
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>aws-crt-client</artifactId>
<version>2.14.13-PREVIEW</version>
</dependency>
as mentioned in https://aws.amazon.com/blogs/developer/introducing-aws-common-runtime-http-client-in-the-aws-sdk-for-java-2-x/
as a complement to the other answers, for me only worked the option 4 from the reference.
Option 4: Change the default HTTP client using a system property in Java code.
I defined it on the setUp() method of my integration test using JUnit 5.
#BeforeAll
public static void setUp() {
System.setProperty(
SdkSystemSetting.SYNC_HTTP_SERVICE_IMPL.property(),
"software.amazon.awssdk.http.apache.ApacheSdkHttpService");
}
and because I am using gradle:
implementation ("software.amazon.awssdk:s3:${awssdk2Version}") {
exclude group: 'software.amazon.awssdk', module: 'netty-nio-client'
exclude group: 'software.amazon.awssdk', module: 'apache-client'
}
implementation "software.amazon.awssdk:aws-crt-client:2.17.71-PREVIEW"
I am working on a java project that runs in Azure Functions. The problem is that I can't make the Java CDI 2.0 work in the application.
Please refer to the application codes below.
Function.java
public class Function {
#Inject
private Util util;
/**
* This function listens at endpoint "/api/HttpTrigger-Java". Two ways to invoke it using "curl" command in bash:
* 1. curl -d "HTTP Body" {your host}/api/HttpTrigger-Java&code={your function key}
* 2. curl "{your host}/api/HttpTrigger-Java?name=HTTP%20Query&code={your function key}"
* Function Key is not needed when running locally, to invoke HttpTrigger deployed to Azure, see here(https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook#authorization-keys) on how to get function key for your app.
*/
#FunctionName("HttpTrigger-Java")
public HttpResponseMessage run(
#HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<String>> request,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
// Parse query parameter
String query = request.getQueryParameters().get("name");
String name = request.getBody().orElse(query);
util.display();
if (name == null) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body("Please pass a name on the query string or in the request body").build();
} else {
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
}
}
Util.java
#RequestScoped
public class Util {
public void display(){
System.out.println("testing..");
}
}
I have this in my pom.xml
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
when I deploy this and hit the endpoint, I am getting an nullpointerexception when accessing the method from the injected bean..
Can someone enlighten me regarding this matter?
I have an issue here that I'm hoping to resolve. First, when I call the cloud Translate service with source and target languages, I encounter the following error:
java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at com.google.cloud.translate.TranslateImpl.optionMap(TranslateImpl.java:131)
at com.google.cloud.translate.TranslateImpl.access$000(TranslateImpl.java:40)
at com.google.cloud.translate.TranslateImpl$4.call(TranslateImpl.java:113)
at com.google.cloud.translate.TranslateImpl$4.call(TranslateImpl.java:110)
This is what I'm doing:
protected Translate getTranslationServiceClient() throws IOException {
if (translationServiceClient == null) {
synchronized (this) {
if (translationServiceClient == null) {
try (InputStream is = new FileInputStream(new File(getCredentialFilePath()))) {
final GoogleCredentials myCredentials = GoogleCredentials.fromStream(is);
translationServiceClient = TranslateOptions.newBuilder().setCredentials(myCredentials).build().getService();
} catch (IOException ioe) {
throw new NuxeoException(ioe);
}
}
}
}
return translationServiceClient;
}
public TranslationResponse translateText(String text, String sourceLanguage, String targetLanguage) throws IOException {
Translation response = translationService.translate(text, TranslateOption.sourceLanguage("en"), TranslateOption.sourceLanguage("es"));
//System.out.println(response.getTranslatedText());
GoogleTranslationResponse gtr = new GoogleTranslationResponse(response);
return gtr;
}
The error points to the Cloud's TranslateImpl class optionMap method and spills the NoSuchMethodError on the checkArgument. Am I Passing the TranslateOption's incorrectly??:
private Map<TranslateRpc.Option, ?> optionMap(Option... options) {
Map<TranslateRpc.Option, Object> optionMap = Maps.newEnumMap(TranslateRpc.Option.class);
for (Option option : options) {
Object prev = optionMap.put(option.getRpcOption(), option.getValue());
checkArgument(prev == null, "Duplicate option %s", option);
}
return optionMap;
}
In an effort to get any kind of response from the API, I've tried calling the service without passing any options or just the targetLanguage. Without any options, I don't have any errors and my texted is translated into english, as expected. If I just add TranslateOption.targetLanguage("es"), I still get the NoSuchMethodError.
I had this exact same error. The problem was an ancient version of Google Guava being brought in by some other dependency. I found this by running mvn dependency:tree. I had to exclude the ancient version of Guava like this
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
I am trying to write a custom stream grouping which writes to Mongodb. I am running a local cluster for now. I have a custom stream class and a mongo object. I write to mongodb in both the prepare() and chooseTask(). It writes to mongodb but the supervisors cannot start. I see this error in the supervisor log:
b.s.d.worker [ERROR] Error on initialization of server mk-worker
java.lang.NoClassDefFoundError: com/mongodb/MongoClient
at storm.starter.MongoMonitorObject.<init>(MongoMonitorObject.java:23) ~[stormjar.jar:0.10.0]
at storm.starter.ModStreamGrouping.prepare(ModStreamGrouping.java:94)
~[stormjar.jar:0.10.0]
I am making changes in the storm starter project for now.
public class ModStreamGrouping implements CustomStreamGrouping, Serializable{
java.util.List<java.lang.Integer> targetTasks = new ArrayList();
#Override
public List<Integer> chooseTasks(int taskId,List<Object> values) {
System.out.println("taskiD = " + taskId);
System.out.println("values = " + values);
return numTasks[0];
}
#Override
public void prepare(WorkerTopologyContext context, GlobalStreamId stream, java.util.List<java.lang.Integer> targetTasks) {
MongoMonitorObject mmo = new
System.out.println(" in prep() ");
System.out.println("targetTasks = " + targetTasks);
numTasks = targetTasks.size();
}
}
public class MongoMonitorObject {
private static final Logger LOG = LoggerFactory.getLogger(MongoMonitorObject.class);
public MongoMonitorObject(java.util.List<java.lang.Integer> targetTasks){
try{
MongoClient mongoClient = new MongoClient("localhost", 27017);
DB db = mongoClient.getDB( "loadDB" );
DBCollection collection = db.getCollection("testCollection");
for (Integer task : targetTasks) {
BasicDBObject document = new BasicDBObject();
document.put("tid", task);
collection.insert(document);
}
}
catch (UnknownHostException e) {
System.out.println(" in UnknownHostException ");
LOG.info(" in UnknownHostException ");
}
catch (Exception e) {
System.out.println(" in Exception ");
LOG.info(" in Exception ");
}
}
}
The stream grouping is defined in a ModStreamGrouping.java and mongo connection is defined in MongoMonitorObject.java. Both belong to the package storm.starter.
I can upload the topology but the supervisors cannot spawn workers. There's a small link I'm missing somewhere but I don't know where exactly. I added the following in storm starter's pom.xml to include mongodb connectivity:
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>2.13.3</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
<version>2.13.3</version>
</dependency>
Edit:
I read it here: https://github.com/mongodb/mongo-java-driver
mongodb-java-driver is a all-in-one jar, it contains bson and core
Therefore dependency mongodb-java-driver is enough.
If use dependency mongodb-driver , dependencies bson and core are needed.
Original post:
Try add mongodb-driver-core and use the newer version dependencies
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
<version>3.2.2</version>
</dependency>
The MongoDB Java Driver uber-artifact, containing mongodb-driver, mongodb-driver-core, and bson
Check it here