I am working on a java project that runs in Azure Functions. The problem is that I can't make the Java CDI 2.0 work in the application.
Please refer to the application codes below.
Function.java
public class Function {
#Inject
private Util util;
/**
* This function listens at endpoint "/api/HttpTrigger-Java". Two ways to invoke it using "curl" command in bash:
* 1. curl -d "HTTP Body" {your host}/api/HttpTrigger-Java&code={your function key}
* 2. curl "{your host}/api/HttpTrigger-Java?name=HTTP%20Query&code={your function key}"
* Function Key is not needed when running locally, to invoke HttpTrigger deployed to Azure, see here(https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook#authorization-keys) on how to get function key for your app.
*/
#FunctionName("HttpTrigger-Java")
public HttpResponseMessage run(
#HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<String>> request,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
// Parse query parameter
String query = request.getQueryParameters().get("name");
String name = request.getBody().orElse(query);
util.display();
if (name == null) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body("Please pass a name on the query string or in the request body").build();
} else {
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
}
}
Util.java
#RequestScoped
public class Util {
public void display(){
System.out.println("testing..");
}
}
I have this in my pom.xml
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
when I deploy this and hit the endpoint, I am getting an nullpointerexception when accessing the method from the injected bean..
Can someone enlighten me regarding this matter?
Related
i am trying to do create a user entity along with a data/file (pdf format).Uploaded and save to database is fine but when i get the user into postman try to send get request method then in the data field show some terrible data and also i can not see my pdf file into my database.
pom.xml
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
Model Class
#Entity(name = "employee")
public class Employee {
#Id
#GeneratedValue
private Integer id;
#Column(nullable = false, length = 30)
private String name;
#Column(nullable = false, length = 30)
private String university;
#Column(nullable = false, length = 30)
private String department;
#Column(nullable = false, length = 2)
private Integer year_of_experience;
#Lob
private byte[] data;
2. Contoller Class
#GetMapping
public List<Employee> getAllEmployee(){
return employeeService.getAllEmployee();
}
#GetMapping("{id}")
public Employee getEmployeeById(#PathVariable Integer id){
return employeeService.getEmployeeById(id);
}
#PostMapping(consumes = MediaType.MULTIPART_FORM_DATA_VALUE,
produces = MediaType.APPLICATION_JSON_VALUE)
public void addEmployee(#RequestParam("file") MultipartFile file, #RequestParam("emp") String emp ) throws IOException {
ObjectMapper objectMapper = new ObjectMapper();
Employee employee = objectMapper.readValue(emp,Employee.class);
employeeService.addEmployee(employee,file);
}
ServiceImplemention Class
#Override
public void addEmployee(Employee employee, MultipartFile file) throws IOException {
for (int i = 0; i < employee.getExperienceList().size(); i++) {
Experience experience = employee.getExperienceList().get(i);
experienceService.addExperience(experience);
}
byte[] temp = file.getBytes();
InputStream inputStream = new ByteArrayInputStream(temp);
employee.setData(temp);
employeeRepostitory.save(employee);
}
Postman view to get employee.
post method set with "form-data" in header and select key as a file and next key is my user entiy key as a text.i also try with content type (applicaiton/json) but not working.How can i convert to this unexpected data to a pdf file or something stranded format to see.
"id": 62,
"name": "raj",
"university": "ewu",
"department": "bba",
"year_of_experience": 3,
"data": "JVBERi0xLjcNCiW1tbW1DQoxIDAgb2JqDQo8PC9UeXBlL0NhdGFsb2cvUGFnZXMgMiAwIFIvTGFuZyhlbi1VUykgL1N0cnVjdFRyZWVSb290IDkgMCBSL01hcmtJbmZvPDwvTWFya2VkIHRydWU+Pi9NZXRhZGF0YSAyNSAwIFIvVmlld2VyUHJlZmVyZW5jZXMgMjYgMCBSPj4NCmVuZG9iag0KMiAwIG9iag0KPDwvVHlwZS9QYWdlcy9Db3VudCAxL0tpZHNbIDMgMCBSXSA+Pg0KZW5kb2JqDQozIDAgb2JqDQo8PC9UeXBlL1BhZ2UvUGFyZW50IDIgMCBSL1Jlc291cmNlczw8L0ZvbnQ8PC9GMSA1IDAgUj4+L0V4dEdTdGF0ZTw8L0dTNyA3IDAgUi9HUzggOCAwIFI+Pi9Qcm9jU2V0Wy9QREYvVGV4dC9JbWFnZUIvSW1hZ2VDL0ltYWdlSV0gPj4vTWVkaWFCb3hbIDAgMCA2MTIgNzkyXSAvQ29udGVudHMgNCAwIFIvR3JvdXA8PC9UeXBlL0dyb3VwL1MvVHJhbnNwYXJlbmN5L0NTL0RldmljZVJHQj4+L1RhYnMvUy9TdHJ1Y3RQYXJlbnRzI
You should take a look at the community project Spring Content. This project gives you a Spring Data-like approach to content. It is for unstructured data (documents, images, videos, etc), what Spring Data is for structured data.
You could add it to your project with something like this:-
pom.xml
<!-- Java API -->
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>spring-content-jpa-boot-starter</artifactId>
<version>1.0.0.M9</version>
</dependency>
<!-- REST API -->
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>spring-content-rest-boot-starter</artifactId>
<version>1.0.0.M9</version>
</dependency>
Configuration
#Configuration
#EnableJpaStores
// enable REST API
#Import("org.springframework.content.rest.config.RestConfiguration.class")
public class ContentConfig {
}
NB: technically this configuration is not needed when using the Spring Boot starters, but included for clarity
To associate content, add Spring Content annotations to your account entity.
Enployee.java
#Entity
public class Employee {
// replace #Lob field with:
#ContentId
private String contentId;
#ContentLength
private long contentLength = 0L;
#MimeType
private String mimeType = "application/pdf";
Create a "store":
EmployeeContentStore.java
#StoreRestResource
public interface EmployeeContentStore extends ContentStore<Employee, String> {
}
This is all you need to create REST endpoints for handling your employee's content under the URI /employees. When your application starts, Spring Content will look at your dependencies (seeing Spring Content JPA and REST), look at your UserContentStore interface and inject an implementation of that interface for JPA. It will also inject an #Controller that forwards http requests to that implementation. This saves you having to implement any of this yourself. So, very similar in programming model and operation to Spring Data.
Then...
curl -X POST /employees/{employeeId} -H "Content-Type: application/pdf" -F "file=#/path/to/local/file.pdf"
will store the content of /path/to/local/file.pdf in the database and associate it with the employee entity whose id is employeeId.
curl /employees/{employeeId} -H "Accept: application/pdf"
will fetch it again and so on...supports full CRUD.
Since you are associating the content with your employee entity, when you get your json responses back from the Spring Data endpoints, they will provide the content-related metadata, but NOT the "terrible data" that you are seeing with your current approach.
There are a couple of getting started guides and videos here. The reference guide is here.
HTH
I try to build Java based Azure Function to be triggered by HTTP and sending data to Topic of Service Bus. This is based on sample code. HTTP Trigger works fine(Hello name is returned), but I don't get any data sent to Service Bus. And I do not get error message. I have tested with C# based Function that that "queueconstring" in local.settings.json is correct.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-output?tabs=java
/* 20.3.2020 HTTP Trigger and Topic output binding*/
package com.function;
import java.util.*;
import com.microsoft.azure.functions.annotation.*;
import com.microsoft.azure.functions.*;
/**
* Azure Functions with HTTP Trigger.
*/
public class HttpTriggerSBOutputJava {
/**
* This function listens at endpoint "/api/HttpTriggerSBOutputJava". Two ways to invoke it using
"curl" command in bash:
* 1. curl -d "HTTP Body" {your host}/api/HttpTriggerSBOutputJava
* 2. curl {your host}/api/HttpTriggerSBOutputJava?name=HTTP%20Query
*/
#FunctionName("HttpTriggerSBOutputJava")
public HttpResponseMessage run(
#HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Optional<String>> request,
#ServiceBusTopicOutput(name = "message", topicName = "ContactInformationChanged", subscriptionName = "Playground", connection = "queueconstring") OutputBinding<String> message,
final ExecutionContext context) {
String name = request.getBody().orElse("Azure Functions");
message.setValue(name);
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
}
You can use the below code, it works fine on my side:
Function.java
package com.function;
import java.util.*;
import com.microsoft.azure.functions.annotation.*;
import com.microsoft.azure.functions.*;
/**
* Azure Functions with HTTP Trigger.
*/
public class Function {
/**
* This function listens at endpoint "/api/HttpExample". Two ways to invoke it using "curl" command in bash:
* 1. curl -d "HTTP Body" {your host}/api/HttpExample
* 2. curl "{your host}/api/HttpExample?name=HTTP%20Query"
*/
#FunctionName("HttpExample")
#ServiceBusTopicOutput(name = "message", topicName = "test", subscriptionName = "test", connection = "ServiceBusConnection")
public HttpResponseMessage run(
#HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Optional<String>> request,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
// Parse query parameter
String query = request.getQueryParameters().get("name");
String name = request.getBody().orElse(query);
if (name == null) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body("Please pass a name on the query string or in the request body").build();
} else {
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
}
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "java",
"ServiceBusConnection":"Endpoint=sb://bowmantest.servicebus.windows.net/;SharedAccessKeyName=test;SharedAccessKey=xxxxxxxxx"
}
}
I want to start up a REST webservice using Jersey. Multiple tutorials stated that this is the way to go:
ResourceConfig resourceConfig = new PackagesResourceConfig(TestSystem.class.getPackageName());
HttpServer server = HttpServerFactory.create("http://localhost:8080/rest", resourceConfig);
server.start();
With a declaration of
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
<version>1.19.4</version>
</dependency>
However I get:
java.lang.IllegalArgumentException
at jersey.repackaged.org.objectweb.asm.ClassReader.<init>(ClassReader.java:170)
at jersey.repackaged.org.objectweb.asm.ClassReader.<init>(ClassReader.java:153)
at jersey.repackaged.org.objectweb.asm.ClassReader.<init>(ClassReader.java:424)
at com.sun.jersey.spi.scanning.AnnotationScannerListener.onProcess(AnnotationScannerListener.java:138)
at com.sun.jersey.core.spi.scanning.uri.FileSchemeScanner$1.f(FileSchemeScanner.java:86)
at com.sun.jersey.core.util.Closing.f(Closing.java:71)
at com.sun.jersey.core.spi.scanning.uri.FileSchemeScanner.scanDirectory(FileSchemeScanner.java:83)
at com.sun.jersey.core.spi.scanning.uri.FileSchemeScanner.scan(FileSchemeScanner.java:71)
at com.sun.jersey.core.spi.scanning.PackageNamesScanner.scan(PackageNamesScanner.java:226)
at com.sun.jersey.core.spi.scanning.PackageNamesScanner.scan(PackageNamesScanner.java:142)
(Even when I don't use the resourceConfig.)
Naturally my first instinct was that something must be wrong with Jersey, so I tried 1.18.1:
java.lang.ArrayIndexOutOfBoundsException: Index 52264 out of bounds for length 287
at jersey.repackaged.org.objectweb.asm.ClassReader.readClass(ClassReader.java:1976)
at jersey.repackaged.org.objectweb.asm.ClassReader.accept(ClassReader.java:464)
at jersey.repackaged.org.objectweb.asm.ClassReader.accept(ClassReader.java:420)
at com.sun.jersey.spi.scanning.AnnotationScannerListener.onProcess(AnnotationScannerListener.java:138)
at com.sun.jersey.core.spi.scanning.JarFileScanner.scan(JarFileScanner.java:97)
at com.sun.jersey.core.spi.scanning.JarFileScanner$1.f(JarFileScanner.java:74)
at com.sun.jersey.core.util.Closing.f(Closing.java:71)
For everything above 1.19 I get the first exception, everything below it's the second one. So maybe something is wrong with my webservice:
#ApplicationPath(TestSystem.URL)
public class TestSystem extends Application {
public static final String URL = "/testsystem/";
}
#Path(HelloWorldEndpoint.URL)
public class HelloWorldEndpoint {
public static final String URL = "hello";
#GET
#Produces(MediaType.TEXT_PLAIN)
#Path("/")
public String greet() {
return "Hello World!";
}
}
How do I start my REST webservice using Jersey?
I am trying to run a PDI transformation involving database (any database, but noSQL one are more preferred) from Java.
I've tried using mongodb and cassandradb and got missing plugins, I've already asked here: Running PDI Kettle on Java - Mongodb Step Missing Plugins, but no one replied yet.
I've tried switching to SQL DB using PostgreSQL too, but it still doesn't work. From the research I did, I think it was because I didn't connect the database from the Java thoroughly, yet I haven't found any tutorial or direction that works for me. I've tried following directions from this blog : http://ameethpaatil.blogspot.co.id/2010/11/pentaho-data-integration-java-maven.html : but still got some problems about repository (because I don't have any and there seems to be required).
The transformations are fine when I run it from Spoon. It only failed when I run it from Java.
Can anyone help me how to run PDI transformation involving database? Where did I go wrong?
Is anyone ever succeeded in running PDI transformation from involving either noSQL and SQL database? what DB did you use?
I'm sorry if I asked too many questions, I am so desperate. any kind of information will be very appreciated. Thank you.
Executing PDI Jobs from Java is pretty straight forward. You just need to import all the necessary jar files (for the databases) and then call in the kettle class. The best way is obviously to use "Maven" to control the dependency. In the maven pom.xml file, just call the database drivers.
A Sample Maven file would be something like below, assuming you are using pentaho v5.0.0GA and Database as PostgreSQL:
<dependencies>
<!-- Pentaho Kettle Core dependencies development -->
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-core</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-dbdialog</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-engine</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-ui-swt</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle5-log4j-plugin</artifactId>
<version>5.0.0.1</version>
</dependency>
<!-- The database dependency files. Use it if your kettle file involves database connectivity. -->
<dependency>
<groupId>postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.1-902.jdbc4</version>
</dependency>
You can check my blog for more. It works for database connections.
Hope this helps :)
I had the same problem in a application using the pentaho libraries. I resolved the problem with this code:
The singleton to init Kettle:
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* Inicia as configurações das variáveis de ambiente do kettle
*
* #author Marcos Souza
* #version 1.0
*
*/
public class AtomInitKettle {
private static final Logger LOGGER = LoggerFactory.getLogger(AtomInitKettle.class);
private AtomInitKettle() throws KettleException {
try {
LOGGER.info("Iniciando kettle");
KettleJNDI.protectSystemProperty();
KettleEnvironment.init();
LOGGER.info("Kettle iniciado com sucesso");
} catch (Exception e) {
LOGGER.error("Message: {} Cause {} ", e.getMessage(), e.getCause());
}
}
}
And the code that saved me:
import java.io.File;
import java.util.Properties;
import org.pentaho.di.core.Const;
import org.pentaho.di.core.exception.KettleException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class KettleJNDI {
private static final Logger LOGGER = LoggerFactory.getLogger(KettleJNDI.class);
public static final String SYS_PROP_IC = "java.naming.factory.initial";
private static boolean init = false;
private KettleJNDI() {
}
public static void initJNDI() throws KettleException {
String path = Const.JNDI_DIRECTORY;
LOGGER.info("Kettle Const.JNDI_DIRECTORY= {}", path);
if (path == null || path.equals("")) {
try {
File file = new File("simple-jndi");
path = file.getCanonicalPath();
} catch (Exception e) {
throw new KettleException("Error initializing JNDI", e);
}
Const.JNDI_DIRECTORY = path;
LOGGER.info("Kettle null > Const.JNDI_DIRECTORY= {}", path);
}
System.setProperty("java.naming.factory.initial", "org.osjava.sj.SimpleContextFactory");
System.setProperty("org.osjava.sj.root", path);
System.setProperty("org.osjava.sj.delimiter", "/");
}
public static void protectSystemProperty() {
if (init) {
return;
}
System.setProperties(new ProtectionProperties(SYS_PROP_IC, System.getProperties()));
if (LOGGER.isInfoEnabled()) {
LOGGER.info("Kettle System Property Protector: System.properties replaced by custom properies handler");
}
init = true;
}
public static class ProtectionProperties extends Properties {
private static final long serialVersionUID = 1L;
private final String protectedKey;
public ProtectionProperties(String protectedKey, Properties prprts) {
super(prprts);
if (protectedKey == null) {
throw new IllegalArgumentException("Properties protection was provided a null key");
}
this.protectedKey = protectedKey;
}
#Override
public synchronized Object setProperty(String key, String value) {
// We forbid changes in general, but do it silent ...
if (protectedKey.equals(key)) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Kettle System Property Protector: Protected change to '" + key + "' with value '" + value + "'");
}
return super.getProperty(protectedKey);
}
return super.setProperty(key, value);
}
}
}
I think your problem is with connection of data base. You can configure in transformation and do not need use JNDI.
public class DatabaseMetaStep {
private static final Logger LOGGER = LoggerFactory.getLogger(DatabaseMetaStep.class);
/**
* Adds the configurations of access to the database
*
* #return
*/
public static DatabaseMeta createDatabaseMeta() {
DatabaseMeta databaseMeta = new DatabaseMeta();
LOGGER.info("Carregando informacoes de acesso");
databaseMeta.setHostname("localhost");
databaseMeta.setName("stepName");
databaseMeta.setUsername("user");
databaseMeta.setPassword("password");
databaseMeta.setDBPort("port");
databaseMeta.setDBName("database");
databaseMeta.setDatabaseType("MonetDB"); // sql, MySql ...
databaseMeta.setAccessType(DatabaseMeta.TYPE_ACCESS_NATIVE);
return databaseMeta;
}
}
Then you need set the databaseMeta to Transmeta
DatabaseMeta databaseMeta = DatabaseMetaStep.createDatabaseMeta();
TransMeta transMeta = new TransMeta();
transMeta.setUsingUniqueConnections(true);
transMeta.setName("ransmetaNeame");
List<DatabaseMeta> databases = new ArrayList<>();
databases.add(databaseMeta);
transMeta.setDatabases(databases);
I tried your code with a "tranformation without jndi" and works!
But I needed add this repository in my pom.xml:
<repositories>
<repository>
<id>pentaho-releases</id>
<url>http://repository.pentaho.org/artifactory/repo/</url>
</repository>
</repositories>
Also when I try with a datasource I have this error : Cannot instantiate class: org.osjava.sj.SimpleContextFactory [Root exception is java.lang.ClassNotFoundException: org.osjava.sj.SimpleContextFactory]
Complete log here:
https://gist.github.com/eb15f8545e3382351e20.git
[FIX] : Add this dependency :
<dependency>
<groupId>pentaho</groupId>
<artifactId>simple-jndi</artifactId>
<version>1.0.1</version>
</dependency>
After that a new error occurs:
transformation_with_jndi - Dispatching started for transformation [transformation_with_jndi]
Table input.0 - ERROR (version 5.0.0.1.19046, build 1 from 2013-09-11_13-51-13 by buildguy) : An error occurred, processing will be stopped:
Table input.0 - Error occured while trying to connect to the database
Table input.0 - java.io.File parameter must be a directory. [D:\opt\workspace-eclipse\invoke-ktr-jndi\simple-jndi]
Complete log : https://gist.github.com/jrichardsz/9d74c7263f3567ac4b45
[EXPLANATION] This is due to in
KettleEnvironment.init();
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/KettleEnvironment.java
There is a inicialization :
if (simpleJndi) {
JndiUtil.initJNDI();
}
And in JndiUtil:
String path = Const.JNDI_DIRECTORY;
if ((path == null) || (path.equals("")))
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/JndiUtil.java
And in Const class :
public static String JNDI_DIRECTORY = NVL(System.getProperty("KETTLE_JNDI_ROOT"), System.getProperty("org.osjava.sj.root"));
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/Const.java
So wee need set this variable KETTLE_JNDI_ROOT
[FIX] A small change in your example : Just add this
System.setProperty("KETTLE_JNDI_ROOT", jdbcPropertiesPath);
before
KettleEnvironment.init();
A complete example based in your code :
import java.io.File;
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransMeta;
public class ExecuteSimpleTransformationWithJndiDatasource {
public static void main(String[] args) {
String resourcesPath = (new File(".").getAbsolutePath())+"\\src\\main\\resources";
String ktr_path = resourcesPath+"\\transformation_with_jndi.ktr";
//KETTLE_JNDI_ROOT could be the simple-jndi folder in your pdi or spoon home.
//in this example, is the resources folder
String jdbcPropertiesPath = resourcesPath;
try {
/**
* Initialize the Kettle Enviornment
*/
System.setProperty("KETTLE_JNDI_ROOT", jdbcPropertiesPath);
KettleEnvironment.init();
/**
* Create a trans object to properly assign the ktr metadata.
*
* #filedb: The ktr file path to be executed.
*
*/
TransMeta metadata = new TransMeta(ktr_path);
Trans trans = new Trans(metadata);
// Execute the transformation
trans.execute(null);
trans.waitUntilFinished();
// checking for errors
if (trans.getErrors() > 0) {
System.out.println("Erroruting Transformation");
}
} catch (KettleException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
For a complete example check my github channel:
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/tree/master/running-etl-transformation-using-java/invoke-transformation-from-java-jndi/src/main/resources
I have been writing some simple unit testing routines for a simple spring web application. When I add #JsonIgnore annotation on a getter method of a resource, the resulting json object does not include the corresponding json element. So when my unit test routine tries to test if this is null (which is the expected behavior for my case, I don't want the password to be available in json object), test routine runs into an exception:
java.lang.AssertionError: No value for JSON path: $.password, exception: No results for path: $['password']
This is the unit test method I wrote, testing the 'password' field with is(nullValue()) method:
#Test
public void getUserThatExists() throws Exception {
User user = new User();
user.setId(1L);
user.setUsername("zobayer");
user.setPassword("123456");
when(userService.getUserById(1L)).thenReturn(user);
mockMvc.perform(get("/users/1"))
.andExpect(jsonPath("$.username", is(user.getUsername())))
.andExpect(jsonPath("$.password", is(nullValue())))
.andExpect(jsonPath("$.links[*].href", hasItem(endsWith("/users/1"))))
.andExpect(status().isOk())
.andDo(print());
}
I have also tried it with jsonPath().exists() which gets a similar exception stating that the path doesn't exist. I am sharing some more code snippets so that the whole situation becomes more readable.
The controller method I am testing looks something like this:
#RequestMapping(value="/users/{userId}", method= RequestMethod.GET)
public ResponseEntity<UserResource> getUser(#PathVariable Long userId) {
logger.info("Request arrived for getUser() with params {}", userId);
User user = userService.getUserById(userId);
if(user != null) {
UserResource userResource = new UserResourceAsm().toResource(user);
return new ResponseEntity<>(userResource, HttpStatus.OK);
} else {
return new ResponseEntity<>(HttpStatus.NOT_FOUND);
}
}
I am using spring hateos resource assembler for converting entity to resource objects and this is my resource class:
public class UserResource extends ResourceSupport {
private Long userId;
private String username;
private String password;
public Long getUserId() {
return userId;
}
public void setUserId(Long userId) {
this.userId = userId;
}
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
#JsonIgnore
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
}
I understand why this is getting an exception, also in a way, the test is successful that it could not find the password field. But what I want to do is, run this test to ensure that the field is not present, or if present, it contains null value. How can I achieve this?
There is a similar post in stack overflow:
Hamcrest with MockMvc: check that key exists but value may be null
In my case, the field may be non existent as well.
For the record, these are the versions of test packages I am using:
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>com.jayway.jsonpath</groupId>
<artifactId>json-path</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.jayway.jsonpath</groupId>
<artifactId>json-path-assert</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.10.19</version>
<scope>test</scope>
</dependency>
[EDIT]
To be more precise, say, you have to write a test for an entity where you know some of the fields need to be null or empty or should not even exists, and you don't actually go through the code to see if there is a JsonIgnore added on top of the property. And you want your tests to pass, how can I do this.
Please feel free to tell me that this is not practical at all, but still would be nice to know.
[EDIT]
The above test succeeds with the following older json-path dependencies:
<dependency>
<groupId>com.jayway.jsonpath</groupId>
<artifactId>json-path</artifactId>
<version>0.9.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.jayway.jsonpath</groupId>
<artifactId>json-path-assert</artifactId>
<version>0.9.1</version>
<scope>test</scope>
</dependency>
[EDIT] Found a quickfix that works with latest version of jayway.jasonpath after reading the documentation of spring's json path matcher.
.andExpect(jsonPath("$.password").doesNotExist())
I had the same problem with the newer version. It looks to me that the doesNotExist() function will verify that the key is not in the result:
.andExpect(jsonPath("$.password").doesNotExist())
There is a difference between the property that is present, but having null value, and the property not being present at all.
If the test should fail only when there is a non-null value, use:
.andExpect(jsonPath("password").doesNotExist())
If the test should fail as soon as the property is present, even with a null value, use:
.andExpect(jsonPath("password").doesNotHaveJsonPath())
#JsonIgnore is behaving as expected, not producing the password in the json output, so how could you expect to test something that you are explicitly excluding from the output?
The line:
.andExpect(jsonPath("$.property", is("some value")));
or even a test that the property is null:
.andExpect(jsonPath("$.property").value(IsNull.nullValue()));
correspond to a json like:
{
...
"property": "some value",
...
}
where the important part is the left side, that is the existence of "property":
Instead, #JsonIgnore is not producing the porperty in the output at all, so you can't expect it not in the test nor in the production output.
If you don't want the property in the output, it's fine, but you can't expect it in test.
If you want it empty in output (both in prod and test) you want to create a static Mapper method in the middle that is not passing the value of the property to the json object:
Mapper.mapPersonToRest(User user) {//exclude the password}
and then your method would be:
#RequestMapping(value="/users/{userId}", method= RequestMethod.GET)
public ResponseEntity<UserResource> getUser(#PathVariable Long userId) {
logger.info("Request arrived for getUser() with params {}", userId);
User user = Mapper.mapPersonToRest(userService.getUserById(userId));
if(user != null) {
UserResource userResource = new UserResourceAsm().toResource(user);
return new ResponseEntity<>(userResource, HttpStatus.OK);
} else {
return new ResponseEntity<>(HttpStatus.NOT_FOUND);
}
}
At this point, if your expectations are for Mapper.mapPersonToRest to return a user with a null password, you can write a normal Unit test on this method.
P.S. Of course the password is crypted on the DB, right? ;)
doesNotHaveJsonPath for checking that it is not in json body
I wanted to reuse the same code I use for testing for the parameter being supplied, and for it missing, and this is what I came up with
#Test
void testEditionFoundInRequest() throws JsonProcessingException {
testEditionWithValue("myEdition");
}
#Test
void testEditionNotFoundInRequest() {
try {
testEditionWithValue(null);
throw new RuntimeException("Shouldn't pass");
} catch (AssertionError | JsonProcessingException e) {
var msg = e.getMessage();
assertTrue(msg.contains("No value at JSON path"));
}
}
void testEditionWithValue(String edition) {
var HOST ="fakeHost";
var restTemplate = new RestTemplate();
var myRestClientUsingRestTemplate = new MyRestClientUsingRestTemplate(HOST, restTemplate);
MockRestServiceServer mockServer;
ObjectMapper objectMapper = new ObjectMapper();
String id = "userId";
var mockResponse = "{}";
var request = new MyRequest.Builder(id).edition(null).build();
mockServer = MockRestServiceServer.bindTo(restTemplate).bufferContent().build();
mockServer
.expect(method(POST))
// THIS IS THE LINE I'd like to say "NOT" found
.andExpect(jsonPath("$.edition").value(edition))
.andRespond(withSuccess(mockResponse, APPLICATION_JSON));
var response = myRestClientUsingRestTemplate.makeRestCall(request);
} catch (AssertionError | JsonProcessingException e) {
var msg = e.getMessage();
assertTrue(msg.contains("No value at JSON path"));
}