I try to work with table decision in excel format (*.xlsx), a spreadsheet. I don't want to put my spreadsheet in the folder resource of my application. So I created a folder : folder/Discount.xls.
But when I run my program a File not found exception it throws. It seems the spreadsheet file can't be load if it is not in the resource folder, so in the jar.
I would to know if I can make a program when the spreadsheed, for my rules can be load dynamically ? I would choose one or another spreadsheet for making hot change of my rules. But I don't know if it possible.
I have some code :
KieServices kieServices = KieServices.Factory.get();
File file = new File("folder/Discount.xls");
Resource resource = ResourceFactory.newFileResource(file);
KieFileSystem kieFileSystem = kieServices.newKieFileSystem().write(resource);
KieBuilder kieBuilder = kieServices.newKieBuilder(kieFileSystem);
kieBuilder.buildAll();
KieRepository kieRepository = kieServices.getRepository();
ReleaseId krDefaultReleaseId = kieRepository.getDefaultReleaseId();
KieContainer kieContainer = kieServices.newKieContainer(krDefaultReleaseId);
KieSession kieSession = kieContainer.newKieSession();
The previous code throw a runtime exception :
Exception in thread "main" java.lang.RuntimeException: Cannot find KieModule: org.default:artifact:1.0.0
I find the solution of my problem. The spreadsheet can be in another folder than the java resource folder. My class test is the following :
public class Main {
private static InternalKnowledgeBase createKnowledgeBaseFromSpreadsheet() throws Exception {
DecisionTableConfiguration decisionTableConfiguration = KnowledgeBuilderFactory.newDecisionTableConfiguration();
decisionTableConfiguration.setInputType(DecisionTableInputType.XLS);
KnowledgeBuilder knowledgeBuilder = KnowledgeBuilderFactory.newKnowledgeBuilder();
File file = new File("folder/Discount.xls");
Resource resourceFile = ResourceFactory.newFileResource(file);
knowledgeBuilder.add(resourceFile, ResourceType.DTABLE, decisionTableConfiguration);
if (knowledgeBuilder.hasErrors()) {
throw new RuntimeException(knowledgeBuilder.getErrors().toString());
}
InternalKnowledgeBase internalKnowledgeBase = KnowledgeBaseFactory.newKnowledgeBase();
Collection<KiePackage> kiePackages = knowledgeBuilder.getKnowledgePackages();
internalKnowledgeBase.addPackages(kiePackages);
return internalKnowledgeBase;
}
public static void main(String[] args) {
(new Main()).run();
}
public void run() {
System.out.println("--- Start Code ---");
StatelessKieSession session = null;
try {
InternalKnowledgeBase knowledgeBase = createKnowledgeBaseFromSpreadsheet();
session = knowledgeBase.newStatelessKieSession();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Customer customer = new Customer(CustomerType.INDIVIDUAL, 1);
System.out.println(customer.toString());
session.execute(customer);
System.out.println(customer.toString());
System.out.println("--- End Code ---");
}
}
My customer class (must add the getter, setter and overhide the toString method) :
public class Customer {
public enum CustomerType {
INDIVIDUAL, BUSINESS;
}
private CustomerType type;
private int years;
private int discount;
public Customer(CustomerType individual, int years) {
this.type = individual;
this.years = years;
}
}
My *.pom file contains :
<dependencies>
<dependency>
<groupId>org.kie</groupId>
<artifactId>kie-ci</artifactId>
<version>7.28.0.Final</version>
</dependency>
<dependency>
<groupId>org.drools</groupId>
<artifactId>drools-decisiontables</artifactId>
<version>7.28.0.Final</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<scope>runtime</scope>
</dependency>
Related
Software version:redmine 4.2.40 jdk17
Dependencies:
<dependency>
<groupId>com.taskadapter</groupId>
<artifactId>redmine-java-api</artifactId>
<version>1.17</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.25</version>
</dependency>
Code:
public class test {
public static void main(String[] args) {
RedmineManager mgr = new RedmineManager("https://localhost/redmine/",
"c35df9265e9f9929f0648edde4a6ab8547b9f0de");
try {
tryGetIssues(mgr);
} catch (Exception e) {
e.printStackTrace();
}
}
private static void tryGetIssues(RedmineManager mgr) throws Exception {
Project project = mgr.getProjectByKey("143");
for (Tracker tk : project.getTrackers()) {
System.out.println(tk.getId() + "-->" + tk.getName());
}
List<Issue> issues = mgr.getIssues("143", null);
for (Issue issue : issues) {
System.out.println(issue.getId() + issue.toString());
}
}
}
error occurred:
Exception in thread "main" com.taskadapter.redmineapi.NotFoundException: Server returned '404 not found'. response body:
at com.taskadapter.redmineapi.internal.comm.redmine.RedmineErrorHandler.processContent(RedmineErrorHandler.java:48)
at com.taskadapter.redmineapi.internal.comm.redmine.RedmineErrorHandler.processContent(RedmineErrorHandler.java:22)
at com.taskadapter.redmineapi.internal.comm.ComposingHandler.processContent(ComposingHandler.java:25)
at com.taskadapter.redmineapi.internal.comm.ComposingHandler.processContent(ComposingHandler.java:25)
at com.taskadapter.redmineapi.internal.comm.BaseCommunicator.sendRequest(BaseCommunicator.java:47)
at com.taskadapter.redmineapi.internal.comm.redmine.RedmineAuthenticator.sendRequest(RedmineAuthenticator.java:52)
at com.taskadapter.redmineapi.internal.comm.FmapCommunicator.sendRequest(FmapCommunicator.java:26)
at com.taskadapter.redmineapi.internal.comm.FmapCommunicator.sendRequest(FmapCommunicator.java:26)
at com.taskadapter.redmineapi.internal.comm.BasicSimplifier.sendRequest(BasicSimplifier.java:24)
at com.taskadapter.redmineapi.internal.Transport.send(Transport.java:604)
at com.taskadapter.redmineapi.internal.Transport.getJsonResponseFromGet(Transport.java:511)
at com.taskadapter.redmineapi.internal.Transport.getObjectsListNoPaging(Transport.java:486)
at com.taskadapter.redmineapi.internal.Transport.getObjectsList(Transport.java:458)
at com.taskadapter.redmineapi.IssueManager.getIssues(IssueManager.java:186)
at TaskManager.main(TaskManager.java:27)
It took me a long time to solve the problem; I hope to get your help!
I have a class that scans a column from a dynamo db table, whilst using the aws sdk for java(main method taken out for simplicity):
public class fetchCmdbColumn {
public static List<String> CMDB(String tableName, String tableColumn) throws Exception {
DynamoDbClient client = DynamoDbClient.builder()
.region(Region.EU_WEST_1)
.build();
List<String> ListValues = new ArrayList<>();
try {
ScanRequest scanRequest = ScanRequest.builder()
.tableName(tableName)
.build();
ScanResponse response = client.scan(scanRequest);
for (Map<String, AttributeValue> item : response.items()){
Set<String> keys = item.keySet();
for (String key : keys) {
if (key == tableColumn) {
ListValues.add(item.get(key).s()) ;
}
}
}
//To check what is being returned, comment out below
// System.out.println(ListValues);
} catch (DynamoDbException e){
e.printStackTrace();
System.exit(1);
}
client.close();
return ListValues;
}
}
I also have a junit tests created for that class:
public class fetchCMDBTest {
// Define the data members required for the test
private static String tableName = "";
private static String tableColumn = "";
#BeforeAll
public static void setUp() throws IOException {
// Run tests on Real AWS Resources
try (InputStream input = fetchCMDBTest.class.getClassLoader().getResourceAsStream("config.properties")) {
Properties prop = new Properties();
if (input == null) {
System.out.println("Sorry, unable to find config.properties");
return;
}
//load a properties file from class path, inside static method
prop.load(input);
// Populate the data members required for all tests
tableName = prop.getProperty("environment_list");
tableColumn = prop.getProperty("env_name");
} catch (IOException ex) {
ex.printStackTrace();
}
}
#Test
void fetchCMDBtable() throws Exception{
try {
fetchCMDB.CMDB(tableName, tableColumn);
System.out.println("Test 1 passed");
} catch (Exception e) {
System.out.println("Test 1 failed!");
e.printStackTrace();
}
}
}
When i run the test using mvn test I get the error:
software.amazon.awssdk.core.exception.SdkClientException: Multiple HTTP implementations were found on the classpath ,
even though I have only declared the client builder once in the class.
What am i missing?
I run the UNIT tests from the IntelliJ IDE. I find using the IDE works better then from the command line. Once I setup the config.properties file that contains the values for the tests and run them, all tests pass -- as shown here:
In fact - we test all Java V2 code examples in this manner to ensure they all work.
I also tested all DynamoDB examples from the command line using mvn test . All passed:
Amend your test to build a single instance of the DynamoDB client and then as your first test, make sure it was created successfully. See if this works for you. Once you get this working, add more tests!
public class DynamoDBTest {
private static DynamoDbClient ddb;
#BeforeAll
public static void setUp() throws IOException {
// Run tests on Real AWS Resources
Region region = Region.US_WEST_2;
ddb = DynamoDbClient.builder().region(region).build();
try (InputStream input = DynamoDBTest.class.getClassLoader().getResourceAsStream("config.properties")) {
Properties prop = new Properties();
if (input == null) {
System.out.println("Sorry, unable to find config.properties");
return;
}
//load a properties file from class path, inside static method
prop.load(input);
} catch (IOException ex) {
ex.printStackTrace();
}
}
#Test
#Order(1)
public void whenInitializingAWSService_thenNotNull() {
assertNotNull(ddb);
System.out.println("Test 1 passed");
}
Turns out my pom file contained other clients, so had to remove the likes of :
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
<exclusions>
<exclusion>
<groupId>software.amazon.awssdk</groupId>
<artifactId>netty-nio-client</artifactId>
</exclusion>
<exclusion>
<groupId>software.amazon.awssdk</groupId>
<artifactId>apache-client</artifactId>
</exclusion>
</exclusions>
</dependency>
and replaced them with :
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>aws-crt-client</artifactId>
<version>2.14.13-PREVIEW</version>
</dependency>
as mentioned in https://aws.amazon.com/blogs/developer/introducing-aws-common-runtime-http-client-in-the-aws-sdk-for-java-2-x/
as a complement to the other answers, for me only worked the option 4 from the reference.
Option 4: Change the default HTTP client using a system property in Java code.
I defined it on the setUp() method of my integration test using JUnit 5.
#BeforeAll
public static void setUp() {
System.setProperty(
SdkSystemSetting.SYNC_HTTP_SERVICE_IMPL.property(),
"software.amazon.awssdk.http.apache.ApacheSdkHttpService");
}
and because I am using gradle:
implementation ("software.amazon.awssdk:s3:${awssdk2Version}") {
exclude group: 'software.amazon.awssdk', module: 'netty-nio-client'
exclude group: 'software.amazon.awssdk', module: 'apache-client'
}
implementation "software.amazon.awssdk:aws-crt-client:2.17.71-PREVIEW"
When I'm running my test class the later tests are using the mocks of the previous ones. I use JMockit in maven. I've read that they might be running on the same jvm branch? If this is the case can someone explain how I run them on different branches? If its not, then can anyone explain why the re-use of mocks is occurring (and thus breaking tests).
public class ServiceUploadTest {
private String filePath = "src/test/resources/AudioTestFile.mp3";
private ServiceUpload serviceUpload = new ServiceUpload();
#Test
#DisplayName("TestConversionOfMp4ToMp3")
void testConversionOfMp4ToMp3() {
new MockUp<Encoder>() {
#Mock
public void encode(MultimediaObject multimediaObject, File target, EncodingAttributes attributes) throws IllegalArgumentException, InputFormatException, EncoderException {
}
};
assertEquals("src/test/resources/Audio.mp3", serviceUpload.convertToMp3(filePath));
}
#Test
#DisplayName("Test cutting loop when length is over 5000000")
void testLongCuttingLoop() throws IOException {
InputStream inputStream = new FileInputStream("/Users/hywelgriffiths/Documents/IntellijProjects/sipho/transcriptionSoftware/audio.transcribe.front/src/test/java/resources/base64.txt");
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream));
String base64 = bufferedReader.readLine();
ServiceUpload serviceUpload = new ServiceUpload();
new MockUp<ProviderUpload>() {
#Mock
public String executeUploadHttp(String mp3Base64, String jobName, String tag, String email) {
return null;
}
};
assertNull(serviceUpload.cuttingLoop(base64, "JOBNAME", null));
}
#Test
#DisplayName("Test cutting loop when length is under 5000000")
void testShortCuttingLoop() throws IOException {
ServiceUpload serviceUpload = new ServiceUpload();
new MockUp<ProviderUpload>() {
#Mock
public String executeUploadHttp(String mp3Base64, String jobName, String tag, String email) {
return null;
}
};
assertNull(serviceUpload.cuttingLoop("SHORTBASE64", "JOBNAME", null));
}
#Test
#DisplayName("Test convertToBase64AndSend")
void testConvertToBase64AndSend(){
ServiceUpload serviceUpload = new ServiceUpload();
File file = new File ("src/test/java/resources/fakeMp4.txt");
String jobName = "JOBNAME";
new MockUp<ServiceUpload>() {
#Mock
public String convertToMp3(String mp4File) {
return "src/test/java/resources/fakeMp4.txt";
}
};
assertNull("\"complete\"", serviceUpload.convertToBase64AndSend(jobName, file, null, false));
}
#Test
#DisplayName("Test convertToBase64andSendCatchBlock")
void testConvertToBase64AndSendCatch(){
ServiceUpload serviceUpload = new ServiceUpload();
File file = new File ("src/test/java/resources/fakeMp4.txt");
String jobName = "JOBNAME";
new MockUp<ServiceUpload>() {
#Mock
public String convertToMp3(String mp4File) throws Exception {
throw new Exception("Forced Exception");
}
};
assertEquals("\"complete\"", serviceUpload.convertToBase64AndSend(jobName, file, null, false));
}
#Test
#DisplayName("Test convertToMp3 catch block")
void testConvertToMp3CatchBlock() {
new MockUp<ServiceUpload>() {
#Mock
public String createMp3(String mp4file) throws Exception {
throw new Exception("Forced Exception");
}
};
assertNull(serviceUpload.convertToMp3(filePath));
}
}
NOTE:
It turns out it was my dependencies in the POM (thanks Jeff) I was using :
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter</artifactId>
<version>RELEASE</version>
and changed it to
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<version>5.3.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.3.1</version>
<scope>test</scope>
</dependency>
You've got something subtle going on, and I'd check your assumptions before you pull your hair out. First, confirm that the MockUp is truly leaking between tests (it shouldn't be). An easy way to do that would be to add a System.out.println in each MockUp (and maybe in setup/teardown), and then as it runs each test, you should see printlns that are not expected. If you don't, then JMockIt is behaving as one would expect.
Assuming your theory is sound, I'd take a look at the pom. Specifically, the surefire settings (it would be nice if you posted it). I'm guessing your comment on 'branches' is really addressed at the forking/threading/test-parallelization that surefire does. You may have something glitchy there and it can be tricky to get it tuned properly.
I think you missed the annotation the top of the test class, see this hint.
I have integrated AWS Java SDK in my applcaition.Unfoutunately am getting "Internal Failure. Please try your request again" as the response.
This is how I have implemeneted it.
Using Maven, added this in pom.xml
<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>transcribe</artifactId>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bom</artifactId>
<version>2.10.12</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
And in code,
String localAudioPath = "/home/****.wav";
String key = config.awsSecretAccessKey;
String keyId = config.awsAccessKeyId;
String regionString = config.awsRegion; //"ap-south-1"
String outputBucketName = config.awsOutputBucket;
Region region = Region.of(regionString);
String inputLanguage = "en-US";
LanguageCode languageCode = LanguageCode.fromValue(inputLanguage);
AwsCredentials credentials = AwsBasicCredentials.create(keyId, key);
AwsCredentialsProvider transcribeCredentials=StaticCredentialsProvider.create(credentials);
AWSCredentialsProvider s3AwsCredentialsProvider = getS3AwsCredentialsProvider(key, keyId);
String jobName = subJob.getId()+"_"+subJob.getProgram_name().replace(" ", "");
String fileName = jobName + ".wav";
AmazonS3 s3 =
AmazonS3ClientBuilder.standard().withRegion(regionString).withClientConfiguration(new
ClientConfiguration()).withCredentials(s3AwsCredentialsProvider).build();
s3.putObject(outputBucketName, fileName, new File(localAudioFilePath));
String fileUri = s3.getUrl(outputBucketName, fileName).toString();
System.out.println(fileUri);
Media media = Media.builder().mediaFileUri(fileUri).build();
String mediaFormat = MediaFormat.WAV.toString();
jobName = jobName +"_"+ System.currentTimeMillis();
Settings settings = Settings.builder()
.showSpeakerLabels(true)
.maxSpeakerLabels(10)
.build();
StartTranscriptionJobRequest request = StartTranscriptionJobRequest.builder()
.languageCode(languageCode)
.media(media)
.mediaFormat(mediaFormat)
.settings(settings)
.transcriptionJobName(jobName)
.build();
TranscribeAsyncClient client = TranscribeAsyncClient.builder()
.region(region)
.credentialsProvider(transcribeClientCredentialsProvider)
.build();
CompletableFuture<StartTranscriptionJobResponse> response =
client.startTranscriptionJob(request);
System.out.println(response.get().toString());
GetTranscriptionJobRequest jobRequest =
GetTranscriptionJobRequest.builder().transcriptionJobName(jobName).build();
while( true ){
CompletableFuture<GetTranscriptionJobResponse> transcriptionJobResponse =
client.getTranscriptionJob(jobRequest);
GetTranscriptionJobResponse response1 = transcriptionJobResponse.get();
if (response1 != null && response1.transcriptionJob() != null) {
if (response1.transcriptionJob().transcriptionJobStatus() ==
TranscriptionJobStatus.FAILED) {
//It comes here and gives response1.failureReason = "Internal Failure. Please try your request again".
break;
}
}
}
private AWSCredentialsProvider getS3AwsCredentialsProvider(String key, String keyId) {
return new AWSCredentialsProvider() {
#Override
public AWSCredentials getCredentials() {
return new AWSCredentials() {
#Override
public String getAWSAccessKeyId() {
return keyId;
}
#Override
public String getAWSSecretKey() {
return key;
}
};
}
#Override
public void refresh() {
}
};
}
The same thing is working with Python SDK. Same region, same wav file, same language, same settings, same output bucket etc. What am doing wrong??
Your flow looks correct. It may be an issue with the audio file you are uploading to AWS. I suggest you check it once.
I am trying to run a PDI transformation involving database (any database, but noSQL one are more preferred) from Java.
I've tried using mongodb and cassandradb and got missing plugins, I've already asked here: Running PDI Kettle on Java - Mongodb Step Missing Plugins, but no one replied yet.
I've tried switching to SQL DB using PostgreSQL too, but it still doesn't work. From the research I did, I think it was because I didn't connect the database from the Java thoroughly, yet I haven't found any tutorial or direction that works for me. I've tried following directions from this blog : http://ameethpaatil.blogspot.co.id/2010/11/pentaho-data-integration-java-maven.html : but still got some problems about repository (because I don't have any and there seems to be required).
The transformations are fine when I run it from Spoon. It only failed when I run it from Java.
Can anyone help me how to run PDI transformation involving database? Where did I go wrong?
Is anyone ever succeeded in running PDI transformation from involving either noSQL and SQL database? what DB did you use?
I'm sorry if I asked too many questions, I am so desperate. any kind of information will be very appreciated. Thank you.
Executing PDI Jobs from Java is pretty straight forward. You just need to import all the necessary jar files (for the databases) and then call in the kettle class. The best way is obviously to use "Maven" to control the dependency. In the maven pom.xml file, just call the database drivers.
A Sample Maven file would be something like below, assuming you are using pentaho v5.0.0GA and Database as PostgreSQL:
<dependencies>
<!-- Pentaho Kettle Core dependencies development -->
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-core</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-dbdialog</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-engine</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle-ui-swt</artifactId>
<version>5.0.0.1</version>
</dependency>
<dependency>
<groupId>pentaho-kettle</groupId>
<artifactId>kettle5-log4j-plugin</artifactId>
<version>5.0.0.1</version>
</dependency>
<!-- The database dependency files. Use it if your kettle file involves database connectivity. -->
<dependency>
<groupId>postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.1-902.jdbc4</version>
</dependency>
You can check my blog for more. It works for database connections.
Hope this helps :)
I had the same problem in a application using the pentaho libraries. I resolved the problem with this code:
The singleton to init Kettle:
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* Inicia as configurações das variáveis de ambiente do kettle
*
* #author Marcos Souza
* #version 1.0
*
*/
public class AtomInitKettle {
private static final Logger LOGGER = LoggerFactory.getLogger(AtomInitKettle.class);
private AtomInitKettle() throws KettleException {
try {
LOGGER.info("Iniciando kettle");
KettleJNDI.protectSystemProperty();
KettleEnvironment.init();
LOGGER.info("Kettle iniciado com sucesso");
} catch (Exception e) {
LOGGER.error("Message: {} Cause {} ", e.getMessage(), e.getCause());
}
}
}
And the code that saved me:
import java.io.File;
import java.util.Properties;
import org.pentaho.di.core.Const;
import org.pentaho.di.core.exception.KettleException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class KettleJNDI {
private static final Logger LOGGER = LoggerFactory.getLogger(KettleJNDI.class);
public static final String SYS_PROP_IC = "java.naming.factory.initial";
private static boolean init = false;
private KettleJNDI() {
}
public static void initJNDI() throws KettleException {
String path = Const.JNDI_DIRECTORY;
LOGGER.info("Kettle Const.JNDI_DIRECTORY= {}", path);
if (path == null || path.equals("")) {
try {
File file = new File("simple-jndi");
path = file.getCanonicalPath();
} catch (Exception e) {
throw new KettleException("Error initializing JNDI", e);
}
Const.JNDI_DIRECTORY = path;
LOGGER.info("Kettle null > Const.JNDI_DIRECTORY= {}", path);
}
System.setProperty("java.naming.factory.initial", "org.osjava.sj.SimpleContextFactory");
System.setProperty("org.osjava.sj.root", path);
System.setProperty("org.osjava.sj.delimiter", "/");
}
public static void protectSystemProperty() {
if (init) {
return;
}
System.setProperties(new ProtectionProperties(SYS_PROP_IC, System.getProperties()));
if (LOGGER.isInfoEnabled()) {
LOGGER.info("Kettle System Property Protector: System.properties replaced by custom properies handler");
}
init = true;
}
public static class ProtectionProperties extends Properties {
private static final long serialVersionUID = 1L;
private final String protectedKey;
public ProtectionProperties(String protectedKey, Properties prprts) {
super(prprts);
if (protectedKey == null) {
throw new IllegalArgumentException("Properties protection was provided a null key");
}
this.protectedKey = protectedKey;
}
#Override
public synchronized Object setProperty(String key, String value) {
// We forbid changes in general, but do it silent ...
if (protectedKey.equals(key)) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Kettle System Property Protector: Protected change to '" + key + "' with value '" + value + "'");
}
return super.getProperty(protectedKey);
}
return super.setProperty(key, value);
}
}
}
I think your problem is with connection of data base. You can configure in transformation and do not need use JNDI.
public class DatabaseMetaStep {
private static final Logger LOGGER = LoggerFactory.getLogger(DatabaseMetaStep.class);
/**
* Adds the configurations of access to the database
*
* #return
*/
public static DatabaseMeta createDatabaseMeta() {
DatabaseMeta databaseMeta = new DatabaseMeta();
LOGGER.info("Carregando informacoes de acesso");
databaseMeta.setHostname("localhost");
databaseMeta.setName("stepName");
databaseMeta.setUsername("user");
databaseMeta.setPassword("password");
databaseMeta.setDBPort("port");
databaseMeta.setDBName("database");
databaseMeta.setDatabaseType("MonetDB"); // sql, MySql ...
databaseMeta.setAccessType(DatabaseMeta.TYPE_ACCESS_NATIVE);
return databaseMeta;
}
}
Then you need set the databaseMeta to Transmeta
DatabaseMeta databaseMeta = DatabaseMetaStep.createDatabaseMeta();
TransMeta transMeta = new TransMeta();
transMeta.setUsingUniqueConnections(true);
transMeta.setName("ransmetaNeame");
List<DatabaseMeta> databases = new ArrayList<>();
databases.add(databaseMeta);
transMeta.setDatabases(databases);
I tried your code with a "tranformation without jndi" and works!
But I needed add this repository in my pom.xml:
<repositories>
<repository>
<id>pentaho-releases</id>
<url>http://repository.pentaho.org/artifactory/repo/</url>
</repository>
</repositories>
Also when I try with a datasource I have this error : Cannot instantiate class: org.osjava.sj.SimpleContextFactory [Root exception is java.lang.ClassNotFoundException: org.osjava.sj.SimpleContextFactory]
Complete log here:
https://gist.github.com/eb15f8545e3382351e20.git
[FIX] : Add this dependency :
<dependency>
<groupId>pentaho</groupId>
<artifactId>simple-jndi</artifactId>
<version>1.0.1</version>
</dependency>
After that a new error occurs:
transformation_with_jndi - Dispatching started for transformation [transformation_with_jndi]
Table input.0 - ERROR (version 5.0.0.1.19046, build 1 from 2013-09-11_13-51-13 by buildguy) : An error occurred, processing will be stopped:
Table input.0 - Error occured while trying to connect to the database
Table input.0 - java.io.File parameter must be a directory. [D:\opt\workspace-eclipse\invoke-ktr-jndi\simple-jndi]
Complete log : https://gist.github.com/jrichardsz/9d74c7263f3567ac4b45
[EXPLANATION] This is due to in
KettleEnvironment.init();
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/KettleEnvironment.java
There is a inicialization :
if (simpleJndi) {
JndiUtil.initJNDI();
}
And in JndiUtil:
String path = Const.JNDI_DIRECTORY;
if ((path == null) || (path.equals("")))
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/JndiUtil.java
And in Const class :
public static String JNDI_DIRECTORY = NVL(System.getProperty("KETTLE_JNDI_ROOT"), System.getProperty("org.osjava.sj.root"));
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/blob/master/running-etl-transformation-using-java/researching-pentaho-classes/Const.java
So wee need set this variable KETTLE_JNDI_ROOT
[FIX] A small change in your example : Just add this
System.setProperty("KETTLE_JNDI_ROOT", jdbcPropertiesPath);
before
KettleEnvironment.init();
A complete example based in your code :
import java.io.File;
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransMeta;
public class ExecuteSimpleTransformationWithJndiDatasource {
public static void main(String[] args) {
String resourcesPath = (new File(".").getAbsolutePath())+"\\src\\main\\resources";
String ktr_path = resourcesPath+"\\transformation_with_jndi.ktr";
//KETTLE_JNDI_ROOT could be the simple-jndi folder in your pdi or spoon home.
//in this example, is the resources folder
String jdbcPropertiesPath = resourcesPath;
try {
/**
* Initialize the Kettle Enviornment
*/
System.setProperty("KETTLE_JNDI_ROOT", jdbcPropertiesPath);
KettleEnvironment.init();
/**
* Create a trans object to properly assign the ktr metadata.
*
* #filedb: The ktr file path to be executed.
*
*/
TransMeta metadata = new TransMeta(ktr_path);
Trans trans = new Trans(metadata);
// Execute the transformation
trans.execute(null);
trans.waitUntilFinished();
// checking for errors
if (trans.getErrors() > 0) {
System.out.println("Erroruting Transformation");
}
} catch (KettleException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
For a complete example check my github channel:
https://github.com/jrichardsz/pentaho-pdi-spoon-usefull-templates/tree/master/running-etl-transformation-using-java/invoke-transformation-from-java-jndi/src/main/resources