how to correctly fire Drools rules on multiple objects? - java

i'm getting my hands on Drools ( with java ) for the first time and i'm quite confused about it's sessions and ability to work with collections of objects.
this is the case:
i'm building a web-application made of rest-services.
i have a class called Log with two fields ( eventType and RiskLevelId ).
Mycode retrieves from a db several objects of this kind in a defined time frame.
If this collection of objects happens to contain one Log with eventType == 2 and RiskLevelId == 1 and another Log with eventType == 3 and RiskLevelId == 1, the rule should be executed.
Via Drools interfaces I correctly retrieve KieServices, KieBuilder, KieContaier, KieBase and KieSession.
try {
// load up the knowledge base
KieServices kieServices = KieServices.Factory.get();
KieFileSystem kfs = kieServices.newKieFileSystem();
FileInputStream fis = f;
kfs.write( "src/main/resources/simple.drl",
kieServices.getResources().newInputStreamResource( fis ) );
KieBuilder kieBuilder = kieServices.newKieBuilder( kfs ).buildAll();
Results results = kieBuilder.getResults();
if( results.hasMessages( Message.Level.ERROR ) ){
System.out.println( results.getMessages() );
throw new IllegalStateException( "### errors ###" );
}
KieContainer kieContainer = kieServices.newKieContainer( kieServices.getRepository().getDefaultReleaseId() );
KieBase kieBase = kieContainer.getKieBase();
kieSession = kieContainer.newKieSession();
}catch (Throwable t) {
t.printStackTrace();
}
i then retrieve each single Log istance in a for loop. staying in the loop i also add the object to the KieSession and fire the rule:
#Autowired
KieSessionFactory kieSessionFactory;
#Override
public void run() {
KieSession kieS = kieSessionFactory.getKieSessionCheckSavedLog();
try {
List<Log> logs = logRepo.getAllInGivenTimeSec(10);
for(Log l : logs) {
kieS.insert(l);
kieS.fireAllRules();
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Here comes the rule i've written:
package com.sample
import it.protodrools.beans.Log;
dialect "java"
rule "log2"
when
$l1 : Log( eventType == 2 && riskLevelId == 1);
$l2 : Log( this != $l1 && eventType == 3 && riskLevelId == 1 );
then
System.out.println( "deadly threat !" );
end
My question is: will this rule take in account the whole list of objects that i'm passing ( though not via List, as i've read this is not a good practice ) and thus consider whether there's a condition-matching pair of object among those i'v passed ?
woukd you suggest some different workaround ?
thanks in advance

No, it will not.
for(Log l : logs) {
kieS.insert(l);
kieS.fireAllRules();
}
According to your loop you will insert an object and after each insert immediately afterwards you fire all rules. I am not sure how Drools will react to your loop, but what you probably want to do is insert all Logs in the working memory and then fire the rules:
for(Log l : logs) {
kieS.insert(l);
}
kieS.fireAllRules();
Designing a JUnit test class would show you this immediately though.

Related

Drools : How to write a rule that hits when data is not available in entry point

I am new to drools. I am expecting a sensor data that will send data from a tracking device (like a tag device). I am using Drools entry point to track the sensor data. I need to do some alerts on some events based on this sensor data.
DRL file is as below
import com.sample.AlertRuleModel;
declare AlertRuleModel
#role( event )
#timestamp( timespamp )
end
rule "No signals are coming from any entry-point for more than 10s"
when
$f : AlertRuleModel() from entry-point "AlertRuleStream"
not(AlertRuleModel(this != $f, this after[0s, 10s] $f) from entry-point "AlertRuleStream")
then
$f.setRuleId(1);
<Do alert here>
end
rule "Rule on Tag1 has not been in zone1 for more than 1 minutes"
when
$f : AlertRuleModel( tagId == 1, zoneId == 1 ) from entry-point "AlertRuleStream"
not(AlertRuleModel(this != $f, tagId == 1, zoneId != 1, this after[0s, 1m] $f) from entry-point "AlertRuleStream")
then
$f.setRuleId(2);
<Do alert here>
end
Java code
kSession = RuleExecutionService.getKieSession(packetProcessorData.getAlertRuleDrlPath());
ruleStream = kSession.getEntryPoint("AlertRuleStream");
kSession.addEventListener(new DefaultAgendaEventListener() {
public void afterMatchFired(AfterMatchFiredEvent event) {
super.afterMatchFired(event);
onPostExecution(event, RuleTypeEnum.ALERT_RULE.getName());
}
});
new Thread() {
#Override
public void run() {
kSession.fireUntilHalt();
}
}.start();
Stream data insertion part
private BlockingQueue<AlertRuleModel> alertFactQueue;
.
.
AlertRuleModel alertRuleModel = null;
while (true) {
alertRuleModel = alertFactQueue.poll(1, TimeUnit.SECONDS);
if (alertRuleModel != null) {
//LOGGER.debug("Inserting alertRuleModel into \"AlertRuleStream\"");
ruleStream.insert(alertRuleModel);
continue;
}
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
LOGGER.error("Exception while sleeping thread during alertFactQueue polling..", e);
}
}
But when i run application,
First rule "No signals are coming from any entry-point for more than 10s" is not hitting at all. I dont know why, please tell me if i am doing anything wrong or any syntax error is for first rule.
In case of second rule "Tag1 has not been in zone1 for more than 1 minutes", it alway hit immediately when i pass fact with tagId == 1 and zoneId == 1. I tried with different time gaps like after[0s, 10m]. But still it hits immediately after passing fact with above values.
Please tell me where i am making mistakes..?

how to insert facts in drools at runtime to share between rules?

I have a simple that checks whether a user id is present in db
rule "check if user is already present"
agenda-group "dbcheck"
when
$blackListUserDto : BlackListUserDto( )
eval( BlackListServiceImpl.isUserBlacklisted($blackListUserDto) )
then
System.out.println("to be executed first");
System.out.println($blackListUserDto.isUserAlreadyBlacklisted());
end
The method isUserBlacklisted is as follows
public static boolean isUserBlacklisted(BlackListUserDto blackListUserDto)
{
try {
BlackListEntity blackListEntity = blackListRepository.findByUserId(blackListUserDto.getUserId());
if(blackListEntity!=null)
{
blackListUserDto.setUserAlreadyBlacklisted(true);
}
else
//do something else
} catch (Exception e) {
e.printStackTrace();
return false;
}
return true;
}
As it can be seen that I am modifying the fact(dto) blackListUserDto by setUserAlreadyBlacklisted(true).
But in the "then" part of rule when I am printing the value
System.out.println($blackListUserDto.isUserAlreadyBlacklisted()); The
output is still false.
also I need to share this data in another rule which is as follows
rule "blacklist user"
agenda-group "blacklist"
when
(BlackListUserDto( userAlreadyBlacklisted == false ))
then
//do something else
end
so far my understanding is that when I edit facts then do we need to re insert them again? if yes then how do I insert it in the same session as there is another method in which I am creating this session as follows :-
public void blacklistUser(String userId) throws IOException
{
BlackListUserDto blackListUserDto=new BlackListUserDto();
blackListUserDto.setUserId(userId);
KieSession kieSession = kContainer.newKieSession();
Agenda agenda = kieSession.getAgenda();
agenda.getAgendaGroup( "blacklist" ).setFocus();
agenda.getAgendaGroup( "dbcheck" ).setFocus();
kieSession.insert(blackListUserDto);
kieSession.insert(queryTypeDto);
kieSession.fireAllRules();
kieSession.dispose();
}
what all changes to be done to make sure that the fact gets updated and the updated value gets reflected in the next rule.
I found a solution to the above and I am sharing the rule that solved the above use case
rule "check if user is already blacklisted 1"
agenda-group "dbcheck"
when
(QueryTypeDto( queryType == "blacklist" ))
$blackListUser : BlackListUserDto( )
not ControlFact( blackListUserDto == $blackListUser )
$blackListUserDto : BlackListUserDto( )
eval( BlackListServiceImpl.isUserBlacklisted($blackListUser) == false )
$queryTypeDto : QueryTypeDto()
then
System.out.println("to be executed first");
System.out.println($blackListUser.isBlackListFraudUser());
modify($blackListUser){
setBlackListFraudUser(true)
}
insert (new ControlFact($blackListUser));
//$queryTypeDto.setUserBlackListed(false);
end
This blog will help more in understanding the use of modify in drools : https://ilesteban.wordpress.com/2012/11/16/about-drools-and-infinite-execution-loops/

Apache Camel library will holding memory for long time?

I am using the verbosegc to capture some data and try to analyze the memory usage of my application.
I have a module that will pulling data from database or third party and put it into a list object then only return to front end for display.
When I choose the date to be date range, it will pull the data from database.
When I choose the date to be today date, then my application will send a request to MQ server, and the MQ server will response my application with xml message. The I will use Apache camel library to handle it.
Here is my verbosegc screen shot when pulling data from database:
As you can see, everytime when I trigger the search function, the memory usage will increase, and then drop back. So this is normal, and also what I expected.
And this is the verbosegc screen shot when pulling data from third party.
As you can see, after the memory increase, it will will horizontal there for a period, and then only drop back.
I suspect that the org.apache.camel.Exchange or org.apache.camel.Message or those object in Apache will holding the memory for longer time.
Here is some of my code to handle the xml message from third party:
/**
* Camel Exchange producer template
*/
protected ProducerTemplate< Exchange > template;
#SuppressWarnings("unchecked")
private < T > T doSend(final Object request, final String headerName,
final Object headerObject,
final SendEaiMessageTemplateCallBack callback)
throws BaseRuntimeException {
log.debug( "doSend START >> {} ", request );
if ( this.requestObjectValidator != null
&& requestObjectValidator
.requiredValidation( requestObjectValidator ) ) {
requestObjectValidator.validateRequest( request );
}
final Exchange exchange = template.request( to, new Processor( ) {
public void process(final Exchange exchange) throws Exception {
exchange.getIn( ).setBody( request );
if ( headerName != null && headerObject != null ) {
exchange.getIn( ).setHeader( headerName, headerObject );
}
}
} );
log.debug( "doSend >> END >> exchange is failed? {}",
exchange.isFailed( ) );
Message outBoundMessage = null;
if ( callback != null ) {
// provide the callBack method to access exchange
callback.exchangeCallBack( exchange );
}
if ( exchange.isFailed( ) ) {
failedHandler.handleExchangeFailed( exchange, request );
} else {
outBoundMessage = exchange.getOut( false );
}
// handler outbound message
if ( this.outboundMessageHandler != null ) {
this.outboundMessageHandler.handleMessage( outBoundMessage );
}
if ( outBoundMessage != null ) {
if ( outBoundMessage.getBody( ) != null ) {
log.debug( "OutBoundMessage body {}", outBoundMessage.getBody( ) );
}
return (T) outBoundMessage.getBody( );
} else {
return null;
}
}
Because of this, my application was hitting Out Of Memory Exception. I am not sure is it because of Apache Camel library or not, kindly advise.
Other than that, when I open the heapdump file, there is 52% complain on the com/ibm/xml/xlxp2/scan/util/SimpleDataBufferFactory$DataBufferLink
And the other are complain on the "Java heap is used by this char[] alone", which is some sub category under DataBufferLink as well.
I google on this, all is talking about the xml message too large.
I have no idea on which way or which direction I should continue to troubleshoot, can kindly advise on this?
FYI, I am using camel-core-1.5.0.jar

getResourceAsStream returning null despite called file being in same dir as class getResourceAsStream is called in

I imported an Android sample coded by Amazon involving AWS's DynamoDB which I got from here and was presumably written for Eclipse:
https://github.com/awslabs/aws-sdk-android-samples/tree/master/DynamoDBMapper_UserPreference
Since Android Studio (0.8.1) uses gradle instead of ant, naturally things got auto-moved around in terms of dir structure when importing so (part of) it looks like this:
PropertyLoader gets the TVM credential info it needs to connect to the database DynamoDB from AwsCredentials.properties. Relevant methods:
public class PropertyLoader {
private boolean hasCredentials = false;
private String tokenVendingMachineURL = null;
private boolean useSSL = false;
private String testTableName = null;
private static PropertyLoader instance = null;
public static PropertyLoader getInstance() {
if ( instance == null ) {
instance = new PropertyLoader();
}
return instance;
}
public PropertyLoader() {
try {
Properties properties = new Properties();
properties.load( this.getClass().getResourceAsStream( "AwsCredentials.properties" ) );
this.tokenVendingMachineURL = properties.getProperty( "tokenVendingMachineURL" );
this.useSSL = Boolean.parseBoolean( properties.getProperty( "useSSL" ) );
this.testTableName = properties.getProperty( "testTableName" );
if ( this.tokenVendingMachineURL == null || this.tokenVendingMachineURL.equals( "" ) || this.tokenVendingMachineURL.equals( "CHANGEME" ) || this.testTableName.equals( "" ) ) {
this.tokenVendingMachineURL = null;
this.useSSL = false;
this.hasCredentials = false;
this.testTableName = null;
}
else {
this.hasCredentials = true;
}
}
catch ( Exception exception ) {
Log.e( "PropertyLoader", "Unable to read property file." );
}
}
However the getResourceAsStream line properties.load( this.getClass().getResourceAsStream( "AwsCredentials.properties" ) ); returns null. As you can see in my screenshot, AwsCredentials.properties is in the same dir as PropertyLoader and matches the case, which is all that should be required based on my readings of the method:
http://mindprod.com/jgloss/getresourceasstream.html
getResourceAsStream() is always returning null
I have tried other things such as prefixing "\" (i.e. properties.load( this.getClass().getResourceAsStream( "\AwsCredentials.properties" ) ); and copying the credentials file and placing in the src folder (you can't see it in this screenshot because the explorer sorts by filetype(?) and places 'main' first, but it's there) as per this:
getResourceAsStream returning null
However, that hasn't fixed the issue either. Having tried these options and done research, I'm confused as to why it's returning null. How can I fix this?
Created a dir called resources under /src/main/ and placed AwsCredentials.properties there and used
properties.load( PropertyLoader.class.getClassLoader().getResourceAsStream( "AwsCredentials.properties" ) );
instead of
properties.load( this.getClass().getResourceAsStream("AwsCredentials.properties" ) );
Not as elegant as I would like, but it works.
For up to a day I was struggling with this as well. And finally I was able to resolve this very neatly. The problem is not in the JAVA but in the all project structure. E.g. in Android Studio the whole project is under src/main/java whereas main is a flavour of the project. So if you've file(-s) to read from in source's package (e.g.) com/my/example/app you have to edit the build.gradle file for read (clazz.getResourceAsStream(file)) to work properly. I.e. under android define sourceSets like this:
android {
/* ... Your stuff ... */
sourceSets {
// Lets have two flavours to make it more clear
main {
resources.srcDirs = ['src/main/java']
}
flavourFoo {
resources.srcDirs = ['src/flavourFoo/java']
}
}
}
Hope this helps!

Best Practice to Build a Java SDK for a REST API

I'm about to develop a Java SDK against a REST API and would like to know what would be the best practice approach to building it. I've looked at Google and also used a number of SDKs which connect to REST APIs and there is never much consistency. I've come across some patterns which I find interesting and would like to know which one could be considered best practice, if any, or if there are alternatives?
I've provided sample / pseudo code to facilitate.
1) The models / requests / client are all separated. Example call:
Client client = new Client( ... credentials ... );
try {
Something obj = client.post( new PostSomethingRequest( ..params... ) );
} catch( Exception oops ) { ...handle... }
try {
Something obj2 = client.get( new GetSomethingRequest( id ) );
} catch( Exception oops ) { ...handle... }
2) The models and request are tied together and the client is separate. Example call:
Client client = new Client( ... credentials ... );
try {
Something obj = client.post( new Something( ..params... ) );
} catch( Exception oops ) { ...handle... }
try {
Something obj2 = client.get( new Something( id ) );
} catch( Exception oops ) { ...handle... }
3) The model contains everything. Example call:
Client.setCredentials( ... credentials ... );
Something obj = new Something( ..params... );
try {
obj.post();
} catch( Exception oops ) { ...handle... }
try {
Something obj2 = Something.get( id );
} catch( Exception oops ) { ...handle... }
If there are better ways of building this I'd also be glad to hear about them.
IF you build an SDK for a special REST API, I would use method names that represent the REST service calls and won't be so generic.

Categories