Experience with Drools Flow and/or OSWorkflow? - java

I'm looking for a straightforward Java workflow engine that:
can handle both automated and manual (GUI-based) steps within a workflow
supports long-running, asynchronous tasks
provides support for restarting workflows in the event of a server crash
stores a full audit history of previously executed workflows
provides easy access to this audit history data
Possible candidates include the new Drools Flow process engine in Drools 5, and OSWorkflow from OpenSymphony. From my current understanding, OSWorkflow seems to offer more of what I want (Drools Flow doesn't appear to store much in the way of an audit history); however, the most recent release of OSWorkflow was back in early 2006. Is it a mistake to now use OSWorkflow when it's no longer under active development?
Does anyone have much experience with either/both of these frameworks? Are there any other workflow engines I should be looking at? All recommendations welcome - thanks.

Just to clarify how Drools Flow supports the requirements you are describing (refering to the Drools Flow documentation):
can handle both automated and manual (GUI-based) steps within a workflow
Drools Flow uses (domain-specific) work items (Chapter 8) to interact with external systems. These could be automated services, or a human task management component (Chapter 9) for manual tasks. This human task component is fully pluggable but Drools Flow supports a WS-HumanTask implementation out of the box. Drools 5.1 will include web-based task lists, including custom task forms.
supports long-running, asynchronous tasks
The engine allows you to start processes that can live for a long time. The process supports different kinds of wait states (work item nodes, event nodes, event wait nodes, sub-process, etc.) to model long-running processes. External tasks can be integrated synchronously or asynchronously.
provides support for restarting workflows in the event of a server crash
The runtime state of all process instances can easily be stored in a data source by turning on persistence (Chapter 5.1). Therefore, all processes can simply be restored in the state they were in after a server crash.
stores a full audit history of previously executed workflows
Drools Flow generates events about what is happening during the execution of your processes. By turning on audit logging (Chapter 5.3), these events can be stored in a database, providing a full audit history of whatever happened during execution.
provides easy access to this audit history data
The history data is stored using a few simple database tables. These tables can be queried directly, or you could use it for generating custom reports (Chapter 12.1) that show the key performance indicators that are relevant for your application.
Furthermore, we believe that a knowledge-oriented approach, allowing you to seamlessly combine processes with rules and event processing whenever necessary, will offer you more power and flexibility compared to aforementioned process-oriented engines.
Kris Verlaenen
Drools Flow Lead

I've not had any experience with the candidates you've mentioned but judging from the projects that I've worked on it might be worth looking at jBPM. Quite a few developers I've worked with swear by it and I think it fits your criteria quite nicely.

Drools Flow is at lot more sophisticated and powerful than both jBPM and OSWorkflow and development is moving at a faster pace than either. We provide a lot of details and screenshots here:
http://www.jboss.org/drools/drools-flow.html
But in summary. You get interactive debugging across rules, workflow and event processing. You have a larger set of built in nodes, improving the number of problems you can directly model declaratively. Correlated (across rules, processes and events) audit logging and reporting. We provide a very simple and yet powerful mechanism for building domain specific workflow, via our pluggable work items.
Drools 5.0 has just been released and 5.1 is going to follow in the next 4 to 6 weeks. We are adding simulation and testing for this, using an MVEL DSL, which we believe will be a huge hit. This will also include more extensive work for remote admin GUIs for processes, all integrated into Guvnor.
The Drools team also prides itself on being more accessible than any of those other mentioned projects. Feel free to pop onto irc for a chat.
Mark

I have experience with both.. also I was involved in a tool for migrating existing processes in OSWorkflow to Drools 5.0. You can read an article about that in: http://blog.athico.com/2009/01/drools-flow-and-osworkflow-migration.html. It is important to mention that this migration tool/translator was created to allow old projects that are using OSWorkflow to upgrade to Drools 5.0 and take advantages of all the Drools 5.0 Business Integration Platform.
Greetings

Related

How to effectively manage a bunch of jar files and their plumbing?

This is a rather high-level question so apologies if it's off-topic. I'm new to the enterprise Java world.
Suppose I have written some individual Java packages that do things like parse data feeds and store the parsed information to a queue. Another package might read from that queue and ingest those entries into a rules engine package. Tripped alerts get fed into another queue, which is polled by an alerting service (assume it's written in Python) that reads from the queue and issues emails.
As it stands I have to manually run each jar file and stick it in the background. While I could probably daemonize some or all of these services for resiliency or write some kind of service manager to do the same, this strikes me as being very amateur. Especially since I'd have to start a dozen services for this single workflow at boot.
I feel like I'm missing something, but I don't know what I don't know. Short of writing one giant, monolithic application, what should I be looking into to help me manage all these discrete components and be able to (conceptually) deliver a holistic application? I'd like to end up with some sort of hypervisor where I can click one button, it starts/stops all the above services, provides me some visibility into their status and makes sure the services are running when they should.
Is this where frameworks come into play? I see a number of them but don't know if that's just overkill, especially if I'm not actively developing a solution for that framework.
It seems you architected a system with a lot of components, and then after some time you decided to aggregate some of them because they happen to share the same programming language: Java. So, first a warning: this is not the best way to wire components together.
Also, it seems you don't know Java very well because you mix terms like package, jar and executable that are totally unrelated and distinct concepts.
However, let's assume that the current state of the art is the best possible and is immutable. Your current requirement is building a graphical interface (I guess HTTP/HTML based) to manage all the distinct components of the system written in Java. I suggest you use a single JVM, writing your components as EJB (essentially a start(), stop() and a method to query the component state that returns a custom object), and finally wire everything up with the Spring framework, that has a nice annotation-driven configuration for #Bean's.
SpringBoot also has an actuator package that simplify exposing objects. You may also find it useful to register your beans as Managed beans, and using the Hawtio framework to administer them (via a Jolokia agent).
I am not sure if you're actually using J2EE (i.e. Java Enterprise Edition). It is possible to write enterprise software also in J2SE. J2SE is not having too much available off the shelf for this, but in contrast has a lot of micro-frameworks such as Ninja, or full stack frameworks such as Play framework which work quite well, much easier to program, and performs much better than J2EE.
If you're not using J2EE, then you can go as simple as:
make one new Java project
add all the jars as dependency to that project (see the comment on Maven above by NimChimpsky)
start the classes in the jars by simply calling their constructor
This is quite a naive approach, but can serve you at this point. Of course, if you're aiming for a scalable platform, there is a lot more you need to learn first. For scalability, I suggest the Play! framework as a good start. Alternatively you can use Vert.x which has its own message queue implementation as well as support for high performance distributed caches.
The standard J2EE approach is doable (and considered "de-facto" in many oldschool enterprises) but has fundamental -flaws- or "differences" which makes a very steep learning curve and a very much non-scalable application.
It seems like you're writing your application in a microservice architecture.
You need an orchestrator.
If you are running everything in a single machine, a simple orchestrator that you probably is already running is systemd. You write systemd service description, and systemd will maintain your services according to your services description. You can specify the order the services should be brought up based on dependencies between services, restart policy if your service goes down unexpectedly, logging for stdout/stderr, etc. Note that this is the same systemd that runs the startup sequence of most modern Linux distros.
If you're running multiple machines, you can still keep using single machine orchestrator like systemd, but usually the requirement for the orchestrator will also become more complex. With multiple machines, you now have to take into account things like moving services between machines, phased roll out, etc. For these setups, there are software that adapts systemd for multi machine orchestration, like CoreOS's fleetd; and there are also standalone multi machine orchestrator like Kubernetes. Both uses docker as application container mechanism.
None of what I've described here is Java specific, which means you can use the same orchestration for Java as you used for Python or other languages or architecture.
You have to choose, As Raffaele suggested you can choose to write all your requirements into one app/service. Seems like a possible mission, using java Ejb's or using spring integration - ampqTemplate ( can write to a queue with ampqTemplate and receive the message with a dedicated listener (example).
Or choosing implementation with microservices architecture. write a service that will push to the queue another one that will contain the listener etc. a task that can be done easily with spring boot.
"One button to control them all" - in the case of a monolithic app - it's easy.
In case that you choose microservices architecture. It depends what are you needs. if its just the "start" "stop" operation I guess that that start and stop of your tomcat/other server will do. For other metrics, there is a variety of solutions. again, it depends on your needs.

distributed transaction .net java SAP

I use several systems connected by Service oriented architecture, those systems include : .NET technology, Java WebLogic Service, and SAP RFC.
is it possible to achieve distributed transaction across these different technologies ? for example when updating SAP failed, we need to make sure that .NET, and Java transaction won't happen at all.
Really appreciate your input, or maybe you can point out where we can learn the basics how to do it?
I would do this as follows:
Each interaction with the service is a Command with execute() and undo() methods defined.
Maintain a stack of commands, push each command once it is successful.
When a command fails, i.e. interaction with a service fails, pop each command from the stack and call undo().
You might look for any Distributed Transaction APIs in .NET that would provide this functionality out of the box, though.
I would not recommend using distributed transactions between systems as it creates a lot of coupling between them and increases the load on each system as it holds locks for the others (you can see a write up I did on this called "transactional integration anti-pattern").
Usually a better option, as suggested by Vikdor is using a Saga which you can coordinate manually or by some external coordinator (e.g. using Orchestration).
Note that compensation logic might not be able to completely undo the operation (e.g. in a retail scenario, if an order caused you to hit an inventory threshold and order from your supplier a cancellation of that order doesn't have to cancel your order from your supplier. Also you may not refund the full sum etc. )

java bpm for large data and multi-istantiation of sub workflow

Can anyone suggest me a opensource java bpm/workflow solution for operating large number of records in the database. These records may be in millions. And every record will create a subworkflow for string operation on them.
We have used bonita open solutions for this. But to no avail as the execution/initiation time it takes is too long. We need a bpm that must be light weight and fast.
Any suggestion will be highly appreciated. Thanks!
jBPM - http://www.jboss.org/jbpm/
jBPM is a flexible Business Process Management (BPM) Suite. It makes
the bridge between business analysts and developers. Traditional BPM
engines have a focus that is limited to non-technical people only.
jBPM has a dual focus: it offers process management features in a way
that both business users and developers like it.
Activiti BPM Platform - http://activiti.org/
Activiti is a light-weight workflow and Business Process Management
(BPM) Platform targeted at business people, developers and system
admins. Its core is a super-fast and rock-solid BPMN 2 process engine
for Java. It's open-source and distributed under the Apache license.
Activiti runs in any Java application, on a server, on a cluster or in
the cloud. It integrates perfectly with Spring, it is extremely
lightweight and based on simple concepts.

Java based workflow engine required

I am looking for a java based workflow engine which is powerful enough to have most of the workflow features but simple to implement.
Features like transition from one state to another based on approvals by people who have permission to approve a state, the transition could be automatic also if all the required fields/data are available, single person or multiple person to approve a state, visual editor to create the workflow, the transition can only happen if certain fields have particular values (like rules), sending notification to approver and once approved send notification to people watching the state.
It sounds like an implementation of the Business Process Execution Language (BPEL) is what you want.
To add to Samuel's suggestion, I'd suggest having a look at the videos on the OpenESB website. The NetBeans plugin is quite powerful.
More generically, see also Wikipedia's Comparison of BPEL engines.
Edit: I've also spent some time over the past week or so using Bonita Open Solution (GPLv2). In my opinion, it's superior to jBPM and Activiti. The range of plugins (i.e. business logic or notifications that you can farm off to third parties) is very impressive and the GUI editor is pretty easy to use. Furthermore, it comes out of the box with a pretty easy to use portal feature which means you can quite quickly to mockups of proper workflow tasks, assignments, etc. within a web portal. My only criticism at the moment is that I wish they'd make a more user-friendly way to connect to Web Services (it can still be done, it's just a bit fiddly). Also, the forums are actively staffed and questions usually get answered by their employees within a day.
Activity (http://activiti.org/) is the new jBPM offspring. Looks promising and if you now start with workflow, I would go for it.
How about JBoss BPM

What's the limit of BONITA Platform ? Can a Business Analyst generate a whole Java Web App with just pre-programmed connectors?

I'm studying some systems that allow faster App dev cycles. So I stumbled upon BONITA. It seems that by preparing some connectors you can allow a Business to generate a whole App.
What's the limit of Bonita and what is needed to improve it ?
Bonita is one of probably many Business Process tools. Like all such tools there are good and bad things. Yes they allow some business processes to be automated by users by dragging and dropping components, or process steps in some sort of GUI. And like systems, say SAP for example, if you fight precisely in the SAP box it can get you going much faster and easier. I've never met any business that fit "exactly" in the SAP box though, and usually there are years spent customizing it. In my experience, the users usually give up on this after a while and want developers to do this for them when they experience something that doesn't quite do what they thought it would. The developers try it out, become frustrated with the limitations or just get overwhelmed by NIH syndrome, and the tool is soon abandoned. Alternatively, and perhaps more concerning, is when you get the true user enthusiast that quite innocently takes down a production database in the middle of the day trying to look at their data in new ways, but without training wheels.
YMMV
Actually I'm evaluating Bonita Open Solution (BOS) for internal use in small company and I think it's a great platform to build workflow applications on. I believe that a manager having basic BPM knowledge can quite easily prototype the whole process including user interaction forms, some of process variables and condition on flows, and simulate anytime with one click. Since you're modelling application, you can completely reorganize the flow when optimizing process. Anytime a programmer can come and implement interactions with external services using connector, again not a hard thing. Once you are ready, just export application and deploy on server.
Bonita has simple but powerful Java API. Accessing it you can tweak most limitations.
Nice thing about Bonita is it's active development. Also forum is quite supportive and even the developers of BOS answer the questions in a short time (hours or few days on weekends).

Categories