Rick

Rick
Rick

Thursday, February 3, 2011

Thoughts on Activiti deployments and usage


I am trying to get some ideas on how people deploy Activiti. Do you use it embedded? Did  you just extend the REST-API war project? Do you actually deploy jar files to the REST-API war project? etc.
We have a project that is going to manage a series of batch jobs. Described here under problem domain. The first phase of the project will require no human interaction (HI), but future phases will.
I don't want to include HI, but I don't want to preclude it either.
In order to use the tools for Activiti with all of its tooling, tracking, auditing and process viewing, we need to deploy our jars under the activiti-rest webapp. Essentially the tools (modeler, probe, explorer, etc.) use the REST API for Activiti (I think).
I think we want to reduce this type of deployment as much as possible, i.e., we don't want to deploy just jar files that have tasks.
I assert: If we want custom tasks and want to use the Activiti tooling (likely as per last meeting at work) then we will have to deploy Java classes (jar files) into that webapp or create a webapp that encompasses our tasks and the REST API. I don't like this approach. I don't think we would be able to get this past SCM group.
The other option is to get a process id for the workflow send this around and have the clients of the workflow mark the current task as done (which will advance it to the next task or branch decision). This approach relies heavily on the API and in my opinion violates separating areas of concern.
Our original idea (Ian and I's) was to use a message bus (Spring Integration) to mark the task done and then each task sends out a message where one or more batch processes (living in another JVM) will handle that step and send a done message when they are done. 
The other possibility is to run embedded and not use the tools (no chance of HI in the future and we loose a lot of the tracking ability of the workflow). 
I can send out some code examples and diagrams with pros and cons of each approach.
Options:
  • Option 1: Run embedded to avoid deployment of custom tasks 
    • Can we stil use the Activiti tooling? Perhaps just to visualize, but can't run the process remotely, must run locally as custom tasks will not be deployed.
    • Custom tasks are embedded with custom nodes
    • If processing nodes exist on two different boxes, how do we send the id around
    • If it all exists on one box, then it will call into our existing custom service bus
  • Option 2: Send process id around, have each node be responsible for marking its task done
    • We could still use the tooling
    • This does spread the code around, we could wrap in a facade/lib
    • How do the nodes get the process id?
    • Seems like we need messaging
  • Option 3: Have the workflow be the message coordinator
    • Create one custom task that fires a start process message via Spring Integration (and passes name/value pairs of the execution context)
    • This custom task waits until it gets a message back that says that step is done, then it marks the task done
    • Use the complete tooling of Activiti to see processes and manage them
    • The message listeners can be Spring Managed Message Driven Beans, we can use JMS and Spring Integration for the message delivery
  • Option 4: Create custom war file based on activiti REST API war file
    • Allows use of Activiti tooling
    • We need to carry the source or break it up into jar files
    • Deploying new tasks is just a matter of rebuilding our war file
    • Calls out to services in our custom service bus
To me the only option that make sense for our goals are Option 3 with some Option 2 (mostly Option 3 with a little Option 2 where needed) or Option 4.
I don't like Option 1 because we lose a lot of the tooling, auditing and future human interaction.
If I had to pick, I would pick Option 3. It divorces us from Activiti a lot. We just create one custom task to send a message and wait until it gets a continue or fail message. We can integrate with all other tasks easily including HI.
None of our code (processing nodes) will rely on a workflow engine. Also there is some work already done to integrate Spring Integration and Activiti so we might not even have to create any custom tasks just use the integration as is.
Based on internal meetings we have had it sounds like we are going to do either Option 1 or Option 4.
Kafka and Cassandra support, training for AWS EC2 Cassandra 3.0 Training