I have debated this issue multiple times, and the fact that it came up again in two distinct discussions today prompted this blog entry:
Whether it’s workflow, orchestration, business process management, or a similar concept under some local name familiar to you, one always seems to end up looking for a flexible, accessible way to define how data flows from one processing node to another. In an XML-centric world, you’ll think about XML documents that get transformed, separated into parts, aggregated again; and your search for the ideal solution always seems to follow a series of steps that look something like this:
- You start out thinking this is just a simple static configuration issue, and cook up your own little XML language.
- You discover that it’s a little more than plain configuration and you need some control structures, and start adding one after the other.
- You find out that this sucks and start looking for some standard that fulfills your needs.
- After endless debates, you end up with some open source or commercial solution based on some half-baked standard that you just can’t seem to really like.
To avoid going through all of this every single time, it helps to think about what, exactly, it is that you want to achieve. For all of the approaches above, you’ll need some engine capable of interpreting (or executing) your configuration (or process definition). Now let’s just say you pick the JVM as your engine, and Java as your process definition language — what exactly do you lose? Not dynamic enough? When was the last time you changed a process already in deployment without serious consideration? Even so, pick Jython, or Groovy — now what is the difference? How about using some other scripting language?
Call me a heretic, but I believe that every time somebody tries to sell you some XML-based, pseudo-declarative language it is either a) Turing-complete or b) sucks. And I have not yet seen convincing arguments that this would be different for BPEL.