Warning: mktime(): It is not safe to rely on the system's timezone settings. You are *required* to use the date.timezone setting or the date_default_timezone_set() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected the timezone 'UTC' for now, but please set date.timezone to select your timezone. in /home/project-web/swamp/htdocs/includes/functions.inc.php on line 15
| SWAMP Workflow Administration and Management Platform | doc2html
SWAMP Index Design
* Home
* Documentation
* License
* Demo Server
* Wiki
* Screenshots
* F A Q
* Downloads
* Contact
* Credits

SWAMP is supported by:
SourceForge.net Logo

Creating a Workflow Definition


The definition files are stored in the well-known file format XML. As widely known, XML allows to define hierarchical structures very powerfull. Moreover, XML files are very good to verify and to parse. However it is not so comfortable and straightforward to edit XML files in a text editor. But since that improves with good syntax highlighting capabilities of modern editors and finally if somebody writes a GUI to create the definition files (which is relative easy again) this disadvantage was considered to be ok for SWAMP.

This chapter tries to explain how to write a workflow definiton file for SWAMP. All shown snippets are taken out of the workflow "Example", that is included in the SWAMP release. Thus it should be easy to learn how to define a workflow and to play around with it.

Note: The system expects all textfiles to be UTF-8 encoded to be able to handle special characters.

The XML Workflow Definition File

The rules for the syntactical rules for a workflow definition are specified in the workflow DTD, located at conf/dtds/workflow.dtd. If a workflow definition is malformed and does not validate against the DTD, or has other semantic errors, SWAMP will refuse to load it. The workflow definition file starts with an XML header consisting of an XML declaration and a document type declaration:

<?xml version="1.0" standalone="no" ?>
<!DOCTYPE workflow SYSTEM "../../../dtds/workflow.dtd">

This way the root resp. top-level element is also defined and it is called workflow. The workflow element requires the attributes name (has to be the same as the directory the workflow-version is stored in), version (the version of this workflow) and leastSWAMPVersion (the minimum version of SWAMP this workflow requires to run on).

If the workflow is a subworkflow, you additionally have to set the attributes parentwf (the parent workflow name) and parentwfversion (the version of the parent workflow). With this additional information the workflow verifier is able to some checks on the workflow.

<workflow name="ExampleWorkflow" version="0.1" leastSWAMPVersion="1.2">
A raw view on the structure of the workflow definition looks like this:

<workflow ... >
		<!-- Meta information of that workflow -->
			<!-- definition of roles needed in that workflow -->

	<node ...>
	<!-- definition of nodes including actions and conditional 
		edges to other nodes -->

	<dataset ...>
	<!-- definition of the workflows dataset -->

Now, we are going to show the details of each workflow element, and describe the example implementation as in the ExampleWorkflow workflow.

Metadata + Role definitions

The metadata element contains the mandatory elements templatedescription, description and roles.

	Workflow for Testing Issues
	<description>Workflow for Testing Issues</description>

		<role name="owner" restricted="true" type="databit">
		<!-- @type is set to "value" by default -->
		<role name="admin" restricted="true">
			<rolevalue>swamp_user, swamp_admin</rolevalue>
		<role name="starter" restricted="true" type="reference">
		<role name="user" restricted="false" type="databit">
		<!-- reference the group with name "supporter" from the database:  -->
		<role name="supporter" restricted="true" type="dbreference">

templatedescription contains the general description of this workflow type, whereas description means the description of a single workflow instance and may be dynamically modified with script content as described later.

The shown role definitions are a minimum set of required roles. The standard roles owner, admin, starter and user have to be defined in each workflow. These roles were already explained in the previous chapter. Each role has a flag restricted with which you simply can turn off that role, means every logged in user automatically has that role and is allowed to do actions that require that role. If restricted is set to true, you have two options:

  • Specify a rolevalue as the example does for the role admins. This way all workflow instances share the admin definition of the template, and if you want to add a new admin to all workflows of that type, just add the new admin to the template and reload the template. These roles are called "static" roles in SWAMP.

  • Specify a target databit (roledatabit) where the usernames of the users the are assigned to that role are stored. This way the members of that role can be changed individually for each workflow instance.

The owner role is automatically set to the user that has started the workflow. To change the users that are assigned to a role in a workflow instance simply edit the corresponding Databit.

Merging roles:

<role name="mergerole" restricted="true" type="reference">
	<description>Merged role</description>
To merge 2 or more roles into a new one use the roleref element.


A raw overview of a node definition looks like this:

<node [type="start"] name="start">
	<description>Mandatory start node</description>
	<!-- duedate  definition -->
	<duedate ... />
	<!-- definition of a milestone -->
	<milestone ... >
	<!-- definitions of included actions -->
	<action... >
	<!-- definitions of edges leaving the node -->
	<edge ...>
Each workflow must exactly have one node with the attribute type="start", because this node gets activated on workflow creation and starts the process. Normal nodes do not have the attribute type set, where nodes with type="end" mark endpoints of a workflow, and signal that the workflow is finished. When an endnode is reached, all remaining active tasks will get canceled, and the workflow will disappear from the "running" list of workflows. The name attribute is the unique identifier of a node.

Nodes can include a milestone, that marks a point of significant progress and makes it easier to track the progress of a complex workflow in the GUI. A node can have any amount jobs to be done, in SWAMP jobs are called actions. If a node is entered, all included actions will get activated. At least a node can define edges and conditions that define which edge should be taken at which situation.


Milestones can help to generate a linear list of points in the workflow that show the progress of a process in a simple way and hide the workflow complexity. Milestones can be rendered nicely in the workflow list pages of webSWAMP and give a fast overview of the workflows progress. An example milestone definition looks like this:

<milestone name="m1" weight="5">
	<description>Milestone 1 reached</description>
The description will be displayed to the user in the GUI, and the weight-factor gives the milestones an order to pretend a linear order that is often not given in complex workflow scenarios.


Each node can be marked with a "duedate" that means a date when the node should be left. This is usually done when the contained tasks were done, depending on the attached conditions of the leaving edges. Workflows with duedates that are near their target time, or already late can get displayed with yellow and red color in the GUIs workflow lists to show that something is stuck. The XML definition:

<duedate databit="testdataset.duedate1"/>
databit contains the path to the databit where the duedate value is set. Usually this value is set there by a preceding dataedit action or by any automatic mechanism.

Actions - The tasks that have to be done

There are two classes of available actions: system-actions and user-actions.

  • Systemactions are jobs that are done automatically by the system, such as sending notifications, starting subworkflows or doing scripted actions.

  • A useraction needs the interaktion of the assigned user/role in the GUI. This can be entering data, making a decision or just clicking "OK" at some point of the workflow.

The actions have some attributes in common: The attribute name always contains a unique identifier for that action, and the description element contains a description for that action. The user-actions have the following attributes in common:

  • role contains the name of the assigned role. The definition of roles was described earlier in this chapter. This attribute is not mandatory. But if you don't specify a role to an action, you are not able to send notifications to the assigned role, or restrict the execution of the action to the role members.

  • restricted (boolean) defines whether the execution of this action is only allowed to users that are in the configured role or if every valid workflow user may act this action. This attribute in not mandatory and set to false by default.

  • notificationtemplate contains the path to the mail-notification template. When the action gets activated the users of the assigned role will get notified.

  • mandatory is a hook for the GUI to distinguish between "maintenance" tasks and important tasks. Only mandatory tasks are shown in the workflow lists. Default is "true".

Dataedit (user-action)

A dataedit-action wants the user to enter/edit data of certain fields in the workflows dataset. Example code:

<dataedit name="dataedit1" eventtype="DATAEDIT2_OK">
	<description>Please fill in the fields.</description>
	<longdesc>Workflows-Threads are unitet now</longdesc>
	<field path="testdataset.product.product_name" mandatory="yes" />
	<field path="testdataset.roles.manualtask_owner" mandatory="yes" />
Here, the user gets a view where he has to edit 2 fields of the dataset. The mandatory tag says if the user may leave the field blank. If the entered values do not fit the datatype of the field, they will not get saved and the user will get an errormessage. When no errors were reported, the event "DATAEDIT2_OK" will get sent to the workflow and the task is done.

Manualtask (user-action)

Manualtask is a simple acknowledgement of the user. The Description as in the elements description and longdesc will be shown, and when the user clicks on "OK" the defined event will get sent out. This task can be used to acknowledge that some work has been done outside the system, or a manager can give his ok for the workflow to continue...

<manualtask name="manualtask3" eventtype="UNITE">
	<description>Go on in the Workflow</description>

Decision (user-action)

A Decision presents the user with a description and some possible options. Each answer is connected with an event that will get sent when the user has made a choice.

<decision name="decision" >
	<description>Testen der CDs</description>
	<question>Please test the CD</question>
	<answer eventtype="PATH1">Take Path 1 in this Workflow.</answer>
	<answer eventtype="PATH2">Take Path 2 in this Workflow.</answer>
	<answer eventtype="PATH3">Take Path 3 in this Workflow.</answer>

Notification (system-action)

The notification-action sends out notification at any point of the workflow progress. Based on a notification-template, there can be any amount of recipients added, like the following snippet shows:

<notification name="notify_owner" 
	<recipient recipientemail="please_change@swamp.swamp"/>
	<recipient dbit="testdataset.roles.user"/>
	<recipient recipientrole="user"/>
	<recipient recipientname="swamp_user"/>
The recipients can be configured directly by their mail-adress, their SWAMP username, all users that have a special role in that workflow or all usernames, mail addresses that are included in a certain databit.

Customtask (system-action)

A customtask allows the instantiation of a custom java class at runtime. Thus it is possible to invoke a piece of code at any point of a workflow. The usage of this action is not recommended, as the java class has to be compiled and must be available to the tomcat classloader. It cannot be contained in a workflow resource bundle. Please try to use a scriptaction when possible.

<customtask name="custom_test" eventtype="none" 
	function="customTest" >
	<description>Calling CustomActionExample.customTest()</description>
The invoked method must have the following signature:
public Boolean customTest(Integer wfid, Integer userId) throws Exception
A customaction can be used for calling external programs on the server. An example on how to do this is included in the de.suse.swamp.custom.CustomActionExample class.

Startsubworkflow (system-action)

This action starts a new workflow and attaches it as a subworkflow to the workflow from where it was called. A subworkflow may be a complex sub-process in a workflow that happens one or moe times. A subworkflow is defined the same way as a normal workflow, as it just a normal workflow but has a reference to its parent workflow and sends an event to its parent workflow on finish. Definition looks like this:

<startsubworkflow name="startsub" subname="Example" subversion="0.1">
	<description>Starting Subworkflow</description>	
When a subworkflow is started, a reference to the datasets of its parent workflow is given to him. This way the subworkflow has access to its parent workflows data. In addition it is checked if there are databits in the subworkflows default-dataset that have the same name as databits in the default-dataset of the root workflow. If there are any, their content is copied from the root workflow to the subworkflow.

Sendevent (system-action)

The sendevent action can send out events dependant on time contraints. With this action we are able to implement a reminder after a certain time, or take any other action in the workflow if a node hasn't been left in time.

<sendevent name="reminder" eventtype="DELAY_1D">
	<triggerdate databit="System.path2.enterDate" 
In this example the event "DELAY_1D" will get sent if the node isn't left 1 day after it was activated. The "databit" attribute can reference a normal databit that contains a date value, or like in this case a special "pseudo-" databit that contains workflow values. More details about these system-databits can be found in one of the next sections. The "offset" parameter has the format "+[0-9][m|h|d]" which stands for the amount of minutes, hours and days.

Note: if you use time triggers for intervals shorter than 30 minutes, you have to reconfigure the scheduler thread to run more often. This can be done in the database table TURBINE_SCHEDULED_JOB.

The sendevent action does not need to set a triggerdate, if none is set, the event is send immediately. It is also possible to send events to other workflows with this action, for example:

<sendevent name="reminder" eventtype="DELAY_1D">
	#foreach ($subwf in $wf.getSubWorkflows(true))
This action will send the Event DELAY_1D to all attached subworkflows. The element targetwfs expects to have a comma seperated list of workflow ids, that can be generated by velocity scripting.

Scriptaction (system-action)

A scriptaction allows you to invoke Velocity and Groovy scripts at a certain point of the workflow. These scripts are executed in a special environment where they have some limited access to workflow objects. This is an example:

<scriptaction name="script_example" language="velocity">
	<description>Setting comment</description>
<scriptaction name="script_example" language="groovy">
	<description>Setting comment</description>
The workflow object is available as ($)wf in the script context. It can be altered in any way, like changing a databit as in the example, activating- deactivating nodes. This assumes you know what you are doing, as it is easy to break a workflow this way. Other objects that are available in the script context are

  • uname: the actual username

  • wf: reference to the workflow object

  • bTools: for triggering Bugzilla actions

  • hist: an arraylist where messages can be appended that will be shown in the GUI afterwards. Usage: hist.addResult(boolean isError, String result)

  • scriptapi: Object that allows to invoke the following methods:

    • createSubWorkflow(String name, String version): create a subworkflow

    • sendEvent(String eventString, int wfid): send an event

    • getWfConfigItem(String name): retrieve property from workflow.conf

    • getSWAMPProperty(String name): retrieve property from defaults

  • executor: Allows the execution of external programs and handling of their output and return code.

Example for calling external scripts from a scriptaction:
<scriptaction name="call_external" language="velocity">
	<description>Calling external a program</description>

The flow of the work - Edges and Conditions

When a node is entered (at workflow start this is the startnode) its edges get activated. That means, they check their associated conditions of being true. If it is true, the source node is left and the new node becomes active. That means, only edges that leave an active node are active and listen to events, changes of data etc. If more than one edges leave a node, and the condition of one of them resolves to true the source node and all leaving edges get deactivated.

<edge to="node1">

<!-- conditions for that edge -->

Each edge needs to have a condition assigned to it which may be a compound condition consisting of AND/OR constructions. Examples for all possible condition types follow:

Event - condition

An eventcondition waits for an event to arrive and resolves to true when the event was received. An event is identified by an event string. For example:

<edge to="path2_is_late" >
	<event type="DELAY_1D"/>
waits for the event "DELAY_1D" to arrive. If an edge has only one event-condition attached, like in the example, a short definition is available:
<edge to="path2_is_late" event="DELAY_1D" />
The event string "NONE" is a reserved event that defines that no event is required, and the condition will always immediately resolve to true on activation. This event is needed for modelling purposes, for example to split the workflow thread.

Data - condition

A datacondition waits for data to be changed, or to match against a given regular expression.

<edge to="node_product1">
	<data check="regexp" 

This data-condition waits until the databit "testdataset.product.product_name" matches the regular expression ".*SLES.*". The change of the data has not neccessarily to happen within a dataedit-task, it can also happen by a user editing the workflows data directly or via another subsystem eg. the SOAP interface.

To make a data-condition watch for any change of a databit, use this code:

<edge to="node_product1">
	<data check="changed" 
		field="testdataset.product.product_name" value=""/>
This condition will always turn to true when the content of the databit gets changed. The value parameter has no effect in this case.

Subworkflowsfinished - condition

To wait until all appended subworkflows are finished the "subsfinished" condition can be used. Per definition a finished subworkflow sends the event "SUBWORKFLOW_FINISHED" to its parent workflow. The "subsfinished" condition is waiting for that event, checks if it was sent by the configured workflow type and version, and resolves to true if the finished subworkflow was the last running subworkflow. Example definition:

<subsfinished subname="Example" subversion="0.1"/>
It can be used to stop the main workflow at a certain point and wait until all started subworkflows finished.

Connecting conditions with AND, OR and NOT

Conditions can be nested in complex contructions by combining the already shown conditions with AND, OR and NOT operands. For example:

<edge to="node_product1">
	<event type="DATAEDIT2_OK"/>
	<data check="regexp" field="testdataset.product.product_name" 

The AND and OR elements do both take 2 subelements, and the NOT element takes just one. That means, to connect for example 3 event conditions by AND you have to write your definition this way:

<edge to="node2">
	<event type="DATAEDIT1_OK"/>
		<event type="DATAEDIT2_OK"/>
		<event type="DATAEDIT3_OK"/>


Datasets contain are the workflows storage for data. They can store texts, dates and role assignments. Atm each workflow defines one root-dataset in its xml definition which has also to be referenced as "defaultdataset" in the workflow element. Datasets can be nested, and the elements that contain the actual data are called databits.

<dataset description="Roles" name="roles">

	<dataset ...>
	<databit ...>

Each databit has a datatype against which its content is verified on changes (type attribute). The visibility of databits and datasets can be set by the state attribute, for example to hide workflow-internal data from the user in the GUI. A databit is defined like this:
<databit name="admin" description="Workflow-Admins" 
		type="person" state="read-write">
The element defaultvalue sets the initial value of the databit. Databits can be referenced from various places within a workflow, eg. data-conditions, dataedit-actions, scriptactions, notificationtemplates and more. The notation for referencing a databit is:

System databits

To extend the power and flexibility of the previously shown workflow elements that use databits, some workflow information is also mapped into the databit "namespace".

These special dynamic databits are referenced with a leading "System." as pseudo datasetname. Available system databits so far are:

to query the attached due date of a node, and to get the date when a node was entered. This can be used for example in the sendevent-action (see section "actions") to determine how long a node has been active. Also mechanisms to notify responsible persons if duedates are not met can be implemented that way.

Valid CSS! Valid XHTML 1.0!