When writing your first workflow you probably just started pulling objects over and linking them together to achieve a simple task and, after a bit of playing around, you probably got it to work and do what you want (See example below).
If you try to write a more complicated workflow, for example automated Virtual Desktop Provisioning, it will become readily apparent that dividing your workflow into multiple workflows will help with readability and efficiency. So, how do you link workflows together? Opalis provides three critical objects for this, Custom Start, Trigger Policy and Publish Policy Data.
The Custom Start Object is available in the Workflow Control Object Folder
The purpose of the Custom Start object is to define input parameters for your workflow. Whenever your workflow is called from another source (A trigger policy inside of another workflow, a web services call from the Web Console, Testing Console) that other source will be prompted to provide input for this workflow. Every workflow that does not start with an object of the Monitor type should start with a Custom Start. Because Custom Start is a special type of object called an Event Object it must be at the start of the workflow and you can only have one per workflow.
To add input parameters to your custom start simply open it up and click Add. You then give the parameter a type (String, Integer, etc) and a name. At this point you can reference this input parameter from any object ‘downstream’.
The next important Workflow Control Object is the Trigger Policy. This object is used to start another workflow.
When you open up this object you will be prompted with a policy browser. Simply browse out to the other policy you want to call.
Once you have selected the other policy, the variables defined in that other policies Custom Start object will be displayed in the Parameters area. Fill those values in, decide if you would like to check ‘Trigger by Path’ or ‘Wait for Completion’ and click Finish
Trigger by Path
There are two ways that Opalis can reference your workflows, either by policy GUID or ‘by path’. The normal functionality is by GUID. When you select a policy using the file browser Opalis will show you the path to that Policy in the text box, but what it is really referencing internally is the GUID of that policy. This means that if you were to move the policy you are calling to another folder, or rename it, this relationship will stay intact. The other option is to check the ‘Trigger by Path’ option. This means that your policy will always trigger whatever policy is at the path you originally specified. This means that if you move or rename the policy you are calling this call will fail unless you put a different policy with the same input parameters at this path.
Wait for Completion
This option will force your currently running policy to wait for the policy it is calling to complete before it moves on to its next step. Normally this is the functionality you want.
Running policies that have ‘Trigger Policies’ in them will yield unexpected results; every call to the ‘trigger policy’ object will fail.
To get around this what we do is turn logging on in all of the related policies we want to test, the trigger the top level policy from another workflow that only has a trigger policy. We then start this testing workflow by hand. At this point we can go look at the logs of all of the policies and see what their results were (if they returned as expected or not). To turn logging on in a policy we need to right click on the policy and select properties. We then go to the logging tab and select the detail of logging we want.
Store Object-Specific published data will save all published data from each object (outputs from each object). Store Common Published Data will save all attributes defined as such. If you want to see what each will capture for a given object simply reference that object from a ‘Subscribe to Published Data Prompt.’ The default view will show the ‘Object-Specific Published Data’, checking ‘Show Common Published Data’ will show you all common published data.
Capturing common published data can be useful when dealing with loops.
Running the test
So, to test my ‘abs’ workflow with its ‘trigger policy’ I turn on logging in abs (see above), create a workflow to call abs, then ‘start’ the calling workflow. I then can examine the logs.
The ‘abs’ workflow takes in a user (Input parameter in the Custom Start), enumerates all of the domains in our forest, and then searches all of those domains for the inputted user. After it finds the user in a domain it passes that users Distinguished Name forward to another policy. This policy has a custom start with UserDN as a parameter. The policy takes the UserDN and looks up all Groups that user is a member of and publishes them back to the calling workflow.
Publish Policy Data
The third Workflow Control element that is vital for having policies call other policies is the Publish Policy Data object. This object allows you to return data to the calling workflow. Setting this object up is actually a two-step process.
Step 1: Modify the policy to have Policy Data
The first step is to actually modify the policy itself so that it knows it will be publishing data. This is done so that calling Workflows know what data will be available on the bus after this workflow has completed. To do this we right click on the policy itself and go to its properties. We then choose “Policy Data” and add variable name and variable type combos.
Step 2: Add a publish policy data object
Drag out a publish policy data object into you workflow. Attach this object to each of the exit points for your workflow and set the data you want to return. In my example I have one exit point for my workflow and will be publishing the output of the Get Active Directory User’s Groups object.
You now know how to use the objects necessary to create multi-tiered workflows. I will cover why you want to use multi-tiered workflows and my best practices around this soon! Stay tuned!