Windows Workflow Foundation (WF)

A question of one of my blog readers: what about multiple workflows calling into the same Local Communication Service concerning possible threading and synchronization issues.

Consider the following scenario. Assume you have one WorkflowRuntime in your host application, together with one registered Local Communication Service defined by the following interface contract:

[ExternalDataExchange] interface IBar { void Bar(int i); void Foo(int i); }

Assume Workflow1 relies on the Bar method while Workflow2 relies on the Foo method, with both workflow definitions like the following:

The whileActivity1 has a condition that evaluatues true at all times (read: endless loop). The sequenceActivity1 acts as the container for a set of activities in the body of the loop, and wraps the callExternalMethodActivity1 (that calls the Bar and Foo method for Workflow1 and Workflow2 respectively) as well as the delayActivity1 (that has a 1 second and a 3 second delay for Workflow1 and Workflow2 respectively).

Next, assume the following host application code (note: there's a little issue in the code below since I'm running more than one workflow and the waitHandle would be set upon completion of one of both, causing the app to terminate - since both workflows won't ever terminate due to the endless loop, this doesn't cause further problems in this particular demo):

1 class Program 2 { 3 static void Main(string[] args) 4 { 5 using(WorkflowRuntime workflowRuntime = new WorkflowRuntime()) 6 { 7 AutoResetEvent waitHandle = new AutoResetEvent(false); 8 workflowRuntime.WorkflowCompleted += delegate(object sender, WorkflowCompletedEventArgs e) {waitHandle.Set();}; 9 workflowRuntime.WorkflowTerminated += delegate(object sender, WorkflowTerminatedEventArgs e) 10 { 11 Console.WriteLine(e.Exception.Message); 12 waitHandle.Set(); 13 }; 14 15 ExternalDataExchangeService edx = new ExternalDataExchangeService(); 16 workflowRuntime.AddService(edx); 17 edx.AddService(new MyService()); 18 19 WorkflowInstance instance1 = workflowRuntime.CreateWorkflow(typeof(TestConcurrency.Workflow1)); 20 instance1.Start(); 21 22 WorkflowInstance instance2 = workflowRuntime.CreateWorkflow(typeof(TestConcurrency.Workflow2)); 23 instance2.Start(); 24 25 waitHandle.WaitOne(); 26 } 27 } 28 }

where MyService implements the IBar interface:

class MyService : IBar { public void Bar(int n) { for (int i = 0; i < 100; i++) { Thread.Sleep(10); Console.Write('#'); } } public void Foo(int n) { for (int i = 0; i < 100; i++) { Thread.Sleep(10); Console.Write('@'); } } }

Question: Predict the console output of the workflow execution above. Tip: what about adding [MethodImpl(MethodImplOptions.Synchronized)] to the method declarations? What about inspecting the ManagedThreadId property of the Thread.CurrentThread in both methods?

Happy threading!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

A reader of mine noticed I'm using Visual Studio 2005 as the tool to create all of my Windows Workflow Foundation related samples, instead of taking my favorite minimal command-line approach to create basic samples. So here it is today: a simple Hello World demo of WF using the command-line.

 

Please welcome WFC

Just like csc.exe is the C#'s programmer best friend, wfc.exe plays that role for WF programmers. Basically it turns a .xoml file (which stands for eXtensible Object Markup Language) into a .dll assembly file representing the workflow you've designed. Below you can see a simple demo.xoml file:

<SequentialWorkflowActivity x:Name="Demo" x:Class="FooBar.Demo"
    xmlns:x="
http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/workflow">
   <CodeActivity x:Name="sayHello" ExecuteCode="sayHello_ExecuteCode">
      <x:Code>
         <![CDATA[
            void sayHello_ExecuteCode(object sender, EventArgs e)
            {
               Console.WriteLine("Hello World");
            }

         ]]>
      </x:Code>
   </CodeActivity>
</SequentialWorkflowActivity>

This simple sample can be compiled into a .dll file by invoking the following:

>wfc demo.xoml

In this case, we've been using a CDATA section to put the code inline between the XOML definition. Another approach would be to put the following in the demo.xoml file:

<SequentialWorkflowActivity x:Name="Demo" x:Class="FooBar.Demo"
    xmlns:x="
http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/workflow">
   <CodeActivity x:Name="sayHello" ExecuteCode="sayHello_ExecuteCode" />
</SequentialWorkflowActivity>

and to create another file called demo.cs:

using System;

namespace FooBar
{
   partial class Demo
   {
      void sayHello_ExecuteCode(object sender, EventArgs e)
      {
         Console.WriteLine("Hello World");
      }
   }
}

which can be compiled using:

>wfc demo.xoml demo.cs

Notice the use of the partial keyword in the class definition. In the end, the .xoml file is a partial definition of the workflow class, so the .cs file is another part of it and has to be marked as "partial". Both compilations will yield the same assembly:

Using the assembly in a host application is another concern of course. Basically you just have to compile the following piece of code with a reference to the demo.dll file generated above:

using System;
using System.Threading;
using System.Workflow.Runtime;

namespace FooBar
{
   class Hello
   {
      public static void Main()
      {
         using (WorkflowRuntime workflowRuntime = new WorkflowRuntime())
         {
            AutoResetEvent waitHandle = new AutoResetEvent(false);

            workflowRuntime.WorkflowCompleted += delegate(object sender, WorkflowCompletedEventArgs e)
            {
               waitHandle.Set();
            };

            WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(FooBar.Demo));
            instance.Start();

            waitHandle.WaitOne();
         }
      }
   }
}

which might be a little difficult to add the reference to the System.Workflow assemblies which reside in the GAC. A more convenient demo approach is to merge the host application and the code-behind for the workflow into one single .cs file, like this:

using System;
using System.Threading;
using System.Workflow.Runtime;

namespace FooBar
{
   class Hello
   {
      public static void Main()
      {
         using (WorkflowRuntime workflowRuntime = new WorkflowRuntime())
         {
            AutoResetEvent waitHandle = new AutoResetEvent(false);

            workflowRuntime.WorkflowCompleted += delegate(object sender, WorkflowCompletedEventArgs e)
            {
               waitHandle.Set();
            };

            WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(FooBar.Demo));
            instance.Start();

            waitHandle.WaitOne();
         }
      }
   }

   partial class Demo
   {
      void sayHello_ExecuteCode(object sender, EventArgs e)
      {
         Console.WriteLine("Hello World");
      }
   }

}

Now compile this file (demo.cs) together with the .xoml file, as follows:

>wfc /target:exe /r:System.Xml.dll demo.xoml demo.cs

 

MsBuild style

There's of course another approach that leans more towards Visual Studio 2005 compilation, i.e. using MsBuild. Assuming you have a file called demo.xoml:

<SequentialWorkflowActivity x:Name="Demo" x:Class="FooBar.Demo"
    xmlns:x="
http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/workflow">
   <CodeActivity x:Name="sayHello" ExecuteCode="sayHello_ExecuteCode">
      <x:Code>
         <![CDATA[
            void sayHello_ExecuteCode(object sender, EventArgs e)
            {
               Console.WriteLine("Hello World");
            }

         ]]>
      </x:Code>
   </CodeActivity>
</SequentialWorkflowActivity>

and a file called hello.cs:

using System;
using System.Threading;
using System.Workflow.Runtime;

namespace FooBar
{
   class Hello
   {
      public static void Main()
      {
         using (WorkflowRuntime workflowRuntime = new WorkflowRuntime())
         {
            AutoResetEvent waitHandle = new AutoResetEvent(false);

            workflowRuntime.WorkflowCompleted += delegate(object sender, WorkflowCompletedEventArgs e)
            {
               waitHandle.Set();
            };

            WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(FooBar.Demo));
            instance.Start();

            waitHandle.WaitOne();
         }
      }
   }
}

you can create a .csproj build file that contains this:

<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <RootNamespace>FooBar</RootNamespace>
    <AssemblyName>Demo</AssemblyName>
    <OutputPath>.</OutputPath>
  </PropertyGroup>
  <ItemGroup>
    <Reference Include="System" />
    <Reference Include="System.Xml" />
    <Reference Include="System.Workflow.Activities" />
    <Reference Include="System.Workflow.ComponentModel" />
    <Reference Include="System.Workflow.Runtime" />
  </ItemGroup>
  <ItemGroup>
    <Compile Include="Hello.cs">
      <DependentUpon>demo.xoml</DependentUpon>
      <SubType>Component</SubType>
    </Compile>
  </ItemGroup>
  <ItemGroup>
    <Content Include="demo.xoml"/>
  </ItemGroup>
  <Import Project="$(MSBuildBinPath)\Microsoft.CSharp.Targets" />
  <Import Project="$(MSBuildExtensionsPath)\Microsoft\Windows Workflow Foundation\v3.0\Workflow.Targets" />
</Project>

which can be built using:

>msbuild

Enjoy command-line compiling WF-apps (for demo purposes only I hope)!

kick it on DotNetKicks.com

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

This post builds upon the foundation created in yesterday's post about the HandleExternalEventActivity. Please follow the instructions in that post before continuing with this one. As you've learned from this previous post, waiting for an external event to occur is a very powerful mechanism to drive execution of a workflow. There are a lot of scenarios however where you'll want to wait for more than one event at the same time, or you want to timeout such a wait for an event to get raised. To make this kind of dreams reality, WF has the ListenActivity in its magic toolbox.

The demo

Take the solution you built in the previous post on working with events and go to Workflow1.cs's designer view. It should look like this:

Your next job is to transform this into the following:

An easy way to accomplish this is outlined below:

  • Add a ListenActivity below the WhileActivity.
  • Rename the left branch of the ListenActivity to clientArrived and the right branch to timeoutOccurred.
  • Drag and drop the clientArrival and doWork activities from the while loop to the left branch.
  • Delete the SequenceActivity from the WhileActivity's body.
  • Drag and drop the ListenActivity to the body of our WhileActivity.
  • Add a DelayActivity (set to 10 seconds) called timeout to the right branch, as well as a CodeActivity called oops with the following ExecuteCode handler code:

    Console.ForegroundColor = ConsoleColor.Cyan;
    Console.WriteLine("Oops! Timeout occurred"
    );
    Console.ResetColor();

Basically, the ListenActivity blocks till one branch completes. Notice you can add additional branches, e.g. to listen to three, four, ... events at the same time:

When you try to execute this workflow, you'll see timeouts occurring when the client waits to long to enter her name:

I'm sorry for the defaced console output, but hey that's not the goal of the demo is it?

Conclusion

Waiting for one event to occur is a common requirement; waiting for multiple events can be even a more common requirement. WF makes it really easy to do this, without having to mess around with threading and waithandles (even more: in a stateful long-running world relying on additional runtime services like persistence) using the ListenActivity. I bet you'll find yourself leveraging the power of this activity on a regular basis, especially in combination with a DelayActivity to model timeouts. Now you know everything about waiting for events in WF, it's time to wake up yourself and don't wait any longer to explore WF! Cheers!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

In yesterday's post, you learned that the vast majority of workflows need to exchange data with other parties to get their jobs done. Just one of the benefits of workflows is the possibility to visualize this kind of interaction by means of different activities, like the CallExternalMethodActivity that was explained in the previous post. Based on a contract definition (read: interface) a workflow can be defined while the choice of the service implementation is left as a decision for the workflow host application. In today's post, we continue our journey on the Local Communication Services and External Data Exchange path with the HandleExternalEventActivity.

Scenario

The CallExternalMethodActivity explained in the previous post is used to perform methods calls from inside the workflow to some service. Based on an interface and method contract, the workflow can be defined. At runtime, an appropriate service implementation is attached to the workflow and chosen by the engine to process service requests. An example is a workflow calling to an order processing system in some way defined by the contract (e.g. PlaceOrder).

Beside making calls from the workflow to some party outside, one can also rely on the host application to notify the workflow when it needs to do something. This is done using the HandleExternalEventActivity which makes a workflow block till the corresponding event (again in a contract-based manner) is raised by some service. As an example, think of a workflow waiting for approval based on some event (e.g. OrderApproved).

A simple demo

Workflow definition

As usual on this blog, we'll try to keep things simple and approachable, to focus on the basics of the topic discussed. Create a simple Sequential Workflow Console application called EventsDemo:

Next, create the following workflow definition in Workflow1.cs:

This needs some elaboration. On top, we have some CodeActivity called start with the simple ExecuteCode event handler displayed below. At the bottom, there's a similar activity with the event handler shown below as well:

private void start_ExecuteCode(object sender, EventArgs e)
{
   Console.ForegroundColor = ConsoleColor
.Green;
   Console.WriteLine("Waiting for clients..."
);
   Console
.ResetColor();
}

private void stop_ExecuteCode(object sender, EventArgs
e)
{
   Console.ForegroundColor = ConsoleColor
.Red;
   Console.WriteLine("Served 5 clients; time for early retirement!"
);
   Console
.ResetColor();
}

Next, there's an activity called clientListener of the type WhileActivity. This one just loops till five clients have been served, based on the following Declarative Rule Condition:

this.clientCount < 5

relying on the following private variable in the code-behind:

private int clientCount = 0;

A WhileActivity can only contain one single child activity. Because of this, a SequenceActivity is added to the body of the WhileActivity. Inside this SequenceActivity, two child activities are added:

  • clientArrival is of type HandleExternalEventActivity and will be discussed below shortly
  • doWork is of type CodeActivity and has the following ExecuteCode event handler:

    private void doWork_ExecuteCode(object sender, EventArgs e)
    {
       Console.ForegroundColor = ConsoleColor
    .Yellow;
       Console.WriteLine("Event captured - Hello {0}. You're client number {1}."
    , args.Name, clientCount);
       Console
    .ResetColor();
    }

The core of the workflow is the HandleExternalEventActivity called clientArrival. This activity works in a similar way as the CallExternalMethodActivity and relies on an interface and in this case an event to wait for. The logical next step is to define this interface (IBar.cs):

using System;
using
System.Workflow.Activities;

namespace
EventsDemo
{
   [
ExternalDataExchange
]
   public interface
IBar
   {
      event EventHandler<FooEventArgs
> Foo;
   }

   [
Serializable
]
   public class FooEventArgs :
ExternalDataEventArgs
   {
      public FooEventArgs(Guid instanceId, string name) : base
(instanceId)
      {
         this
.name = name;
      }

      private string
name;

      public string
Name
      {
         get { return
name; }
         set { name = value
; }
      }
   }
}

Notice the interface definition being annotated with the ExternalDataExchangeAttribute, which is required for workflow to communicate with it using Local Communication Services. Next, the defined event has an event arguments object derived from ExternalDataEventArgs. The constructor of this event arguments object is worth to mention because it requires a base call to one of the base class's constructors that require a workflow instance identifier to be passed on:

public FooEventArgs(Guid instanceId, string name) : base(instanceId)
{

This is required for the workflow runtime to be able to correlate the event with the right workflow instance.

When you've defined the interface with the event, you can continue to set up the HandleExternalEvent activity. Start by setting the InterfaceType to the IBar interface. Next, set the EventName to Foo. In order to capture the event arguments in the workflow instance to d something useful with it in a later stage, you can bind the e parameter (à la EventArgs e) to some local variable:

private FooEventArgs args;

public FooEventArgs
Args
{
   get { return
args; }
   set { args = value
; }
}

Finally hook up an event handler for the Invoked event of the HandleExternalEvent activity. This will be used to increment the counter that keeps track of the number of served clients (you could do this inside the doWork CodeActivity too):

private void clientArrival_Invoked(object sender, ExternalDataEventArgs e)
{
   clientCount++;
}

Notice you could use this event handler too in order to extract information from the event args that were raised with the exception.

Finally, the property grid of the HandleExternalEventActivity clientArrival should look like this:

The host

On to the host side now. What we want to do, is ask the end-user for a name and then raise the event to the workflow instance to indicate a "client arrival". The workflow should then proceed in the WhileActivity loop and perform work for the newly arrived user.

To do this, we'll first implement IBar as follows:

class Bar : IBar
{
   public event EventHandler<FooEventArgs
> Foo;

   public void RaiseEvent(Guid instanceId, string
name)
   {
      if (Foo != null
)
      {
         EventHandler<FooEventArgs
> evt = Foo;
         FooEventArgs args = new FooEventArgs
(instanceId, name);
         evt(
null
, args);
      }

   }
}

Next, to establish the communication between the workflow and the "Bar" service, we need to register an ExternalDataExchangeService (from System.Workflow.Activities):

bar = new Bar();
ExternalDataExchangeService svc = new ExternalDataExchangeService
();
workflowRuntime.AddService(svc);
svc.AddService(bar);

I've created bar as a local variable in the host class:

private static Bar bar;

Once the workflow instance is started, we'll start a new thread (rather quick-n-dirty) to accept user input till the workflow terminates:

WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(EventsDemo.Workflow1));
instance.Start();

Thread userInput = new Thread
(UserInput);
userInput.Start(instance.InstanceId);

waitHandle.WaitOne();

userInput.Abort();

That is, the background thread userInput is started with a parameter to indicate the workflow instance to raise events in. When the workflow completes (when waitHandle is set), the background thread is aborted (there exist cleaner ways to implement this idea, but for demo purposes this should be okay). This background thread is defined as follows:

static void UserInput(object instanceId)
{
   for
(; ; )
   {
      Thread.Sleep(500);
//dirty demo trick
      Console.Write("User name: "
);
      string name = Console
.ReadLine();
      bar.RaiseEvent((
Guid
) instanceId, name);
   }
}

The core line of code is indicated in bold and does the real work of notifying the workflow instance of the simulated "client arrival". Don't worry about the Thread.Sleep call which is just there to keep the console output nice and smooth in a very dirty way to keep things simple and clean.

Now run the application, it should produce the following output (enter a few names and see what happens):

Conclusion

Local communication between a workflow and external data services is an absolute must for a lot of workflow scenarios. In this post and the previous post, you learned how to establish this kind of communication with a workflow in both directions, by making method calls and by waiting for events to occur. Enjoy!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

Workflows don't stand on their own; they need a host application to run in but in a lot of cases there's more: external services are needed to provide functionality to the workflow. You might think, let's just use a CodeActivity to call into some piece of functionality but that doesn't offer enough flexibility in most cases. Okay, you can work with an interface and parameterize the workflow, but still the workflow doesn't reveal its intentions visibly: a CodeActivity is an opaque container where the visualized world of workflows is traded for procedural code again.

Enter the Communication Services, part one. In this post, you'll learn how to exchange data with the workflow by hooking in a "external data exchange service".

CallExternalMethodActivity

At the core of external data exchange is the CallExternalMethodActivity. In a later post, we'll focus on HandleExternalEvent but for now, let's just rely on plain simple method calls. The scenario is the following:

  1. We need a workflow to get the job done (for reasons that may vary a lot, which have been discussed extensively already).
  2. It can't be 100% self-contained and has to rely on external services to process things, for example to make an order.
  3. This external service communication needs a big deal of genericity, so that it can be replaced by an alternative implementation when required. In other words, it needs to be contract-based.

(And no, it isn't a web service we want to call, because in that case we could rely on InvokeWebService.)

Demo

A workflow definition

Right, on to the demo with a somewhat simplified contract: let's call a simple calculator to get things done (I don't want to get the curse of the bad demo gods :-)). Start by creating a simple console-based workflow application:

Next, add a single CallExternalMethodActivity from the toolbox to the workflow definition as shown below:

This is what you should get to see when you set the properties correctly as outlined below. First, go to the code behind of the workflow and add three properties:

private int a;

public int
A
{
   get { return
a; }
   set { a = value
; }
}

private int
b;

public int
B
{
   get { return
b; }
   set { b = value
; }
}

private int
result;

public int
Result
{
   get { return
result; }
   set { result = value
; }
}

Now, add an interface to the project called ICalculator:

[ExternalDataExchange]
interface
ICalculator
{
   int Add(int a, int
b);
   int Subtract(int a, int
b);
   int Multiply(int a, int
b);
   int Divide(int a, int
b);
}

Notice the use of the ExternalDataExchangeAttribute, which lives in System.Workflow.Activities. This indicates the interface acts as the contract for external data exchange.

Now go back to the designer and set the CallExternalMethodActivity properties. Start by specifying the InterfaceType:

Next, set the MethodName to Add. Then, set the parameters a, b and (ReturnValue) that appear in the property grid to point to the properties A, B and Result respectively:

If you've done things correctly, this should result in the following:

Right, so now your workflow knows how to transform some input parameters into the corresponding output.

Letting the host know about data services

That's all what's needed for the workflow definition itself. On to the host now (Program.cs). First, we'll parameterize the workflow instantiation as follows:

Dictionary<string, object> parameters = new Dictionary<string, object>();
parameters.Add(
"A"
, 1);
parameters.Add(
"B"
, 2);

WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(ExternalDataDemo.Workflow1
), parameters);
instance.Start();

Next, create some class that implements ICalculator. In this case this is very straightforward since we live in the same project as the workflow definition. In reality you'd have to reference the assembly with the interface definition of course. For the sake of the demo, a simple implementation as a nested class inside Program will meet our needs:

class CalculatorService : ICalculator
{
   #region
ICalculator Members

   public int Add(int a, int
b)
   {
      return
a + b;
   }

   public int Subtract(int a, int
b)
   {
      return
a - b;
   }

   public int Multiply(int a, int
b)
   {
      return
a * b;
   }

   public int Divide(int a, int
b)
   {
      return
a / b;
   }

   #endregion
}

Now it's time to hook in this service to the workflow runtime, so that instances can rely on it to get their jobs done. This is done in the Program class again, as follows:

ExternalDataExchangeService edx = new ExternalDataExchangeService();
workflowRuntime.AddService(edx);
//keep this order!
edx.AddService(new CalculatorService()); //keep this order!

It's important to keep the order of the last two lines, otherwise you'll end up with an exception. Also, import the System.Workflow.Activities namespace that contains the ExternalDataExchangeService class. This class acts as a container for all external data exchange services, i.e. implementations of interfaces that were annotated with the ExternalDataExchangeAttribute earlier on. You can only have one implementation of each service.

Finally, to get the result back, we'll change the WorkflowCompleted event handler as displayed below:

int res = 0;

AutoResetEvent waitHandle = new AutoResetEvent(false);
workflowRuntime.WorkflowCompleted +=
delegate(object sender, WorkflowCompletedEventArgs e)
{
   res = (
int) e.OutputParameters["Result"
];
   waitHandle.Set();
};

...

In here, res is a variable declared on top of the the Main method which gets printed out at the end of the method:

WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(ExternalDataDemo.Workflow1), parameters);
instance.Start();

waitHandle.WaitOne();

Console.WriteLine(res);

The result when executing should be obvious :-).

Conclusion

Exchanging data with external parties should be apparent when looking at the workflow definition. Therefore, one has created the CallExternalMethodActivity to encapsulate such a call for external data. By doing so, external data exchange has the potential to become just yet another service, like tracking and persistence. In the next post, we'll talk about events. Enjoy!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

In the past, I've been talking about making a workflow dynamic by applying changes to it at runtime:

Yesterday, you learned how to use tracking (and persistence) services to visualize what's going on inside a workflow in flight, using the WorkflowMonitor sample application that comes with the Windows SDK. Today, we'll combine both the dynamic adaptation of a workflow with tracking services.

Modifying the WorkflowMonitor

As discussed yesterday, the WorkflowMonitor tracks workflows by polling a tracking database used by WF. To visualize what's a workflow keeping itself busy with, it relies on the workflow definition type that gets loaded in the workflow designer that's rehosted by the sample application. However, when making dynamic changes, the WorkflowMonitor doesn't visualize the modifications made to the workflow, because the type doesn't get reloaded. To make things work in a quick-n-dirty fashion, you'll need to change the WorkflowMonitor sample a bit (one line to be precise).

Go to MainForm.cs and locate the method called UpdateActivities. Change the code as follows (only a few lines are displayed, just enough for you to find the spot where modification is needed):

...

ListViewItem
currentWorkflow = listViewWorkflows.SelectedItems[0];
if (currentWorkflow != null
)
{
   Guid
workflowInstanceId = workflowStatusList[(currentWorkflow.SubItems[0]).Text].InstanceId;

   SqlTrackingWorkflowInstance sqlTrackingWorkflowInstance = null
;
   if (true == monitorDatabaseServiceValue.TryGetWorkflow(workflowInstanceId, out
sqlTrackingWorkflowInstance))
   {
      //Edited by Bart De Smet - 10/05/06
      GetWorkflowDefinition(workflowInstanceId);
      //End edit

      listViewActivities.Items.Clear();
      activityStatusListValue.Clear();

      ...

Dynamic updates

To keep things easy, I'll rely on the results of yesterday's work, so follow those instructions first and make sure the app runs fine. Then go to Workflow1.cs in the TrackingDemoLibrary project and change the ExecuteCode eventhandler for allowAccess like this:

private void allowAccess_ExecuteCode(object sender, EventArgs e)
{
   Console.ForegroundColor = ConsoleColor
.Green;
   Console.WriteLine("You're granted access"
);
   Console
.ResetColor();

   WorkflowChanges wc = new WorkflowChanges(this
);

   CodeActivity hello = new CodeActivity("hello"
);
   hello.ExecuteCode +=
new EventHandler
(hello_ExecuteCode);

   IfElseActivity ageChecker = (IfElseActivity)wc.TransientWorkflow.Activities["ageChecker"];
   IfElseBranchActivity plusEighteen = (IfElseBranchActivity)ageChecker.GetActivityByName("plusEighteen");
   plusEighteen.Activities.Add(hello);

   this
.ApplyWorkflowChanges(wc);
}

Next, add another delay on top of the workflow, set to 10 seconds. This will give us the opportunity to see the dynamic change happen in the WorkflowMonitor:

Recompile and make sure to re-register the workflow definition assembly in the GAC:

You might need to recreate the tracking database too (that the easy way to clean it when doing demos), as explained in the post on Tracking Services. Just drop the SqlTrackingDatabase, create it again and execute the two .sql scripts.

Now start the adapted (see above) WorkflowMonitor and then start the TrackingDemo solution. Keep an eye on the WorkflowMonitor and see what happens:

The result of the execution looks as follows thanks to dynamic adaptation:

Conclusion

Tracking in a world of dynamic updates is even more interesting and with one simple change to the WorkflowMonitor sample code, one creates a very appealing piece of tracking functionality. Have fun!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

In the previous post I've covered the use of persistence services with Windows Workflow Foundation to enable long-running workflows in a reliable way. However, there's still another dark side to long-running workflows that needs a solution: what's happening inside? As you can imagine, a workflow in progress can be blocked waiting for a lot of things, such as the expiration of a timer to indicate a timeout (e.g. by putting a DelayActivity in parallel with another activity using a ParallelActivity) or a wait for an external chunk of data to arrive. In order to visualize what's going on inside, at runtime, the WF Gods created tracking services.

Getting started

Just like persistence services, tracking services rely on some store to put information - in casu tracking information - in. Again, WF comes with a SQL Server enabled service out of the box, called the SqlTrackingService. There's quite a bit to tell about tracking services, but we'll stick with the basics in this post. Nevertheless, a short list of cool things:

  • Filtering tracking information using tracking profiles
  • Querying tracked information using SqlTrackingQuery

These might get covered in much more detail somewhere in the future. As usual, in case you have questions, you can contact me too.

Right, on the the real stuff now. To enable tracking services using SQL Server (2000, 2005, MSDE, Express) we need to create a tracking database first. We'll use SQL Server 2005 Express Edition in this post (download here) with SQL Server Management Studio Express installed on the machine:

  1. Connect to the database server:


  2. Create a new database called SqlTrackingDemo:


  3. Now define the database structure and logic required by tracking services, based on two .sql scripts that ship with WF in the %windir%\Microsoft.NET\Framework\v3.0\Windows Workflow Foundation\SQL\EN folder:


  4. Execute (in order) Tracking_Schema.sql and Tracking_Logic.sql, while making sure you've selected the newly created database from step 2:


  5. Finally you should find a bunch of tables and stored procedures in the database (check this).

A tracking-enabled workflow project

A simple solution

Create a new Sequential Workflow Console Application first, called TrackingDemo:

Remove Workflow1 from the Solution Explorer and add a new Sequential Workflow Library project to the solution, called TrackingDemoLibrary:

This step is required because we'll use our workflow definition to visualize tracking information and to do so, the assembly containing the workflow definition will be put in the GAC (as shown later on). Next, add a reference to the TrackingDemoLibrary project in the TrackingDemo (host) project:

In the code of the host application (Program.cs) change the call to CreateWorkflow as follows:

WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(TrackingDemoLibrary.Workflow1));

Finally, go the project properties of TrackingDemoLibrary and make the assembly strong-named:

Defining the workflow

Now go to Workflow1 in the TrackingDemoLibrary project. First, parameterize the workflow by adding two properties to the code:

private string firstName;

public string
FirstName
{
   get { return
firstName; }
   set { firstName = value
; }
}

private int
age;

public int
Age
{
   get { return
age; }
   set { age = value
; }
}

Now create the following workflow definition:

In here, the IfElseActivity's plusEighteen branch relies on a Declarative Rule Condition defined as follows:

The ExecuteCode event handlers for the different CodeActivity activities are defined like this:

private void welcome_ExecuteCode(object sender, EventArgs e)
{
   Console.WriteLine("Welcome "
+ firstName);
}

private void allowAccess_ExecuteCode(object sender, EventArgs
e)
{
   Console.ForegroundColor = ConsoleColor
.Green;
   Console.WriteLine("You're granted access"
);
   Console
.ResetColor();
}

private void denyAccess_ExecuteCode(object sender, EventArgs
e)
{
   Console.ForegroundColor = ConsoleColor
.Red;
   Console.WriteLine("You're denied access"
);
   Console
.ResetColor();
}

Finally, the delay activities in the workflow were added to simulate long-running operations and have a timeout value set to 10 seconds.

To test things, go to the Program.cs file and change the workflow instance creation code to pass through parameters:

Dictionary<string, object> parameters = new Dictionary<string, object>();
parameters.Add(
"FirstName", "Bart"
);
parameters.Add(
"Age"
, 23);

WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(TrackingDemoLibrary.Workflow1
), parameters);
instance.Start();

No magic yet:

Track it

Time to enable tracking. This is done by adding just one single line of code to the host code (Program.cs):

using System.Workflow.Runtime.Tracking;

...

workflowRuntime.AddService(new SqlTrackingService("Data Source=localhost\\SQLEXPRESS;Initial Catalog=SqlTrackingDemo;Integrated Security=SSPI;"));

Now, execute the application once more and then go to the SQL Server Management Studio Express to execute the following query:

use SqlTrackingDemo
exec dbo.GetWorkflows

This will result in something like this (a few columns were hidden in the screenshot):

In here you can see the identifier of the workflow instance as well as information about the workflow type. Other database tables include much more detailed information about the current state the workflow is in, effectively enabling tracking scenarios. Instead of doing low level work with these tables or stored procedures, we'll use a monitor application to see what's going on.

Lights on the WorkflowMonitor

When you install the Windows SDK, a folder with samples is created on the system, typically in %programfiles%\Microsoft SDKs\Windows\v6.0\Samples:

Extract the WFSamples.zip folder to some location on your system. In the Applications\WorkflowMonitor\CS subfolder of the samples, you'll find a WorkflowMonitor application that we'll take a look at now. Open it in Visual Studio 2005.

Now run the application. After a few seconds, the app will crash with the following error message:

This is because we didn't use the default configuration for tracking (i.e. different database name etc). However, we get the opportunity to change the settings right away:

The settings shown above are the valid ones in our case, using SQL Server 2005 Express with the right database name and a polling time of 5 seconds.

When you now try to start the monitoring (using the green "play" button) you'll likely see the following error (or something similar). You might need to restart the application to see the message popping up too:

Basically this means that the WorkflowMonitor is able to get the workflows using GetWorkflows but now tries to load the type specified in the database to visualize the workflows (which may be in flight). For the WorkflowMonitor to find our TrackingDemoLibrary.Workflow1 type, it will need to find the assembly containing the definition. You can either copy the dll file to the bin\Debug (or bin\Release depending on how you're running the WorkflowMonitor sample) folder of the WorkflowMonitor or register the workflow definition in the GAC. We choose the latter one:

Now restart the WorkflowMonitor and you should see something like this:

Notice the mark signs indicate the followed path during execution. It's a very interesting time investment to look at the code of this sample to find out how the workflow information is queried using SqlTrackingQuery. This sample is also interesting because of the designer rehosting used to visualize the workflow in the right-hand side pane. I'll cover designer rehosting some time in the future on this blog too.

Now, run our workflow application again while the monitor is polling the tracking database. Contrary to what you might expect, you won't see a new workflow instance appearing in the list right away. This happens because tracking only performs its job when asked so, e.g. because of a persistence opportunity (this statement is somewhat simplified, check out the docs to get full info on how the system works, related to work batches and so on). As an exercise, try to combine persistence (see previous post) with tracking to track a workflow in progress. You'll end up with a piece of hosting code like this:

workflowRuntime.AddService(
   new SqlWorkflowPersistenceService(
      "Data Source=localhost\\SQLEXPRESS;Initial Catalog=SqlPersistenceDemo;Integrated Security=SSPI;"
,
      true
,
      TimeSpan
.FromHours(1.0),
      TimeSpan
.FromSeconds(5.0)
   )
);

workflowRuntime.AddService(
   new SqlTrackingService
(
      "Data Source=localhost\\SQLEXPRESS;Initial Catalog=SqlTrackingDemo;Integrated Security=SSPI;"
   )
);

Chances are high your app will block or throw an exception. This is because the system needs to connect to the two databases at the same time in a transactional manner. To have this work, you need to start the DTC service on your machine:

Now execute the app again and you should see the following in the monitor:

As you can see, it's now possible to monitor a workflow in flight.

Note: An alternative approach to avoid the DTC problem is to use the - fasten your seatbelts - SharedConnectionWorkflowCommitBatchService service instead of a combination of the tracking service and the persistence service. As the name implies, this one uses a shared connection for both, effectively avoiding distributed transactions and the need for DTC to be enabled and started on the machine. I won't cover this here.

Conclusion

The use of long-running workflows in combination with a lot of services can turn the whole thing in a opaque box without you (and managers) knowing what's going on inside. Tracking services are the window into the internals of workflow instances in all their states, even when in progress. Combining this goodness with things like designer rehosting can open up a lot of great scenarios as shown in the WorkflowMonitor sample.

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

In the past, I've been writing about WF quite a bit. If you haven't read these articles already, you might find it interesting to check these out first before digging into this one. That being said, this post will focus on persistence services (also known as state persistence services).

Why persistence?

Before we dive deeper in the technical details, let's answer a crucial question: why persistence? As you probably know by now, workflows can be defined (at least partially) as a great tool to visualize either human workflow (e.g. approval processes) or machine workflow (e.g. business integration scenarios). Instead of writing procedural code, one creates a flow diagram or state diagram that represents the underlying logic that gets managed by the workflow runtime and engine. Read my WF introduction post for more information on this.

A common question is why one would prefer to use workflow instead of classic procedural code. There are a lot of answers to the question, one of which relates to the graphical aspect of a workflow that makes it more approachable and easier to understand. Another good answer is the support for scenarios that would end up with a rather high level complexity in procedural code, like parallel execution, data exchange with other components, transactional support, etc. Yet another one - the one that will interest us most - is a set of runtime services available to cope with more complex situations, including persistence services.

Beside the distinction of state workflows and sequential workflows and the distinction between machine workflows and human workflows, one can also draw a line between long-running and non-long-running workflows. It's not atypical (even more, rather typical) for workflows to be long-running. For example, approval processes will require human interaction by one or more parties to continue the workflow execution. Yet another example is a business integration application that waits for external data to arrive. In such a case we say a workflow to be idled. Needless to say, it would be a waste of resources (and even unreliable) to keep state of an idle workflow in memory. That's why we need persistence services.

The basics: getting idle

We'll start by taking a look at a workflow becoming idle. Start by creating a simple sequential workflow:

To force a workflow getting idle, we can use the DelayActivity. So, define the following workflow, containing a DelayActivity with the delay set to 10 seconds:

Next, define the ExecuteCode event handler for both CodeActivity activities as follows:

private void codeActivity1_ExecuteCode(object sender, EventArgs e)
{
   Console.ForegroundColor = ConsoleColor
.Green;
   Console.WriteLine("CodeActivity1 speaking"
);
   Console
.ResetColor();
}

private void codeActivity2_ExecuteCode(object sender, EventArgs
e)
{
   Console.ForegroundColor = ConsoleColor
.Green;
   Console.WriteLine("CodeActivity2 speaking"
);
   Console
.ResetColor();
}

When you launch this workflow application, no surprises will happen. At least not visibly (yet). The workflow will be launched, the "CodeActivity1 speaking" message will be printed out, the workflow will pause for 10 seconds, and then "CodeActivity2 speaking" will appear. Behind the scenes however, the workflow is idled when the delay activity is started. To visualize this, go to the host application's Program.cs file and hook in an event handler for the WorkflowIdled event as show below:

class Program
{
   static void Main(string
[] args)
   {
      using(WorkflowRuntime workflowRuntime = new WorkflowRuntime
())
      {
         AutoResetEvent waitHandle = new AutoResetEvent(false
);
         workflowRuntime.WorkflowCompleted +=
delegate(object sender, WorkflowCompletedEventArgs
e) {waitHandle.Set();};
         workflowRuntime.WorkflowTerminated +=
delegate(object sender, WorkflowTerminatedEventArgs
e)
         {
            Console
.WriteLine(e.Exception.Message);
            waitHandle.Set();
         };
         workflowRuntime.WorkflowIdled +=
new EventHandler<WorkflowEventArgs
>(workflowRuntime_WorkflowIdled);

         WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(PersistenceDemo.Workflow1
));
         instance.Start();

         waitHandle.WaitOne();
      }
   }

   static void workflowRuntime_WorkflowIdled(object sender, WorkflowEventArgs
e)
   {
      Console.WriteLine("Idled " + e.WorkflowInstance.InstanceId);

   }

}

Run the application now. It should print the following before it completes (i.e. screenshot taken during the delay activity execution):

How persistence works

When a workflow gets idle, it can be persisted, so that it can be unloaded from memory. A persistence service inherits from the System.Workflow.Hosting.WorkflowPersistenceService base class that's used by the workflow engine to perform persistence. I recommend you to check out this class. Workflow comes with a SQL Server driven persistence service out of the box, defined in System.Workflow.Hosting.SqlWorkflowPersistenceService.

When a workflow gets idle, the system checks in the registered services to find a persistence service, if any. If it finds when, the persistence service gets called to perform the persistence. Next, the system uses the persistence information to find out about the "next timer expiration" to wake up the idled workflow instance when required, so that it can continue its job executing.

Setting up persistence

The first step to set up persistence using SQL Server (2000, MSDE, 2005, Express) is to create the persistence database. In this post, I'll use SQL Server 2005 Express Edition to illustrate the persistence service. You can download the software over here. Don't forget to install SQL Server Management Studio Express too.

  1. Open SQL Server Management Studio Express and connect to the server (typically localhost\SQLEXPRESS):


  2. Next, create a new database called SqlPersistenceDemo, either by invoking CREATE DATABASE or by using the tools:


  3. Now it's time to define the database by executing the WF .sql scripts provided in %windir%\Microsoft.NET\Framework\v3.0\Windows Workflow Foundation\SQL\EN:


  4. First, execute SqlPersistenceService_Schema.sql, then execute SqlPersistenceService_Logic.sql. Make sure to use the right database when you execute the scripts (SqlPersistenceDemo):


  5. You're ready. The persistence database should contain the following tables and sprocs:

Now, we can alter the code to support persistence. This is done in the host application by creating an instance of the SqlWorkflowPersistence class and registering it as a service with the workflow runtime:

workflowRuntime.AddService(
   new SqlWorkflowPersistenceService(
      "Initial Catalog=SqlPersistenceDemo;Data Source=localhost\\SQLEXPRESS;Integrated Security=SSPI;"
,
      true
,
      TimeSpan
.FromHours(1.0),
      TimeSpan
.FromSeconds(5.0)
   )
);

The first parameter is straightforward and points to the database. In this case I'm using the SQLEXPRESS instance and connecting using windows integrated authentication. The second parameter is called "unloadOnIdle" which unloads the workflow when it's idled. This is the point in time where persistence happens. The third parameter isn't relevant for our elaboration right now and is used when multiple hosts can hydrate/dehydrate workflow instances from the database (it sets the ownership duration for the host, so that if another host tries to load the instance during that interval, it doesn't succeed in doing so and an exception will be thrown). The last parameter specifies the loading interval used to check whether a workflow instance has to be dehydrated (i.e. loaded from the database). We set this value to 5 seconds for the sake of the demo.

Tip: Check out SqlConnectionStringBuilder is you didn't do so yet and you want a more structured approach to create a connection string:

SqlConnectionStringBuilder sb = new SqlConnectionStringBuilder();
sb.IntegratedSecurity =
true
;
sb.DataSource =
"localhost\\SQLEXPRESS"
;
sb.InitialCatalog =
"SqlPersistenceDemo"
;
string connString = sb.ToString();

Testing it

Time to test. But before we do so, let's hook in another set of event handlers to find out about unload/load/persist events for the workflow:

   workflowRuntime.WorkflowIdled += new EventHandler<WorkflowEventArgs>(workflowRuntime_WorkflowIdled);
   workflowRuntime.WorkflowLoaded +=
new EventHandler<WorkflowEventArgs>(workflowRuntime_WorkflowLoaded);
   workflowRuntime.WorkflowUnloaded +=
new EventHandler<WorkflowEventArgs
>(workflowRuntime_WorkflowUnloaded);
   workflowRuntime.WorkflowPersisted +=
new EventHandler<WorkflowEventArgs
>(workflowRuntime_WorkflowPersisted);

...

static void workflowRuntime_WorkflowPersisted(object sender, WorkflowEventArgs e)
{
   Console.WriteLine("Persisted {0} on {1}", e.WorkflowInstance.InstanceId, DateTime.Now.ToUniversalTime());
}

static void workflowRuntime_WorkflowUnloaded(object sender, WorkflowEventArgs e)
{
   Console.WriteLine("Unloaded {0} on {1}", e.WorkflowInstance.InstanceId, DateTime.Now.ToUniversalTime());
}

static void workflowRuntime_WorkflowLoaded(object sender, WorkflowEventArgs e)
{
   Console.WriteLine("Loaded {0} on {1}", e.WorkflowInstance.InstanceId, DateTime.Now.ToUniversalTime());
}

static
void workflowRuntime_WorkflowIdled(object sender, WorkflowEventArgs e)
{
   Console.WriteLine("Idled {0} on {1}", e.WorkflowInstance.InstanceId, DateTime.Now.ToUniversalTime());
}

Before we press the magic F5 button, there's one more thing to do. Go to SQL Server Management Studio Express and create a new query:

use SqlPersistenceDemo

exec dbo.
RetrieveAllInstanceDescriptions
select * from dbo.InstanceState

Basically, the two are almost identical. Generally, I'd recommend to rely on the stored procedures to retrieve information about the workflow instances in progress, but let's just illustrate both:

Now, run the workflow application. You'll see something like this:

As you can see, the idled event is followed by unloading the workflow, thanks to the persistence service (cf. the second "unloadOnIdle" parameter in the constructor call of SqlPersistenceService). Next, when the workflow is unloaded, it's time to persist the workflow instance in the database. While the application is running you have 10 seconds to re-run the query to see this:

If you want to understand how a workflow is persisted, check out the Windows SDK documentation on writing a custom persistence service (search for "Creating Custom Persistence Services"). All of this works using serialization of the runtime workflow instance state.

Finally, the workflow will be reloaded and execution continues:

Notice execution didn't proceed after 10 seconds, but it took 15 seconds instead. The reason for this is the specified loading interval (cf. last parameter to the SqlPersistenceService constructor parameter). Based on this interval, the runtime checks whether a timer has expired amongst the persisted workflow instances.

Tip: Lower the delay's TimeoutDuration to 9 seconds and see what happens. Keep in mind little delays incurred by the runtime to do the persistence when analyzing the results.

Conclusion

In this post, we've seen the basics of the workflow state persistence services. Thanks to this service, long-running workflows can be implemented efficiently without you having to worry much about the persistence itself. In other words, services like persistence services free developers from the burden to cope with error-prone tasks and recurring patterns like implementing persistence. In the next workflow post, I'll focus on the tracking service, so keep an eye on my RSS feed.

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

In my previous post on Windows Workflow Foundation I covered how to expose a workflow via .NET 2.0 web services. You saw how easy it is to do this thanks to the built-in support in the Visual Studio 2005 Extension for Workflow. However, the next-generation service platform is Windows Communication Foundation (WCF), and so we'll take a look into the basics of hosting a workflow in WCF.

Defining a workflow

We'll start this demo by creating a new Empty Workflow Project, as shown below. The reason to choose a workflow project is to have the VS2005 Extension for Workflow loaded, which we wouldn't have if another non-WF project type is chosen.

Next, add a new Sequential Workflow to the project (State Machine Workflows will be covered in a later post) and call it Adder. Needless to say the use of WF (and the calculator service around it) is useless for this kind of applications, but hey that's what demos are all about isn't it? We'll just examing how to host a workflow inside a WCF host, dot.

Tip: Choose the Sequential Workflow (with code separation) item. You can compare this with code-behind separation in ASP.NET. One file will contain markup (in this case that will be the .xoml file - XOML stands for eXtensible Object Markup Language) and another one (the code-behind file if you want) contains the C# code you'll be adding (.xoml.cs extension).

On the workflow, add one single CodeActivity called add. Next, create an event handler for ExecuteCode with the following piece of code in it:

private void add_ExecuteCode(object sender, EventArgs e)
{
   result = a + b;
}

In here a, b and result are private members that have a corresponding (starting with upper-case) property as shown below (prop-prop-prop):

private int a;

public int
A
{
   get { return
a; }
   set { a = value
; }
}

private int
b;

public int
B
{
   get { return
b; }
   set { b = value
; }
}

private int
result;

public int
Result
{
   get { return
result; }
   set { result = value
; }
}

Adding the WCF service

The next step is to add a WinFX Service (in later builds this will likely be changed to reflect the .NET Framework 3.0 branding), which we'll call CalculatorService.

In the CalculatorService.cs file you'll find an interface called ICalculatorService and a class called CalculatorService implementing that interface. The interface is called the service contract and has to changed as follows:

[ServiceContract()]
public interface
ICalculatorService
{
   [
OperationContract
]
   int Add(int a, int
b);

   [
OperationContract
]
   int Subtract(int a, int
b);

   [
OperationContract
]
   int Multiply(int a, int
b);

   [
OperationContract
]
   int Divide(int a, int
b);
}

We'll only be implementing the Add method, you can create an implementation for the other methods yourself if you want. Tip: Divide is somewhat more tricky because you have to cope with a "division by zero" situation. To solve that problem, go and find out about FaultContracts in WCF and play around a bit.

Next, change the CalculatorService class as outlined below. Tip: Put the cursor somewhere in the ICalculator part of the class declaration, click on the smart tag and choose to implement the interface (implicitly) - or press SHIFT-ALT-F10. Leave the Subtract, Multiply and Divide methods as they are now, i.e. just throwing an exception (I still wonder why the VS 2005 IDE folks didn't choose the NotImplementedException for this piece of auto code generation; or not?):

public int Subtract(int a, int b)
{
   throw new Exception("The method or operation is not implemented."
);
}

public int Multiply(int a, int
b)
{
   throw new Exception("The method or operation is not implemented."
);
}

public int Divide(int a, int
b)
{
   throw new Exception("The method or operation is not implemented."
);
}

The Add method is the most interesting one of course:

public int Add(int a, int b)
{
   int
res = 0;

   using
(WorkflowRuntime wr = new WorkflowRuntime())
   {
      AutoResetEvent waitHandle = new AutoResetEvent(false
);

      wr.WorkflowCompleted +=
         delegate(object sender, WorkflowCompletedEventArgs
e)
         {
            res = (
int)e.OutputParameters["Result"
];
            waitHandle.Set();
         };

      Dictionary<string, object> arguments = new Dictionary<string, object
>();
      arguments.Add(
"A"
, a);
      arguments.Add(
"B"
, b);

      WorkflowInstance wi = wr.CreateWorkflow(typeof(Adder
), arguments);
      wi.Start();

      waitHandle.WaitOne();
   }

   return
res;
}

This should be familiar from previous posts. One thing that might be new is the use of output parameters:

res = (int)e.OutputParameters["Result"];

It might be a little hard to read the code from top to bottom because the result is grabbed from the anonymous method that's hooked in to the WorkflowCompleted event.

In order to make our service functional, we have to create a configuration file first:

As you should have seen already, the CalculatorService.cs file contains a comment section on how to create the configuration file. Uncomment the configuration XML and cut-paste it into the <configuration> section of the app.config file. Next, make some changes (indicated in bold and underlined - 3 changes in total):

<?xml version="1.0" encoding="utf-8" ?>
<
configuration
>
   <
system.serviceModel
>
      <
services
>
         <!--
Before deployment, you should remove the returnFaults behavior configuration to avoid disclosing information in exception messages
-->
         <
service name="WFviaWCF.CalculatorService" behaviorConfiguration="returnFaults"
>
            <
endpoint contract="WFviaWCF.ICalculatorService" binding="wsHttpBinding"
/>
            <
endpoint contract="IMetadataExchange" binding="mexHttpBinding" address="mex"
/>
         </
service
>
      </
services
>

      <
behaviors
>
         <
serviceBehaviors
>
            <
behavior name="returnFaults"
>
              <
serviceMetadata httpGetEnabled="true"
/>
              <
serviceDebug includeExceptionDetailInFaults="true"
/>
            </
behavior
>
         </
serviceBehaviors
>
      </
behaviors
>
   </
system.serviceModel
>
</
configuration>

The first change is required because of a configuration schema change (the service's type attribute has changed to become a name attribute); the other two changes enable MEX (Metadata EXchange) that allows us to create a web service proxy (read: retrieve the WSDL service definition over HTTP).

A console application host

Next on our to-do list is the creation of a hosting application. You could choose for a Windows Service or walk the path of IIS hosting (slightly more difficult ways) but we'll stick with a simple console application. So add a new class file to the project and call it Program.cs:

Change the code of the Program.cs file like this:

using System;

namespace
WFviaWCF
{
   class
Program
   {
      public static void
Main()
      {
         MyServiceHost
.StartService();
         Console
.ReadLine();
      }
   }
}

This code uses the MyServiceHost class that was auto-generated as part of the WinFX Service creation process (see CalculatorService.cs for its definition) and just starts the service and waits for user input to stop the service host.

Before we can launch the application we'll need to set the project properties to compile to a console application with a given entry point:

Running the application

Time to press F5 and see the Console application service host in action. Whoops, something goes wrong (assuming you're running on Vista and you haven't elevated yourself to run VS2005 with administrator privileges). No worries, I wouldn't post this if I hadn't a solution in mind...

What happens is this: Windows Vista has a kernel-mode listener called http.sys (just like Windows Server 2003 and Windows XP SP2). All HTTP traffic passes through it. But before an app can start listening on an HTTP address, it needs to get registered with http.sys. More specifically the user account running the service host needs to have permission to use a URL to start listening on. In Windows Server 2003 and Windows XP SP2 there's a tool called httpcfg.exe (see TechNet; see Support Tools for Windows XP). However it doesn't ship with Windows Vista. Geeks could compile httpcfg.exe from the code comes with the Platform SDK (folder Samples\NetDS\HTTP\ServiceConfig).

However, the god news is that httpcfg.exe is dead. It's successor is called netsh http. So, go to a command prompt, elevated using "Run as administrator", and do the following:

The command used in here is

add urlacl url=http://+:8080/WFviaWCF/CalculatorService user=VISTA-9400\Bart

and contains the url as displayed in our AddressAccessDeniedException, plus the user who is allowed to use the URL (replace it by your DOMAIN\user). Alternatively you can also allow access to the URL to BUILTIN\Users or \Everyone but please understand the possible security risks when doing so.

Try to launch the application again, it should work correctly now. Time to open up a browser and go to http://localhost:8080/WFviaWCF/CalculatorService. It should look like this:

If it looks like the screenshot below, you've not configured MEX correctly as I outlined earlier on. Notice however that WCF helps you out with the problem so you don't need to know the XML configuration syntax by head.

A client application

In order to test our service we'll have to build a client application now. Just open another instance of Visual Studio 2005 and create a new console application called WFviaWCFClient. Next, go to the solution explorer, right-click and choose Add Service Reference. This the WCF-equivalent of Add Web Reference and calls svcutil.exe behind the scenes (instead of wsdl.exe). It fully supports WCF channels and the contract methodology but you can inspect this yourself if you like in the Service References\localhost.map\localhost.cs file that will get generated.

Enter the Service URI and accept the default Service reference name. Finally click on OK:

Last but not least, change the main method in program.cs like this:

static void Main(string[] args)
{
   int
a = 1, b = 2;

   localhost.
CalculatorServiceClient svc = new localhost.CalculatorServiceClient
();
   Console.WriteLine("{0} + {1} = {2}"
, a, b, svc.Add(a, b));
   Console.ReadLine();
}

Time to run the application (almost ashame to show you such a trivial calculation, so that explains the little image below :-)):

Exercise: Make multiple calls to the Add method and notice you don't end up with an exception like we did in the ASP.NET 2.0 Web Service scenario. Why is this? (Tip: think about persistence, sessions and our WCF host implementation).

Conclusion

Hosting our workflow behind a service facade using WCF shouldn't be difficult at all (try to experiment with other transport mechanisms, check out the WCF documentation). However, keep in mind this was a trivial stateless example. Also notice the lack (today) of equivalents to the WebServiceInput/Output/Fault activities. More complex scenarios (supporting state etc) will require more work; I'll come back to this later on, so keep an eye on my blog.

See you again in the WF space soon!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Introduction

Previously in my WF blog series I've been talking about the WF basics as well as dynamic updates to workflow instances at runtime. These posts had one thing in common: all of the demo applications were hosted in a simple console application, that was for the lion's part generated by the Visual Studio 2005 WF Extension. As you probably know by now, WF is about an in-process workflow engine where hosting the WF runtime is required to get it to work. In this post, you'll see how to host WF in ASP.NET 2.0 and expose it through a web service.

A simple workflow library

Creating the project

In the previous posts, we started coding by creating a Workflow-enabled console application. This time however, we need to create a library that defines a workflow. Later on, this workflow definition will be used by a web service in an ASP.NET website. So, open Visual Studio 2005 and create a new Sequential Workflow Library project entitled "WorkflowViaWS":

In order to concentrate on the real topic of this post, i.e. web service hosting of workflows, we'll create a trivial workflow: a divider. So, rename Workflow1.cs to Divider.cs first.

A webservice 'contract'

Now, when talking about web services one word should pop up in your brain almost instantaneously: the contract. Although we're not yet working with WCF over here, WF also has kind of a contract notion. Basically, we need to create an interface that will act as the interface to the workflow and hence the workflow that encapsulates our workflow. Add a new interface to the project called IDivider:

namespace WorkflowViaWS
{
   interface
IDivider
  
{
      double Divide(double a, double
b);
   }
}

Accepting input == WebServiceInputActivity

Now we're ready to define the workflow itself. In order to accept input from the web service, we need to use a WebServiceInputActivity from the Toolbox.

When you drag and drop this activity to the workflow's designer surface, the designer will tell you to set the IsActivating property of the activity to true. This is required to tell the system that this web service input actually activates the (new) workflow instance. Think about this for a while ... you can indeed have more than one WebServiceInputActivity in your workflow: welcome to the world of stateful webservices and state hydration/dehydration. This falls outside the scope of this post however, but stay tuned to discover this too!

Next, the designer will ask you to set the InterfaceType property. This is where IDivider enters the scene. Setting the property couldn't be a simpler job thanks to the dialog support:

Next you'll have to set the MethodName with the assistance of a dropdown list in the Properties pane. Needless to say we'll choose the only method available: Divide.

Now pay attention to the Properties pane: a set of Parameter fields will come out of the blue:

Setting these fields is quite easy again, but there's some work to do first: adding properties to the Divider workflow class:

private double a;

public double
A
{
   get { return
a; }
   set { a = value
; }
}

private double
b;

public double
B
{
   get { return
b; }
   set { b = value
; }
}

Tip: use the prop code snippet in Visual Studio 2005 as shown below (type 'prop' without quotes and press TAB).

These properties will act as inputs to the workflow, just like we did in the previous posts with our console-hosted workflows and the arguments dictionary passed to CreateWorkflow. This is how the properties would be used in a console-based application (or another non-WS hosting environment):

Dictionary<string, object> arguments = new Dictionary<string, object>();
arguments.Add("A",
10.0);
arguments.Add("B"2.0);

WorkflowInstance
instance = workflowRuntime.CreateWorkflow(typeof(WorkflowViaWS.Divider), arguments);

Binding a parameter to a property is assisted by the following dialog that shows all of the valid candidates for binding. Bind parameter 'a' to property 'A' and parameter 'b' to property 'B':

Now we have completed all the required properties of the WebServiceInputActivity, the system still complains about something else...

What goes in must go out - WebServiceOutputActivity

A web service hosted workflow won't do much if it couldn't produce results (so, there's no built-in concept of "one-way web methods" available, although you can think of workarounds for that).

Therefore, add a WebServiceOutputActivity activity to the designer (of course below the WebServiceInputActivity) and set its InputActivityName to webServiceInput1, the name of the corresponding input activity (note: I've been lazy with the acitivy naming in here, but since we only have one input and one output activity this doesn't matter much - btw, you're absolutely right to discard this excuse).

Next, we have to set the (ReturnValue) property of the output activity. Again, we'll add a property to the workflow class:

private double result;

public double Result
{
   get { return result; }
   set { result = value; }
}

Selecting the return value is done using exactly the same dialog as we used to set the input parameter bindings. The result should look like:

Division by zero? Fault! - Some additional logic

In Kindergarten (or a bit later) everyone learned that dividing by zero isn't possible. The .NET Framework however takes a more mathematical approach and the result of dividing some double value by zero will be NAN (not-a-number). Let's take the former approach which gives us the opportunity to add some additional logic to the workflow and illustrate the WebServiceFaultActivity.

Perform the following jobs:

  1. Drag and drop an IfElseActivity to the designer surface in between the input and output activities. Rename it to divisionByZeroCheck.
  2. As you can see the IfElseActivity has two branches. Select the left branch and rename it to isZero. Then select the right branch and rename it to nonZero.
  3. The designer now complains on the left (isZero) branch and asks to set the Condition property.
    • In the Properties pane, select Declarative Rule Condition.
    • Next, expand the Condition property and set the ConditionName to CheckForZero.
    • Now click the ellipsis (...) to create a new rule:


    • The result should look like:


    • Tip: Take a look at the Divider.rules file in the Solution Explorer. This file contains the declarative rule. The reason I've chosen for a "declarative rule condition" is to show you this feature (and the underlying XML); however, in this situation a "code condition" would be better because the condition won't ever change. Declarative rule conditions and code conditions both have pros and cons, but this falls outside the scope of this post.
  4. Drag and drop a WebServiceFaultActivity into the left (isZero) branch.
    • Set its InputActivityName to webServiceInput1.
    • Next, you'll have to set the Fault property. In order to do this, first switch to the code view and add the following:

      private ArgumentException divisionByZero = new ArgumentException("Division by zero.");

      public ArgumentException
      DivisionByZero
      {
         get { return
      divisionByZero; }
         set { divisionByZero = value
      ; }
      }

    • Now you can set the Fault property to DivisionByZero.

  5. In the right branch (nonZero), add a CodeActivity and call it divide. Add an ExecuteCode handler with the following piece of code:

    private void divide_ExecuteCode(object sender, EventArgs e)
    {
       result = a / b;
    }

  6. Move the webServiceOutput1 activity below the divide activity in the right branch (nonZero).

If you've done the jobs above correctly, you should see this:

Get it published

Time to start the web service creation and publication. Right-click the WorkflowViaWS project in the Solution Explorer and choose Publish as Web Service:

This will create an ASP.NET Web Site called WorkflowViaWS_WebService and add it to the current solution:

As you can see a web.config file was created (it should pop up immediately after completion of the "Publish as Web Service" action). In there you'll see some interesting regions such as:

<WorkflowRuntime Name="WorkflowServiceContainer">
   <
Services
>
      <
add type="System.Workflow.Runtime.Hosting.ManualWorkflowSchedulerService, System.Workflow.Runtime, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
/>
      <
add type="System.Workflow.Runtime.Hosting.DefaultWorkflowCommitWorkBatchService, System.Workflow.Runtime, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
/>
   </
Services
>
</
WorkflowRuntime>

More information about these two services (which have to do with threading) can be found in the Windows SDK. You'll also see an HttpModule being hooked in:

<httpModules>
   <
add type="System.Workflow.Runtime.Hosting.WorkflowWebHostingModule, System.Workflow.Runtime, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="WorkflowHost"
/>
</
httpModules>

This module (WorkflowWebHostingModule) is responsible to maintain (read/write) a client cookie that holds the workflow's instance identifier (a GUID that uniquely represents the workflow instance the user is connected to). This allows multiple web method calls to be part of one workflow instance.

Finally there are references to the System.Workflow.* assemblies:

<add assembly="System.Workflow.Activities, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/>
<
add assembly="System.Workflow.ComponentModel, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"
/>
<
add assembly="System.Workflow.Runtime, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/>

The web service file (called WorkflowViaWS.Divider_WebService.asmx but renameable) has a fairly short definition:

<%@WebService Class="WorkflowViaWS.Divider_WebService" %>

Geeky readers might consider to ildasm the WorkflowViaWS.dll file that was dropped in the Bin folder:

Hit F5 and run

Eager to see what you've created? Just hit F5 (when you're somewhere inside the ASP.NET Web Site in order to have the website as startup project) and choose the add debugging configuration to web.config (this might be useful if you decide to play with breakpoints in the workflow). The built-in ASP.NET Development Server will get launched:

Finally the web service page will be displayed in the browser:

Click on Divide to test the web service and enter the values 10 and 20. The result should be - help where is my calc.exe? - 0.5:

Note: If you try to invoke the web service more than once in the same session, you'll see the following error.

System.InvalidOperationException: The workflow hosting environment does not have a persistence service as required by an operation on the workflow instance &quot;21b04980-9926-4b35-a87f-3d1c98b3a987&quot;.

This is normal because of our configuration. Recall the WorkflowWebHostingModule HttpModule that keeps track of the workflow instance identifier in a cookie on the web client. In between calls from the same session the system needs to be able to persist the workflow instance data to disk. Because we didn't configure a persistence service, this fails. Persistence services will be covered in a separate post later on.

What about the fault? Restart your browser and open the web service page again. Now try to make a call with parameters a=10 and b=0. When invoking the service, you'll end up with an HTTP 500 Internal Server Error. If you want to see the exception that was sent in the fault, create a little web service client application (e.g. a console application) like this:

class Program
{
   static void Main(string
[] args)
   {
      localhost.
Divider_WebService ws = new localhost.Divider_WebService
();

      try
      {
         double
res = ws.Divide(10, 0);
      }
      catch (Exception
ex)
      {
      }
   }
}

I assume everyone knows how to add a web service reference to an existing web service and how to use the proxy object. There's one caveat however: if you restart the ASP.NET Development Server that comes with Visual Studio 2005, it will likely listen on another random port. The web service proxy will then be invalid, unless you change the Url property on the proxy object to reflect the current URL to the web service (http://localhost:<port>/WorkflowViaWS_WebService/WorkflowViaWS.Divider_WebService.asmx).

You can copy the application to an IIS machine too of course, but keep in mind the same version (= CTP build) of WF (and hence, .NET Framework 3.0) needs to be present on the web server machine. If you're using your own pc as IIS web server, there shouldn't be a problem if configuration is correct (ASP.NET 2.0 is registered - cf. aspnet_regiis.exe - and - in W2K3 - the web service extension is allowed). For Vista users some experience with IIS 7 might be helpful although it's pretty plug-and-play.

That said, the result of a debugging session should be like this:

Conclusion

Exposing a workflow through simple ASP.NET 2.0 web services is no rocket science. In this post you saw the basics of this; further posts in this workflow series will cover more advanced scenarios with workflow persistence and the use of WCF as a WF host.

Keep up the good WF-work!

Del.icio.us | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

More Posts Next page »