Sharing your living documentation with everyone with Pickles

BDD is a great way to collaboratively define features within your system and understand there intent by writing scenarios using the gherkin language (given, when, then). Another bonus to writing scenarios in such a style is it is easy to take scenarios written in plain English and turn them into automated tests. Frameworks such as SpecFlow and Cucumber are but two such options. With that, scenarios will generally live within the code base and for example with SpecFlow can only be accessed easily through Visual Studio and to ensure you have the latest version you will need access to the source code repository. Effectively you need a dev setup! In any software project the code base is always the only truth you can rely on, therefore it is correct to house the scenarios within the code base. But then problems arise in that now you need a development IDE to view them. You definitely don’t want everyone having to install and use the IDE to view the files but you also don’t want to be sending around copies of features files, which could become soon out of date. Roll up Pickles!

Pickles is an open source living documentation generator that works on feature files written in the Gherkin language and supports frameworks such as SpecFlow and Cucumber. Pickles is simple enough to use, you tell it where to find the features files, and then you tell it where to put the results. You can output the results in a number of formats including JSON, Word and the default option HTML.

You can get Pickles via NuGet with the following command

Install-Package Pickles

Pickles has a number of options to how you can invoke the generation of the documentation. This can be as a Nant task, an MSI task or even a PowerShell command. With these options it is very simple to add the generation of the documentation as part of your continuous integration process. Lets take the PowerShell example

Import-Module 'C:\Pickle\Pickles.PowerShell.dll'

Pickle-Features -FeatureDirectory 'C:\MyFeatures' -OutputDirectory 'C:\MyOutput'

A simple two liner. The Pickles NuGet package includes a PowerShell module, which you will first need to import. This module contains a single CmdLet called Pickle-Features. -FeatureDirectory tells pickles where to find your feature files. Pickles will search for feature files in this directory and also any sub directories. -OutputDirectory tells Pickles where to write the results too.

So a typical scenario like the one below found in SpecFlow.

Gherkin Style Scenario

Can be viewed as HTML through a web browser by running Pickle over the feature file.

Pickle Output

We use TeamCity for our continuous integration. We have added a step to the build process to run Pickles from PowerShell to generate the documentation files. Then we have added an extra tab on the build results screen. so that it can always be viewed for every checkin. In addition to this we copy the files over to a virtual directory folder so they can always be accessed, directly in a browser without the need to also have to go to TeamCity. This now means that anyone in the office can view the feature files at any time with the same URL they can be sure they are always viewing the latest version.

Passing values between steps in SpecFlow

In a previous post I introduced a BDD testing framework called SpecFlow. SpecFlow enables scenarios written by a business analyst to be converted into unit tests.

The steps that make up a single scenario, (i.e. the given, when, then’s) don’t all have to be implemented within the same step definition class file. SpecFlow also offers the ability to parametrise steps, based on the values within the scenario.

For example:

Scenario: Click the login link
Given:  that I want to navigate to the “login page”

I can accept the “login page” as a parameter as follows:

[Given(@"that I want to navigate to the (.*)")]
public void GivenThatIWantToNavigateToAPage(string pageValue)

This can be highly beneficial from a reusability point of view to reuse steps, and keep the code base cleaner and more maintainable. This will also most likely lead to test steps being placed within step definition files that mostly resemble the intent of the step.

This leads to the issue of how to pass values from a one step in a scenario, which is in one step definition file, to another step from the same scenario in a separate step definition file.

Fortunately SpecFlow provides the ability to easily accomplish this using the ScenarioContext object. The ScenarioContext object has a Current property, which can be used to hold state, by persisting required data to a dictionary. To simplify the use of the dictionary a generic interface is provided.

For example, in the first step of a scenario I put an instance of a class into the Current dictionary.


and in a later step it can be retrieved by simply referring to the type that is stored:


If you wish to store 2 or more objects of the same type, you can also optionally provide an Id, which is used to retrieve the value by:

To Set

ScenarioContext.Current.Set<MyClass>(myClass, "Class1");

To Get


For further information on the ScenarioContext property click here

Writing a Load Test Plugin to dynamically change the user count

Visual Studio 2010 Ultimate edition comes bundled with a load testing framework, which provides the tools to allow a developer or a QA the ability to create load tests. In its simipliest form a load test will typically contain several unit tests, which are executed as part of the load test execution.  Each of the unit tests within the load test can be indepentedly configured to specify the frequency in which each unit test is executed, to best simulate the distribution of load upon a server for a given duration. One of the key configuration settings of a load test is specifying the virtual user count, which allows the tester to see how many users the system can support before performance begins to degrade or even the system crashes.

A nice feature of the Microsoft load testing framework is the ability to extend the available functionality of a load test by writing your own load test plugin.

This can be achieved by adding a class that implements the ILoadTestPlugin interface. Implementing this interface allows you to write custom code to handle events that the ILoadTestPlugin interface provides, which occur when the load test is run.

As mentioned above the user count lets you simulate the load of a sytem for a given number of users. One feature that does not come out of the box with the load testing frameork is the ability to change the user count whilst a load test is executing.

One of the projects I recently worked on was to measure the performance of one of our existing applications. This was done by creating a suite of load tests and running them under different load configurations.  A feature I wanted to be able to provide was the ability to change the user count at anytime during the execution of a load test.  To accomplish this I needed to create a custom plugin that could interact with a user interface to allow the test runner to change the current user count at anytime.

The answer to this requirement was to create a custom load test plugin that contained a self hosted WCF service which contained a single web method that could receive an updated user count.  The user count would be set within an WPF application that provides a slider control, which makes a call to the service contained within the plugin everytime the value of the slider is changed.

Below is the implementation of the custom plugin.  This contains a simple WCF service called PluginService. Within  the Initilize method of the plugin, the service is started and hosted within the plugin on a seperate thread. The  service provides the interface to allow anyone consuming the service, in this case a WPF application, to update the userLoad static variable within the plugin.

    public class DynamicWcfRunSettingsPlugin : ILoadTestPlugin, IDisposable
        private static Microsoft.VisualStudio.TestTools.LoadTesting.LoadTest myLoadTest;
        public static int userLoad;
        private bool serviceIsOnline = true;
        private Task task;

        [Description("Starting Current Load")]
        public static int UserLoad
            get { return userLoad; }
            set { userLoad = value; }

        public void Initialize(Microsoft.VisualStudio.TestTools.LoadTesting.LoadTest loadTest)
            myLoadTest = loadTest;
            task = Task.Factory.StartNew(HostService);
            myLoadTest.Heartbeat += new EventHandler<HeartbeatEventArgs>(LoadTestHeartBeat);

        private void HostService()
            var baseAddress = new Uri("http://localhost/pluginService");
            using (var host = new ServiceHost(typeof(PluginService), baseAddress))
                while (serviceIsOnline)

        private void LoadTestHeartBeat(object sender, HeartbeatEventArgs e)
            if (!e.IsWarmupComplete)

            myLoadTest.Scenarios[0].CurrentLoad = userLoad;

        public void Dispose()
            serviceIsOnline = false;

    public interface IPluginService
        void SetVirtualUserLoad(int virtualUsers);

    public class PluginService : IPluginService
        public void SetVirtualUserLoad(int virtualUsers)
            DynamicWcfRunSettingsPlugin.UserLoad = virtualUsers;

The final piece of the puzzle is to implement the LoadTestHeartBeat event.  The ILoadTestPlugin interface allows code with a load test plugin to be run at different times during the execution of a load test. The LoadTestHeartBeat event is polled every second by the actual load test. This allows us to change a configuration variable and for it to be immediately picked up by the load test. It is here we can set the CurrentLoad property of the load test to the last value received by our service. Job done!