Skip to content
Jonathan George By Jonathan George Software Engineer IV
Using complex objects in BDD Scenarios with SpecFlow

SpecFlow has been succeeded by ReqnRoll. The code is this post has been ported over to Corvus.Testing.ReqnRoll. Available as Corvus.Testing.ReqnRoll, but also as two meta packages: Corvus.Testing.ReqnRoll.NUnit and Corvus.Testing.ReqnRoll.MSTest. The functionality is the same, but has just been migrated over to ReqnRoll.

If you use SpecFlow as regularly as we do at endjin, you're no doubt familiar with using tables to set up data. This works really well for simple objects, but as soon as you have any kind of complex object or hierarchy, it falls over. I encountered this problem whilst pairing with a colleague recently and we came up with a simple solution we really like.

The Introduction to Rx.NET 2nd Edition (2024) Book, by Ian Griffiths & Lee Campbell, is now available to download for FREE.

Before getting into our solution it's worth mentioning that there are already various approaches suggested online, like this one: SpecFlow: table.CreateInstance<T> only loads a shallow model, table.CreateDeepInstance<T> to the rescue – we liked the intent, but really didn't like the syntax.

It's also worth mentioning that when you start talking about ways of expressing complex objects in your specs, you may be told that you're cuking it wrong. The main objection to what we're doing here is that it likely means you're not writing specs in the language of your users - and of course, this very much depends on who your end users are. BDD specs have considerable value beyond just testing that which business users see, so it's perfectly valid to consider scenarios where the end user is actually a developer. As such, I'm not going to argue the case one way or the other here.

So, let's start with what we want.

  • The ability to express complex objects in our table data in specifications
  • A solution that doesn't make either the specifications or their data hard to read
  • A generic approach that's not tied to the structure of the specific objects – ideally a binding that we can apply when needed.

The second point is a particularly important one because keeping the scenarios readable is vital to the success of BDD specs, regardless of the audience. As a result, we decided that we should define objects independently and compose them together when needed. Let's have a look at what the approach we came up with looks like in practice:

@useChildObjects
Scenario: Demonstrating how this approach looks
  Given I have a User called 'Jon'
  And I have a User called 'Matthew'
  And I have the following expense claims
    | Date       | Amount | User      | Description                    |
    | 26/02/2020 | £25    | {Jon}     | Travel to Endjin team away day |
    | 26/02/2020 | £40    | {Matthew} | Travel to Endjin team away day |

Breaking this down:

  • The first line enables our new approach - more on how this is implemented in a moment.
  • The next two lines create two new User objects with specific names.
  • Then, in the table for the expense claims, we can use those named objects simply by referring to them using their names surrounded by curly braces.

So, how does this work? Here's the implementation of the bindings for @useChildObjects (the full code can be found here):

[BeforeScenario("@useChildObjects", Order = 100)]
public static void SetupValueRetrievers(ScenarioContext scenarioContext)
{
    var instance = new ChildObjectValueRetriever(scenarioContext);
    scenarioContext.Set(instance, ChildObjectValueRetrieverKey);
    Service.Instance.ValueRetrievers.Register(instance);
}

[AfterScenario("@useChildObjects", Order = 100)]
public static void TearDownValueRetrievers(ScenarioContext scenarioContext)
{
    scenarioContext.RunAndStoreExceptions(() =>
        Service.Instance.ValueRetrievers.Unregister(scenarioContext.Get<ChildObjectValueRetriever>(ChildObjectValueRetrieverKey)));
}

As you can see, it's creating a specialised SpecFlow ValueRetriever. These are part of the mechanism SpecFlow uses to turn tabular data into objects when you use the Table.CreateInstance<T> or Table.CreateSet<T> methods. The interface is very simple, providing methods to determine whether a ValueRetriever can process a value and one to actually do the work.

Programming C# 12 Book, by Ian Griffiths, published by O'Reilly Media, is now available to buy.

Our ChildObjectValueRetriever is very straightforward, looking for values that are surrounded by curly braces and attempting to look up the named values from the ScenarioContext. You can see the code here.

That just leaves the question - how do the objects get into the ScenarioContext in the first place? Well, that's up to you. There are many different ways you might want to create your test data objects, and little value in imposing constraints on how that works for the sake of what we're trying to achieve here. The key thing is to give the objects names and store them in the current ScenarioContext using that name. In the example above, the binding for I have a User called 'name' could always generate a user with the same property values, or it could use a tool like DataGenerator to create new objects - it doesn't matter.

If you want to use this approach in your own projects, it's part of the endjin-sponsored Corvus.NET project which consists of many useful tools we've built over the past few years. You can get the package from NuGet, and see the code on GitHub. It's all written in .NET Standard 2.0 so you can use it regardless of whether you're on .NET Core or .NET Framework.

Once you've installed the package, make sure you reference it in your SpecFlow configuration, as described here. If you're using XML config it looks like this:

<specFlow>
  <stepAssemblies>
    <stepAssembly assembly="Corvus" />
  </stepAssemblies>
</specFlow>

and if you're using JSON config, like this:

{
  "stepAssemblies": [
    { "assembly": "Corvus.SpecFlow.Extensions" }
  ]
}
Integration Testing Azure Functions with SpecFlow and C#, Part 2 - Using step bindings to start Functions

Integration Testing Azure Functions with SpecFlow and C#, Part 2 - Using step bindings to start Functions

Jonathan George

If you use Azure Functions on a regular basis, you'll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the second of a series of posts, we look at using step bindings provided by the Corvus.Testing library to run functions apps as part of your SpecFlow scenarios.
Integration Testing Azure Functions with SpecFlow and C#, Part 3 - Using hooks to start Functions

Integration Testing Azure Functions with SpecFlow and C#, Part 3 - Using hooks to start Functions

Jonathan George

If you use Azure Functions on a regular basis, you'll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the third of a series of posts, we look at using classes in the Corvus.Testing library to run functions apps via scenario and feature hooks.
Integration Testing Azure Functions with SpecFlow and C#, Part 4 - Controlling your functions with additional configuration

Integration Testing Azure Functions with SpecFlow and C#, Part 4 - Controlling your functions with additional configuration

Jonathan George

If you use Azure Functions on a regular basis, you'll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the fourth of this series of posts, we look at how configuration can be supplied from your tests to the functions apps being tested.

Jonathan George

Software Engineer IV

Jonathan George

Jon is an experienced project lead and architect who has spent nearly 20 years delivering industry-leading solutions for clients across multiple industries including oil and gas, retail, financial services and healthcare. At endjin, he helps clients take advantage of the huge opportunities presented by cloud technologies to better understand and grow their businesses.