Simulate or Don't Simulate: That is the question

Chris Lees 9th Feb 2018
5 min read

Certification Best Practice

One topic that always seems to crop up when talking to firms about automated certification is the idea of simulation. Whether it is a necessary requirement that an automated onboarding tool has the ability to simulate responses to inbound messages or not.

There seem to be quite polarised views on this, so in this update, I thought I would try to summarise views for and against.

The Case For Simulation

There are two typical lines of argument in favour of simulation.

The first is that without it, there is still an element of manual intervention required to test edge cases. For example, if you have a test concerning trade cancellation unless you have a simulator to send the "bust" message, then someone on your team must manually do it, thereby eroding the benefits of automation.

A second argument is that simulation gives certainty of response. For example, what if my test assumes customers should receive a fill, but because they entered an order into a test environment that has no other orders in there for it to fill against? Or what if the test environment has too many, or strangely priced orders? And what about dark pools where users may not know the correct price to enter to generate a trade? Only simulation is the answer they argue.

The Case Against Simulation

The major drawback or simulators is the need for somebody to code the responses to give.

I won't name names, but there was a well-known product in our little niche onboarding that has left quite a damaging legacy; many people tell the same story of trying to implement this tool and then ripping it out because the effort involved in re-scripting each time the spec updated exceeded the effort of doing the onboarding manually.

Pro tip: If you have had a bad experience with automated certification scripting in the past, then I'd encourage you to take another look because the next generation tools have removed this problem.

Even where you have the time and knowledge in-house to do the scripting, we find that the simulator always lags development and becomes another delay to your launch plan. An inaccurate or unavailable simulator slowly erodes trust in its accuracy and therefore accuracy.

In a world where most trading venues and banks have dedicated test environments (something now mandated under MiFID II for a variety of business models), you have to question the need for simulators anyway. Why maintain a third environment when you could simply run certification from test? The core objective of automation is to automate as much of the manual tasks as possible and make it consistent even with junior resources. Simulation is a potential tool to achieve part of that but isn't part of the requirement itself.

The other major problem we see with simulation is how it impacts the end product. When the customer interacts with a simulator, then this enforces quite a rigid, structured certification model. We've all seen the testing tools which clients need to step through in order; the simulation largely enforces his, or at least frames that way of thinking. It also requires the customer to use a UI, making this a "self-service or nothing" approach.

By contrast, old-school, manual on-boarding is far more fluid; my log grep will find a message wherever it appears in a file (and the same message may satisfy two scenarios), which frees the customer from rigid structure and lets them test in whatever order they like and potentially satisfy multiple scenarios with a single message.

We believe that this fluid testing is better, more efficient, and more natural for both testers and onboarders. The goal of automated tools should, therefore, be to support and sustain this fluid approach, while allowing manual tasks to be reduced as far as possible.

What Is FixSpec's Axe In All This?

As a leading vendor of onboarding tools, we obviously have an axe to grind, so here it is…

When we started out, we deliberately based certification off the spec itself, as opposed to scripting. This approach is paying real dividends, as today our products are able to effortlessly and consistently handle multi-protocol testing without onboarders ever writing a single script.

Like most vendors, we've dabbled with simulation too. We felt the pain of trying to keep up with APIs as they changed (and being the last ones to know about them!). It was a bad outcome for us, a bad outcome for our clients and their end customers, and it was not scalable across specs or protocols. So we retired the product and haven’t looked back.

Today, our onboarding tools focus on automating the process of running certifications as opposed to changing the process by introducing simulation. It’s about giving our customers (and their customers) tools to allow them to quickly spot and correct problems and to eliminate manual log grepping and scripting completely.

If you want to give your customers a UI to look at, then we can offer that, but if you just want tools to help your internal team onboard faster and in parallel, then we can do that too. We are keeping our focus on the tasks that take the majority of your time instead of building complex, brittle technology to satisfy the tasks that take the minority of your time.

We’d love to hear your views on the key requirements of automated onboarding tools, and the most painful parts of the process for your firm. Get in touch!

Find This Useful?

To receive more tips for improving efficiency in YOUR connectivity process, sign up to our FREE monthly email newsletter.

Awesome - thanks for signing up. You can unsubscribe at any time by simply dropping us a line.