At Comic Relief, we have 5 core values that we aim to achieve in everything we do; Bold, Creative, Fun, Trustworthy and Engaging. These values can be seen front-and-centre in our campaign activity, but we also embrace these values in how we optimize our digital product offering to ensure that, as well as hitting our campaign targets, we can continue to innovate.

On 9th November, Hiral Patel (Comic Relief’s Analytics Manager) and I attended Opticon London; the annual conference hosted by Optimizely. It was a great event with speakers from multiple industries addressing the unpredictable wins optimization can bring. The day was a good mix of talks and Q&A sessions which allowed us to consider how we may shape our long term strategy and also some immediate best practice learnings we could implement on our return to the office. Below are a couple of those learnings, along with some other observations from the day.

A/B testing shouldn’t always be iterative

It may grate the nerves of the scientific mind to say it, but Secret Escapes’ Rohit Gupta’s talk “Don’t be a scientist” was really helpful to remind us all that the purpose of testing is to optimize a product to achieve its goal, not necessarily to understand why. Is it better to get a 5% increase in conversion and know it was because of a specific button CTA change or a 10% increase where the entire page was different? The scientist in us all wants to know what it was that changed user behaviour, but that doesn’t actually matter – what matters is that the new version worked (or didn’t work) – because you can always find out the ‘why’ later!

dont-be-a-scientistic-dexters-lab-rohit-gupta

Use Minimum Viable Product principles when you test

Stephen from conversion.com gave a fantastic talk regarding the effort/resource levels which should go into testing – and the importance of testing broadly in order to make sure that you don’t waste resource pursuing a ‘favourite’ hypothesis.

Scaling testing - Stephen conversion.vom.jpg

His example of the Guardian’s ‘Save for later’ button was one which really stuck in my mind; rather than building in the functionality which already existed on mobile for the desktop product, the Guardian’s desktop product team simply added the button without the backend functionality to back it up. This meant the test could be executed quickly and cheaply to provide insight into whether the feature would be used by their desktop users. I’m now building out some similar tests to run on our Comic Relief websites – results to follow soon.

Ghost functionality testing Stephen conversion.com.jpg

Every business has constraints on testing

“Experiment Everywhere” was the theme of the day but it was clear that all the attendees found that there were limits to their ability to test – the three most common were:

  • Resource – whilst tools like Optimizely are making it easier for us to test without relying on development time, any test will still require time to implement and, naturally, nobody has unlimited resource. Prioritizing resource for testing can sometimes feel like a ‘nice to have’ (otherwise known as bottom-of-the-backlog) – but companies who use testing in early proof of concept work can save immeasurable amounts of time and money as testing will invalidate hypotheses as well as validate them.
  • Company culture – the speaker from Orange Europe found that the company culture didn’t really have experimentation and analytics as a regular way of working. Even though there was an organisation-wide training programme, only 5% of staff where applying the training in their day-to-day work.orange-europe-experimentation-as-culture

    The way Orange repositioned experimentation was by allowing anyone in the company to propose tests; initially they wouldn’t need to bring anything much more than the barebones concept but once teams saw the benefits and insights which testing produced, more and more of the company got involved and made use of their analytics knowledge to propose other tests. Now Orange Europe has 50% of those who were trained using analytic tools (how’s that for a nice ROI stat?)

  • Traffic levels – this is a big one for us at Comic Relief (but Sarah Stellini, Head of Growth Optimization at Betssen also feels our pain!) – whilst we experience huge spikes in traffic around campaign time (Red Nose Day or Sport Relief), achieving traffic levels to gather significant results outside campaign time can be an issue. We also struggle as different audiences typically use our sites at different times of year; results from a test run in September – December would not necessarily be relevant to our Night Of TV traffic for example.

So what’s next for experimentation at Comic Relief?

First up, we’re going to be doing more of it! With Red Nose Day 2017 sneaking up on us at speed, we’re about to hit the best traffic levels for us to test. We’re also going to embody our company value of being bold with our tests and encourage the company as a whole to make use of the tools we have at our fingertips. And lastly, we’re going to talk about the results more – not just the tests which worked; acknowledging and being open about when our assumptions are proven wrong is potentially more important that celebrating when they were right.

To finish, I’d like to thank Optimizely for hosting us and the other attendees for sharing insights into their own challenges and successes with testing. All the speakers and further information can be found on the Opticon London website.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s