How to stay relevant and challenge assumptions through customised CRO process and flexible reporting: a Delhaize case study

Subject
Conversion Rate Optimisation (CRO) 

Time to read
10mins

Intended audience
With some limited experience of CRO projects within a large organisation with multiple stakeholders (often pulling in different directions).

Adobe Experience Cloud Solutions
Adobe Target, Adobe Analytics + Workspace 


This is the first part of a 2-part blog discussing the importance of having a customised Conversion Rate Optimisation (CRO) process and the value of building flexibility into your test reporting. To demonstrate these points, we’ll consider our work supporting Delhaize.be as presented at the Digital First conference in Belgium.

If you’re already a CRO process-ninja…

1

Well done to you!!

2

You will soon be able to read the second part of the story which deals with building on these foundations with a good reporting set-up.

If that’s not you (yet), let’s dive right in and take a look at this together…


Part 1: The process

A CRO program’s key ingredients

As anyone involved in conversion rate optimisation within the setting of a large organisation will confirm, there are specific ingredients which cannot be skipped if you want your CRO program to succeed. If we were all to sit around a table and discuss, I’m sure we’d have some different opinions on exactly what ALL the ingredients are but I’m confident we could agree on this short-list:

1

People (talented, curious, open-minded)

2

Governance (detailed, efficient, transparent)

3

Process (simple, linear, iterative)

What I am not certain about is whether ‘process’ would receive as much love as it should.

It’s not the most exciting ingredient (that’s definitely ‘people’) or the easiest (‘governance’ is not difficult) but in our experience, it’s the absolute top-dog on that list. Without good process in place, known and followed, the other two agreed main ingredients will struggle to positively influence success.

Default CRO Process

So, let’s talk about ‘process’. First of all, what is ‘good’ process and then how can we turn up the dial and make it ‘great’? Well, let’s start with a basic, best practice approach to how a conversion rate optimisation project should be run.

It starts with very clear phases and core tasks that must be passed through as you progress with your testing roadmap and each individual test. At Nobi this is our default starting point:

First, along the top of the flow, comes all the preparation work needed to actually identify good tests to run and ultimately decide what it is you’re going to test next.

Then, once there is stakeholder agreement you enter the 3 phases each individual test should pass through… you plan it, you build it, you learn something (hopefully 😉 ).

Next, for each wave of testing, you want to be in a position to recommend to the organisation what to do next. Based on your test’s outcome, should you hard code an alternative experience, reiterate with a new hypothesis or possibly, abandon and move on.

And this is all a never-ending cycle…
prep > test > act > prep > test > act > prep > test > act…

Great (fine-tuned) CRO process


“Behind every successful CRO program, stands a great process”*

Process can only live up to its top-dog position if it works for us; not against us, so figuring out what needs added, tweaked or dropped is vital. These fine-tuning steps are what takes it from ‘good’ to ‘great’.

* This is a completely made up quote but that doesn’t mean it’s not true.


On a recent project Nobi worked with Ahold Delhaize to evolve the initial default CRO process into something more specific to their needs, guided by some early teething-issues, so that they now have a customised process that supports their successful CRO program.

Who are Ahold Delhaize?

  • Founded 1867
  • 788 Delhaize stores in Belgium
  • 14,000 employees
  • €5 billion revenue (2018)
  • Also operating in Greece, Romania, Serbia

no

no

But let’s not jump too far ahead… you need details!

The problem and our solutions

In a nutshell, the Delhaize CRO team were not always running the tests the wider organisation cared most about at that moment in time.

Ideation | Open the ideation funnel

The first part of the problem was that although we were collecting valid test ideas through standard ideation sessions, we were limited by how many people could actually attend those sessions and contribute their insights. Delhaize has teams in Belgium, Greece, Romania and Serbia with each working semi-independently so not being able to easily include all those teams was clearly a significant handicap. On top of that, even at the local level, it was impractical to expect timely representation from all the Belgian teams that work in one way or another on the Delhaize online stores.

Nobi’s first solution was to increase the scope and flexibility of our ideation funnel by adding a crowd-sourced approach independently of our standard ideation workshops.
This involved an online, open-access form which allowed anyone within the Delhaize organisation to propose a test idea at any time as long as they could provide 4 essential pieces of information:

If they can tell us this, we’ll then give the usual in-depth consideration to how valid the proposal is and if it’s accepted it gets added to our list of ideas just as if it had come up during a scheduled, structured workshop.

The obvious benefit for Delhaize is that their next test (and the next and the next and so on)  is always being chosen from a complete and up to date pool of proposals.

Evaluation | Score local importance

The second part of the problem we had to solve was that the default approach with one ideation workshop every X months did not allow us to stay focused on the most relevant ideas because that gauge on what was or wasn’t relevant was always shifting. What was a super-top-priority for the online store in January would probably cease to be a priority by February and become of very little interest by March.
So, the idea of a roadmap being marked out for X month’s worth of testing, written in stone, just didn’t make sense for Delhaize.

Nobi’s second solution was to reconfigure the Evaluation task to ensure we were always up to date with priority levels and our tests were relevant for the local teams.

1

Evaluation moved from being a once every X months task to being a per wave task… constant re-evaluation.

2

Scoring of our ideas would include a new ‘local priority’ field which local teams could adjust themselves to rank old and new proposals on a scale of 1 – 10 priority.

January 15th:
What the CRO team hear: “We need this insight now and you must make it happen.”

February 5th:
What the CRO team hear: “It would be good to know.”

March 4th:
What the CRO team hear: “If you have nothing else… why not!?”

This continual re-assessment by stakeholders ensures that Delhaize’s next test is always chosen from the most relevant candidates.

Conclusion

We recommend improving your CRO process by moving away from limited, occasional ideation and one-off evaluation and instead figuring out a way to allow anyone with an interest in improving the business’ key online goals to both suggest and score test ideas whenever and however often they need.


If you want more…

Nobi and Delhaize presented on this topic at the Digital First conference and you can grab a copy of the deck here:


Yes, subscribe me to Nobi mailing list

If you’re interested to see how we used the Delhaize CRO process and the flexible reporting capabilities of Adobe Workspace to challenge a stakeholder assumption and deliver valuable insights, you will soon be able to read the second part of the story which deals with building on these foundations with a good reporting set-up.

Contact us:

If you’d like to talk with us about how Nobi can help your organisation get started with CRO process or possibly move your own ‘good’ process to a ‘great’ one, please get in touch.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.