{squeaky lean}

The 'Trigger Objective' – The secret of good strategy

strategic planning - trigger objectives

UK cultural reference. http://en.wikipedia.org/wiki/Trigger_(Only_Fools_and_Horses)

Top Rank Objectives can be daunting. So can the strategic planning process required to decide how to go about achieving them.

You know the kind of objective I mean: 'Increase revenue', 'Increase audience'.

They're a bit like: 'Discover meaning of life', 'Deliver world peace'.

Where on earth do you start?!

Just to give you a chance of getting your head around them, you need to break them down into lower level objectives, strategies and tactics.

However, if you read through the conflicting definitions you get from googling 'objectives, strategies, tactics', you'll get a headache.

What we are all actually looking for, are ways of improving our strategic planning process. A good one will deliver us from daunting Top Rank Objectives, safely and easily to what I call 'Trigger Objectives', which aren't daunting at all.

A Trigger Objective is a much lower level objective, set in the reality of the here and now. It relates closely to the knowledge and experience of you and your team, so you can easily list a set of actions or experiments that you think might achieve it.

But the real beauty of a Trigger Objective is that when you achieve it, it sets off a chain of events that sends you on your way to that daunting Top Rank Objective, without you having to spend a moment thinking about it.

Here's how you find Trigger Objectives:

1. Write down your Top Ranking Objective (TRO)

For most profit-making organisations, this is either 'Increase profit', or 'Increase revenue', depending on which stage of development your company is at.

Not exactly life affirming I know.

But life affirming is what your Vision, Values and Mission Statements are for. There's no need to be fancy about your TRO, just be honest and write down what it is.

Now. Get everyone you need to contribute to your TRO together. Or at least a representative from each relevant department. They need to be a part of the debate that comes next.

2. Decide your 2nd rank objectives and pick one

We're still talking high level here. Looking at your P&L might help, as these could be related to your main revenue lines. In my world, if I was doing this for one of our publishing websites like NME.com or GoodtoKnow.co.uk, our 2nd rank objectives might be:

  • Increase display ad revenues
  • Increase sponsorship revenues
  • Increase affiliate revenues

If you stopped here, and asked a team to suggest actions or experiments that would achieve one of these, you would get either blank faces or a cacophony of ideas with no focus. And no idea which ones to start with.

So keep moving. You need to focus on one of the objectives and dig into it further.

Weight them in terms of current value (perhaps based on overall planned revenue contribution) and opportunity for new value (based on which ones you feel have more room for growth).

In my example we'll pick 'Increase display ad revenues'.

3. Decide your 3rd rank objectives and pick one

Thinking through the levers that effect 'Increase display ad revenues', I can list these 3rd rank objectives:

  • Increase display ad inventory (providing more ad space to sell)
  • Increase display ad yield (a higher value per ad sold)

Again, we're still too high, so I'll pick one to dig into further.

Lets say ad yield is holding strong, but we're short of ad inventory. If this were true then choosing would be easy, we'd dig into 'Increase display ad inventory'

4. Decide your 4th rank objectives and pick one

There is really only one metric that effects the amount of inventory we have on a website: page impressions. Some people might include ads per page, but we certainly don't go about adding more ad slots to pages just to increase inventory. So our 4th rank objectives should be levers that effect page impressions:

  • Increase unique visitors
  • Increase visits per unique visitor
  • Increase pages per visit

Right. At about this level – lively debate breaks out.

Suddenly there are people in the room exclaiming that our SEO is woeful so we should work on that to increase Unique Visitors. Someone else is complaining that retention is low and that a membership club of some kind would do wonders to increase our Visits Per Unique Visitor.

This is how you know you're close to Trigger Objectives . Pennies start dropping and people start to jump a step, because we're at a low enough level for people to engage their own knowledge and experience.

Data is still your closest ally though. Use your metrics to settle the debate and pick one objective to focus on. In our example, the data is pointing to Unique Visitors as our search rankings have dropped sharply in the past few months.

5. Decide your Trigger Objective

What can be done to 'Increase Unique Visitors'?

  • Increase offline marketing activity
  • Increase online marketing activity
  • Increase visitors from organic search
  • Increase social referrals
  • Increase month on month visitor retention

From the discussion in step 4, SEO was raised, and search metrics discussed. It's what helped us decide to focus on Unique Visitors in the first place, so lets just pick it:

  • Increase visitors from organic search


Done. You have discovered your Trigger Objective. At Level 5.  

Interestingly, I'd say most Trigger Objectives are found at this level. It's perhaps why Eric Reis' – 5 Why's works so well.

No stopping now though. Time to move onto Tactics.

Your tactics are the actions you're going to take to hit your Trigger Objective. As its a Trigger Objective you won't have any problem coming up with a nice, big healthy list. The team has helped steer you to this objective, so they should be brimming with ideas.

List them and prioritise them.

I can reveal now that at IPC, we recently went through these exact steps on one website and ended up with this exact Trigger Objective. We also prioritised 'Improve page load times' as our priority tactic. And at the time of writing, the entire team, from Publishing Director to Front End Developer, are completely focused on page load times.

'Even the Publishing Director!?' Yes. He was in the room as we travelled from Increase Revenues, the objective he is ultimately responsible for, all the way to Improve page load times. He's been a part of the collective reasoning that linked one to the other. He is buying into the assumption that faster page load times = increased revenues.

And here is the path that links them, in all its glory:

Looking at the diagram above, you can see there's no need for a long winded strategy document. This diagram contains your Top Rank Objective, your Strategy, your Trigger Objective and your Priority Tactic.  Displaying it in a format such as the one above, makes it immediately clear what you are focusing on. Which makes it more valuable.

Also, if you decide to pivot from this strategy, you can see that you don't need to dismiss the entire path. You might instead move up one level and down to a different Trigger Objective. Or you might go up two or three levels, before choosing a different path. You have a mechanism for the team to use to make these decisions without having to start from scratch.

It should be a living model, changing as you discover new levers that effect bubbles in the diagram, or prove existing levers redundant.

For now though: find your Trigger Objective, decide the Priority Tactic, and go build, launch, measure, learn, build, launch, measure, learn…..etc.

Like this?

By Kevin Heery.


Hit objectives quicker through Validated Learning. Here's how.

Validated Learning is another key concept in Eric Reis' Lean Startup book. He talks about how startups can learn from what they've launched already, to influence what they build next.

For bigger companies, installing a validated learning approach is more complicated.

The problem is that a lot of them see Agile like this:

"Yeah, Agile! We do that. We hand the dev team a prioritised list of things to build, and they tell us how much they can get done in 4 weeks.

We draw straws. Whoever picks the short one, goes to see the developers once a day to discuss progress. They do this at a white board with cards and coloured pens. And no-one's allowed to sit down in case they can't get up again.

At the end of the 4 weeks the developers will have built and launched all our stuff. We hand them a new list and they start again. I think we pay them in cake."

Sounds good right?

Actually, there are two things wrong with it:

  1. It assumes that whoever hands the dev team the prioritised list is the rare kind of genius that knows exactly what should be done and in what order. That they need no feedback at all to help them grow your company faster than mere mortals could possibly achieve; and
  2. It assumes that the products are always perfect first time, every time. That you can just launch them and forget them, confident that the world will be enthralled by their sheer magnificence. And without a single tweak required.

Thing is:

  1. They're not that rare kind of genius; and
  2. Products are never right first time

Assuming we're agreed on those two points, we should try to change the way we do things to accommodate them.

So along with our developers, editorial team, sales people and anyone else with a say in what the dev team do, we need to always:

  1. Improve our knowledge of which products and features will delight our audience; and
  2. Check that the products and features we launch are having the desired effect on our audience

The good news is that just by doing number 2, we get number 1 for free!

Validating launched products and features

I'm assuming that your dev team is using an Agile approach to development. If not, at least go and sort that bit out first. I don't care which flavour of Agile they use, just that the basic concepts are being followed.

Done? Good. Your dev team should be proudly standing by a whiteboard that looks like this:

Good for your dev team. Now get everyone else to join in.

Everyone needs to feel responsible for whether or not the products and features being delivered are the right products and features to deliver. And everyone needs to feel responsible for wether or not they are performing as well as they could be.

To achieve this, all you need to do is:

  1. Make sure, as a team, you've agreed your objective along with an immediate KPI to improve. Read objective setting for dummies for more on this
  2. Write the KPI, the current score and a target score on the top of the whiteboard
  3. Add a column to the end of the workflow board entitled 'Validating'

User stories (Agile speak for a feature that needs building) normally finish their journey across an Agile workflow board once they're released to the live environment and no bugs show up. They are considered DONE.

But not when you add a Validating column.

The story would instead move from DONE to VALIDATING, meaning that the team are in the process of measuring and monitoring the story to decide wether it's working in business terms. I.e. If it is positively effecting the chosen KPI.

I recently launched a pilot programme to try this approach at the company I work.

How WE do it

Every two weeks, about 10 people who work on a brand website: publishing, editorial, development, and in some cases even the Managing Director, will stand at the board and discuss the stories in the validated column.

By this time, the product manager will have ensured there is results data against the chosen KPI, and they will have written the relevant numbers under the story card.

As a story's merits are debated, there are three possible outcomes for it, each of which normally trigger an action:

    1. It is proven to be a success and gets to live out its days proudly in the 'Good Idea' envelope at the bottom of the board

      • New similar stories might be generated because of its success.
      • Some further A/B testing might be agreed, to see if its results can be improved even further
      • Related stories in the backlog might be prioritised higher because there is more confidence that they will improve the KPI
    2. It is deemed a failure and is vanquished to the 'Bad Idea' envelope at the bottom of the board

      • The product or feature might rolled back so it can't do any more damage
      • Related cards in the backlog might be removed, ripped up and binned
      • We may learn that it was our approach that was flawed, rather than the idea, and a new story is generated to do it the way it should have been done first time round
    3. It's deemed too early to tell if it is a good or a bad idea

      • We might agree to wait another two weeks to debate again
      • We decide wether or not related stories should be put on hold until this one is validated
As you can see from the pic below, we weren't as harsh as to actually call the envelopes "Good idea" / "Bad Idea", but you get the idea:validated learning envelopes

Once this debate is finished, we go into a meeting room and build a backlog for the next two weeks with all of this data and debate fresh in our minds.

This is Validated Learning.

We've been doing it this way for 3 months and the sense of pace towards a focused goal has increased significantly.

Also, because everyone is motivated by the KPI, there is a natural collective push for Minimum Viable Product. The team now crave fast feedback through metrics. They know they can get it quicker by first validating a less sophisticated version of the planned end product, so they do.

Overall, they are more engaged because they always have a common goal and are regularly measuring themselves against it. As a team.

And suddenly, the trusty workflow board has developed from:

  • a tool used to help a product manager and a dev team work through their backlog of work,
  • to a tool used by the wider business to pinpoint the tactics required to hit the business' objectives.

And its the same £100 white board it was before. Pretty good value if you ask me.

So, in summary:

  1. Pick a KPI
  2. Add a VALIDATING column
  3. Debate the value of things in the Validating column regularly
  4. Take action based on the outcome of these debates

And that's Validated Learning.

Like this?


By Kevin Heery.