{squeaky lean}

Hit objectives quicker through Validated Learning. Here's how.

Validated Learning is another key concept in Eric Reis' Lean Startup book. He talks about how startups can learn from what they've launched already, to influence what they build next.

For bigger companies, installing a validated learning approach is more complicated.

The problem is that a lot of them see Agile like this:

"Yeah, Agile! We do that. We hand the dev team a prioritised list of things to build, and they tell us how much they can get done in 4 weeks.

We draw straws. Whoever picks the short one, goes to see the developers once a day to discuss progress. They do this at a white board with cards and coloured pens. And no-one's allowed to sit down in case they can't get up again.

At the end of the 4 weeks the developers will have built and launched all our stuff. We hand them a new list and they start again. I think we pay them in cake."

Sounds good right?

Actually, there are two things wrong with it:

  1. It assumes that whoever hands the dev team the prioritised list is the rare kind of genius that knows exactly what should be done and in what order. That they need no feedback at all to help them grow your company faster than mere mortals could possibly achieve; and
  2. It assumes that the products are always perfect first time, every time. That you can just launch them and forget them, confident that the world will be enthralled by their sheer magnificence. And without a single tweak required.

Thing is:

  1. They're not that rare kind of genius; and
  2. Products are never right first time

Assuming we're agreed on those two points, we should try to change the way we do things to accommodate them.

So along with our developers, editorial team, sales people and anyone else with a say in what the dev team do, we need to always:

  1. Improve our knowledge of which products and features will delight our audience; and
  2. Check that the products and features we launch are having the desired effect on our audience

The good news is that just by doing number 2, we get number 1 for free!

Validating launched products and features

I'm assuming that your dev team is using an Agile approach to development. If not, at least go and sort that bit out first. I don't care which flavour of Agile they use, just that the basic concepts are being followed.

Done? Good. Your dev team should be proudly standing by a whiteboard that looks like this:

Good for your dev team. Now get everyone else to join in.

Everyone needs to feel responsible for whether or not the products and features being delivered are the right products and features to deliver. And everyone needs to feel responsible for wether or not they are performing as well as they could be.

To achieve this, all you need to do is:

  1. Make sure, as a team, you've agreed your objective along with an immediate KPI to improve. Read objective setting for dummies for more on this
  2. Write the KPI, the current score and a target score on the top of the whiteboard
  3. Add a column to the end of the workflow board entitled 'Validating'

User stories (Agile speak for a feature that needs building) normally finish their journey across an Agile workflow board once they're released to the live environment and no bugs show up. They are considered DONE.

But not when you add a Validating column.

The story would instead move from DONE to VALIDATING, meaning that the team are in the process of measuring and monitoring the story to decide wether it's working in business terms. I.e. If it is positively effecting the chosen KPI.

I recently launched a pilot programme to try this approach at the company I work.

How WE do it

Every two weeks, about 10 people who work on a brand website: publishing, editorial, development, and in some cases even the Managing Director, will stand at the board and discuss the stories in the validated column.

By this time, the product manager will have ensured there is results data against the chosen KPI, and they will have written the relevant numbers under the story card.

As a story's merits are debated, there are three possible outcomes for it, each of which normally trigger an action:

    1. It is proven to be a success and gets to live out its days proudly in the 'Good Idea' envelope at the bottom of the board
      Actions:

      • New similar stories might be generated because of its success.
      • Some further A/B testing might be agreed, to see if its results can be improved even further
      • Related stories in the backlog might be prioritised higher because there is more confidence that they will improve the KPI
    2. It is deemed a failure and is vanquished to the 'Bad Idea' envelope at the bottom of the board
      Actions:

      • The product or feature might rolled back so it can't do any more damage
      • Related cards in the backlog might be removed, ripped up and binned
      • We may learn that it was our approach that was flawed, rather than the idea, and a new story is generated to do it the way it should have been done first time round
    3. It's deemed too early to tell if it is a good or a bad idea
      Actions:

      • We might agree to wait another two weeks to debate again
      • We decide wether or not related stories should be put on hold until this one is validated
As you can see from the pic below, we weren't as harsh as to actually call the envelopes "Good idea" / "Bad Idea", but you get the idea:validated learning envelopes

Once this debate is finished, we go into a meeting room and build a backlog for the next two weeks with all of this data and debate fresh in our minds.

This is Validated Learning.

We've been doing it this way for 3 months and the sense of pace towards a focused goal has increased significantly.

Also, because everyone is motivated by the KPI, there is a natural collective push for Minimum Viable Product. The team now crave fast feedback through metrics. They know they can get it quicker by first validating a less sophisticated version of the planned end product, so they do.

Overall, they are more engaged because they always have a common goal and are regularly measuring themselves against it. As a team.

And suddenly, the trusty workflow board has developed from:

  • a tool used to help a product manager and a dev team work through their backlog of work,
  • to a tool used by the wider business to pinpoint the tactics required to hit the business' objectives.

And its the same £100 white board it was before. Pretty good value if you ask me.

So, in summary:

  1. Pick a KPI
  2. Add a VALIDATING column
  3. Debate the value of things in the Validating column regularly
  4. Take action based on the outcome of these debates

And that's Validated Learning.

Like this?

OR

By Kevin Heery.

Google

MVP Jenga – What's a Minimum Viable Product and How do I Get one?

Minimum Viable Product - MVP JengaMinimum Viable Product (MVP). What's one of them? How do I identify one for my next launch?

Wel lets imagine you run a news website.

You've measured your customer journey KPIs and identified your next objective:

"Improve first to second visit conversion rate"

You arrived at this objective after seeing good audience acquisition numbers, but disappointing return visits. You dug a bit further and decided it was the new users, rather than returning users, that were the problem.

Nice objective setting. I'm impressed.

You went further. You got your team together, ran a brainstorm and everyone is now very excited about one big idea:

Categorised News Alerts

  • Users sign up via Facebook, Twitter or an internal registration system, which will be built by your own team
  • Users will select from 20 different categories of news
  • And will be automatically alerted via their chosen communication method, as soon as a relevant news story is published

So, you are now about go and tell your team to make it so. They will carry out this instruction smoothly and efficiently and in no time at all, your return visit numbers will be legendary.

OR

It will take your team five times longer than expected, your new visitors will completely ignore this feature and will never return. And it will have been an enormous waste of time.

Sorry. I even annoyed myself slighty there. BUT, lets face it, it's not like the latter situation is rare. It happens all the time, and it's often nobody's fault.

As hard as people try to make things happen EXACTLY as planned, unforseen circumstances and unpredictable audience behaviour can make a mockery of your plan and your wonderful idea.

How do you account for this then?

Let me tell you about MVP Jenga.

What's that you ask?

Well, if I'm honest, I just made it up. But stay with me here.

MVP Jenga

..is a game you play with your product team. If you've read 5 lessons a lean startup must relearn, then you'll know that this product team consists of everyone who might contribute to you achieving your objective. Here's how you play:

Setup:

Make sure the team know the background of the product idea. Talk through:

– The objective and how you arrived at it
– Your product vision

Beginning the game:

Ask people to suggest simpler versions of the product. A successful suggestion must meet three criteria:

  1. It must be simpler and easier to develop
  2. An audience would recognise it as an improvement to the current available product
  3. It must be a valid test of whether or not the product vision will actually help achieve your objective

The first person who has a suggestion that passes these 3 criteria, gets 1 point, and wins round 1.

Repeat:

Keep going through as many rounds as you can. In round 2 a successful suggestion earns 2 points, in round 3 its 4 points, in round 4 its 8 points, and the scores double like that until no-one can improve on the previous suggestion.

When that happens, the game ends and the team have unearthed a Minimum Viable Product.

The person with the most points is given a 20% pay rise while the person with the least points is escorted from the building.

That last part is optional.

Right, lets imagine your news website team playing MVP Jenga with the product idea: Categorised News Alerts

Round 1:

Tracy, digital marketer, is very excited that there is some kind of game being played at work. She is ready to dive straight in.

"Maybe we can tell if people like the idea of news alerts without them having to sign up to our own system. All my friends use Facebook these days", says Tracy, "And I bet there are hundreds more people on there I don't even know! Maybe that's enough. Maybe it's enough without Twittle even?"

The team let out a collective sigh, but after debating her idea against the three criteria, Bang! Point in the bag for Tracy.

Tracy screams and then taps away on her phone to tell everyone she knows of the monumental event that's just occurred.

Scores: Tracy 1. Everyone else 0.

Round 2:

After just 20 seconds of silence, Phil, the new PHP developer in the team, stops scratching himself and says "As far as I can tell, the complicated bit of this is the automation of the alerts. What if the editorial team just alert people manually when they publish a news story, by logging into Facebook Admin?"

Phil is immediately hit by ice-cold stares from Janet, the editor, and Steve, the staff writer.

Janet thinks fast and says "Phil seems simply to be moving the work from himself to Steve. Steve would be the one who has to do all the extra work. I mean, I wouldn't have the time to help out with that."

Steve's eyes move quickly from Phil to Janet, and then slowly to the floor. "What the..?!" thinks Steve.

The team debate the idea and eventually fall on the side of Phil, based on the expected increased speed to market of his idea.

Even though there are six people in the room, Steve feels very lonely.

Scores: Tracy 1 Phil 2

Round 3:

Quinten, the publisher, has been wondering why we can't just build the original product vision. But all of a sudden he understands. He can see the benefit of having a product in the market much sooner, and at far less cost, giving him feedback on whether this whole news alert idea is actually any good in the first place.

He stands up and shouts "I'VE GOT ONE! Rather than doing this for 20 categories straight away, how about we just do it for sport stories. If that seemed to work we could roll it out for other categories later! And it will make Steve's job much easier until we decide to automate it after all!"

Steve looks up at Quinten thankfully. Quinten pats Steve on the head.

The team debate this idea and SCORE! Quinten comes from nowhere to take the lead with 4 points.

Round 4:

Eddie, the product manager, has been squirming a little, thinking 'I'm the product manager for pity's sake, everyone's wondering why I haven't got any points yet. Come on Eddie, think. THINK!…At least say something. ANYTHING!'

"I know", squeaks Eddie, he doesn't normally squeak, and a few people can't help snigger, "To find out if new visitors might be interested in news alerts at all, why don't we just ask them when they turn up? And get their email address from them.

You know: 'GET SPORT NEWS ALERTS. STRAIGHT TO YOUR MAILBOX AS IT HAPPENS. SIGN UP NOW'.

Once they do it we can tell them the feature is coming soon, but not do any more development unless enough people show interest"

Silence. No Facebook, no Twitter, no registration system, just capturing an email address. No categories, not even any actual alerts! 'It's too simple' thinks Eddie, 'they think I'm an idiot. What have I DONE?!'

Phil, itchy PHP developer, breaks the silence. "Well it's certainly easier from my point of view. Check cookie, present message, take email address. I can just stick it in a text file. Couple of hours work I reckon"

Next, Tracey. "Still looks like extra value. If people are interested in alerts then it should be a nice surprise for them. They won't know we're planning more, so they won't be disappointed"

And finally, Steve. "And isn't it still a valid test too? If plenty of people sign up, then we know they're interested in alerts. If not, then we know they're not. We'll have a good idea of what percentage of new visitors are interested and we can decide wether or not to do any more development once those results are in."

'YESSSSS!!!', thinks Eddie. And everyone is agreed. His idea passes. 8 points.

Scores: Tracy 1, Phil 2, Quinten 4, Eddie 8.

Round 5:

Nothing. Eddie wins this game of MVP Jenga, and this online news site team has identified a Minimum Viable Product – an MVP.

But, what happened to the vision? What about the actual news alerts?

Well, if the MVP shows there is an audience for news alerts, then the team might progress to sending some manual sport alerts out, to see if their 1st to 2nd visit conversion rate actually goes up.

If it does, they might progress to either manual alerts on more categories, or decide it's already successful enough to build in the automation – and save Steve's weekends.

Then they might introduce Facebook Connect, to see if that improves sign up conversion, and then Twitter to see if it improves even further.

And hey presto, they arrive at their original product vision, probably just as quickly, and after having a product live in the market providing business value, for months.

Even if the MVP is successful though, somewhere along their original path, the team will see data about their audience's behaviour that will surprise them and it will steer them onto a different path.

This is a good thing.

They will be driven by real user feedback rather than assumptions. They will find the product people WANT rather than the one the team ASSUME they want.

And their business will be more successful because of it.

Because of MVP Jenga.

Try it.

And let me know if it works!

Like this?


By Kevin Heery.

Google