Keep learning from users

The Challenge

Make sure you're solving the right problem in the right way so you're delivering your service promise and optimising opportunities.

My Role

Researcher | Design Strategist | Facilitator

The Process

Introduce continuous feedback loops; test, measure and learn for an insight-driven and iterative design process.

A continuous feedback loop is a mechanism to receive feedback, usually against agree metrics or desired outcomes, so you can identify strengths and weaknesses and make changes to improve. They work well when you want to deliver a high impact experience but you're not sure of the solution that will deliver the desired impact. Click here for a 2 minute video explainer.

Step1: Define the bet; what do you want to learn and how will you measure success

  • Each test and learn loop is focused on defined learning objectives for any given bet with clear metrics or outcomes that indicate you have achieved or are close to achieving the desired impact on users.
  • Learning objectives early in the continuous cycle are likely to be broad. For example, to test if the overall concept is a solution-fit. Once this is established you can work towards gradually refining the learning objectives.

Step 2: Only test what you need to test - keep concepts to a minimum

Only create tools that only include features you want to learn about; anything that doesn't inform learning is waste so only create the minimum requirement for learning. Consider a range of Minimally Viable Products (or MVPs) for different learning stages.

For example:

  • Low-fidelity features; this could be paper and post-its
  • Wizard of Oz; produce a front-end interface and support back-end functionality with a developer for facilitated testing
  • Fake doors: a link to a feature on the website but it doesn't go anywhere i.e. the user clicks on the link and they get a message "We are exploring this option, thank you for indicating your interest."
  • Fully interactive prototypes.

Step 3: Stay disciplined - facilitate ongoing learning

Regular team practices and easy to understand metrics help to keep people engaged with learning:

  • Weekly co-ordination practices to focus on setting objectives and key results; celebrate successes linked to learning objectives that drive good design and investment decisions, including early failures
  • Weekly learning sessions to review and interpret findings; coach teams to elicit and exploit insights and steer roadmaps
  • Celebrate decisions and the progress it delivers - go, no-go or pivot decisions based on insights are solid progress and worth celebrating.

Outcomes from this phase include:

  • Bottom-up innovation driven by cross-functional and self-managing teams
  • Transparency and easy to understand team performance metrics
  • Baseline metrics that demonstrate user value - sometimes called North Star metrics
  • Strengthened learning capability and how to apply to design and development decisions
  • Service blueprint to support scale of delivery

Additional Information

Click here for some tips on formulating hypotheses for testing.