I have recently been working with Paul Hill on the first NHS Digital project to implement the Jobs To Be Done (JTBD) theory in Domain E.
This is somewhat of a hot topic in UX at the moment even though there are limited examples of how it’s actually been applied to a project. It is because of this I wanted to share our honest experience with it.
We initially decided to use JTBD for four reasons:
To broaden our skill set
We were curious about JTBD
We had a gut feeling it might work
We felt we had a homogeneous user base
Jobs and user research
We spent 6 weeks gathering existing research and conducting our own ethnographic research through 4 participatory workshops around the country.Taking the view that our users were mostly homogeneous we thought that empathy mapping across a generic user journey would allow us to understand the whats, whys and hows of the users’ jobs to be done. By looking at what they did and how they felt, we were able to construct some obvious jobs to be done that were focussed around their motivations and goals — notjust what they said they wanted to do
Difficulties with job stories
As this was our first time writing jobs stories (a technique promoted by Alan Klement) we spent quite a while writing them… and then rewriting them… and then rewriting them.
We weren’t sure whether we should write out a different job story for every potential situation when the motivation and outcome were the same
A lot of the initial job stories we wrote had specific motivations. We weren’t necessarily solutionising, but we were reducing the freedom we wanted in the design process — one of the benefits of the JTBD approach
We found it difficult at first to capture the real outcome of the job. We had to really ask ourselves why the users wanted to do something.
The job prioritisation exercise
One of the recommended approaches for JTBD is to use a prioritisation survey to uncover the under served and over served jobs in the market so you can focus on developing solutions that are going to make a difference to users (suggested by Tony Ulwick in his ODI method). For every job story or job statement, you ask users how important that job is and how satisfied they are by current solutions, using a 10 point Likert scale. You send this survey out to a large number of users to try and get a quantitative angle on the problem.
Though we initially planned to conduct this survey we eventually had to concede defeat. We found the survey was too long and therefore too expensive to conduct. Ultimately however, we underestimated how long it would take to write and release and we were pushed for time.Instead of giving up we managed to find a single example of someone using a card sort for the prioritisation exercise (I can’t find the link). Given the time constraints, we conducted the card sort with a small number of internal users. They read each statement and decided how satisfied they were with the current solution (not applicable, not at all, slightly or very). We then asked them to rank in terms of importance the jobs for which they were not at all satisfied or slightly satisfied with the current solutions.
Even with this approach we found we had too much data to handle in our short time frame. To tackle this we arbitrarily took the top 10 most important job stories from the “not at all satisfied” bucket and tallied them up. Those that appeared at least twice were then taken as our prioritised list of job stories.
Jobs and the design process
We kicked off the design process with an ideation workshop that used our job stories as the focal point. We included subject matter experts that had no prior knowledge of jobs to be done and indeed had little experience of any design process. It turned out they had no problems understanding the jobs we’d written and every idea they proposed was clearly linked to a job to be done.We took their ideas and created an initial prototype. After every round of testing we made sure that any new features were linked to one of our uncovered jobs to be done.
Shifting the Job fidelity
Though this was not mentioned in any literature we found, we decided to make the job stories high level to begin with as this enabled us to maximise our freedom within the design process. It also ensured we didn’t get swamped by an unmanageable number of ‘similar’ job stories and we felt it would help us to design better solutions.After multiple rounds of testing we made the job stories much more granular to improve clarity for our delivery partners. This shift in ‘job fidelity’ meant the job stories were more flexible and could evolve as we moved from uncertainty to a greater level of confidence.
What worked well
Job stories resonated with stakeholders
During the ideation phase we found stakeholders could easily put themselves in the user’s shoes — one of the proposed benefits of personas. They were able to explain how and why their solutions might allow the user to get their job done.
Job stories resonated with users
Users ‘got it’. Often when they saw our solutions they would describe the situations they’d been in when it would have been useful and emphasised whyit would have been useful. They were describing their job stories in their own words
Insight captured within the job story
Using Job stories allowed us to remove most of the ambiguity that would exist in a typical user story because of its focus on situations and motivations. To highlight why this context would help I have taken one user story off of the GDS website and created two possible job stories.
In the example user story there is no context of time or confidence, so this can’t be fed into the design process (unless you start adding lengthy acceptance criteria). Although the two users in this example want to get their details on the electoral register, you may have to design different solutions to meet their needs.
What didn’t work well
Limited resources on practically applying JTBD
We found various medium articles, blog posts about why and when to use JTBD but felt there was a lack of information on its practical application and any real case studies (one of the reasons I felt we should share our experience).
Conflicting approaches from ‘the experts’
The information that is available is often conflicting, with the likes of Tony Ulwick and Alan Klement having a public and long running argument over who really understands JTBD. This basically meant we ended up applying the JTBD theory in our own way (also known as “Winging it”).
Prioritisation exercises aren’t quick
Academically it sounded like a great idea but in reality we found them to be long, cumbersome and people found it hard to understand what we were asking. That said, we believe it may work better if you were to use job statements which are considerably less wordy e.g. “Transport me and my belongings via the ground”. Fully open to debate on this.
How do we know a solution satisfies a Job to be Done?
Until something goes live, how do we know it works or that it’s a success? We felt during concept validation that the jobs resonated with users. It made us confident that what we were proposing would meet the JTBD. But as with any design process, until something is live and can be measured, you’re never sure. Click here for my colleagues related article and discussion:
Too many jobs?
As job stories are very specific in nature we ended up with a very large number of job stories to write, prioritise and design for. I believe this may have led to a larger workload than if we had used personas and user stories.As I mentioned earlier, we struggled with some job stories being very similar. The situation is different — but is it really different? Maybe. What we do know is that it felt like we were working with a lot of duplicate jobs that users couldn’t differentiate from when we were prioritising
“The statements are very repetitive and seem to ask the same thing multiple times. This makes it a little tedious and frustrating to fill out.”
What would we do differently?
Write clearer jobs stories from the start
Merge some job stories — if it seems similar, it probably is
Complete a job prioritisation survey with simpler language and more distinct job stories or job statements
Would we use JTBD again?
If it felt like the right project.
We have temporarily disabled the comment functionality whilst we make some technical updates. In the meantime, we’d love to hear what you think about our work so tweet us @NHSDigital using the hashtag #NHSTransformationBlog or get in touch at firstname.lastname@example.org.