Kings Court Trust provides legal services for families after the death. Most of its clients come from referrals from business partners such as will-writers. It has a web app called Insight to help partners make referrals and track cases. But in 2014, Insight was the cause of a lot of negative feedback. When I joined the company in early 2015, I was asked to help.
Below - Insight in early 2015
The broad goal of the project was to identify problems with Insight and design and launch an improved version, with improved satisfaction and engagement.
How might we more easily help partners make referrals and track cases through Insight?
Me – Sole product designer and project lead, working as part of an in-house cross-functional product team.
I travelled around the country visiting users at their places of work. Getting a feel for their working environment, and conducting user tests on their computers and devices, helped me gain a richer understanding of context, use cases and problems.
Above: I recorded all user tests with both screen recording software (for a clear view on mouse movements and clicks) and a camera (for hand gestures and contextual information).
Below: After each test I reviewed the videos and summarised the results in a spreadsheet, highlighting key issues for review.
As well as qualitative research, I sent a survey – partly to uncover and prioritise user needs, partly to benchmark satisfaction to be able to work out later if the project had been a success.
As part of the discovery phase I profiled users based on evidence from surveys, user testing and user interviews. I created a set of personas - fictionalised, evidence-based summaries of users - to help keep the design process user-focused.
I also helped to identify users’ “red routes” – their top priorities while using Insight. They were partly based on the quantitative research I’d done (a survey of users), partly on the contextual research.
Analysis of navigation
One of users’ top complaints was about the app’s usability. Research helped build an understanding of their needs and priorities, and uncovered problems with the navigation.
Above – existing navigation with confusing structure, unclear calls to action and lack of hierarchy.
Above – the old (left) and new (right) designs for the main menu. I hoped the new version would:
- reflect users’ priorities
- create a clearer hierarchy
- Clarify calls to action
- Remove noise
Beyond the navigation, I revamped many aspects of the UI and created a design system. Some of the changes included:
- Writing clearer, shorter content that reflected users’ needs
- Clearer calls to action
- A clearer sense of progress, for example when using the document storage process
- Better contrast and readability
- Clearer indications of errors, and how to recover from them
- A new design system with changes to fonts, inputs, colours and layout
Further testing and iteration
I created an interactive prototype in Axure and ran usability tests on the new designs. I asked users to perform key tasks and asked them to ‘think out loud’. The results and feedback from users were hugely positive, but also uncovered areas for improvement.
After a few rounds of iteration, the developers began the build – I worked with them until launch, in mid 2015.
At the start of the project I sent a survey to users, partly for benchmarking purposes - to understand the current level of satisfaction. A few months after release of the revamped system, I sent the same questions to the same users. The results (below) showed a significant increase in satisfaction.
Overall, there was 32% increase in satisfaction from users.
- I didn’t realise it at the time, but when I joined KCT I wasn’t mindful enough that everyone who’d worked on the existing system did the best work possible with the time and resources they had at the time. It’s one of my career regrets.
- Contextual research is fun
- My colleague Clare Ridd at Farewill would be appalled to know I used a 7 point scale in my benchmarking survey