Showing progress through a journey to increase research programme sign-ups

Time to read:

5 minutes

Problem  

  • 10 percentage point drop-off on the longest page of the registration experience
  • Existing design elements did not accurately set expectations or align with the design system
  • Quantitive evidence that participant understanding of the research programme could be improved

Discovery 

Analytics 

  • Analytics indicated an average conversion rate of 99% across all but one page of the sign-up experience
  • The typical dwell time on the page is 3-4 minutes, much lower than the time needed to read the information.

The team created a detailed user journey map, which showed:

  • Each screen of the journey 
  • Conversion rate of each screen across several months
  • Existing user insights where available

This was a valuable resource in quickly identifying areas of opportunity.

Usability interviews

I contributed to the discussion guide, carried out note-taking for all 12 interviews, and assisted with planning the structure of the collaborative analysis workshop, as well as contributing design insight.

Focusing on the participant information sheet content, insights showed that: 

  • Users would not read the information comprehensively, as it was too long
  • Users would leave the process as their expectations were not met 
  • Users did not know what was coming next in the process
  • Users were frustrated at the length of the information which they were required to read 

User need

As someone who has decided to join the research programme, I need clear expectations of what the sign-up journey involves, so I can set aside enough time to do it and I don’t have any surprises.  

Hypothesis 

We believe that an increase in the number of users who sign up for the programme can be achieved if potential participants have clear expectations of what the sign-up process involves. This can be achieved by better visualising and providing feedback on progress throughout the joining journey. 

Design and ideation

Competitor analysis

This activity allowed me to analysis and consider the strengths and weaknesses of other progress indicators, and understand user expectations.

It also highlighted the unique challenge of a journey will one exceptionally long page and how expectations can be set around the time required to complete the journey.

  1. How might we improve wayfinding and orientation for users joining the programme. 
  2. How might we better show progress through the five pages of the participant information to improve expectations and commitment required to read it and consent. 

Early thinking of different solutions to these problems was presented at a design critique with the wider design team. This helped to refine the solution in advance of the first round of usability interviews.  

Questions raised included: 

  • Should a new loaded page show as “read” in the progress indicators 
  • What is the order of component for screen reader users? 
  • Will it take up too much room and increase page length? 

Usability testing

What was tested in round one

  • A single progress bar which increased in proportion to page length following a page being read 
  • Progress increased as an animation to delight users and create a sense of accomplishment  
  • Page number is shown to set expectations 
  • Included estimated reading time to set expectations 

User insights from round one

  • Some appreciated expectations being set with the estimated reading time. 
  • Some were distracted by time to read and were not comfortable with being told how long they should take to read the page.  
  • There was a mixed understanding of whether time to read is for the current page or all five pages of information.  
  • Participants understood the progress bar, but did not comment on it until prompted. 

Iterations made based on user insights

I made the following changes to the design based on these insights: 

  • Replacing time to read with progress bars of length relating to each page 
  • Adding “Your progress is saved” to reassure users 
  • Showing the current page as completed progress to better motivate users to continue
  • Adding Participant information at the top of the page, to better orientate 

User insights from round two

  • Much less frustration was expressed throughout the journey of the participant information sheet 
  • Interviewees were much more likely to understand where they were in the participant information, even though they did not always refer to the progress bar 

User testing outcome

The design which was successful in round two would be AB tested on the production site. I created high-fidelity development ready files in Figma, and collaborated with an engineer to ensure the final build met the spec of the design file.

Experimentation

Stakeholder engagement

Given that the participant information sheet is owned by the Our Future Health Ethics team, I provided them with visibility into the design proposals, the success of the usability testing, and the hypotheses for AB testing.  

Accessibility  

When navigating with a screen reader, the percentage of progress is announced, this allows screen reader users to take advantage of the feature. The feature was tested as part of a regular accessibility audit and no concerns were raised. 

AB testing

An AB test with the new progress bar achieved a statistically significant 2.27 percentage point conversion increase to page 3. Page dwell time also increased by several minutes.

The progress bar variant also achieved a 3.01 percentage point overall increase in consent rate. This is equivalent to 7,500+ more consented participants per month, based on historical consent rates. 

Outcome and impact 

The progress bar variant was deployed to 100% of traffic and significantly increased conversion rates in line with the impact of the AB test.