Project overview
Westpac launched their annual Businesses of Tomorrow (BoT) awards program in 2016, providing support, resources, mentoring and exposure to 200 leading Australian businesses.
An online hub was successfully planned, designed and launched to support and facilitate year 1 applicants (see year 1 design case study).
This case study focuses on the evaluative UX research study and design optimisations prior to the site refresh for year 2 applications in 2017/18. This was conducted over a 2 month period between myself and a technical data analyst.


Project objectives
The first year was a success, attracting over 100,000 unique visitors and 2,000 applications, however completion rate was lower than desired and the site was launched without time for ruthless UX testing. On top of this, the bank were hoping to reach a more diverse range of businesses in year 2.
Year 2 objectives
 • Educate users about the program in a succinct, user friendly way
 • Increase conversion rates from site visitors to applicants
 • Reduce drop-off during the application process
 • Attract more entries from businesses operating outside of major capital cities and industries
 • Increase applications from businesses with women in leadership positions

Project planning
To help meet these objectives we collaboratively road-mapped our approach to the project with the client. This broke the project into 2 parts for my Experience Design team.
Part 1: UX research and analysis
  1. Data analysis - Analysing anonymised user activity from web analytics and platform reports
  2. Usability testing - Qualitative review of the original BoT website user experience
  3. Picturing our users - Honing our understanding of target audiences through desktop research
Part 2: Design optimisations
  4. IA, navigation and content revisions - Test and improve the site structure and way finding
  5. Interface changes - Design and implement improvements to the wireframes and UI


Data analysis
We discovered that during the 3 month entry period 48% of users abandoned their applications. There was also a strong likelihood an applicant wouldn’t return to the application form if they couldn’t complete it in their first session.
Uncovering the barriers to entry
By interviewing program facilitators and reviewing reports we found that;
 • Many applicants did not retain their login details to resume an application
 • The password recovery process was unclear, leading Westpac to be contacted directly for support
 • Winners were confused by having 2 logins (application and account credentials)

Analysis of the entry form itself uncovered the questions with the highest barrier to completion. This was then used to better articulate, simplify or remove questions.
Question completion rate for year 1 applications.

Device challenges
We also found from analytics that pages per session and session duration for mobile visitors were considerably lower than desktop visitors. This suggested additional barriers to completion on certain devices (such as lack of access to required files).


Usability testing

Recruitment
10 participants were recruited all holding business leadership, or strong influencer positions at an Australian SME across a range of sectors. Sessions were split between desktop and mobile devices.

Test scenarios 
Scenarios were themed around the six hypotheses that we wanted to test; program comprehension, information find-ability, business relevance, program benefits, entry criteria, ease of application.

Mobile user testing

Key findings
Task completion was high and the website was generally well received, however their was some clear feedback to work on.

  1. The clarity around the awards program wasn’t clear
  2. The requirements/criteria were not surfaced early enough or hard to find
  3. Users wanted to see previous winners for comparison
  4. The application form missed important context and status feedback
  5. Users expected an easy way to return to the application where they left off

"To be honest, i'm not really sure what it is just yet."

"Oh no, where am I now?"
 –  User testing participants

Desktop user testing
Demographic research

Secondary research
Desktop research from various industry reports helped to better understand the desired audience. With a focus on diversity in industry, gender and region, these insights would help guide design and content changes.

Sample desktop research findings on our target audience


Information architecture revisions

Content re-categorisation
No major fundamental structural issues were uncovered during user testing, however there were some clear structural improvements required to help drive users through the funnel to apply.

"That’s useful information. i would expect this on the home page."
 –  User testing participant

Hybrid card sort
Participants were asked to arrange cards of content types from the year one website into pre-defined categories or create their own. They were then asked to arrange the content in each category in order of importance.
​​​​​​​
Hybrid card sort and synthesis of results.


Year 2 site map
The IA was then restructured to answer common questions at the right time, in the right order whilst users try to learn about the program.

The restructured and relabelled site map reflected analytics and user feedback to better anticipate user flow.

Navigation revisions
Navigating content - year 1
Discovery content was originally split between only 2 long pages (about / entry), requiring a secondary navigation to quickly jump between sections.
Problem: The navigation pattern was not familiar to some users (particularly on mobile) and at times caused confusion due to it’s fixed placement and inconsistencies.
In year 1, the in-page navigation pinned to the main navigation as a user scrolled.

Solution: The newly revised IA negated the need for in-page links on long pages. A simpler responsive link bar was introduced, providing contextually relevant pathways between pages.

Links would reflect common user paths to direct to relatable content.

Homepage revisions
Problem: The homepage didn’t align with user expectations at this point in their journey, which were to quickly understand the program’s offering and funnel them through to additional information, not ‘fluffy’ storytelling or a hard sell to apply.

"There's not enough information before it asks me to apply."
 –  User testing participant

Multiple apply buttons were seen as premature, asking users to commit before they knew what the program was.

Solution: To meet the needs of our time-poor audience we dialled back the emotional messaging and surfaced important consideration content and links to make the process more quick and efficient.

Revised homepage wireframes and visual design.

Encouraging applications
A dynamic alert bar aided users with their understanding of what the program is and when it is held, creating a level of immediacy to start/complete their applications.
 Messages dynamically update to inform users as the application deadline gets closer.

Password-free registration
Problem: Forgotten passwords were a major barrier of entry for returning users and pain point for the bank due to the number of reset enquiries.
Solution: We created a system to remove passwords entirely, requiring applicants to only remember the email address they registered with and send themselves a unique quick access link. The aim was to reduce number of duplicate or abandoned applications and support enquiries.


Self-guided applications
Screens were adapted or created to focus on handholding the user through the process. Progress was now cached and autosaved to guarantee no accidental loss of data.

Reassuring messaging and feedback such as status indicators, tool tips, errors and validation help guide the user.

Progress indicators reflected the approximate time/effort remaining rather than page numbers

Removing criteria ambiguity
Website copy was revised to be clear and direct, removing any confusion around what BoT is. Imagery and video content was also updated to be more inclusive and highlight BoTs enjoying some of the program rewards.

Visual design of the application requirements and judging process pages

Showing what success looks like
A new section was added for year 2 showing the diverse breadth of industries who applied and the value the program had brought to the chosen BoTs through compelling testimonials.

Potential applicants wanted to see information on previous winners to assess their own chances of winning.

Outcome
The website updates were successfully implemented in time for the 2017/18 application period. Application completion rates increased from 53% in year one to 66%. A dramatic reduction in support enquiries and increase in business diversity was also recorded, meeting the project goals.

Streamlined user flow and navigation
Analysis of analytics, heat maps and screens recordings after the 2 month application period showed that the improvements have clearly aided with the site experience and performance. Click heat maps showed the revised site navigation and quick links component were being well utilised.

Higher engagement
Users spent between 1:15 to 2:00 minutes per session, visiting on average over 2 pages per session. The newly added ‘rewards’ and ‘judging’ pages had the greatest engagement.

Scroll heat maps and screen recordings further indicate that users are thoroughly consuming the information provided.

User research validation
This was a fantastic UX project to lead, with every word and structural change based upon insight. The re-categorisation of content, clearer navigation pathways and more direct and digestible content have already made a big impact, highlighting the benefit of research driven design.

​​​​​​​
"Well done team, you guys did such an amazing job
of this BEAST of a website, it looks so good.”

 –  client feedback

View the latest website iteration at https://businessesoftomorrow.com.au

The project team celebrating the successful relaunch.

New learnings
Data is everywhere: Accessing application data offered a wealth of insights which standard analytics wouldn’t normally provide.
Quant. informed qual.: With more research we could have adapted the research approach and better probed into specifics around barriers to form completion.
Setting participant recruitment expectations: Lack of direct access to, and interest from the high profile cohorts meant we had to restructure the study. This has made me rethink recruitment strategies to focus on attainable participants.
Humility: Having played a key role in the initial design It was important to remain open to feedback and changes to further improve the website.

You may also like

Back to Top