Laura Griffee

NOLS

Advanced Search Redesign

Reimagining course discovery for a global wilderness school to improve usability and conversion rates for program enrollment.

  • Product Design
  • Design Systems
A mockup of the advanced search interface in grid view for expeditions after the redesign.

Role

  • UI Designer
  • UX Researcher

Responsibilities

  • Concept Ideation
  • Wireframing and Mockups
  • Prototyping
  • Usability Testing
  • UI Guidelines

Team

  • 1 Designer
  • 2 Engineers
  • 1 Manager
  • 4 Directors

Tools

  • Sketch
  • Abstract
  • InVision

Timeline

Aug 2018 - Jan 2020
(1 year 5 months)

Overview

Company

National Outdoor Leadership School (NOLS)
A global nonprofit wilderness school developing leaders through outdoor education and expedition-based learning.

Goals

User
Help prospective and returning customers quickly compare course offerings matching their interests, budget, and time so that they can easily find and apply to the right course.

Business
Improve the user experience of the website’s advanced search feature to attract more customers and increase revenue.

Summary

I led the complete redesign of NOLS' course search experience as UI designer and UX researcher, delivering a 30% increase in application engagement year-over-year.

NOLS customers were struggling to find relevant courses using the website's advanced search tool, directly impacting enrollment rates. I led the complete redesign as the primary UI designer and UX researcher, following an iterative, human-centered process with three rounds of usability testing. The solution delivered a 30% increase in application engagement year-over-year and resolved the core usability problems identified in follow-up testing.

A mockup of the advanced search interface in grid view for expeditions after the redesign.
Redesign Mockup (Expeditions, Grid View)
A mockup of the advanced search interface in list view for wilderness medicine after the redesign.
Redesign Mockup (Wilderness Medicine, List View)
Advanced search interface in grid view for all programs before the redesign.
Website Before (All Programs, Grid View)
Advanced search interface in grid view for expeditions after the redesign.
Website After (Expeditions, Grid View)
Advanced search interface in list view for all programs before the redesign.
Website Before (All Programs, List View)
Advanced search interface in list view for wilderness medicine after the redesign.
Website After (Wilderness Medicine, List View)

Full Case Study

Introduction

Context

NOLS, a global nonprofit wilderness school, wanted to improve the user experience of their website’s advanced search tool to attract more customers and increase revenue.

NOLS is a global nonprofit wilderness school that educates students in leadership and wilderness skills. It offers courses in four main program areas: Expeditions, Wilderness Medicine, Risk Services, and Custom Education. Of these programs, Expeditions and Wilderness Medicine account for the majority of revenue and courses for the company.

At the time of this project, the company was trying to grow the revenue of its Expedition course offerings but had been unable to do so for the past two years due to a shift in customer preference towards shorter, less profitable courses.

One idea NOLS had to address this shift was to attract more customers by increasing brand awareness and driving more website traffic. To ensure they had the best chance of attracting customers on their website, the company wanted to improve the user experience of the advanced search feature.

Problem

Customers were having trouble using the advanced search tool to quickly find course offerings based on their interests, budget, and time.

Based on previous usability testing, customer complaints, and an UX audit by an external SEO company, our team knew that prospective and returning customers were having trouble using the advanced search tool, to quickly filter down course offerings. Over the course of the project, I realized that these usability issues were part of a much larger problem with how the NOLS website segmented customers.

Advanced search interface in grid view for all programs before the redesign
Advanced search interface prior to the redesign.

Users & Audience

NOLS’ advanced search users are outdoor recreationalists, outdoor professionals, and medical professionals seeking education in leadership and wilderness skills.

To help guide our design decisions and priorities, our team spent time getting to know NOLS’ advanced search tool users through proto-personas, validated personas, and usability testing conducted both before and during this project.

Below is key information about two personas that our team focused on:

Emilia Worton, 17

Validated Persona

Emilia Worton, 17

Gap Year Student

Emilia is a 17-year-old student who graduated early from high school and is taking a gap semester before she applies for college. She has never taken a NOLS course.

Key Needs

  • Go on a longer course and is fine with taking time out of school.
  • Travel somewhere new and distant.
  • Compare multiple products before making a decision.
  • Have information on price and reviews.
Flash Davidson, 45

Proto-Persona

Flash Davidson, 45

Outdoor Guide

Flash, is a 45-year-old climbing and skiing guide who needs to recertify his Wilderness First Responder (WFR) certification for his job. He has taken multiple NOLS courses.

Key Needs

  • Recertify his WFR in the fastest & easiest way possible.
  • Know what dates are available in his area.

*These personas were prioritized because they represented customers from the programs that account for the vast majority of the NOLS’ revenue and courses. Other personas not listed here were also considered.

Role

UI Designer

I was responsible for concept ideation, visual design, wireframing, hi-fi mockups, prototyping, design system components, and developer feedback.

During this project, I spearheaded redesigning the course finder’s interface, developing a new visual style for NOLS’s web brand that was much cleaner, minimalist, and modern. Alongside this redesign, I documented and created guidelines for this new style in WindPants, NOLS’s primary design system, which I was working on building out at the time.

I was also responsible for creating and updating 100+ wireframes, high fidelity mockups, and clickable prototypes. After passing off my mockups to the developers from the external web studio, I gave them detailed feedback on implementation.

Deliverables

  • Wireframes (InVision Freehand)
  • High Fidelity Mockups (Sketch)
  • Clickable Prototypes (InVision Prototypes)
  • Design System Components (Sketch, Abstract, InVision DSM)
  • Development Feedback (GitHub, Google Docs)

UX Researcher

I conducted and synthesized the research for two usability tests.

During this project, I conducted two out of the three usability tests: 1) moderated in-person with employee subject matter experts and 2) moderated remotely with new and returning customers. And was responsible for compiling my findings from these sessions into a usability report. Since I joined this project part way through, other team members were responsible for creating the testing plan and script.

Deliverables

  • Recorded Usability Tests (Zoom)
  • Usability Testing Reports (Google Docs)

Team

I was the sole designer for most of the project. I reported to one project manager, worked with four directors, and collaborated with two developers.

In-House (6 people)

  • Web Manager
  • Frontend Designer
  • Art Director
  • Creative Director
  • Marketing Director
  • Information Systems Director

External Web Studio (2 people)

  • Backend Developer
  • Frontend Developer

Scope & Constraints

Joining Mid-Project
I joined this project a part way through and needed to hit the ground running. I had to simultaneously catch up on what was previously accomplished while also preparing for the next round of usability testing.

Technical Debt
NOLS used multiple aging databases to keep track of student and course information. This led to limitations and complications during this project related to accessing and editing course data.

Scope Creep
Changing requirements from stakeholders and usability testing expanded this project’s scope from minor updates to the website’s advanced search interface to a total overhaul of the entire site.

No UI Guidelines
The NOLS Creative team did not have any layout or spacing UI guidelines in place for the web and I ended up needing to take time during the project to create them.

Minimal UI & UX Experience
When I joined this project to replace another team member, I had no UX experience and very little UI experience. I had to quickly learn about both of these disciplines within a matter of weeks through heavy research and quick learning.

Process

Overview

Our team followed an iterative, human-centered design process that spanned three usability testing cycles. After the first cycle, I joined the project and took over UI design and UX research.

To ensure our design decisions addressed real user needs, the team constructed a process out of Stanford d.school’s five modes of design thinking—empathize, define, ideate, prototype, and test—that spanned three usability testing cycles. During each cycle the team 1) conducted usability testing to better understand our users, 2) identified where our users’ problems existed, 3) created low fidelity sketches to brainstorm solutions, 4) built high-fidelity interactive prototypes of our ideas, and 5) returned to our users again for feedback.

When the teammate originally responsible for the UI design and UX research was unable to keep working on the project in the same capacity after the first cycle, I transitioned onto the team and assumed both roles.

Cycle 1

3 months

Understand Users

Usability Testing

Define Problems

Identify & Prioritize Needs

Brainstorm Ideas

Lo-Fi Sketches

Prototype Solutions

Hi-Fi Mockups


*Before I joined the project

Cycle 2

3 months

Understand Users

Usability Testing

Define Problems

Identify & Prioritize Needs

Brainstorm Ideas

Lo-Fi Sketches

Prototype Solutions

Hi-Fi Mockups

Clickable Prototype

Cycle 3

6 months

Validate Solutions

Usability Testing

Define Problems

Identify & Prioritize Needs

Brainstorm Ideas

Lo-Fi Sketches

Prototype Solutions

Hi-Fi Mockups

Clickable Prototype

Implementation

8 months

Phase 1

Hi-Fi Spec Mockups

Dev Feedback

Phase 2

Hi-Fi Spec Mockups

Dev Feedback

Joining Mid-Project

To get myself up to speed upon joining the project, I reviewed the usability testing report from the first cycle and identified three key findings.

Before moving on to the next round of usability testing I analyzed the team’s report from the previous cycle so that I could: 1) get myself up to speed on who they tested, why they tested them, and what they discovered; 2) distill my own key findings from this information; and 3) identify any recurring issues in the next round of testing.

I learned from the team’s report that they had conducted five remote unmoderated usability tests with new customers on the live advanced search webpage to evaluate how easy it was to use the interface for the first time. Since NOLS was looking to expand its customer base, the team thought it was critical to begin usability testing with new customers to ensure the advanced search tool was easy for a wide variety of customers to use without prior knowledge of the interface. Five participants of representative ages (18-45) and devices (computer, tablet, smartphone) were recruited via UserTesting.com to take remote unmoderated usability tests on the live NOLS website.

From the report’s findings, I distilled three key issues that seemed to be part of larger overarching usability problems:

Key Report Findings

Busy Visual Design

Users were distracted by the interface’s busy visual design and had a hard time distinguishing the hierarchy of elements on the page.

4/5 users thought the interface was busy, overwhelming, and confusing

My initial reaction is there’s a lot going on. It’s kind of overwhelming...

Unclear Language

Users were confused by certain checkbox and section titles and were unsure if the terms would help them find the right course.

3/5 users came across language that confused them

Leave No Trace, I think is an eco-friendly type of thing?

Missing Search Criteria

Users were frustrated by the absence of important search criteria and were unable to search and compare courses based on key deciding factors.

4/5 users were unable to filter by one of their key deciding factors

You’d think there would be an easier way to filter by this...I don’t see first aid anywhere.

Understand Users

Following the team’s usability testing plan and script, I conducted eight in-person moderated usability tests with subject matter experts on a high fidelity clickable prototype to evaluate how easy it was for returning customers to use the interface.

In the second round of usability testing, the team wanted to evaluate how easy it was for returning customers to use the advanced search tool. In addition to attracting new customers, NOLS was also looking to grow its business by selling more courses to current customers. For this reason, it was important to conduct usability testing with returning customers to ensure the advanced search tool was easy to use for customers with more specific needs and deeper product knowledge.

In an effort to recruit the smallest number of participants with the greatest knowledge of program specific customer needs, the team recruited eight internal subject matter experts spanning NOLS’ course offerings to take in-person moderated usability tests on a high fidelity interactive prototype. I conducted these tests with participants at my desktop computer and took notes during the sessions. Following testing, I gathered my findings from these sessions into a usability testing report.

Advanced search interface mockup from the previous designer in list view for all programs
The advanced search mockup I inherited from the previous designer and used for the usability tests.

Define Problems

After conducting the tests, I used affinity diagramming to help me identify six usability issues that needed to be fixed.

To define what problems users were having, I reviewed the notes I had taken during the usability sessions and recorded participants’ pain points in a Google Doc. I then used affinity diagramming to organize related participant pain points into distinct clusters and prioritize them based on severity (did the issue prevent users from completing critical tasks?) and pervasiveness (how many users experienced this problem?).

I decided to tackle the following six usability issues that were the most severe and pervasive:

Top Usability Issues

Excessive Filtering Options

Users were overwhelmed by the amount of filtering options they had to choose from and had a difficult time quickly filtering.

8/8 users had trouble quickly filtering using the checkboxes

Feels busy, there are so many checkboxes….I’m overwhelmed by the amount.

Unclear Language

Users were confused by certain checkbox and section titles and were unsure if the terms would help them find the right course.

7/8 users came across language that confused them

Preferred Start Date Range...I’m not really sure [what] this is trying to be…

Busy Visual Design

Users were distracted by the interface’s busy visual design and had a hard time distinguishing the hierarchy of elements on the page.

6/8 users thought the interface was busy, overwhelming, and confusing

Really busy...the amount of info here is overwhelming.

Unnatural Listing Order

Users were puzzled by how the default course results were sorted. They were also confused by the non-alphabetical listing of checkboxes.

6/8 users were confused by the ordering of the checkboxes and results

I can’t tell how the courses are sorted in this initial presentation.

Missing Search Criteria

Users were frustrated by the absence of important search criteria and were unable to search and compare courses based on key deciding factors.

4/8 users were unable to filter by one of their key deciding factors

Sponsors are not here...I want to be able to search for courses using that..

Data Inconsistency

Users were disoriented by the varying number of results and course information between list and grid views.

3/8 users were unsure why data changed between views

Why can I only see 600 courses versus 800 courses now?

Brainstorm Ideas

To find the best solutions to the usability problems, I conducted several rounds of brainstorming and sketching.

I began solving for each of the usability problems by first brainstorming a wide range of possible solutions. I then explored the viability of those solutions through low fidelity sketches. Finally, I narrowed down my solutions and chose the best ones to implement on the high fidelity mockups.

Prototype Solutions

Based on my proposed solutions, I used Sketch to update the high fidelity mockups and InVision to stitch the new mockups into a clickable prototype.

When I opened the mockups that I had inherited from the previous UI designer for the first time, I took time to review the file organization in Sketch. High fidelity mockups can be quite time intensive to work on and I knew that making sure my file had meaningful naming, logical foldering, and useful symbols would save me (and future designers) a lot of time down the road.

After organizing the file to my liking I set to work implementing my solutions:

Usability Updates

Streamline Filters

  • Remove unnecessary filters
  • Combine similar filters
  • Get rid of sublevel filters
  • Order filters by importance*
  • Declutter checkboxes

*Based on Google Analytics and usability testing

Clarify Language

  • Use language familiar to users
  • Shorten filter titles and labels
  • Use tooltips for unavoidable jargon

Simplify Visual Design

  • Reduce visual weight of elements
  • Add more spacing
  • Remove meaningless indicators
  • Replace uncommon UI patterns

Make Listing Order Intuitive

  • Alphabetize checkboxes
  • List default results by start date*

*Implemented after Cycle 3 due to permission needed from stakeholders

Expand Search Criteria

  • Add missing queryable data*
  • Add missing filtering options

*Implemented after Cycle 3 due to technical constraints

Make View Data Consistent

  • Don’t switch data between views
  • Results should all be sessions

After implementing these solutions I stitched the mockups together into a clickable prototype with InVision for the next round of usability testing.

Advanced search interface mockup from the previous designer in list view
Inherited Mockup Before (List View)
Updated advanced search interface mockup in list view
Mockup After (List View)
Advanced search interface mockup from the previous designer in grid view for all programs
Inherited Mockup Before (Grid View)
Updated advanced search interface mockup in grid view
Mockup After (Grid View)

Validate Solutions

Following the same usability testing plan and script, I conducted three of five remote moderated interviews with new and returning customers on the updated high fidelity clickable prototype to validate my design solutions.

In the third round of usability testing, the team wanted to 1) evaluate if the design iterations had fixed the issues users were having and 2) find any new usability problems introduced by the changed design.

Five participants of representative ages (18-45), experiences (new & returning), and program interests (Expeditions & Wilderness Medicine*) were recruited in-house to take remote moderated usability tests on a high fidelity interactive desktop prototype. I moderated three of these tests with participants over recorded Zoom calls and another teammate assisted me in moderating another two tests. Following testing, I took notes on the session recordings and gathered my findings into a usability testing report.

In retrospect, the team should have conducted this usability testing on a variety of devices instead of testing only on desktop. Half of NOLS’ website traffic comes from customers on mobile devices and this round of testing completely missed out on gathering further insights into how to improve that experience.

*Since these two programs accounted for the vast majority of the company’s revenue and courses they were prioritized above other programs for usability testing.

Moderating a usability test on the updated hi-fi mockup
Moderating a usability test on the updated hi-fi mockup.

Define Problems

After conducting the tests, I used affinity diagramming to help me identify seven key findings.

To identify new, persisting, and solved usability issues, I reviewed my notes from the usability session recordings and documented participants’ pain and high points in a Google Doc. I then organized related points into distinct clusters and ranked them based on severity (did the issue prevent users from completing critical tasks?) and pervasiveness (how many users experienced this problem?).

I generated the following eight key findings from my affinity diagramming:

Solved Issues

Excessive Filtering Options

Users were no longer overwhelmed by the amount of filtering options they had to choose from and had an easy time quickly filtering.

5/5 users found and selected checkboxes easily

It was easy to search and find what I was looking for.

Busy Visual Design

Users were no longer distracted by the interface’s busy visual design and had an easy time distinguishing the hierarchy of elements on the page.

4/5 users thought the page was simple, clear, and intuitive

I think it’s concise and not too busy. Pretty easy to navigate...

Data Inconsistency

Users were no longer disoriented by varying numbers of results and course information between list and grid views.

5/5 users were not disoriented when switching between views

Basically it’s just a picture version of what was listed.

Persisting Issues

Unclear Language

All other language issues were solved, but the checkbox label for beginner wilderness safety courses was not obvious to users.

5/5 users didn’t use “Initial Certification” checkbox

I did get confused with the course types and skills…for Wilderness Medicine.

Unnatural Listing Order*

Users were no longer confused by checkbox listing order, but they were still puzzled by how the default course results were sorted.

4/5 users expressed confusion over the sorting of default course results

...what order [are they] in...I would expect [them] to be sorted by date.


*No iterations made from last cycle due to pending stakeholder approval

Missing Search Criteria*

Users were frustrated by the absence of important search criteria and were unable to search and compare courses based on key deciding factors.

3/5 users were unable to filter by one of their key deciding factors

I would go to the search and type in...Right now I don’t see it...


*No iterations made from last cycle due to technical constraints

New Issues

Insufficient Contrast

Users had trouble noticing the informational tooltip icons easily (once they did users found them to be helpful).

3/5 users didn’t recognize/notice the informational tooltip icons easily

Oh that little help text with the information i dot...it didn’t catch my eye at first.

Brainstorm Ideas

While brainstorming and sketching possible solutions to the remaining usability problems, I realized the interface needed to be segmented by program.

While exploring the viability of listing default course results in order of start date (to solve for the unnatural listing order usability problem), I realized it was possible for the first page of results to only be Wilderness Medicine courses (since there were 506 Wilderness Medicine courses, 333 Expeditions courses, 3 Risk Services courses, and 27 Alumni courses). If that happened it would be very confusing to other program users and would be unacceptable to Expeditions stakeholders.

This led me to consider if the course finder experience needed to be separated by program. Comparing the filtering needs of all the programs (especially those of Expeditions and Wilderness Medicine), I quickly found that they had conflicting preferred views (list vs. grid), location data (campus vs. geolocation), and searching behaviors (browsing vs. comparing). Based on the results of my comparison, it was evident to me that separating the course finder experiences was necessary to further simplify and customize the advanced search interface.

I brought my team together to brainstorm a new page flow to segment users into separate course finder experiences based on program. We all agreed that we would need an unfiltered course finder page to bucket users and that the filters for each of the programs should be tailored to the needs of the program.

Wireframes exploring segmenting the advanced search interface by program.
Wireframes exploring how I could segment the interface based on program.

Prototype Solutions

Based on my proposed solutions, I used Sketch to update my high fidelity mockups. As part of that process, I ended up needing to create and document layout guidelines for my team in a design system.

While working on my mockups this round, I began considering the development handoff and realized that my mockups were not following any layout or spacing guidelines and my team did not have any in place for me to follow. I quickly set to work researching, creating, and documenting these guidelines in the design system that I was building out at the time. After doing that, I reworked all my mockups to follow these new guidelines (and in the process deepened my use of symbols in the file to be the most efficient with this overhaul).

Spacing grid layout documentation on the NOLS design system documentation website.
Layout guidelines I helped codify on NOLS' design system documentation site.

After reworking the mockups to follow the new layout and spacing guidelines I set to work implementing my solutions:

Usability Updates

Audience Segmentation

  • Separate experience by program
  • Tailor filters and view to audience

Clarify Language

  • Expand “Initial Cert.” filter context

Make Listing Order Intuitive

  • List default results by start date

Expand Search Criteria

  • Add missing queryable data

Increase Contrast

  • Make UI elements AA compliant

Refinements

Simplify Visual Design

  • Cut down number of fonts
  • Flatten interface elements

Streamline Filters

  • Combine similar filters
  • Tailor filters to audience

Stakeholder Requests

Marketing Features

  • Tags for waitlist and limited space
  • Featured courses section
Grid view advanced search interface mockup from usability testing
Mockup Before (All Programs, Grid View)
A mockup of the advanced search interface in grid view for expeditions after the redesign.
Mockup After (Expeditions, Grid View)
List view advanced search interface mockup from usability testing
Mockup Before (All Programs, List View)
A mockup of the advanced search interface in list view for wilderness medicine after the redesign.
Mockup After (Wilderness Medicine, List View)

Implementation

To set the developers up for success, I created 92 high fidelity design spec mockups in InVision and provided detailed feedback on each round of implementation.

Due to the extensive nature of the proposed changes, implementation was split up into two phases. Before each phase, I created and annotated a total of 92 high fidelity design spec mockups in InVision to clearly communicate to developers how every part of the interface should look, work, and feel. These specs and annotations detailed different screen sizes, views, states, and page flows.

After each phase, I gave the developers detailed feedback (via Google Docs and GitHub Issues) on how the coded interface compared to my design specs mockups. Despite my best efforts to be as clear as possible, this process ended up being very painful and time consuming. Many details of my specs were overlooked or not executed properly and I ended up needing to provide feedback on quite a few visual inconsistencies. In retrospect, it would have been really helpful if NOLS had had a design system in place* to aid with smooth designer developer collaboration.

*At the time I was spearheading building out a design system for NOLS but it wasn’t far enough along yet to truly begin translating between design and development.

Zoomed out view of desktop, tablet, and mobile mockups for handoff to developers.
Some of the desktop, tablet, and mobile mockups I created for the developers.

Github issue comment giving styling feedback on alignment and spacing.
Example of styling feedback given through GitHub Issues.

Outcome

What Happened

Two months post launch, NOLS experienced a 30% increase in application engagement year-over-year. Six months post launch, further usability testing indicated that the redesign had fixed the interface’s main usability problems.

Two months after the launch, the redesign was responsible for a 30% increase in the number of customers reaching the application phase year-over-year. This data strongly indicated that the redesign of the advanced search tool had improved the user experience and was attracting more customers. After this two month period, our team was unable to continue tracking this data because COVID-19 forced NOLS to suspend all course offerings for several months and resume only limited operations later that year.

To supplement our quantitative launch data, the team also collected qualitative data via another round of usability testing (conducted by a third party freelancer in July 2020). This testing confirmed that the redesign had fixed the main problems with the interface. The minor issues the testing identified confirmed several areas of improvement the team was already considering.

A mockup of the advanced search interface in grid view for expeditions after the redesign.
Redesign Mockup (Expeditions, Grid View)
A mockup of the advanced search interface in list view for wilderness medicine after the redesign.
Redesign Mockup (Wilderness Medicine, List View)
Advanced search interface in grid view for all programs before the redesign.
Website Before (All Programs, Grid View)
Advanced search interface in grid view for expeditions after the redesign.
Website After (Expeditions, Grid View)
Advanced search interface in list view for all programs before the redesign.
Website Before (All Programs, List View)
Advanced search interface in list view for wilderness medicine after the redesign.
Website After (Wilderness Medicine, List View)

Lessons Learned

Usability testing is humbling, persuasive, and powerful.
Conducting usability tests for the first time during this project taught me that it’s virtually impossible to design an exceptional user interface without testing it with real users.

Expertise in digital design is knowing that you need to test things.
Being a good designer for digital experiences means recognizing that you are working off assumptions that need to be validated by actual users.

Design file organization is crucial for maximum efficiency.
Keeping files organized with meaningful naming, logical foldering, and useful symbols saves you, developers, and future designers a lot of time down the road.

Smooth designer developer collaboration does not happen by accident.
Effective communication and collaboration between design and development can only happen if there are systems, tools, and workflows in place that help translate between disciplines.

Design is never finished.
Customer expectations change over time and so must design—there’s never an “end” to iterating on digital experiences.