Complex Search Filters Redesign

Complex Search Filters Redesign

Helping business operatives find insights faster through a better search & filtering experience.

Helping business operatives find insights faster through a better search & filtering experience.

Background

Intelligent Assets (IA) is an enterprise platform used for asset monitoring and automated alerting. Operations teams across industries such as transportation and sustainability rely on IA to manage equipment at scale using IoT sensor data.

This project focused on redesigning the main search filtering experience in IA, starting with the Assets page. This was a crucial feature that users needed in order to quickly isolate and resolve potentially costly issues occurring on their assets.

See the diagram below for key IA terms that will be referenced throughout this case study:

MY ROLE

Lead UX/UI Designer

(Planned & executed all design tasks, collaborated with stakeholders, managed design team)

TOOLS

Figma, FigJam, Figma AI, ChatGPT, Maze, Heurio, Google Suite (Docs, Sheets), Jira, Confluence, Slack

TEAM
  • 2 Junior UX/UI Designers

  • 1 Frontend Developer

  • 1 Product Manager

Timeframe

6 months

The Problem

IA's old search and filtering UI made it difficult to extract actionable insights from long, dense asset datasets. Vague filter field labels, unnecessary steps, and missing filtering capabilities slowed down operations teams that needed fast, targeted problem-solving to avoid costly downtime.

The Goals

Product: Design useful, scalable filter settings patterns that can be repeated across the IA platform.

Users: Make it easier and faster for users to refine searches and isolate relevant assets within large datasets.

Before & After

Before & After

The Results

Business Impact

  • 36% increase in Assets page engagement.*

  • 22% reduction in time to isolate at-risk assets.*

Product Impact:

  • 8 new reusable filter components were added to our design system, and later applied to the Events and Groups pages.

  • Positive feedback from key customers and stakeholders.

*Based on Google Analytics data.

See the design process below

See the design process below

Our Design Process

Phase 1

Discover

Defining Requirements & Scope

Defining Requirements & Scope

The initial request from stakeholders was somewhat vague: “Improve IA’s filters menus.”

So, to better clarify the project’s scope and goals, I led requirements-gathering sessions with the Services, Sales, and Engineering teams.

These discussions helped us ground our design goals in real user needs while keeping the scope within our known technical constraints.

The Main MVP Requirements:

  1. Efficiency – Reduce the number of clicks and time needed to edit filters.

  2. Functionality – Enable more advanced filtering (e.g., within/outside ranges, multi-select in categorical lists, etc.)

  3. Scalability – Create design patterns that can be migrated to other IA pages.

Defining Requirements & Scope

The initial request from stakeholders was somewhat vague: “Improve IA’s filters menus.”

So, to better clarify the project’s scope and goals, I led requirements-gathering sessions with the Services, Sales, and Engineering teams.

These discussions helped us ground our design goals in real user needs while keeping the scope within our known technical constraints.

The Main MVP Requirements:

  1. Efficiency – Reduce the number of clicks and time needed to edit filters.

  2. Functionality – Enable more advanced filtering (e.g., within/outside ranges, multi-select in categorical lists, etc.)

  3. Scalability – Create design patterns that can be migrated to other IA pages.

Interviews

We weren't able to get direct interviews with end-users due to scheduling and time constraints. So instead, we interviewed 3 internal Services Engineers (who directly interfaced with our customers/end-users on a daily basis), and also looked for relevant insights from our repository of past user interviews.

Interview Takeaways

01.

Users mainly search for assets reporting data outside of normal ranges.

01.

Users mainly search for assets reporting data outside of normal ranges.

02.

Users avoid the current filters menu because it's confusing and lacks essential functions.

02.

Users avoid the current filters menu because it's confusing and lacks essential functions.

03.

Users want to filter assets based on event data, like whether an asset has open events.

03.

Users want to filter assets based on event data, like whether an asset has open events.

Auditing the Current UI

Next, we performed a heuristic evaluation of the existing filters menu and identified the most critical usability gaps:

  • Limited or unintuitive settings for number, true-false, location, and categorical/list fields.

  • Too much clicking – each filter is in a drawer that has to be opened individually.

  • No way to tell which filters are active when the drawers are closed.

  • Layout is not optimized for mobile.

Auditing the Current UI

Next, we performed a heuristic evaluation of the existing filters menu and identified the most critical usability gaps:

  • Limited or unintuitive settings for number, true-false, location, and categorical/list fields.

  • Too much clicking – each filter is in a drawer that has to be opened individually.

  • No way to tell which filters are active when the drawers are closed.

  • Layout is not optimized for mobile.

Competitor Analysis

We searched the web for direct and indirect competitor solutions to help drive our ideation for IA's new filter settings.

Key sources of inspiration:

  • Zillow's "draw on map" feature for limiting your geographical search area.

  • Google Looker Studio's compact, searchable checkbox list filters.

  • Target's inwardly navigable filters side-drawer – a great example of progressive disclosure.

Competitor Analysis

We searched the web for direct and indirect competitor solutions to help drive our ideation for IA's new filter settings.

Key sources of inspiration:

  • Zillow's "draw on map" feature for limiting your geographical search area.

  • Google Looker Studio's compact, searchable checkbox list filters.

  • Target's inwardly navigable filters side-drawer – a great example of progressive disclosure.

Phase 2

Define

Who We Were Designing For

We translated our research insights into two primary personas that we expected to use search filters the most often in IA:

Sarah, The Supervisor – Represents our admin users.

  • Oversees several vehicle fleets across a large region. Needs to filter quickly across many asset types and groups.

Michael, The Maintainer Represents our day-to-day power users.

  • Works on-site and uses IA to find nearby assets that need repairs.

Shared needs across both personas:

  • Combining multiple filter types to narrow results quickly.

  • Adjusting filters on-the-fly without losing context.

  • Information density > simplicity: more info visible on screen is helpful.

Info Architecture: Organizing the Data

The old filters menu gave users no context about which asset type a custom attribute belonged to. This became an even bigger problem considering a single filters menu could display dozens or even hundreds of attribute filter fields.

I led the design team in diagramming the underlying data architecture of the Assets page to make these relationships visible for our larger team and internal stakeholders.

Defining the Filter Data Types

Rather than treating every filter field as a unique UI problem, I proposed we map all fields to a set of core data types. This unlocked a much more scalable design approach.

Why this mattered: One UI pattern per data type = fewer components to design and build, and a more consistent user experience.

Mapping User Flows

For each data type, we mapped out every possible interaction state and flow, making sure that:

  1. All MVP business requirements were addressed.

  2. Users would never hit a dead end while using any filter field setting.

Click here to see all user flows in full detail.

Phase 3

Ideate

Our Ideation Process

Designing a complex feature required a lot of rapid iteration and constant collaboration across teams. These were the essential parts of our workflow:

  1. Design team critiques – My junior designers and I brainstormed multiple concepts for each user story, discussing the pros and cons of every idea in detail.

  2. Check-ins with the Dev – We regularly convened with the developer to confirm design feasibility and re-work solutions when it seemed the development effort would outweigh the expected user value.

  3. Monthly cross-functional design reviews – We led company-wide UX/UI reviews with engineering, sales, services, and C-suite leadership to gather feedback and confirm user goals/needs with the customer-facing teams.

Design Iterations

Design Iterations

We went through several rounds of iterations throughout the project. Click on the images below to see how our solution evolved over time, and why we chose the final iterations.

Phase 4

Test & Iterate

Remote Usability Testing

Our user testing strategy centered around two primary research questions:

  1. Can users complete the core MVP tasks?

  2. Is the new design faster and easier than the old one?

Test Setup:

  • Participants: 11

  • Format: Maze (remote unmoderated) & Google Surveys

  • Content to Test: Figma prototypes with realistic data.

  • Scope: Desktop only (Google Analytics showed 75% of IA users are on desktop/laptop)

What we tested: Discoverability, adding/editing different filter data types, removing filters.

Remote Usability Testing

Our user testing strategy centered around two primary research questions:

  1. Can users complete the core MVP tasks?

  2. Is the new design faster and easier than the old one?

Test Setup:

  • Participants: 11

  • Format: Maze (remote unmoderated) & Google Surveys

  • Content to Test: Figma prototypes with realistic data.

  • Scope: Desktop only (Google Analytics showed 75% of IA users are on desktop/laptop)

What we tested: Discoverability, adding/editing different filter data types, removing filters.

Results from User Testing

Quantitative Results

0%

0%

of participants initially struggled to find the filters entry-point.

0%

0%

of participants initially struggled to find the filters entry-point.

0%

0%

of users had trouble understanding how to use the numerical filter settings

0%

0%

of users had trouble understanding how to use the numerical filter settings

0

0

Average System Usability Scale (SUS) Score (>80 = Excellent!)

0

0

Average System Usability Scale (SUS) Score (>80 = Excellent!)

Qualitative Results (User Feedback)
🛠️ Area for Improvement:

“I’m leaning towards questioning whether Group and Asset Type [filter summary chips] should always be visible. It seems like you’d just want nothing visible if you have no filters being applied.”

– Tester #3

Positive Feedback:

“The complexity was revealed gradually, so I didn't feel overwhelmed at any point.”

– Tester #9

Post-Test Changes

After analyzing our usability test and survey results, we prioritized making the following changes to our designs:

  1. Simplified the number filter design – more flexible and easier to scan than a limited set of fixed options.

  2. Removed default filter chips – Having the "Groups" and "Asset Types" be always visible was misleading when no filters were applied. We removed them so the summary row only shows chips when filters are actually added.

  3. Improved entry-point discoverability – We moved the filters icon from the left side to the right, as users looked for it nearby the other top-level actions on this side of the page.

Phase 5

Deliver

The Design Handoff

The Design Handoff

We kept the developer in the loop throughout our design process so nothing in the final handoff would come as a surprise. Since we had been checking in with her about design feasibility all along, we avoided having to revise our planned solutions during these final stages.

Our Handoff Structure:

  1. Delivered in phases based on the developer's planned build order.

  2. Used consistent naming conventions across all layers and components.

  3. Included thorough design annotations and UI copy guidelines with assigned priorities, so the developer knew what was critical vs. nice-to-have.

  4. Held a handoff meeting to walk the dev through the Figma file and answer any questions before implementation began.

The result: A smoother collaboration process, fewer misunderstandings and development blockers, and accurate implementation of the final designs.

Click to Expand

Post-Development Design Q.A.

We used Heurio to perform a design quality assurance audit on the implemented feature, wanting to make sure that it met our original design goals and business requirements.

We assigned priority levels to all Heurio feedback tickets, so the developer could address the most important issues first.

Our Design Q.A. Checklist:

  1. Ensure all MVP business requirements were met.

  2. Content is responsive at all breakpoints.

  3. All filter types behave as designed.

  4. Meets accessibility requirements in both light and dark mode.

Post-Development Design Q.A.

We used Heurio to perform a design quality assurance audit on the implemented feature, wanting to make sure that it met our original design goals and business requirements.

We assigned priority levels to all Heurio feedback tickets, so the developer could address the most important issues first.

Our Design Q.A. Checklist:

  1. Ensure all MVP business requirements were met.

  2. Content is responsive at all breakpoints.

  3. All filter types behave as designed.

  4. Meets accessibility requirements in both light and dark mode.

Conclusion

What Came Next?

After the Assets page's filters MVP was launched, we began the process of expanding the redesign's reach futher:

  1. Began migrating other IA pages' filters menu to the design – Since we had already accounted for cross-page scalability from the start, these migrations were fairly seamless.

  2. Added an Event Filters sub-menu to the Assets page's filters panel – This feature had been highly requested since the beginning, but we had excluded it from the Assets page MVP due to the high development effort it required.

What Came Next?

After the Assets page's filters MVP was launched, we began the process of expanding the redesign's reach futher:

  1. Began migrating other IA pages' filters menu to the design – Since we had already accounted for cross-page scalability from the start, these migrations were fairly seamless.

  2. Added an Event Filters sub-menu to the Assets page's filters panel – This feature had been highly requested since the beginning, but we had excluded it from the Assets page MVP due to the high development effort it required.

Reflections & Learnings

In enterprise UX, strong upfront discovery often provides more value than late-stage user testing alone. In this project, two key constraints made traditional usability testing a challenge:

  • Prototyping had diminishing returns — Building a realistic, testable Figma prototype for our complex, non-linear workflows took nearly as long as building out the real thing.

  • Limited access to end-users — Enterprise customers are often large and slow-moving, with multiple stakeholders standing between us and the target users we needed to test our designs with.

Ultimately, our user testing revealed our design was on the right track anyway, and I think this was thanks to the upfront discovery I had us prioritize: accumulated insights from years of user interviews and customer calls, ongoing proxy research with internal customer-facing stakeholders, and secondary research on best practices in filter design.


Reflections & Learnings

In enterprise UX, strong upfront discovery often provides more value than late-stage user testing alone. In this project, two key constraints made traditional usability testing a challenge:

  • Prototyping had diminishing returns — Building a realistic, testable Figma prototype for our complex, non-linear workflows took nearly as long as building out the real thing.

  • Limited access to end-users — Enterprise customers are often large and slow-moving, with multiple stakeholders standing between us and the target users we needed to test our designs with.

Ultimately, our user testing revealed our design was on the right track anyway, and I think this was thanks to the upfront discovery I had us prioritize: accumulated insights from years of user interviews and customer calls, ongoing proxy research with internal customer-facing stakeholders, and secondary research on best practices in filter design.


The Final Results

The Final Results

Click to Expand

Click to Expand