top of page

Verizon Support Landing Page Redesign

As the Support Portfolio Research Lead at Verizon, I led a multi-phase research initiative to redesign the unauthenticated Support landing page, a critical entry point used by millions of customers seeking help with their devices, accounts, and network issues.

At the time, the experience was not effectively guiding users to the right support path. Many users landed on the page unsure where to begin, often defaulting to search or escalating to call and chat support. This created friction for users and increased reliance on higher-cost support channels.

The goal of this work was to design a clearer, more intuitive starting point that could efficiently guide both:

  • Intent-focused users who know what they need

  • Discovery-focused users who need help figuring it out

To address this, I led foundational research, co-design, and a qualitative A/B/C test to evaluate new approaches to structuring the Support experience. In parallel, I partnered closely with design as a co-designer, contributing to one of the concepts tested.

This work directly influenced the final product experience and drove measurable impact:

  • 26% decrease in support page search usage

  • 29% increase in click-through rates on navigation

Support Portfolio at Verizon

The Support Portfolio plays a critical role in shaping the digital support experience for millions of Verizon customers.

It includes:

  • Self-Service Troubleshooting Tool – Empowering users to independently resolve common issues without needing to call or chat with an agent.

  • Wayfinding to Support Tools – Streamlining navigation and discoverability of support features across the app and web experiences.

  • General Support Pages – Designing and refining content-rich help pages that provide clear, actionable answers to user questions.

  • Support Ticket Management – Helping users view, track, and manage their support requests with transparency and ease.

  • Device Diagnostics – Surfacing diagnostic tools that help users identify and address device-specific problems.

  • Network Status and Usage – Providing users with real-time visibility into network health, outages, and data usage.

  • Generative AI – Exploring how GenAI can enhance customer support by delivering faster, smarter, and more personalized assistance.

Previous Research

Usability Testing

To understand why the existing Support landing page was underperforming, I began with usability testing on the current experience.

The goal of this study was to identify where users were getting stuck and how they were attempting to navigate support.

What I observed:

  • Users felt overwhelmed by the number of options presented upfront

  • Many could not identify a clear starting point

  • Participants often defaulted to searching or hesitated before taking action

  • Even when tools were available, users struggled to confidently choose the right path

These findings made it clear that the issue was not a lack of functionality, but a lack of structure and guidance.

What this raised:
While we understood what wasn’t working, we still didn’t know:

  • How users expected support to be organized

  • What an intuitive starting point should look like

  • How to design for both users who know their issue and those who don’t

 

Co-Design Study

To answer those questions, I led a co-design study where participants created their ideal Support landing page.

The goal was to uncover users’ mental models and understand how they naturally organize support experiences when not constrained by the current design.

Key insight:
Across sessions, participants consistently structured support by service type first (Mobile vs Home/Internet), then narrowed down to device, issue, and solution.

This revealed a clear, repeatable pattern:
→ Users think in progressive narrowing, not in flat lists of options

What this unlocked:

  • A strong foundation for structuring the Support landing page

  • Validation that a guided, step-based experience aligned with user expectations

What questions remained:
Even with a clear direction, we still needed to understand:

  • How much structure is too much vs too little

  • Whether users prefer simplified navigation vs content-rich options

  • How a chatbot-driven experience compares to structured navigation

  • Which approach best supports both intent-focused and discovery-focused users

 

Why This Led to the A/B/C Test

To answer these remaining questions, we developed three distinct design approaches and evaluated them through a qualitative A/B/C test.

This allowed us to move from:

  • understanding user problems
    → to exploring solutions
    → to validating which direction worked best and why

Project Overview

My Role

UX Research Lead (Primary) + Co-Designer

Research Leadership

  • Owned research strategy end-to-end

  • Led usability testing, co-design studies, and the qualitative A/B/C evaluation

  • Defined hypotheses, research questions, and test plans

  • Moderated sessions and synthesized findings

  • Delivered insights that shaped product direction

Design Contribution

  • Co-designed Option A with the design team

  • Translated research insights into a structured interaction flow

  • Focused on reducing cognitive load and clarifying entry points

Time Line: 3 weeks

Research Objectives

  1. Understand which approach do participants find the most helpful to navigate to the right
    solution. Does that change if the customer is in one of the following mindsets:

    1. Intent-focused: Customer knows exactly what their problem is and has a specific
      solution in mind (e.g. step-by-step troubleshooter, FAQ/article, etc.).

    2. Discovery-focused: Customer may not fully understand their issue. Or may understand
      their issue, but is open to having options for solution (e.g. Customers bill was different
      than expected, but not sure why / what they should do).

  2. Understand sentiments on our proposed approaches (A vs B vs C).

  3. Understand what participants expect to see on a support landing page?(Ex: Categories, links
    to specific solutions, search, Contact Us info).

Key Research Questions

  1. What are users’ initial reactions to each concept?

  2. What do users expect from a support landing page?

  3. Do expectations differ between unauthenticated vs authenticated states?

  4. How do users feel about navigation vs chatbot input?

  5. What improvements would they suggest?

Our Hypothesis:

If users are presented with structured, clickable options such as simplified (Version A) or content-rich (Version B), they will navigate to solutions more confidently and with less cognitive strain than when using a chatbot-only approach (Version C). We expect intent-focused users to prefer quick, direct access, while discovery-focused users may value additional context and curated options.

Methodology

Remote, A/B/C Usability study

1-on-1 individual sessions

60 minutes each

Mobile Device Modality

Participants

12 Verizon Customers​

  • 6 users given an intent-focused scenario​

  • 6 users given a discovery-focused scenario

Open to all account members

Mix of ages, incomes, ethnicities, and genders.

Tools

UserTesting - UserTesting was utilized for recruitment, and moderation.

Figma - The prototypes were created through Figma.

Google Office Suite - I utilized Google Office Suite to write test plans and to create findings presentations.

Miro - I utilized Miro for notetaking during sessions and keeping track of insights and themes.

Design Concept Development

Three design approaches were explored collaboratively across the team, each representing a different philosophy for how users should enter and navigate support.

While each concept was designed by different designers, I co-designed Option A, focusing on translating research insights as directly as possible into a product experience.

Option A (Co-Designed by Me) — Simplified, Guided Flow

Interaction flow:
Service → Device → Issue → Help Method

My goal with this concept was to design an experience that reflected how users actually think about support, rather than how systems or internal tools are organized.

Instead of introducing new patterns, I focused on operationalizing what users explicitly showed us in research.

My Design Approach

From usability testing and co-design, two things were clear:

  1. Users felt overwhelmed by too many choices upfront

  2. Users naturally narrowed problems step-by-step

So rather than presenting everything at once, I designed a progressive narrowing experience that:

  • Reduces cognitive load at each step

  • Helps users build confidence as they move forward

  • Guides both intent and discovery users without forcing either

Key Design Decisions (and Why)

1. Service-Based Starting Point (Mobile vs Home)

Why I made this decision:
In the co-design study, nearly every participant began by grouping their issue by service type.

This wasn’t prompted. It was consistent across sessions.

Design translation:
I reduced the entry point to just two primary options:

  • Mobile

  • Home

Tradeoff:

  • Fewer options = less flexibility upfront

  • But significantly lower cognitive load and faster decision-making

Intent:
Give users an immediate, confident starting point without needing to interpret multiple categories.

2. Progressive Narrowing (Step-by-Step Flow)

Why:
Users didn’t think in terms of “tools” or “features.”
They thought in terms of:
→ “What do I have?”
→ “What’s wrong?”
→ “How do I fix it?”

Design translation:
I structured the flow to follow that exact mental model:

  • Service → Device → Issue → Resolution

What this solves:

  • Eliminates the need for users to map their problem to a system

  • Reduces overwhelm by only showing relevant options at each step

Design principle:
Don’t make users interpret the system. Let the system adapt to how users think.

 

3. Search as an Alternative Path

Why:
Usability testing showed that some users arrive with a very specific goal and don’t want to navigate.

Design decision:
Instead of forcing all users into the guided flow, I added a prominent search bar at the top.

Tradeoff:

  • Adds another entry point

  • But prevents frustration for intent-focused users

Intent:
Support both behaviors without compromising the clarity of the main experience.

4. Visible Call & Chat Options

Why:
A consistent pattern in research:
When users are frustrated, they don’t want to troubleshoot. They want to talk to someone immediately.

Design decision:
I surfaced call and chat options upfront, rather than burying them deeper in the flow.

Tradeoff:

  • Could increase support costs if overused

  • But aligns with real user behavior and reduces frustration

Intent:
Acknowledge that self-service is not always the right solution and design for that honestly.

Version A_edited_edited.jpg
Version A_edited_edited.jpg
Version A_edited_edited.jpg

Evaluated Stimuli

What Makes Option A Different

Compared to the other concepts, Option A was intentionally:

  • More opinionated in structure

  • More aligned to user mental models

  • Less focused on surfacing everything upfront

It prioritized:

  • Clarity over completeness

  • Guidance over exploration

  • Confidence over flexibility

Option B — Expanded Categories + Curated Content

Option B explored a more content-rich approach.

  • Introduced more granular service categories (Fios, LTE Home, etc.)

  • Surfaced popular topics and tools upfront

  • Provided multiple entry points at once

Design philosophy:
Give users more visibility and faster access to common actions

Tradeoff:
Higher cognitive load due to increased number of choices

Option C — Chatbot-Driven Experience

Option C removed navigation entirely and relied on conversational input.

  • Users describe their issue in their own words

  • System returns personalized support

Design philosophy:
Maximize flexibility and personalization

Tradeoff:

  • Requires users to articulate their issue clearly

  • Relies heavily on trust in system accuracy

My Contribution as a Co-Designer

While each concept explored different directions, my role in Option A was to ensure that:

  • The design was directly grounded in user behavior

  • Every interaction decision could be traced back to research

  • The experience reduced friction rather than introducing new complexity

I intentionally designed Option A to reflect what users already told us they wanted, rather than introducing assumptions about what might work.

Key Takeaways

1. A clear starting point matters more than more options

  1. Across all participants, the biggest point of friction was not missing features, but not knowing where to begin.

  2. Users consistently preferred experiences that:

  3. Reduced the number of initial choices

  4. Clearly signaled the first step

  5. Helped them feel confident they were on the right path

  6. This is why Option A was preferred by the majority. It removed ambiguity and made the first decision easy.

2. Users naturally think in a step-by-step flow

  1. Users do not approach support as a list of tools or features.

  2. Instead, they follow a mental model of:
    → What service do I have?
    → What’s wrong?
    → How do I fix it?

  3. Designs that matched this progressive narrowing behavior were easier to use and required less effort.

  4. This insight directly informed the structure of Option A and validated the step-based interaction model.

3. More content increases cognitive load, even when helpful

  1. While participants appreciated the additional content and shortcuts in Option B, many described it as overwhelming.

  2. This highlights an important tradeoff:

  3. More options can improve discoverability

  4. But too many options at once reduce clarity

  5. Design needs to balance completeness with simplicity, especially at the entry point.

4. Chat alone is not enough without guidance

  1. Option C revealed that users are hesitant to rely fully on chatbot-driven experiences.

  2. Key concerns included:

  3. Uncertainty about whether the system would understand them

  4. Effort required to articulate their issue

  5. Lack of visible options to guide them

  6. Even users open to chat preferred having:

  7. Suggested prompts

  8. Clickable starting points

  9. Popular solutions surfaced upfront

5. Users want flexibility, but not at the cost of clarity

  1. Across all concepts, users valued having multiple ways to get help:

  2. Guided navigation

  3. Search

  4. Direct contact options

  5. However, flexibility only worked when anchored by a clear primary path.

  6. Without that, additional options became noise rather than support.

6. Designing for both intent and discovery requires structure, not more features

  1. The core challenge of this project was supporting two very different user mindsets.

  2. The solution was not adding more entry points, but:

  3. Structuring the experience to guide discovery users

  4. Preserving fast paths (search, shortcuts) for intent users

  5. Option A performed best because it balanced both within a single, coherent flow.

Impact & Conclusion

When the three concepts were evaluated in the qualitative A/B/C test, 8 out of 12 participants preferred Option A, the concept I co-designed.

  1. Participants consistently described it as:

  2. Clear

  3. Organized

  4. Straightforward

  5. Many noted that the simplified structure made it easier to understand where to start and gave them confidence they would reach the right solution.

Design Influence

  1. Although the business ultimately moved forward with a version closest to Option B for broader strategic reasons, core elements from Option A were carried into the final experience, including:

  2. A more simplified navigation structure

  3. Clearer entry points into support pathways

  4. Increased visibility of call and chat options

  5. This ensured that user-preferred patterns were still embedded in the shipped product.

Business Impact

  1. Following the redesign:

  2. Search usage decreased by 26%

  3. Click-through rates on service categories increased by 29%

  4. These improvements indicate that users were better able to navigate directly to the right solution without relying on fallback behaviors like search.

Broader Strategic Impact

  1. Feedback from users who were open to chatbot experiences, but still wanted clickable options, sparked a broader reevaluation of chatbot design across support portfolios.

  2. I led a literature review of past chatbot research, which reinforced a consistent pattern:
    → Users prefer guided entry points and selectable options before engaging in conversational flows

  3. This work influenced not just the unauthenticated mobile experience, but also authenticated and desktop support experiences, helping establish a more consistent approach to support design.

Final Reflection

  1. This project reinforced the importance of grounding design decisions in real user behavior.

  2. Option A was intentionally designed to reflect what users already told us they wanted, not what we assumed might work. Seeing that approach become the most preferred solution, and influence the final product, validated the value of tightly integrating research and design.

bottom of page