Unauthenticated Support Landing Page: Qualitative A/B/C Test
During my time as the Support portfolio lead at Verizon, one of the projects I’m most proud of was a multi-phase research initiative to guide the redesign of the Support landing page on the Verizon website.
​
Prior to the A/B/C test in this case study, I started with usability testing on the existing page (at the time) to understand where people were getting stuck. The findings were clear: once users landed on the page, many were unsure where to begin and struggled to find a clear starting point. To reimagine the experience, I then led a co-design study to explore what an ideal support journey might look like. Across sessions, participants consistently organized support content by Verizon service type, such as Mobile or Home/Internet. That insight gave us a strong direction for how to structure the page.
​
With that foundation in place, the design team created three new versions of the Support page. At this stage, I then conducted a qualitative A/B/C test to understand how users experienced and interpreted each design. This phase focused on uncovering user preferences, mental models, and expectations, giving us critical insight into not just what people preferred but why.
​
Those findings directly informed design decisions and guided the next round of research which was a quantitative tree test that measured task success. One year after the new Support page launched, support page search usage dropped by 26% and click-through rates on menu increased by 29%.
​
The remainder of this case study focuses on the qualitative A/B/C testing phase, which ultimately shaped the direction of the redesign and delivered measurable results.
Support Portfolio at Verizon
The Support Portfolio plays a critical role in shaping the digital support experience for millions of Verizon customers. The work done in this portfolio helps ensure that support tools are intuitive, effective, and aligned with business goals.
​
The Support Portfolio covers a wide range of experiences that help users get the support they need when they have questions or are trying to solve a problem with their network, account, or device. Some experiences included in the portfolio:
-
Self-Service Troubleshooting Tool – Empowering users to independently resolve common issues without needing to call or chat with an agent.
-
Wayfinding to Support Tools – Streamlining navigation and discoverability of support features across the app and web experiences.
-
General Support Pages – Designing and refining content-rich help pages that provide clear, actionable answers to user questions.
-
Support Ticket Management – Helping users view, track, and manage their support requests with transparency and ease.
-
Device Diagnostics – Surfacing diagnostic tools that help users identify and address device-specific problems.
-
Network Status and Usage – Providing users with real-time visibility into network health, outages, and data usage.
-
Generative AI – Exploring how GenAI can enhance customer support by delivering faster, smarter, and more personalized assistance.
Project Overview
Business Situation/Problem
The Support landing page 2 year years ago did not adequately address the needs of users with different levels of issue clarity. Some users arrive with a specific solution in mind (intent-focused), while others need help identifying the problem or exploring possible resolutions (discovery-focused). At Verizon, we wanted to explore new Support landing page methods that would effectively guide both types of users to the right support path, while minimizing friction and cognitive effort.
​
Our Hypothesis:
If users are presented with structured, clickable options such as simplified (Version A) or content-rich (Version B), they will navigate to solutions more confidently and with less cognitive strain than when using a chatbot-only approach (Version C). We expect intent-focused users to prefer quick, direct access, while discovery-focused users may value additional context and curated options.
​
Research Objective
-
Understand which approach do participants find the most helpful to navigate to the right
solution. Does that change if the customer is in one of the following mindsets:-
Intent-focused: Customer knows exactly what their problem is and has a specific
solution in mind (e.g. step-by-step troubleshooter, FAQ/article, etc.). -
Discovery-focused: Customer may not fully understand their issue. Or may understand
their issue, but is open to having options for solution (e.g. Customers bill was different
than expected, but not sure why / what they should do).
-
-
Understand sentiments on our proposed approaches (A vs B vs C).
-
Understand what participants expect to see on a support landing page?(Ex: Categories, links
to specific solutions, search, Contact Us info).
​
Key Research Questions
-
General feedback on proposed approaches (A vs B vs C)
-
Initial reactions to the page as a whole?
-
Reactions to the individual page sections?
-
Any changes they would make to each approach?
-
What do participants expect to see on a support landing page?
-
Do these expectations change on an Unauthenticated vs Authenticated support landing page?
-
When / why would a user be inclined to sign-in on a support landing page?
-
-
Options A (Static categories, minimal choices):
-
Do customers find the limited, focused choice helpful?
-
Does it help them break down / articulate their issue or does it limit how participants feel they can express their issue?
-
Does it simplify their decisioning / minimize cognitive load?
-
Do the choices/categories presented meet user expectations
-
Do they have suggestions for alternate categorization or options they would like to see?
-
-
-
Options B (Static categories + curated content, heavy choice):
-
Do participants find this approach overwhelming?
-
Do customers find curated solutions helpful? If not, why?
-
Do the solutions and choices/categories meet user expectations?
-
Do they have suggestions for alternate categorization or options they would like to see?
-
Which option would a participant engage with for their scenario, why they would choose it, and what they would expect to see/do next after they choose it.

-
-
​Option C (Al driven, user-input required):
-
What are participant's initial reaction to this approach?
Is it what participants expected to see on a Support Page? If not, is it different in a good or bad way?
What would users do next if presented with this approach?
-
How do participants feel about having to input/articulate their issue?
-
What are participants' level of trust that this approach can help get them to the right solution to their problem?
-
​​
Methodology
Remote, A/B/C Usability study
1-on-1 individual sessions
60 minutes each
Mobile Device Modality
​
Participants
12 Verizon Customers​
-
6 users given an intent-focused scenario​
-
6 users given a discovery-focused scenario
Open to all account members
Mix of ages, incomes, ethnicities, and genders.
​
Tools
UserTesting - UserTesting was utilized for recruitment, and moderation.
Figma - The prototypes were created through Figma.
Google Office Suite - I utilized Google Office Suite to write test plans and to create findings presentations.
Miro - I utilized Miro for notetaking during sessions and keeping track of insights and themes.​
Evaluated Stimuli

What's the difference between versions?
Version A
This version presents two starting points( Mobile and Home services) aligning with Verizon’s primary service categories. It also includes quick access to alternative support channels like chat, call, community, and network status. Inspired by prior user feedback, this layout supports a top-down support journey: from broad service type to specific device and issue. A search bar is also included for direct input.
​
Version B
Expanding on Version A, this version offers four starting points: Mobile, Fios, 5G Home, and LTE Home. It also includes a section at the top featuring popular topics and tools, based on previous user feedback requesting quick access to commonly used resources. It reflects user feedback preferring to begin support by service type, with additional granularity for internet services. Like Version A, it retains alternative support channels and a search bar for flexibility.
​
Version C
This chatbot-based version removes selectable options entirely, relying solely on user input. Created in support of Verizon’s personalization initiatives, it aims to deliver tailored support by allowing users to describe their issues in their own words—enabling more targeted, relevant solutions.
Timeline
Week 1
-
Met with stakeholders to begin writing out a research brief that outlines what we want to achieve in the study
-
Aligned with designers what prototypes we would to test.
-
Designers began creating prototypes
Week 2
-
While designers created prototypes, I began writing the test plan for the study.
-
Once prototypes were finalized, I adjusted test plan to the final prototypes.​
-
-
Sent test plan to stakeholders for review; allotting 3 days for team to review and leave any feedback.
-
Started recruitment for study on UserTesting.
Week 3
-
Made final changes to the test plan based on stakeholder feedback on Monday.
-
Moderated testing sessions begin on Tuesday and continued moderating sessions throughout the week.
Week 4
-
Synthesized findings and then created the finding presentation.
-
Held a read-out meeting with stakeholders to present findings.
Key Takeaways
-
Half of users prefer Option A the most, 4 prefer Option B, and 2 prefer Option C.
-
The users who prefer Option A like that it has everything Option B has but is simplified, while still providing more action items than Option C.
-
-
All users felt Option A has a low cognitive load and they have high trust that Option A will help them find the right solution to their problem.
-
On the other hand some users felt the amount of options within Option B was a higher cognitive load and the cognitive load of Option C would be only be low so long as the chatbot understood the question being asked.
-
-
Overall, users want clickable actionable items like Options A and B.
-
Users also appreciated, Option B's curated content at the top of the page. Despite the few who did prefer Option C, they also wanted to integrate clickable options within the chat such as popular solutions or options to login.
-
Impact
-
User preference for simple, clickable options led business leaders to move forward with Version B, which is still in use under Verizon’s 2025 rebrand. Feedback from users who liked Version C’s chatbot but wanted clickable options sparked a broader discussion and reevaluation of the chatbot experience across portfolios.
-
I spearheaded a literature review of past chatbot research, which confirmed a consistent user desire for clickable options at the start of a chat to reduce friction and avoid miscommunication.
-
-
This study influenced not just the mobile unauthenticated Support page, but also the authenticated and desktop experiences, setting a new standard for support experiences. As a result of the redesign, a comparison of click rates one year apart from the original support page to the redesign showed search usage was reduced by 26%, while click-through rates on service categories increased by 29%.