Executive Summary

Problem

WTFast suffered from an unsustainably low trial-to-paid conversion rate. Most new users churned within the first 24 hours, before experiencing the product's value.

My Role

I proactively led the research and design strategy to diagnose and solve the root cause of churn, from data analysis to wireframes and overseeing implementation.

Solution

I redesigned the FTUE to guide users to an immediate "win" and collaborated with engineering to fix a critical systemic bug that broke the first-time experience.

Validated Impact

The result was a **400% increase** in new user activation and a **tripling (200% increase)** of the conversion rate to paid subscription.


1. Situation: The Costly Mystery of Abandonment

When I took on this project, WTFast was facing a paradox. Surveys of existing users indicated that the main concern was ping optimization. However, product data showed an alarming churn rate: the vast majority of new users acquired through paid marketing were leaving without even completing a single gaming session.

These users were a blind spot. They didn’t stick around long enough to complain about ping; they just disappeared. This led me to a key hypothesis: the company was optimizing for those who stayed, while the real revenue “hemorrhage” was at the first contact with the product. The problem wasn’t long-term retention; it was initial activation.


2. Task: Diagnose the Root Cause and Design a High-Impact Solution

I proposed a design initiative to the Product Manager focused on a single metric: increasing the new user activation rate. My mandate was clear:

  • Investigate the actual behavior of users who churned to discover the “why” behind the drop-off.
  • Define a design strategy that would eliminate the fundamental friction in the onboarding (FTUE).
  • Design and prototype the solution (wireframes and flows), and collaborate closely with engineering for its implementation and with an external agency for the final UI.

My approach was simple: we needed to stop asking those who stayed and start observing those who left.


3. Action: From Observation to Solution

My process was divided into two critical phases:

Part A: Research and Discovery - The “Smoking Gun”

Using an internal session analysis tool, I examined over 2000 recordings of users who churned within the first 24 hours. I discovered two findings that changed everything:

  1. Behavioral Finding: A clear failure pattern emerged. 70% of new users ignored the automatic flow and jumped to manual configuration. There, they faced an overwhelming list of +200 servers and, after less than 45 seconds of confusion, closed the app.
  2. Critical Technical Finding: My experience as a gamer and user of the application made me suspect the ping data was incorrect. Upon escalating this to the CTO, we discovered a systemic bug: the app was performing such an aggressive ping test that ISPs were blocking it, displaying erroneous data. Worse, the system then actively routed users through suboptimal connections, or at best, did nothing. The product wasn’t just confusing; it was actively sabotaging the experience.

Part B: Solution Design - Guided Towards Success

With the root cause identified, I designed a two-pronged solution:

  1. Bug Fix (Collaboration with Engineering): For the new ping test wait time, I designed a loading screen that communicated value: “We are analyzing our global network…”. This transformed friction into a perception of power and reliability.
  2. FTUE Redesign: I replaced the open dashboard with a linear wizard focused on one principle: “the quickest possible win.”
    • Reduced Cognitive Load: Instead of a list of 200 servers, the user only selected their game region.
    • Created a “Magic” Moment: The wizard automatically detected installed games, eliminating manual search.
    • Forced Success: The flow culminated in a single button: “Optimize and Play.” Manual configuration was hidden until after this first successful session.

![A comparative image would go here: the old, confusing dashboard full of options, next to the new, clean, linear wizard flow. Arrows and brief annotations would point out the resolved pain points.]


4. Result: Activation Skyrocketed, Conversion Tripled

The solution was implemented, and the results were monitored via A/B testing, confirming a massive and direct impact.

  • The new user activation rate increased from 10% to 50%. This represents a 400% increase in the number of people who experienced the product correctly.
  • As a direct result, the trial-to-paid conversion rate tripled (a 200% increase), directly impacting the company’s main business metric.

![A powerful image here would be a simple bar chart showing the “Before” and “After” of the Activation Rate, and another for the Conversion Rate (using an index of 100 for “Before” and 300 for “After” to avoid revealing real numbers).]


5. Key Learnings

  • Behavior > Opinion: This project was a masterclass in how behavioral data can reveal problems that surveys will never show. Users said “the ping isn’t good” when the real problem was “I don’t understand how to start.”
  • Design is a Technical Partner: Collaboration with engineering wasn’t a final step; it was the key to discovery. A product designer must have the curiosity to ask “how does this work?” to find problems beyond the pixel layer.