
PlayTest AI - Feb 2024 ( 8 Weeks)
From Useful to Usable: Increased User Engagement of a Game Testing Dashboard from 15% to 70%.

Introduction
Company
PlayTest AI
My Role
Founding Product Designer
Timeline
8 Weeks
Impact tldr;
Inc. user engagement by 55%
Inc. conversion by 60% in 8 weeks
Feature adoption by 41%
Responsibilities
Founding Product Designer
Research Prototyping & Testing
UX/ UI & Visual Design
Overview
PlayTest AI is an AI-powered game testing platform that automates QA for games using AI bots. It replaces manual testing with real-time gameplay analysis, bug detection, and actionable insights—helping teams ship faster with less overhead.
Challenge
When I joined, the product’s dashboard was designed by the CTO and suffered from poor usability, low engagement, and a cluttered, developer-centric UI.
Key issues:
80–90% user drop-off within minutes (Hotjar data)
Users scrolled aimlessly or got stuck
Only 1–2 users engaged meaningfully (>20 minutes)
QA teams were overwhelmed, and the dashboard wasn’t helping
The goal:
Transform a confusing, developer-built dashboard into a user-friendly, production-ready platform tailored for overworked QA teams.
Dashboard Redesign
B2B Enterprise Webapp
Conversational AI Interface
Problem
While the AI-driven QA testing appealed to potential clients, many struggled to use the platform without constant hand-holding. To scale and drive adoption, we had to solve three key challenges:
To scale effectively and drive adoption, PlayTest AI needed to address three critical challenges:
Introduce scalability features
Add features like workspaces, build stats, and project management to support growing QA teams.
Redesign the UX/UI
Make the platform intuitive and visually cohesive to reduce friction for new users.
Establish Product Moat
Create a seamless, polished interface that stands out in a crowded B2B SaaS market.

Process
Research

Heuristic Analysis
Competitive Analysis
Interview
Define + Ideate

Persona
Sitemap
User flows
Design

Web App
Design System
Testing + Iteration

Sketches
Low-Fidelity Wireframes
High-Fidelity Wireframes

How might we design a successful product that boosts user engagement and expands the user base (customers) while enhancing the overall user experience?

Solution
Redesigning for Clarity, Usability & Scale 🛠
Leveraging user behavior insights and internal research, I led a full redesign of PlayTest AI—transforming it into a truly self-serve and intuitive platform for modern QA teams.
The New Experience Enables Users To:
✅ Track test case results (pass/fail) at a glance
✅ Create and manage test scenarios—no technical skills needed, powered by AI
✅ Access detailed bug insights to support faster, smarter development decisions - Phase 2

Outcomes
Increased User Engagement Rate by 55%
Increased 30min+ user session duration from 15% to 70% by resolving usability issues, significantly reducing support tickets.
Increased Trial to Paid User Conversion by 60%
Users struggled with bug detection and test case resolution, failing to meet product promises. Growing clients from 3 pre-MVP to 11 post-redesign.
Increased 'Generate Scenario' feature adoption by 41%
Redesigning from manual to AI-driven scenario generation increased user interactions from 30–56 clicks, enabling 41% more scenarios generated & tested.
Research
I started off by talking to stakeholders to understand what the product is, how to use it, the business goals etc. Then I did a heuristic analysis of the old product and found it extremely confusing and hard to navigate.

Interviews
Apparently I wasn’t the only one, after getting feedback from our clients, I learned that they shared similar experience as me. I then created below personas with their pain points.
"Do I have to write all test cases myself"
"Why does this happen"
"Why are there so many CTAs for the same function?"
"What should I do next"
"Can I not upload other game builds"

Market Research
Before this project, I had very little knowledge about gaming industry in general. Even after knowing about the painpoints of our customers, I had no idea how to solve the problems. To make sure I knew what I was getting into, and how I could improve their experience. I read bunch of articles to learn about game analytics and user behavior analytics tools. Through this quick research method, I learned about what features are common for these tools and potentially how PlayTest AI could be improved.

Define + Ideate
Designing Under Pressure: 8 Weeks to Ship 🧭
After research, I was tasked with delivering a high-fidelity redesign of PlayTest AI—including dev-ready specs—within just 8 weeks. Prioritization was critical.
To stay aligned and move fast, I held daily syncs with the CEO, CTO, Head of Engineering, and QA testers, enabling rapid feedback and real-time decision-making.
Backed by research insights, I proposed features that balanced usability, impact, and feasibility.
🎯 Simplifying information architecture to improve navigation and reduce cognitive load
🎯 Defining a lean MVP that delivered maximum value within dev constraints
🎯 Introducing a unique AI feature to create a competitive moat


Before & After








🤖 AI Scenario Generation: Smarter Testing, Less Effort
I designed the AI Scenario Generation feature to help QA testers automate and accelerate their workflow.
Once a build is uploaded, AI bots simulate player behavior, generating dynamic test cases and identifying bugs—no manual scripting required.
Why It Mattered:
Boosted Efficiency
Replaced manual test writing with AI-powered automation.
→ Result: 41% increase in generated/tested scenarios, with user interactions jumping from 30 to 56 clicks.Created a Competitive Edge
No other platform offered this.
→ Helped us stand out and onboard new clients—strengthening our product moat.





Results

Reflection
What I learned from this experience...
Overcoming Limited User Feedback Opportunities
With only three clients and QA testers tied up with their primary roles, gathering consistent, actionable insights was a challenge.
How I navigated this constraint:
1. Leveraged alternative data sources.
2. Explored non-traditional research methods (e.g., ChatGPT, colleagues, designer friends).
3. Tapped into the UX community for heuristic evaluations.
By adapting my approach, I ensured rigorous design research and validation, even with limited direct user feedback.

Next Project
Notice of Proposal - Charlottesville City Planning Dept.
#iOS Mob App for Residents & Development Authority

Go to Home