Logo
Back to Projects

Provoid: Design Thinking for a Cognitive Sports App

→ Introduction

After building the Provoid Sports App — a cognitive training platform for elite young athletes — the next critical challenge was ensuring the UI/UX actually worked for real users under real conditions. The app had been built at speed; now it needed to be validated.

I led a full Design Thinking process to identify, prioritize, and solve the most friction-heavy UX problems in the app. Using a structured 5-phase approach — Empathize, Define, Ideate, Prototype, and Test — I took the product from raw beta feedback to a validated, iterated prototype deployed to real devices via TestFlight and Google Play Beta.

✕ Challenge

The Provoid Sports App was a technically solid product, but early beta testing revealed significant friction points that threatened user engagement and training consistency.

The core UX problems identified:

  • Unclear Instructions: Multiple users across iOS and Android reported that game tutorials were too long, text-heavy, and hard to find before gameplay started.
  • Poor In-Game Feedback: Users did not receive immediate, clear feedback on whether their answers were correct — breaking the learning loop that cognitive training depends on.
  • Unintuitive Navigation: Key UI elements (buttons, modals, scroll indicators) were not discoverable without exploration, causing confusion and frustration.
  • Difficulty Balance Issues: Some games felt too long and exhausting; others too easy, reducing the motivation to keep playing.
  • UI Inconsistencies: Overlapping buttons, misaligned tap targets, and layout shifts on certain screen sizes undermined the perceived quality of the product.

✓ Solution

I applied a structured Design Thinking process, working through each phase with real beta testers and real device builds — not simulated environments.

🔍 Empathize: Beta Testing & User Interviews

I distributed the prototype to 6 beta testers via TestFlight (iOS) and Google Play Beta (Android). Testers were given minimal direction — "explore the tools, report bugs and inconsistencies, tell us how each game felt" — to simulate authentic first-use conditions.

Feedback was collected via WhatsApp, email, and direct TestFlight annotations, covering users across iOS and Android devices including iPhone, iPad Air, and various Android handsets.

🗺️ Define: Affinity Map & Empathy Map

I synthesized all raw feedback into a structured Affinity Map across 8 categories: Instructions/Tutorials, Intuition/Usability, Duration/Difficulty, In-Game Feedback, UI/Visual Design, Performance/Tech, Audio, and Leaderboard/Motivation.

From this, I built an Empathy Map to understand what users were saying, thinking, feeling, and doing — as well as their key pain points. The dominant insight: users felt frustrated and uncertain, not because the cognitive games were hard, but because the interface gave them no guidance or feedback during gameplay.

💡 Ideate: HMW Statements & Brainstorming

Using "How Might We" (HMW) statements for each pain point category, I ran a focused ideation session to generate solutions. Selected ideas included:

  • Visually-driven tutorials with images instead of walls of text
  • Separate instruction screens before gameplay (no distractions during play)
  • Green flash feedback for correct answers
  • Larger, clearly defined action buttons
  • Sequential screen flow for the GameResultsModal with a visible scroll indicator

🎨 Prototype: Lo-Fi to Hi-Fi

I created low-fidelity wireframes in Figma and Miro, focusing on two high-priority surfaces: the Go/No-Go game and the GameResultsModal. These wireframes were used to align on layout and interaction flow before writing a single line of code.

The high-fidelity prototype was built directly in React Native — the production codebase — and included: a dedicated instruction screen with an image and two audio preview buttons, a restructured game layout with stats on top and a large centered action button, real-time green flash feedback on correct responses, and a scroll-cue button in the results modal.

The Hi-Fi prototype was published to Apple TestFlight so the usability test participant could experience it on their own physical device.

🧪 Test: Structured Usability Testing

I conducted a formal usability test with a participant (Youseff Ala), following a structured protocol developed after identifying errors in a previous informal session. The 20-minute in-person test included free app exploration and a recorded interview transcribed via Happyscribe.

To ensure test quality, I created a UX Test Moderation Error Checklist — covering 11 common moderation mistakes such as intervening too early, asking leading questions, and failing to maintain neutrality — all of which I tracked and avoided during the session.

★ Business Results

The usability test validated that the Design Thinking cycle successfully resolved the core UX friction points identified in beta testing.

Validation Outcomes

The structured test confirmed improvements across every major problem category identified in the Affinity Map.

  • Instructions: User confirmed the separated, visual instruction screen made the game "really understandable" from the second attempt onward.
  • In-Game Feedback: The green flash was described as "clear" and non-distracting — confirming the design decision to use a full-screen subtle color overlay.
  • Button Size & Navigation: Buttons were rated as correct size with no friction; the GameResultsModal scroll behavior was intuitive without needing the scroll indicator.

UX Quality Signal

The overall feedback from the usability test established a strong baseline for the product's UI quality before the SC Victoria pilot.

  • Visual Design Validated: Colors, contrast, and layout described as "simple, very nice" and "user-friendly" — appropriate for the target demographic.
  • Difficulty Curve Confirmed: Progressive difficulty described as "good difficulty. I think it's enough" — validating the adaptive challenge design.
  • One Open Bug Identified: A timing issue where the action button briefly disappeared between game rounds was captured and logged for resolution before the pilot launch.

↻ Project Timeline

The full Design Thinking cycle was completed in a focused sprint running parallel to the app's final pre-launch phase.

Phase 1 - Empathize

Distributed prototype to 6 beta testers via TestFlight and Google Play Beta, collecting raw feedback across iOS and Android devices.

Phase 2 - Define

Synthesized tester feedback into an Affinity Map and Empathy Map across 8 UX categories, identifying core pain points.

Phase 3 - Ideate

Generated HMW statements for each pain point category and selected the highest-impact solutions through structured brainstorming.

Phase 4 - Prototype

Built lo-fi wireframes in Figma and Miro, then implemented a hi-fi prototype directly in React Native and shipped to TestFlight.

Phase 5 - Test

Conducted a structured 20-minute usability test with a UX moderation error checklist, recorded and transcribed, with results validating all major improvements.

Ian Paniagua

Ready to transform your product?

Let's discuss how we can achieve similar results for your business. Contact me at paniagua.ian.de@gmail.com or book a call below.