An IRB-approved study combining computer vision, behavioral performance data, and machine learning to model user-product fit for gaming peripherals.
This project explores how user preference, physical constraints, and performance metrics can be combined to design a human-centered recommendation system for gaming mice. Rather than optimizing for raw performance alone, the goal was to understand how comfort, hand dimensions, and subjective preference influence user satisfaction — and to translate those insights into a system capable of recommending gaming mice to new users.
Users face a genuinely difficult decision:
From a design perspective, the challenge was to capture meaningful user signals beyond raw performance, respect subjective comfort and preference, and build a recommendation system grounded in human experience — not just optimization.
How can comfort, physical fit, and subjective preference be combined into a recommendation system that helps users find the right gaming mouse — before they buy it?
I was responsible for the project end-to-end:
The study involved in-person interviews with participants, controlled experimental testing of 14 gaming mice, surveys measuring comfort, willingness to use, and subjective preference, and performance data collected through standardized Aim Lab tasks.
16 participants each tested 7 of the 14 mice, which were clustered based on size and weight and assigned to participants based on coverage across the experiment.
Three key insights shaped the system design:
These insights reinforced the importance of designing a system that treats users as experiencing bodies, not abstract data points.
The experiment used an incomplete block structure where each participant tested 7 of the 14 mice. Each participant had their hand scanned, order was controlled to reduce fatigue and bias, and participants completed structured tasks and surveys for each mouse tested. This ensured that recommendations were grounded in comparable user experiences.
Three design goals guided every decision:
The system represents users using normalized hand measurements and grip style, identifies similar users based on proximity in feature space, aggregates preference signals from similar users, and produces a ranked list of recommended mice. The design prioritized simplicity and interpretability over model complexity.
The final system takes a new user's hand measurements and grip style, finds similar users based on those characteristics, weighs subjective preference and comfort to generate recommendations, and outputs a ranked list of gaming mice. Rather than claiming to find a "perfect" mouse, the system supports better-informed decision making by narrowing choices based on human-centered signals.
This project was overall successful in demonstrating a proof of concept — a pipeline where users get their hand scanned, test an array of mice, and report on how each felt, allowing that data to meaningfully inform the mouse selection process.
The study did face real limitations. The sample size of both participants and mice was small relative to the broader population of gamers, and there are many variables that 16 participants and 14 devices can't fully account for. Testing was also limited to purely aiming tasks, which captures one dimension of use but misses a lot of how people actually interact with a mouse day-to-day.
If this project were to continue, the next steps would be to expand the selection of mice and the number of participants, and to allow participants to use the mice for longer periods across multiple different environments — not just controlled aiming tasks.