Case Study № 08 · ACM CHI 2024

OPM Viz: Bridging Gameplay Data and Learning Reflection

Transforming educational game data into actionable learning insights through visualization systems that enable student reflection and peer comparison.

Role Lead UX Researcher
Duration 21 Months
Published ACM CHI 2024
21
Month Study
25
Stakeholders Interviewed
CHI
2024 Published
Open
Player Model
Quick Read The essentials in 60 seconds
01
The Problem

Educational games capture rich data, but students never see it—leaving valuable learning opportunities unrealized.

02
The Approach

Design-Based Research with instructor focus groups, student playtesting, and iterative visualization design.

03
The Solution

OPM Viz enables peer comparison, metrics viewing, and synchronized replay for reflective learning.

04
The Impact

Students identified inefficiencies, improved self-assessment, and developed curiosity about alternative approaches.

Educational games generate massive amounts of gameplay data—every decision, every strategy, every moment of struggle or breakthrough. But when students finish a level, where does that data go? Usually, nowhere. The game records it, but students never see it again.

This case study documents the development of OPM Viz, a visualization system for the 'Parallel' educational game that transforms gameplay data into learning opportunities. Rather than letting valuable performance data disappear into the void, OPM Viz makes it accessible, comparable, and actionable.

Students don't know how their approach compared to their peers, where they could improve, or what strategies might be more efficient. Through a 21-month Design-Based Research project, we turned every play session into a reflective learning experience.

The Problem

Data Inaccessibility

Games like 'Parallel' record detailed information about student strategies and struggles—but this data lives in logs only researchers see. Students can't access their own performance data.

Missed Learning Opportunity

Research shows reflection and peer comparison enhance learning. Educational games generate the data needed, but don't provide tools to visualize it.

Surface-Level Dashboards

Existing dashboards show scores or completion rates, but don't reveal the "why." Students know they took 50 steps, but not which were inefficient.

"I finished it, but I have no idea if there's a better way. Am I using ten steps when I could use five?" — Student participant during playtesting

Research Approach

As Lead UX Researcher and Designer, I conducted a 21-month Design-Based Research project to create and validate OPM Viz:

01

Instructor Focus Groups

Worked with 2 parallel programming instructors analyzing student gameplay to understand how experts identify learning moments.

02

Student Playtesting

Observed 10 students playing 'Parallel' and conducted semi-structured interviews to identify core needs.

03

Thematic Analysis

Coded qualitative data to understand patterns in student struggles and desires for improvement.

04

Design & Build

Built OPM Viz enabling peer comparison, metrics viewing, and synchronized replay.

05

Integration

Seamlessly connected the visualization with the 'Parallel' game's existing platform.

06

Evaluation

Conducted usability study with 8 students using think-aloud protocol and interviews.

UXR pipeline diagram

UXR pipeline adopted for this project: Understand → Design → Evaluate.

Understanding Users

Instructor Focus Groups

Before designing the visualization, we needed to understand how experts actually use gameplay data. We conducted workshops with two experienced parallel programming instructors who analyzed anonymized student gameplay recordings from 'Parallel'.

Chunking Into Segments

Instructors broke gameplay into conceptual chunks, not continuous streams. "Steps 2 and 3, that's where the student is trying to figure out the Switch mechanism." Effective visualization needs to segment around learning concepts.

Spatial Context Matters

Instructors constantly connected gameplay actions to specific game areas. "This is happening in the synchronization zone"—linking spatial gameplay to abstract parallel programming concepts.

Comparative Analysis

Instructors rarely analyzed a single student in isolation. "This student used a mutex here, while this one avoided it entirely"—using comparison to highlight efficiency or alternatives.

Learning-Grounded Metrics

When ranking students, instructors evaluated based on core concepts: Did students identify critical sections? Did they minimize critical section size? Did they understand concurrency patterns?

Student Playtesting Insights

To understand what students need from a visualization system, we observed 10 students playing 'Parallel' and conducted semi-structured interviews. Through thematic analysis, three core needs emerged:

Efficiency Improvement

Players weren't satisfied with just completing levels—they wanted to know if solutions were efficient. Students craved ways to identify inefficiencies and learn from more optimal peer strategies.

Multi-Scale Comparison

Students wanted both high-level overview ("Did I take more steps overall?") and granular step-by-step comparison ("What did they do differently at step 15?").

Guided Discovery

When stuck, students wanted help without complete answers. They appreciated hints or alternative solution paths—nudges forward without defeating the challenge.

Key Outcomes

The evaluation confirmed OPM Viz successfully prompted the reflection we designed it for.

CHI
Publication
Published at ACM CHI 2024—the premier HCI conference
Novel
Contribution
Open Player Model approach applied to dynamic gameplay contexts
Active
Reflection
Students compared, analyzed, questioned, and learned from their data

The Solution: OPM Viz

Guided by Shneiderman's Visual Information-Seeking Mantra ("Overview first, zoom and filter, then details-on-demand"), we designed OPM Viz—an interactive visualization system that integrates with 'Parallel' to empower student reflection.

OPM Viz interface

The OPM Viz Visualization System Interface showing peer comparison and metrics.

The system allows students to:

Video Demonstrations

Explaining the features of the OPM Viz system.

Overview of the development process and algorithms.

Evaluation: Does it Foster Reflection?

We conducted an evaluative study with 8 students using think-aloud protocol and semi-structured interviews. Our analysis focused on whether students actually used comparison features, whether the system prompted reflection, and what specific insights emerged.

Thematic analysis results

Thematic map illustrating how students used OPM Viz for reflection and learning.

Facilitated Active Comparison

Participants consistently compared strategies with peers. "I thought my solution was good until I saw someone did it in half the steps. Now I'm curious how."

Identified Specific Inefficiencies

Students identified concrete areas of inefficiency: "I see—I backtracked here three times, but this student solved it without backtracking."

Sparked Conceptual Curiosity

"These two solutions both work, but they're using different synchronization patterns. I want to understand why."

Calibrated Self-Assessment

Students developed realistic views of where they stood relative to peers and where they needed to improve.

"The key was enabling comparison across multiple dimensions simultaneously—performance metrics alongside visual replays. This multi-scale, multi-modal comparison enabled genuine reflection." — Research finding

Impact & Contributions

Bridging OLMs and Games

Introduced an Open Player Model approach, adapting Open Learner Model principles for the dynamic context of serious games.

Learning-Focused Visualization

Designed and validated a player-facing visualization system prioritizing reflection and learning over purely aesthetic goals.

Enhancing 'Parallel'

Provided a novel tool for the established 'Parallel' research platform, enabling new avenues for studying game-based learning.

Published Research

Shared findings and system design with the academic community via publication at ACM CHI 2024.

Conclusion

OPM Viz successfully demonstrated how thoughtful visualization design can bridge the gap between gameplay data and meaningful learning reflection.

By centering the design process around both instructor expertise and student needs, we created a system that not only presents data but actively facilitates comparative analysis and reflection that leads to deeper understanding.

This work contributes to learning analytics and educational technology, showing how UX research methods can inform tools that truly support learning rather than simply displaying information. The positive evaluation results and academic recognition underscore the value of this user-centered approach.

Skills & Methods Demonstrated

Research

Design-Based Research, Focus Groups, Usability Testing, Thematic Analysis, Think-Aloud Protocol

Design

Visualization Design, Information Architecture, Interaction Design, Educational Technology

Specialized

Learning Analytics, Educational Games, Reflection Systems, Open Learner Models

Impact

Academic Publishing (CHI 2024), Iterative Development, Cross-Functional Collaboration