Educational games generate massive amounts of gameplay data—every decision, every strategy, every moment of struggle or breakthrough. But when students finish a level, where does that data go? Usually, nowhere. The game records it, but students never see it again.
This case study documents the development of OPM Viz, a visualization system for the 'Parallel' educational game that transforms gameplay data into learning opportunities. Rather than letting valuable performance data disappear into the void, OPM Viz makes it accessible, comparable, and actionable.
Students don't know how their approach compared to their peers, where they could improve, or what strategies might be more efficient. Through a 21-month Design-Based Research project, we turned every play session into a reflective learning experience.
The Problem
Data Inaccessibility
Games like 'Parallel' record detailed information about student strategies and struggles—but this data lives in logs only researchers see. Students can't access their own performance data.
Missed Learning Opportunity
Research shows reflection and peer comparison enhance learning. Educational games generate the data needed, but don't provide tools to visualize it.
Surface-Level Dashboards
Existing dashboards show scores or completion rates, but don't reveal the "why." Students know they took 50 steps, but not which were inefficient.
Research Approach
As Lead UX Researcher and Designer, I conducted a 21-month Design-Based Research project to create and validate OPM Viz:
Instructor Focus Groups
Worked with 2 parallel programming instructors analyzing student gameplay to understand how experts identify learning moments.
Student Playtesting
Observed 10 students playing 'Parallel' and conducted semi-structured interviews to identify core needs.
Thematic Analysis
Coded qualitative data to understand patterns in student struggles and desires for improvement.
Design & Build
Built OPM Viz enabling peer comparison, metrics viewing, and synchronized replay.
Integration
Seamlessly connected the visualization with the 'Parallel' game's existing platform.
Evaluation
Conducted usability study with 8 students using think-aloud protocol and interviews.
UXR pipeline adopted for this project: Understand → Design → Evaluate.
Understanding Users
Instructor Focus Groups
Before designing the visualization, we needed to understand how experts actually use gameplay data. We conducted workshops with two experienced parallel programming instructors who analyzed anonymized student gameplay recordings from 'Parallel'.
Chunking Into Segments
Instructors broke gameplay into conceptual chunks, not continuous streams. "Steps 2 and 3, that's where the student is trying to figure out the Switch mechanism." Effective visualization needs to segment around learning concepts.
Spatial Context Matters
Instructors constantly connected gameplay actions to specific game areas. "This is happening in the synchronization zone"—linking spatial gameplay to abstract parallel programming concepts.
Comparative Analysis
Instructors rarely analyzed a single student in isolation. "This student used a mutex here, while this one avoided it entirely"—using comparison to highlight efficiency or alternatives.
Learning-Grounded Metrics
When ranking students, instructors evaluated based on core concepts: Did students identify critical sections? Did they minimize critical section size? Did they understand concurrency patterns?
Student Playtesting Insights
To understand what students need from a visualization system, we observed 10 students playing 'Parallel' and conducted semi-structured interviews. Through thematic analysis, three core needs emerged:
Efficiency Improvement
Players weren't satisfied with just completing levels—they wanted to know if solutions were efficient. Students craved ways to identify inefficiencies and learn from more optimal peer strategies.
Multi-Scale Comparison
Students wanted both high-level overview ("Did I take more steps overall?") and granular step-by-step comparison ("What did they do differently at step 15?").
Guided Discovery
When stuck, students wanted help without complete answers. They appreciated hints or alternative solution paths—nudges forward without defeating the challenge.
Key Outcomes
The evaluation confirmed OPM Viz successfully prompted the reflection we designed it for.
The Solution: OPM Viz
Guided by Shneiderman's Visual Information-Seeking Mantra ("Overview first, zoom and filter, then details-on-demand"), we designed OPM Viz—an interactive visualization system that integrates with 'Parallel' to empower student reflection.
The OPM Viz Visualization System Interface showing peer comparison and metrics.
The system allows students to:
- View their own performance metrics against the community
- Compare solution paths against anonymized peers
- Filter peers based on performance or specific strategies
- Replay games side-by-side with synchronized playback
- Identify key differences and potential areas for improvement
Video Demonstrations
Explaining the features of the OPM Viz system.
Overview of the development process and algorithms.
Evaluation: Does it Foster Reflection?
We conducted an evaluative study with 8 students using think-aloud protocol and semi-structured interviews. Our analysis focused on whether students actually used comparison features, whether the system prompted reflection, and what specific insights emerged.
Thematic map illustrating how students used OPM Viz for reflection and learning.
Facilitated Active Comparison
Participants consistently compared strategies with peers. "I thought my solution was good until I saw someone did it in half the steps. Now I'm curious how."
Identified Specific Inefficiencies
Students identified concrete areas of inefficiency: "I see—I backtracked here three times, but this student solved it without backtracking."
Sparked Conceptual Curiosity
"These two solutions both work, but they're using different synchronization patterns. I want to understand why."
Calibrated Self-Assessment
Students developed realistic views of where they stood relative to peers and where they needed to improve.
Impact & Contributions
Bridging OLMs and Games
Introduced an Open Player Model approach, adapting Open Learner Model principles for the dynamic context of serious games.
Learning-Focused Visualization
Designed and validated a player-facing visualization system prioritizing reflection and learning over purely aesthetic goals.
Enhancing 'Parallel'
Provided a novel tool for the established 'Parallel' research platform, enabling new avenues for studying game-based learning.
Published Research
Shared findings and system design with the academic community via publication at ACM CHI 2024.
Conclusion
OPM Viz successfully demonstrated how thoughtful visualization design can bridge the gap between gameplay data and meaningful learning reflection.
By centering the design process around both instructor expertise and student needs, we created a system that not only presents data but actively facilitates comparative analysis and reflection that leads to deeper understanding.
This work contributes to learning analytics and educational technology, showing how UX research methods can inform tools that truly support learning rather than simply displaying information. The positive evaluation results and academic recognition underscore the value of this user-centered approach.
Skills & Methods Demonstrated
Design-Based Research, Focus Groups, Usability Testing, Thematic Analysis, Think-Aloud Protocol
Visualization Design, Information Architecture, Interaction Design, Educational Technology
Learning Analytics, Educational Games, Reflection Systems, Open Learner Models
Academic Publishing (CHI 2024), Iterative Development, Cross-Functional Collaboration