OPM Viz: Bridging Gameplay Data and Learning Reflection
Educational games generate massive amounts of gameplay data—every decision, every strategy, every moment of struggle or breakthrough. But when students finish a level, where does that data go? Usually, nowhere. The game records it, but students never see it again. They don't know how their approach compared to their peers, where they could improve, or what strategies might be more efficient.
This case study documents the development of OPM Viz, a visualization system for the 'Parallel' educational game that transforms gameplay data into learning opportunities. Rather than letting valuable performance data disappear into the void, OPM Viz makes it accessible, comparable, and actionable—turning every play session into a reflective learning experience.
The Problem
Educational games capture rich data, but students can't access it: Games like 'Parallel' (which teaches parallel programming) record detailed information about how students approach problems: which strategies they use, where they struggle, how their solutions compare to optimal approaches. But this data typically lives in logs that only researchers see. Students are left wondering: "Was my solution good? How did others solve this? What could I do better?"
The missed learning opportunity: Research in learning sciences shows that reflection and peer comparison significantly enhance learning. Students learn by analyzing their own performance, seeing how others approached the same challenge, and identifying specific areas for improvement. Educational games generate the data needed for this reflection, but don't provide tools to visualize it.
The practical challenge: Existing educational games often include performance dashboards showing scores or completion rates, but these don't reveal the "why" behind performance. A student might know they took 50 steps to complete a level, but not understand which steps were inefficient or how a peer solved it in 20 steps with a different strategy.
My Approach
As Lead UX Researcher and Designer, I conducted a 21-month Design-Based Research project to create and validate OPM Viz:
- Conducted instructor focus groups: Worked with 2 parallel programming instructors who analyzed student gameplay recordings to understand how experts identify learning moments
 - Led student playtesting: Observed 10 students playing 'Parallel' and conducted semi-structured interviews to identify core needs
 - Applied thematic analysis: Coded qualitative data to understand patterns in student struggles and desires
 - Designed interactive visualization system: Built OPM Viz enabling peer comparison, metrics viewing, and synchronized replay
 - Integrated with 'Parallel': Seamlessly connected the visualization with the game's existing platform
 - Evaluated learning impact: Conducted usability study with 8 students using think-aloud protocol and interviews
 
Key Outcomes
- Students identified specific inefficiencies in their parallel programming strategies
 - Improved self-assessment and metacognition through comparative visualization
 - Sparked curiosity about alternative approaches to problem-solving
 - Demonstrated successful reflection: Students actively used the system to analyze their gameplay
 - Published at ACM CHI 2024—the premier HCI conference
 - Novel Open Player Model approach: Applied Open Learner Model principles to dynamic gameplay contexts
 
The Challenge: From Data to Learning
Educational games like 'Parallel', which teaches parallel programming concepts, generate vast amounts of player data. However, a significant gap exists: how can this data be effectively visualized to help students reflect on their own learning process and understand the strategies used by their peers?
Existing tools often lack features specifically designed to foster reflective learning. Our core challenge was to design a visualization system that moves beyond simple data presentation to actively guide students in analyzing gameplay, comparing strategies, and ultimately deepening their understanding of complex concepts taught through the game.
Understanding the Users: Instructors & Students
To ensure our solution was grounded in real needs, we employed a Design-Based Research approach, incorporating UX methods to understand both instructor goals and student expectations.
                Instructor Focus Groups: Learning from Expert Practice
Before designing the visualization, we needed to understand how experts actually use gameplay data to support learning. We conducted workshops with two experienced parallel programming instructors, asking them to analyze anonymized student gameplay recordings from 'Parallel'.
The process: Instructors watched recordings of students playing levels, identified different strategies students used, ranked performance, and pinpointed specific moments where reflection would be valuable. Through observation and think-aloud protocol, we documented how experts naturally analyze gameplay data.
Key insights emerged from how instructors naturally worked:
- Chunking gameplay into meaningful segments: Instructors didn't analyze gameplay as continuous streams—they broke it into conceptual chunks. One instructor observed, "Steps 2 and 3, that's where the student is trying to figure out the Switch mechanism." This revealed that effective visualization needs to segment gameplay around learning concepts, not just chronological steps.
 - Spatial context matters: Instructors constantly connected gameplay actions to specific game areas and programming concepts. They'd note, "This is happening in the synchronization zone," linking spatial gameplay to abstract parallel programming concepts.
 - Comparative analysis drives insight: Instructors rarely analyzed a single student in isolation. They consistently compared strategies across students, noting things like "This student used a mutex here, while this one avoided it entirely," using comparison to highlight efficiency or alternative approaches.
 - Performance metrics grounded in learning: When ranking students, instructors didn't just count steps or time. They evaluated based on core parallel programming concepts: Did students identify critical sections? Did they minimize critical section size? Did they understand concurrency patterns?
 
Student Playtesting: Understanding Learner Needs
To understand what students actually need from a visualization system, we observed 10 students playing 'Parallel' and conducted semi-structured interviews following their gameplay. This revealed a significant gap between what games currently offer and what learners actually need.
What we observed: Students played levels, sometimes struggling for extended periods, sometimes finding elegant solutions quickly. After completion, we asked about their experience: What was hard? How did they know if their solution was good? What would help them improve? Through thematic analysis of interview transcripts and gameplay observations, three core needs emerged.
Key Insights from Students:
- Desire for efficiency improvement: Players weren't satisfied with just completing levels—they wanted to know if their solutions were efficient. One student expressed frustration: "I finished it, but I have no idea if there's a better way. Am I using ten steps when I could use five?" Students craved ways to identify inefficiencies and learn from more optimal peer strategies.
 - Need for multi-scale comparison: Students wanted to compare their gameplay with peers, but at different levels of detail. Sometimes they needed a high-level overview: "Did I take more steps overall?" Other times they needed granular step-by-step comparison: "What did they do differently at step 15?" The visualization needed to support both zoomed-out summaries and detailed moment-by-moment analysis.
 - Seeking guidance without giving up: When stuck, students wanted help, but didn't want complete answers handed to them. They appreciated hints or alternative solution paths—ways to nudge them forward without defeating the challenge. After completing levels, they also valued exploring different valid approaches, understanding that there were multiple ways to solve the same problem.
 
Why this mattered for design: These insights directly shaped OPM Viz's features. The need for efficiency comparison led to peer performance metrics. The desire for multi-scale views informed our "overview first, then zoom and filter" approach. The want for guidance without answers inspired our reflection prompts—questions that guided analysis without revealing solutions.
Designing the Visualization System
Guided by insights from instructors and students, and utilizing Shneiderman's Visual Information-Seeking Mantra ("Overview first, zoom and filter, then details-on-demand") as a framework, we designed and developed the OPM Viz system.
Key design goals included:
- Providing an overview comparing a student's performance against the community
 - Allowing users to filter and select specific peer playtraces for deeper comparison
 - Offering a synchronized, detailed side-by-side view of playtraces
 - Incorporating metrics relevant to parallel programming efficiency
 - Suggesting moments for reflection based on common patterns or deviations
 
                The Solution: OPM Viz
OPM Viz is an interactive visualization system designed to integrate seamlessly with the 'Parallel' serious game. It empowers students to reflect on their gameplay, compare strategies with peers, and gain deeper insights into parallel programming concepts.
Explaining the features of the OPM Viz system.
Overview of the development process and algorithms.
The system allows students to:
- View their own performance metrics
 - Compare their solution path against anonymized peers
 - Filter peers based on performance or specific strategies
 - Replay their own and selected peers' games side-by-side
 - Identify key differences and potential areas for improvement
 
Evaluation: Does it Foster Reflection?
Could OPM Viz actually prompt the kind of reflection and learning we designed it for? To answer this, we conducted an evaluative study with 8 student participants who had used 'Parallel' in a previous course.
The evaluation method: After playing levels in 'Parallel', participants used OPM Viz to explore their own gameplay data compared to anonymized peers. We employed think-aloud protocol—asking participants to verbalize their thoughts as they used the system. Following the session, we conducted semi-structured interviews probing deeper into their experience and what they learned.
Our analysis focused on three questions:
- Did students actually use the system's features to compare their strategies with peers?
 - Did using the system prompt reflection on their own performance and learning?
 - What specific insights or learning moments did the visualization enable?
 
                Key Findings from Evaluation:
- Facilitated active comparison: Participants consistently used the system to compare their strategies with anonymized peers. One student noted, "I thought my solution was good until I saw someone did it in half the steps. Now I'm curious how." The visual comparison didn't just show differences—it prompted students to analyze why.
 - Identified specific inefficiencies: Rather than vague feelings of "maybe I could do better," students identified concrete areas where their approaches were less efficient. They'd pinpoint moments: "Oh, I see—I backtracked here three times, but this student solved it without backtracking."
 - Sparked conceptual curiosity: Exploring different solutions through the system encouraged students to think beyond "getting it done" to understanding the underlying programming concepts. A student observed, "These two solutions both work, but they're using different synchronization patterns. I want to understand why."
 - Supported calibrated self-assessment: Participants used comparative data to accurately gauge their own understanding. Rather than over- or under-estimating their performance, they developed realistic views of where they stood relative to their peers and where they needed to improve.
 - Highlighted design trade-offs: Observing different strategies helped students understand that parallel programming involves trade-offs. One solution might be simpler but less efficient; another might be faster but harder to understand. Students began thinking about these design decisions.
 
Overall impact: The evaluation confirmed that OPM Viz successfully prompted the kind of reflection we designed it for. Students didn't just passively view their data—they actively compared, analyzed, questioned, and learned. The system made gameplay data not just visible, but actionable for learning.
What made it effective: The key was enabling comparison across multiple dimensions simultaneously. Students could see performance metrics (steps, time) alongside visual replays of actual gameplay. They could compare at overview level ("I'm in the top third") and dive into details ("At step 7, they took a different path"). This multi-scale, multi-modal comparison is what enabled genuine reflection.
Impact and Contributions
This project demonstrates the potential of integrating player-facing visualization systems, informed by Open Learner Model principles, into serious games to enhance learning.
Key Contributions:
- Bridging OLMs and Games: Introduced an Open Player Model approach, adapting OLM principles for the dynamic context of serious games
 - Learning-Focused Visualization: Designed and validated a player-facing visualization system prioritizing reflection and learning over purely aesthetic goals
 - Enhancing 'Parallel': Provided a novel tool for the established 'Parallel' research platform, enabling new avenues for studying game-based learning
 - Published Research: Shared findings and system design with the academic community via publication at ACM CHI 2024
 
The positive evaluation results and academic recognition underscore the value of this user-centered approach to designing learning support tools within educational games.
Conclusion
The OPM Viz project successfully demonstrated how thoughtful visualization design can bridge the gap between gameplay data and meaningful learning reflection. By centering the design process around both instructor expertise and student needs, we created a system that not only presents data but actively facilitates the kind of comparative analysis and reflection that leads to deeper understanding.
This work contributes to the growing field of learning analytics and educational technology, showing how UX research methods can inform the design of tools that truly support learning rather than simply displaying information.
Skills & Methods Demonstrated
Research: Design-Based Research • Focus Groups • Usability Testing • Thematic Analysis • User Research
Design: Visualization Design • Information Architecture • Interaction Design • Educational Technology • Prototyping
Specialized: Learning Analytics • Educational Games • Reflection Systems • Comparative Analysis • Data Visualization
Impact: Academic Publishing • Iterative Development • Cross-Functional Collaboration