OPM Viz: Bridging Gameplay Data and Learning Reflection

ACM CHI 2024 Educational Games Design-Based Research

Educational games generate massive amounts of gameplay data—every decision, every strategy, every moment of struggle or breakthrough. But when students finish a level, where does that data go? Usually, nowhere. The game records it, but students never see it again. They don't know how their approach compared to their peers, where they could improve, or what strategies might be more efficient.

This case study documents the development of OPM Viz, a visualization system for the 'Parallel' educational game that transforms gameplay data into learning opportunities. Rather than letting valuable performance data disappear into the void, OPM Viz makes it accessible, comparable, and actionable—turning every play session into a reflective learning experience.

The Problem

Educational games capture rich data, but students can't access it: Games like 'Parallel' (which teaches parallel programming) record detailed information about how students approach problems: which strategies they use, where they struggle, how their solutions compare to optimal approaches. But this data typically lives in logs that only researchers see. Students are left wondering: "Was my solution good? How did others solve this? What could I do better?"

The missed learning opportunity: Research in learning sciences shows that reflection and peer comparison significantly enhance learning. Students learn by analyzing their own performance, seeing how others approached the same challenge, and identifying specific areas for improvement. Educational games generate the data needed for this reflection, but don't provide tools to visualize it.

The practical challenge: Existing educational games often include performance dashboards showing scores or completion rates, but these don't reveal the "why" behind performance. A student might know they took 50 steps to complete a level, but not understand which steps were inefficient or how a peer solved it in 20 steps with a different strategy.

My Approach

As Lead UX Researcher and Designer, I conducted a 21-month Design-Based Research project to create and validate OPM Viz:

Key Outcomes

The Challenge: From Data to Learning

Educational games like 'Parallel', which teaches parallel programming concepts, generate vast amounts of player data. However, a significant gap exists: how can this data be effectively visualized to help students reflect on their own learning process and understand the strategies used by their peers?

Existing tools often lack features specifically designed to foster reflective learning. Our core challenge was to design a visualization system that moves beyond simple data presentation to actively guide students in analyzing gameplay, comparing strategies, and ultimately deepening their understanding of complex concepts taught through the game.

Understanding the Users: Instructors & Students

To ensure our solution was grounded in real needs, we employed a Design-Based Research approach, incorporating UX methods to understand both instructor goals and student expectations.

UXR pipeline diagram showing Understand, Design, Evaluate process
UXR pipeline adopted for this project.

Instructor Focus Groups: Learning from Expert Practice

Before designing the visualization, we needed to understand how experts actually use gameplay data to support learning. We conducted workshops with two experienced parallel programming instructors, asking them to analyze anonymized student gameplay recordings from 'Parallel'.

The process: Instructors watched recordings of students playing levels, identified different strategies students used, ranked performance, and pinpointed specific moments where reflection would be valuable. Through observation and think-aloud protocol, we documented how experts naturally analyze gameplay data.

Key insights emerged from how instructors naturally worked:

Student Playtesting: Understanding Learner Needs

To understand what students actually need from a visualization system, we observed 10 students playing 'Parallel' and conducted semi-structured interviews following their gameplay. This revealed a significant gap between what games currently offer and what learners actually need.

What we observed: Students played levels, sometimes struggling for extended periods, sometimes finding elegant solutions quickly. After completion, we asked about their experience: What was hard? How did they know if their solution was good? What would help them improve? Through thematic analysis of interview transcripts and gameplay observations, three core needs emerged.

Key Insights from Students:

  1. Desire for efficiency improvement: Players weren't satisfied with just completing levels—they wanted to know if their solutions were efficient. One student expressed frustration: "I finished it, but I have no idea if there's a better way. Am I using ten steps when I could use five?" Students craved ways to identify inefficiencies and learn from more optimal peer strategies.
  2. Need for multi-scale comparison: Students wanted to compare their gameplay with peers, but at different levels of detail. Sometimes they needed a high-level overview: "Did I take more steps overall?" Other times they needed granular step-by-step comparison: "What did they do differently at step 15?" The visualization needed to support both zoomed-out summaries and detailed moment-by-moment analysis.
  3. Seeking guidance without giving up: When stuck, students wanted help, but didn't want complete answers handed to them. They appreciated hints or alternative solution paths—ways to nudge them forward without defeating the challenge. After completing levels, they also valued exploring different valid approaches, understanding that there were multiple ways to solve the same problem.

Why this mattered for design: These insights directly shaped OPM Viz's features. The need for efficiency comparison led to peer performance metrics. The desire for multi-scale views informed our "overview first, then zoom and filter" approach. The want for guidance without answers inspired our reflection prompts—questions that guided analysis without revealing solutions.

Designing the Visualization System

Guided by insights from instructors and students, and utilizing Shneiderman's Visual Information-Seeking Mantra ("Overview first, zoom and filter, then details-on-demand") as a framework, we designed and developed the OPM Viz system.

Key design goals included:

Screenshot of the OPM Viz visualization system interface
The OPM Viz Visualization System Interface.

The Solution: OPM Viz

OPM Viz is an interactive visualization system designed to integrate seamlessly with the 'Parallel' serious game. It empowers students to reflect on their gameplay, compare strategies with peers, and gain deeper insights into parallel programming concepts.

Explaining the features of the OPM Viz system.

Overview of the development process and algorithms.

The system allows students to:

Evaluation: Does it Foster Reflection?

Could OPM Viz actually prompt the kind of reflection and learning we designed it for? To answer this, we conducted an evaluative study with 8 student participants who had used 'Parallel' in a previous course.

The evaluation method: After playing levels in 'Parallel', participants used OPM Viz to explore their own gameplay data compared to anonymized peers. We employed think-aloud protocol—asking participants to verbalize their thoughts as they used the system. Following the session, we conducted semi-structured interviews probing deeper into their experience and what they learned.

Our analysis focused on three questions:

Thematic analysis results showing student interactions
Thematic map illustrating how students used OPM Viz for reflection and learning.

Key Findings from Evaluation:

Overall impact: The evaluation confirmed that OPM Viz successfully prompted the kind of reflection we designed it for. Students didn't just passively view their data—they actively compared, analyzed, questioned, and learned. The system made gameplay data not just visible, but actionable for learning.

What made it effective: The key was enabling comparison across multiple dimensions simultaneously. Students could see performance metrics (steps, time) alongside visual replays of actual gameplay. They could compare at overview level ("I'm in the top third") and dive into details ("At step 7, they took a different path"). This multi-scale, multi-modal comparison is what enabled genuine reflection.

Impact and Contributions

This project demonstrates the potential of integrating player-facing visualization systems, informed by Open Learner Model principles, into serious games to enhance learning.

Key Contributions:

  • Bridging OLMs and Games: Introduced an Open Player Model approach, adapting OLM principles for the dynamic context of serious games
  • Learning-Focused Visualization: Designed and validated a player-facing visualization system prioritizing reflection and learning over purely aesthetic goals
  • Enhancing 'Parallel': Provided a novel tool for the established 'Parallel' research platform, enabling new avenues for studying game-based learning
  • Published Research: Shared findings and system design with the academic community via publication at ACM CHI 2024

The positive evaluation results and academic recognition underscore the value of this user-centered approach to designing learning support tools within educational games.

Conclusion

The OPM Viz project successfully demonstrated how thoughtful visualization design can bridge the gap between gameplay data and meaningful learning reflection. By centering the design process around both instructor expertise and student needs, we created a system that not only presents data but actively facilitates the kind of comparative analysis and reflection that leads to deeper understanding.

This work contributes to the growing field of learning analytics and educational technology, showing how UX research methods can inform the design of tools that truly support learning rather than simply displaying information.

Skills & Methods Demonstrated

Research: Design-Based Research • Focus Groups • Usability Testing • Thematic Analysis • User Research

Design: Visualization Design • Information Architecture • Interaction Design • Educational Technology • Prototyping

Specialized: Learning Analytics • Educational Games • Reflection Systems • Comparative Analysis • Data Visualization

Impact: Academic Publishing • Iterative Development • Cross-Functional Collaboration