Skip to main content

GlassLab’s Online Handbook Integrates Evidence of Learning Into Game Design


Kristen DiCerbo has recently posted to Pearson’s Research & Innovation Network the entry Learning Game Design From the Experts. Dr. DiCerbo's research program centers on digital technologies in learning and assessment, particularly on the use of data generated from interactions to inform instructional decisions. A senior research scientist, she has conducted qualitative and quantitative investigations of games and simulations, particularly focusing on the identification and accumulation of evidence.

“You might be surprised how many curriculum groups I talk to who are working with game designers to build learning games and struggling to integrate learning, fun, and evidence of learning from data”, she says. “It is enough to make you wish there was a handbook… wait, now there is!”


Late February, GlassLab released a Game Design Handbook aimed at game developers who want to expand their skills into learning. The handbook is actually a website, meaning it can be updated and improved over time, and it’s free. The game design sections were written by Erin Hoffman and Michael John.


GlassLab’s Game Design Handbook Highlights


·         What Makes a Good Learning Game?


A section on matching game mechanics to learning. As an example, in Mars Generation One: Argubot Academy, searching for evidence in the Mars environment mechanically matches searching for evidence in real-world argumentation. Equipping a robot mechanically parallels constructing an argument out of claims and evidence. And battling robots mechanically matches the back-and-forth competition of an argument. Thus, the mechanics of the game perform the competency itself, rather than attaching a completely unrelated performance to a piece of declarative knowledge.


·         Core Loops: Engagement Machines


A section that discusses “core loops.” These are loops of verbs that describe game action. By repeating the same sequences of actions but in different ways, we can produce multiple representations of the same skills, reinforcing and advancing them.


·         When Data Empowers the Player


A forward-looking section on data empowering players. This section imagines a future in which players use their learning-related data from the game to make choices and decisions about what to do next to meet their goals.


·         From Metrics to Evidence


A section that addresses the leap of logic required when we try to move from the gathering of behavioral metrics to composing that behavioral data into meaningful and reliable indicators of a player’s performance or capability. Behavioral data is just noise, but reliable indicators — the pattern found in the data — make music. In the language of assessment, this is the difference between data and evidence.