Get a Feedback Loop and Listen to It


Imagine writing a novel like this: every two weeks, you gather thirty people into a room and ask them to read your draft. When they’re done, they fill out a series of questions. “Did you understand what was going on at all times? Did you understand the protagonist’s motivations? Did you feel compelled to read more? On a scale of one to five, would you recommend this novel to your friends?”

Imagine it doesn’t stop there. While the readers are reading, you watch them the entire time. How long does the average page take to finish? When does their pace slow? When do they skim? Imagine a camera on these readers’ faces constantly, tracking their eye movements across each page, data that is then aggregated and mined for trends. Imagine their brains wired, too, looking for activity related to rational reasoning, emotional response, excitement, imagination.

The data you gather feeds back into the revisions, and, two weeks later, you are testing your novel again.

In 2007 Wired Magazine did a cover story on then-upcoming Halo 3. Though the magazine could have picked from any number of interesting angles for the story– how it was the Xbox 360’s biggest punch yet in the war with the PlayStation 3, or how its features like Saved Films and the Forge editor mode would bring Web 2.0-style user-generated content into the console space– its editors chose instead to focus on the playtesting lab at Microsoft and its unprecedented data-mining capabilities. The deck to the article didn’t mention Bungie at all. It read, simply, “How Microsoft Labs Invented a New Science of Play.”

When the article came out, I pointed a friend towards it, feeling proud of myself for being part of something deemed a big deal by the cover of a Condé Nast publication. But my friend came away from the article upset, even disturbed. Though the writer had almost flippantly tossed off the notion that video game development also “involves artistry, obviously,” he clearly saved his real swooning for the lab– for the thousands of hours of recorded play, the gigabytes of log files, the propellerhead science– not so much the game as an experience.

“Where has the creativity gone?” my friend asked.

Nobody really disputes that playtesting in some form or another is indispensable in order to make good games, of course. Understanding how an audience may react is tricky even in a linear medium such as the novel. At least there, we can assume readers will start at page one and continue to the end. In games, though, our agency enables our habits, and out habits become our blinders. You don’t consciously know that you always strafe to the right to avoid grenades– you just do, and your design ends up reflecting that. The choice then is to ship it that way, or to show the game to a lot of people, some of whom will instinctively dodge to the left or backpeadal or who might try to bunny-hop over the damage radius (something you’ve never thought to do).

It’s no surprise, then, that the playtest’s leading proponents, including developers like Valve and Bungie, are often known for the high quality of their titles. The feedback is valuable because games are systems with many moving parts, players included, and predicting how it will all interact is impossible for a single person or even a group of people. Psychologists tell us that once something is learned it is very difficult to imagine having not known it. The playtest allows us to see what it is like to not know our game.

If design by playtest creates the most perfect games, however, then how do we reconcile this with games as a medium in connection with the arts? For while it is easy to accept the notion that in mass-market entertainment there may very well be the optimal action film trailer, or the optimal casino layout, to speak of art in its artiest sense as a thing that can be optimized in one way or another is heading down a tricky path. If game design is actually a series of tests, and we employ scientists to perform empirical research to determine the best path, then where (like my friend said) has the creativity gone? Given a certain set of goals, is there a “correct” game design, an optimal design from which any further innovation is unnecessary?

When put this way, it is tempting to try to refute the design by playtest as a practice that can get in the way of an auteur’s personal expression: if people find a part of my highly idiosyncratic art game frustrating, well, that’s just part of the point. (I am a frustrated artist, after all, and my art is to make you just as frustrated as I am.) If we want games to travel into this realm of meaningfulness, the argument continues, we will just have to learn to accept unorthodox control, impossible puzzles, and general obtuseness as part of their repertoire.

Whether or not this is true, I think the uneasiness we get from a reliance on playtesting most often stems from the way it can feel like the tests are telling us what to do– that as creators we have relinquished our control to the mob of the focus group, or to some impersonally codified rules of human behavior. Over-dependence on playtesting, we fear, may lead us to a middle-of-the-road game that is the average of all games, unremarkable in every way.

These are legitimate concerns. But it is also important to remember that as personal as they can be, art, entertainment and video games are all transactions: things that occur, somehow, in the space between the creator and the audience. The playtest is a tool, one that has evolved to help us grapple with that single most important quality of games, the reason they are so beguiling and why they are so problematic: their interactivity.

Used properly, playtesting does not tell you what to do, so much as it tells you what you have in front of you. It shines a light into the possibility space of the game– a light of a different color or from a different angle than you are used to, one that makes possible a better understanding of its true shape. In this sense, a game designer’s artistry is not thwarted by the playtest. The artistry is present in how he or she reacts to the results. Data is just data, and it is up to us to decide what we will do about it.

13 Comments