Monday, March 8, 2010

The Gamer's Dilemma

As if just for me, James Madigan of The Psychology of Video Games, and frequent contributor to GameSetWatch, has written The Glitcher's Dilemma, an astute analysis of the social dilemma, or "prisoner's dilemma" as it relates to videogames. The prisoner's dilemma is a popular scenario in political science and numerous other disciplines that I find incredibly interesting. By expanding James' approach and problematizing the prisoner's dilemma from a political science perspective, I think we can glean some design lessons that may help to minimize poor in-game behavior in the future.

I recommend readers take a look at the original article as it offers some interesting analysis I will try not to repeat here. For the benefit of those unfamiliar with this scenario, however, I will follow James' lead with this description of the prisoner's dilemma from psychologist Robyn Dawes:

"Two men rob a bank. They are apprehended, but in order to obtain a conviction the district attorney needs confessions. He succeeds by proposing to each robber separately that if he confesses and his accomplice does not, he will go free and his accomplice will be sent to jail for ten years; if both confess, both will be sent to jail for five years, and if neither confesses, both will be sent to jail for one year on charges of carrying a concealed weapon. Further, the district attorney informs each man that he is proposing the same deal to his accomplice."

The scenario mapped above is relatively simple. If the goal of each criminal is to minimize the length of time spent in prison, the most logical decision is to confess. The confessing prisoner will either free himself entirely or mitigate the potential result of even more time spent in jail while his partner goes free. In this scenario, two rational actors reach sub-optimal outcomes.As James mentions (see his image above), this situation mirrors numerous videogame dilemmas, from javelin glitching in Modern Warfare 2 to Zerg rushing in Starcraft 2. Every time a potentially fun environment is ruined by gamers exploiting glitches or loop holes the dilemma is played out on a small scale.

For many political scientists and peace theorists, the prisoner's dilemma is incredibly problematic as a predictive tool. To many, exploring its efficacy and overcoming its parameters is of great importance. After all, this simple scenario has been used by some to justify nuclear arms buildup and preemptive warfare. Despite the frequency of cooperation, and the irrationality of human beings, many people point to the prisoner's dilemma to explain why people do not get along.

Cooperation does occur, and is more frequent, as Jame's rightly points out, when players participate in an unknown or infinite number of games. Over time, despite the tit-for-tat scenarios discussed in the original piece, cooperation tends to normalize between actors. Theoretically, this scenario is met while playing any online game. There is no way to tell if you'll meet the same group of players during online matches, or to know how many matches you will play with those same individuals in the future. Yet the dilemma still occurs with startling frequency. Why?
The answer might lie in another critique of the prisoner's dilemma. The scenario as described by Dawes is predicated on neither prisoner being able to communicate with the other. At least on the international stage, this is unrealistic. The significance of communication is simple. Most actors, be they nation states or gamers, are not so irrational as to pursue sub-optimal outcomes for cruel or vindictive reasons. People, in general, actually want to get along. Rather, without communication, misperceptions occur.

Let's take the current Starcraft 3 beta for example. The dominant strategy plaguing the beta is early rushing, not allowing the opponent to experience all the game has to offer. Rushing is the most certain way to win a match. Now imagine two players who would rather play a longer, more fulfilling game. They want a good match, but above all else, they do not want to lose against a rushing opponent. Despite shared interests, they both believe the other's goal is merely to rush and win.
This same scenario can be applied to nuclear armament. Psychologist Scott Plous does just that, calling the situation the "Perceptual Dilemma." Incorrect perceptions create poor decisions and sub-optimal outcomes - which is why communication is so important. Talking is a way actors inform others of their intentions and preferences, dispelling misunderstandings.

Which brings me to an interesting question. Can the prisoner's dilemma occur in single-player games? Certainly not by traditional definitions, but the theory still holds when one-sided exploitation occurs. Imagine if a player found a way to cheat in Braid, beyond just looking at a manual. This hypothetical exploit could diminish the game's impact, worsening the experience for the players.

On the other hand, a frustrated player may misunderstand a puzzle, perceive the game itself as being unfair, and act upon a misconceived slight. I do not think it is much of a stretch to draw comparisons with the perceptual dilemma here. In this case, unclear communication between player and developer is to blame. The player chooses sub-optimal outcomes for themselves because they consider the game intentionally and resolutely unfair.
So what does this mean for game design? First, and most obvious, games should not be built with fatal exploits. Secondly, anonymity is undoubtedly a factor. That being said, the effects of anonymity are diminished when online identities are permanent. With permanent online identities, players will know when they interact with the same individuals.

Similarly, mutual enjoyment is also more likely when players interact with one another repeatedly. I think designers are under the assumption that more players, and bigger player pools, is better. In actuality, we can diminish poor decision outcomes within smaller pools of player interaction, not more. Perhaps massively-multiplayer games should be less massive.

Above all else, games should foster communication. This includes single-player games. Player's should know why sub-optimal outcomes occur, be it from they computer or player based. Many online games limit communication between enemies for a myriad of reasons, some of them justified. However, we should find ways to foster communication between players before, during, and after a competitive match. This can take the form of text or voice chat, easily understood symbols or gestures, a mailing system, or even a streamlined messaging UI. It may open the door to a few bad apples eager to verbally assault other players, but it is better than the alternative.
This is the ultimate source of paranoia. Many gamers have come to fear online play because of a handful of immature and irrational players. The mass exodus of reasonable players from this online space, to some extent, exacerbates rude online gaming culture. Many of us now associate competitive online games as venues for immaturity.

We continue to fuel our own perceptual dilemma's by giving credence to extremist players, the few who actually enjoy traditionally sub-optimal outcomes: griefers. Third parties and ban-hammer wielding moderators can handle some of the worst offenders. In a transparent environment free of irrational actors, with open channels of communication, it becomes easier to mitigate the prisoner's dilemma. Nation states do it every day, why can't we?

9 comments:

  1. I think that anonymity is really the sticking point in all of this. Let's hypothesize a really simple intent-communication system for SC2, maybe even built into the matchmaking system, that lets you declare "I want a long game" or "no holds barred." Anyone who wants a long game is STILL going to be disappointed because all we've done is push the prisoner's dilemma around a little bit -- the same hustlers that create bunches of new low-level Battle.net accounts to stomp on n00bs in WarCraft III will declare that they want a long game, then proceed to rush anyway.

    (This doesn't even get into the "rush-or-don't-rush" religious war; the same philosophy that holds that you deserve to lose a 2D fighting game if you're too new to defend against the same repeated spammed move states that the later stages of the StarCraft tech tree are only there as fallbacks in case both players happen to survive the initial mutual rush.)

    It's possible that you could mitigate this somewhat with some kind of reputation system, but that has its own problems. I suspect that a matchmaking preference like "only match me with players above reputation X" would lead to a sort of cloistering that would make it hard for new, unrated players to get a non-jerk opponent.

    ReplyDelete
  2. Great follow through on the concept. There's definitely things that game designers can do to short circuit these kinds of dilemmas.

    ReplyDelete
  3. The thing is, Starcraft has been supported for long enough, and the "problem" was apparent enough in the first game that you can say with absolute certainty that rushing is an intentional part of the game, and it has counters baked into the design. Yes you have to know how to counter it, but you'd think that after the first couple times it happened you'd be searching for counter strategies. I mean, I don't think Chess is broken because my opponent has learned strong openings and I haven't, I just recognize that if I want to be competitive above the very most basic beginner level I need to learn the major openings and their counters.

    There's a strong argument to be made that rushing strategies make the game better because it means that the game is tense and interesting from the moment it starts. This is not a bug or a flaw or poor balance, it's a design choice that simply isn't for everybody. And it's okay for there to be games that aren't for everybody. I just think that it's more productive to let respect Starcraft for what it is than to try to decrease the diversity of games available.

    ReplyDelete
  4. @ Anthony B

    Anonymity is a touch nut to crack, you're absolutely right. It gives those griefers permission to wreak havoc. A system of better communication does not have a way of dealing with extremists. Which is a problem on the political stage as well.

    I like the idea of a reputation system, which you could also tie into a matchmaking system. It's easy enough to know if the player who said "I want a long game" decides to Zerg rush and win in five minutes. Demotions on that player would notify others of their poor behavior.

    A good example of this in action is League of Legends, which makes the number of games you quit open to everyone. If you have a high quit percentage, then players might choose to not play with you at all. That seems like a fair punishment to me.

    @ Jamie

    Thanks again for the inspiration.

    @ Julian

    I do not want to give the impression I'm complaining about rushing in Starcraft. I actually agree with you, I have seen plenty of rushes fail. Match types meant for new players is an easy way around the problem.

    With two people who do not know counter measures, however, the example is an excellent illustration of the dilemma. The current Starcraft 3 Beta even better, because to some extent, the purpose of the beta is to playtest all levels of gameplay and usher a new audience into the game. If that is, indeed, their goal for the beta, than Blizzard is plagued with rushing because of this dilemma. Some people, maybe the type of person they want in the beta, are just opting not to play at all.

    ReplyDelete
  5. I think the problem is that "exploits" are considered a problem, when really it's that these games are too complicated for the developers making them.

    The solution to getting rid of exploits that spoil the game is to just get rid of them. There is absolutely no way to design a system of "honor" in the context of online gaming. The game designer needs to be aware of any potential exploits and design that recognition into the game.

    Does that mean a ham-fisted "Off Limits" sign? Most definitely not. But more than likely if you input a code that spits out something you don't want, you screwed up the code somewhere and you need to go back and fix it, not hope that no one finds it or that they can be shamed into not using it to their advantage.

    ReplyDelete
  6. The problem that I always have with the prisoner's dilemma is with the assumptions of what it means to be "optimal" and "rational."

    For some players (and nations for that matter) winning, or at least not losing, takes precedence over everything, including personal costs. Therefore, beating your opponent, regardless of whether it is done fairly or unfairly, is both optimal and rational.

    That being said, I still think social norms have to be used along with technical measures when addressing fairness. I understand Interactive Illuminatus' skepticism regarding honor on the Internet, but it can be done. Gaming sub-communities voluntarily police the behavior of their own members to ensure everyone abides by the agreed-upon norms. For example, I'd much rather play on-line with folks from the VGHVI or Gamers With Jobs than a random group of people because those communities have both explicit and implicit values that define their community without the help of software rules.

    Sometimes, as is the case in other media, what the creator accidentally included is as interesting as what they purposefully included. I'm apprehensive to let all exploits be patched out, as it limits the ways in which the system may be tested. Of course, the definition and acceptability of an exploit or glitch is a very personal decision, but perhaps the price of Wavedashing is Zerg-rushing?

    ReplyDelete
  7. I agree with Scott on all counts. My point was mainly that a developer cannot *assume* that communities like the ones you mentioned will appear--that the game is still a weak game if it has debilitating exploits, even if the players choose to ignore them.

    ReplyDelete
  8. Damn, you're pretty fast on the draw Illuminatus! Thanks for visiting the site, by the way. :-)

    Exhaustive playtesting is probably the key to all this stuff.

    If (when) exploits slip through, the community-based solution definitely necessitates a pluralistic population. Something like Halo is big enough to have sub-communities, but small-population games can easily be dominated by certain types of players.

    ReplyDelete
  9. @ Interactive

    Yea, removing the exploits completely is the surest way to prevent the dilemma. But I agree with Scott on this one. You can't find and root out all the bad apples, but you can create a game that rewards good behavior and provides avenues for mitigating the dilemma. To some extent, its about changing the gamer culture in a particular game while keeping potential exploits in mind.

    ReplyDelete