My wife was once highly offended by a cartoon mole.
One of the few games I’ve ever been able to get her into was Animal Crossing on the Nintendo GameCube. I thought it would be a good gateway game for her, but one day she announced that she was angry at the game.
“That stupid mole called me a cheater,” she said, and I immediately understood. Though ostensibly there to remind players about the importance of saving their game, Mr. Resetti the mole was also Animal Crossing’s reaction to people who tried to cheat by turning the game off without saving. Doing so would let you try for a better selection of random items in the game’s general store.
But if you did this, Mr. Resetti would know. And he’d be super pissed.
Apparently my wife had assumed the game autosaved and thus just kept hitting the power button when she was done. The result was an unavoidable and lengthy lecture from “the stupid mole” that stood in stark contrast to the saccharin tone of the rest of the game. During one of his diatribes, he actually berates the player by saying “You oughta be ashamed. Huh? What’s that? Speak up, you reset-happy CHEATER.”
That stings even if you WERE purposely resetting the game, because “cheater” is a powerful label. Breaking rules in video games can cover a wide range of activities, and a lot of them aren’t as bad as installing a wallhack or using a lag switch to become an impossible to hit target. There are small acts of cheating that many of us are probably guilty of: using a dictionary to win at Words With Friends, editing save files in Dungeon Defenders to get impossibly awesome equipment, or dropping out of online games in order to avoid getting a loss on our records. Heck, even buying illicit gold in massively multiplayer games can be routine if you know where to look.1
What kinds of circumstances make us more or less likely to leap into such transgressions? Besides the threat of punishments like VAC-banning on Steam or getting a Battle.net account suspended, recent research has shown that the threat of having to update our own self-image as a “cheater” or “a dishonest person” can be a surprisingly strong deterrent.
Researchers Christopher Bryan, Benoit Monin, and Gabrielle Adams tested this idea directly on the campus of Stanford University.2 They approached students and asked them to participate in tasks like flipping a coin ten times while trying to use THE POWER OF THEIR MINDS to make it land on heads as much as possible. They set subjects up to be able to cheat by recruiting them online and asking them to perform the task while sitting there at their home computer. To motivate them to consider cheating, the experimenters offered $1 for every heads the subjects supposedly produced.
Here’s the thing, though: half the subjects were given instructions that proscribed against “cheating” while the other half received almost identical instructions that mentioned “being a cheater.” For example, one group got “PLEASE DON’T CHEAT” at the top of their self-report form, while the other got “PLEASE DON’T BE A CHEATER.” The researchers guessed that the latter would be a more effective deterrent, since it more directly attacked people’s self-concept. And indeed, such a simple nudge caused those in the “don’t be a cheater” condition to report significantly fewer heads. The difference wasn’t huge, but it was there: an average of 4.88 heads for the “Don’t be a cheater” group and an average of 5.49 heads for the “Don’t cheat” group.
Other researchers, though, have found similar and much bigger effects through other, equally simple invocations of self-image. Nina Mazar, On Amir, and Dan Ariely did a great series of experiments3 where subjects sat in a group and were given sheets of paper, each containing 20 matrices of nine numbers. Their task was to find and circle two numbers in each matrix that added up to 10. Here’s an example I recreated:
Not difficult, but not so trivial that people would be likely to find all 20 pairs of numbers within the 5 minute time limit they were given. In fact, the researchers had done their homework to know that in 5 minutes most people could be expected to solve around 7 of the matrices. In addition to the papers with the matrices, subjects were also given a envelope containing cash, from which they would extract their earnings at the end of the experiment. The more matrix puzzles they solved, the more money they got.
Mazar and her colleagues ran several versions of this experiment, but the general setup is that some subjects were given a chance to cheat. They were told to destroy their papers in a shredder, then self report how many matrices they had solved. So people had both incentive to cheat (they were paid more) and freedom to do it. A control group did the same task, but knew that their answers were actually going to be scored and thus had no chance to cheat.
From previous work with this task the researchers knew that people who could cheat would generally do so, reporting having solved an average of 12 matrices versus the 7 in a control group. But they were interested if they could manipulate the amount of cheating by either protecting or endangering subjects’ self-image –specifically their image of themselves as a honest person.
In one iteration of the experiment they highlighted moral standards by having subjects write down as many of The Ten Commandments as they could.4. The result? People who were asked to write down Commandments but had the opportunity to cheat without getting caught didn’t do it. At all. Similar results happened when they had subjects indicate that they understood that their conduct fell under the purview of the university’s honor code.5 This is one reason why I think games would benefit from having players include anti-cheating messages on loading screens or even agree to an occasional “I agree not to cheat/drop out/grief/whatever” statement before joining a multiplayer match.
But it turns out that people can also be nudged into cheating MORE. In one experiment Mazar and her colleagues wanted to make it easier for people to label their behavior as something other than cheating. To do this, they simply paid people in tokens. This was kind of silly since subjects immediately turned around and exchanged the tokens for cash, but it worked. In fact, it REALLY worked –subjects who cheated to get more tokens reported solving, on average, almost three times as many problems as those in the control group. Just from letting them think “I’m claiming tokens” instead of “I’m stealing money.”
This may sound absurd, but it matches up with the real world quite well. Stealing cash from the register? No way. But taking an extra long lunch break without reporting it or padding an expense report? Those things happen a lot more than stealing cash of equal value. And in fact, I think video games facilitate this kind of stuff by their nature. Nothing in video games is physical or even money. It’s often abstracted. Selling gold in World of Warcraft? Duping items in Diablo 3 and then dumping them in the real money auction house?6 If you’re saying “That’s different,” or “That’s not cheating” then you’re doing exactly what Mazar describes: protecting your self-concept as a non-cheater by recategorizing your behavior.
But here’s the positive spin on all this: Even when given chances to do otherwise, people in these experiments only cheated a little. Only a few more heads-side-up coin tosses were reported and only a few extra matrices were falsely reported as solved. The main way that people seem to protect their self image is by putting a throttle on their cheating impulses. Reminders of (or attacks on) our self-image can often lower them even more.
Irate cartoon moles are optional but apparently effective.
Interesting. One thing I would note is that putting in reminders of basic decency and conduct can inflame the griefers who take great satisfaction from disruption. The more they are reminded that their actions impact others, the more it can egg them on. That would be an interesting article by the way. The psychology of griefers in games and real life.
Interesting read, as always. I do wonder about that Standford study, though. When involving probabilities, there’s the question of how large their sample size was. Did they take into account that people with a naturally high amount of heads were more likely to cheat than people with a naturally low amount? The law of large numbers wouldn’t have applied to each individual’s score, and ten flips is a very low statistical sample for probabilities. Each individual had only a 37.7% chance of getting more than half heads, so I’d expect that could’ve played into things.
The bit about cheating being more common when it’s abstracted is especially fascinating. I’ll have to look into that more.
I don’t really ever think about whether my gaming behaviour constitutes as cheating. What I do think about is whether what I do is in violation of the Terms of Use, or whatever other agreement I’ve signed to be able to play the game. Buying gold/currency in a MMO is, in most cases, explicitly in violation of the Terms of Use. I don’t have a problem with it when it’s a part of the in-game systems, like Guild Wars 2 or Diablo 3, when it’s no longer illicit. Many people still see it as cheating, but the way I see it, when it doesn’t affect other people’s gameplay, and it’s not breaching any agreement, it’s fair game.
I’d also be really interested in what Josh mentioned about studies on the psychology of griefers or just players in general who exhibit anti-social behavior.
My first thoughts after reading this is that this phenomena is probably visible in all situations in which our self-perception is challenged, because I recently wrote about stereotypes in games. There’s always those responses to those kinds of criticisms of games which imply that there are clear lines between deliberate use of stereotypes and unintentional. How great a role does the protecting of our self-perception alter what we view as harmful stereotypes? Or as Josh questioned, bad behavior? I suspect it plays a significant role.
I also find it interesting what Thomas said about focusing strictly on what is the law involving games, because it’s one of the ways I think amoral and/or unethical behavior is made possible. That’s a fascinating article waiting to be written on it’s own, but not by the likes of me 🙂
Good article, this reminds me of the GDC talk by Riot Game on how they tried to improve player behavior and had some good results with loading tips: http://www.gdcvault.com/play/1017940/The-Science-Behind-Shaping-Player
Pingback: You Can Cheat, But Don’t Be a Cheater | The Psychology of Video Games - appgong
Really interesting stuff. I wonder if narrative context feeds into this, too. When I’ve played Professor Layton games in the past I’ve found that I’m more willing to look up a solution for a puzzle when it’s set by one of the supporting characters as an aside to the main story. When it comes to the big narrative ‘beats’ – solving a puzzle to break into the villain’s lair, etc – I’m less inclined, as it feels like I’m not properly ‘earning’ the reward of more story.
Great article. I’ve cheated in games before, and sometimes it enhances the gameplay experience. Other times, I feel guilty for doing so. It’s interesting, but it seems that most new games don’t have actual cheat codes built into the game like they did back in the 8 and 16-bit era.