Knapsack problems

Some computer games draw a lot of ire because of what is sometimes called “Inventory Tetris1. This is where the items the player is carrying are represented on a (finite) grid-like structure, and usually crops up in action/rpg games. It can quickly become fiddly and annoying to the player, trying to decide what to keep (given there isn’t enough space for all the valuable doodads they may come across).


By SharkD – Own work, GPL, on wikipedia

It gets especially complicated when items are not a uniform size, going from “I can carry this many items” to “I can carry these items provided I can rearrange them to fit“. Which just evokes all the joys of packing2.

Given that “Ugh!” is many people’s initial reaction to this sort of puzzle, why do so many games include it? Well, some people’s reaction to any puzzle is “Ugh!”; we need a more compelling reason for avoiding it. Solving a packing puzzle can be very satisfying—just ask anyone moving house/apartment/etc. who has (finally) managed to sort out their furniture in their new room(s). Any puzzle can be a worthwhile challenge to include in a game, provided it harmonises with other aspects of the game.

Limited inventory also adds realism3 to a game—a character is not able to carry a small village in their pockets. But realism is not the be-all-and-end-all when it comes to video games. All games abstract away details from the real world to try to capture the core of an experience (Does anyone ever run out of petrol in a racing game?). As such the relevant question to be asked is not “do I/my audience like this kind of puzzle?”, but “does this fit with the game’s core experience”.

Let me give a couple of examples of where the “inventory tetris” mechanic fits, and where it doesn’t.

Sir, You Are Being Hunted


From Rock, Paper, Shotgun

This is a game about sneaking around to collect parts for the MacGuffin that will allow you to escape. Armed robots attempt to stop you. Along the way, you scavenge necessary supplies, like food and weapons (and stuffed badgers).

The whole experience is about coping with limited resources, and a restricted inventory forces you to prioritise. If you choose to leave your majestic stuffed badger behind, you could potentially come back for it later, but just getting to where you left it can be difficult (i.e. having to fight/sneak your way past the robots).

Adventure Games

With the traditional “adventure game” genre (think “Colossal Cave”, “The Secret of Monkey Island”, “King’s Quest”, “Myst” etc.), the player’s inventory is essentially unlimited. This may be because there are only a small number of collectible items in the game anyway, but of more relevance is that retracing your steps (in this type of game) is not interesting or challenging. If a player is at the front door and needs to open a parcel, a knife being in their inventory is essentially an abstraction for remembering seeing a knife in the kitchen and going to get it.

These games often induce a sort of kleptomania; experienced players will grab anything that isn’t nailed down because it’s bound to come in handy later, and it saves them backtracking.

Occasionally, a particular puzzle will require putting limits on what the player is able to carry, but these should be treated as exceptions, and not change the normal inventory mechanic. For example, in the text-adventure Hitchhiker’s Guide to the Galaxy, solving a certain problem involves the player traversing a narrow space which gives an excuse for them to only take one item with them. In Broken Age, carrying a particular item (noted in-game as being exceptionally heavy) means the player cannot cross a particular surface without falling through.

So, as with all game mechanics, inventory tetris has a place, but can be very annoying if it is used somewhere it doesn’t fit.


1 Or, more prosaically an Inventory Management Puzzle, but that just doesn’t have the same pizzazz.

2 You may recognise this an example of the knapsack problem, one of many NP-complete problems which we have no efficient way of solving. I may burble more on this distinction in a later musing, if anyone is interested.

3 I use the term very loosely. 🙂

Better Game Stories In One Step

…it’s just not a very simple one.


A note to start with: This is focused on games where the story is an important component. Not all games are like this. Assume that we’re talking about action/adventure/rpg/etc. games with a significant narrative.


Few would argue that a compelling story involves the following four elements:

  1. An interesting* protagonist
  2. …who wants something
  3. …but has to overcome obstacles to get it
  4. …and either succeeds or fails**

“Traditional” storytelling media (e.g. books, films) are pretty good at ticking these boxes (literally—for example, there’s a how-to book for movie scripts).

Following the same advice and patterns has worked … okay … for video games, but runs into the usual problem with an interactive medium. The player is the protagonist. This means you have a conflict between giving the player freedom to do what they want to do, and ensuring that the protagonist does what is needed for the next part of the story.

Different games manage this better or worse, and various techniques have been used (e.g. “gating” parts of the game to make sure the player experiences things in the right order). But players of some games have reacted loudly against being “railroaded”; feeling disconnected from the game, that their actions don’t matter, that the controls may as well be “Press X to see the next scene”.

Yet it should be easy, shouldn’t it? Games are all about the player/protagonist trying to overcome obstacles to achieve a goal. And games are pretty good at making the protagonist interesting—either through being a blank slate that the player can project themselves onto, or making appropriate use of pre-existing literary/filmic character design techniques.

Whether you refer to it as “ludo-narrative dissonance”, “lack of engagement”, “railroading”, or whatever else, I suspect the same underlying issue with the story. The problem is that the player and the protagonist have different goals. As such, story progress (related to the protagonist’s goal), makes the player feel disinterested (at best). If it gets in the way of the player achieving their goal, they may come to see the narrative as another obstacle.

An example of this is in open-world games where the player wants to muck about and explore, and becomes frustrated at the game trying to get them back to the main quest. Another example is a cut scene that presents a character the protagonist needs to rescue. The player is essentially told “this is your best friend”, but they’re thinking “no, Sam is my best friend****, this is just some random NPC that I’m going to be forced to rescue. Aw man, I hope this isn’t going to be one of those escort missions…”.

To fix this, we just need to make sure the player’s goal matches (or at least is compatible with) the protagonist’s. “Oh, is that all?” you might be thinking. The difficulty is how. To support my attempt at a general answer, I submit the following example.

Think of the opening scene of “Raiders of the Lost Ark” (What do you mean, you haven’t seen it?!?). Imagine playing through something like that in a game. You have to navigate various traps to obtain the magic +3 Sword of Wompage—a significant improvement over your -1 Blunt Twig of Equivocation. You then get a brief chance to use the Sword of Wompage before, just as you’ve escaped the collapsing dungeon by the skin of your teeth, the villainous Baron Smarmy Twirlmoustache shows up and takes your new toy away. I would suggest that at this point, the goals of you (the player) and the protagonist are in perfect alignment.

So what are some general principles we can draw from this?

  • Players won’t care about something just because they’re told to
  • They will care about something that affects gameplay
  • Cut scenes are better for introducing obstacles than goals
  • Baron Twirlmoustache is kind-of a jerk

Game developers already consider the various types of player motivation they want to tap into when designing gameplay (see the Bartle taxonomy, for a formal example); the next step is considering how to align the story with it as well.


* Note: “interesting”, not “likeable”. The main character doesn’t necessarily have to be someone the audience wants to be, or would like to meet, but the audience does have to be curious about what will happen to the character*** next.

** This doesn’t necessarily align with whether the story has a “happy ending”. Sometimes the best outcome for the protagonist is not getting the thing but realising they don’t actually want/need it.

*** One of the benefits of an ensemble cast is that different audience members may be intrigued by different characters, thus keeping a wider audience tuning in than if the focus was mainly on a single protagonist.

**** Few know that Frodo was an avid gamer. There had to be something to while away those quiet, lonely nights in Bag End.

Fear and Gender

A note to begin with: this post discusses gender issues. I don’t claim to be able to speak for all men, let alone all women, and I’m well aware that people are many and varied (probably in ways I’m not even aware of). So can we agree at the start that we are dealing with the [mythical] “average man/woman”? Right then…

I’m not sure if I have much of a conclusion/point (beyond “peoples is messed up, yo”); this is more expressing some thoughts on a topic that bothers me.


I recently read an article by Caitlin Moran (in The Times Magazine—behind a paywall unfortunately) called “What men need to know about women”. The gist was that women are exhausted (from trying to live up to societal expectations—basically “Women can have it all! You don’t have it all? You slacker!”) and scared.

Scared because ~50% of the population are bigger, stronger, and more aggressive than them. Scared because if attacked there is little they can do to fight back. Scared because—quite frankly—the statistics around violence and sexual assault (male attacking female) are terrifying.

This struck a chord with me. Not because I’m a woman and have experienced this fear. Not because I’d never heard it before (there is a theory that you have to hear something several times before it really sinks in). Probably from the way it was expressed.

You see, I’m a small man. I’m roughly average height, but I have to wear heavy boots if the wind is blowing. I can definitely relate to the sense of being aware that most people in the room are bigger than me. I don’t feel entirely comfortable walking home alone late at night; not generally afraid, just extra alert and cautious. I did spend a while being afraid, following an unpleasant encounter with a boisterous drunk (though that was weirdly location-specific), so I can appreciate that regular verbal harassment and the like would quickly erode one’s sense of safety.

As well as size, though, I suspect the worry is related to the impulse (or lack thereof) to fight back, which seems more of a cultural construct. For boys, there’s a (usually unspoken) encouragement to “hit them back”. It seems to engender an odd perspective in that, àpropos of nothing, you occasionally find a thought lurking in the back of your mind to the effect of “yeah, I could totally take them down”. Even though my instinctive reaction is to freeze up when threatened (the lesser-known third option of the fight-or-flight response), I still entertain fantasies of showing an assailant that I was not to be messed with (straightens monocle imperiously), lest I feel myself “less of a man”. Stupid, huh?

I did kung fu for a few years, and one of the significant factors in me stopping was a mental block about hurting others. I was fine with learning moves, practicing falls, hitting a bag, etc. I still consider the board-break from one of my gradings as a particular achievement. But I blanched at doing more contact sparring-type drills, and when it sunk in that I was capable of seriously injuring someone by accident. Plus getting sick of the common bumps and bruises, and the fact that I could be easily knocked around (scrawny, remember? Technique has only limited benefit when your opponent is twice your size. This is why there are weight divisions in boxing/wrestling/etc.).

In contrast, girls are socialised to not cause trouble. I remember seeing it somewhere expressed as women shrinking and men growing (in terms of imposing themselves—or not, as the case may be—on the people/space around them). This doesn’t mean they are never aggressive, but it can often manifest verbally/emotionally rather than physically. Which, strangely enough, can be a most effective avenue for wounding men (again, remember this is about generalities and stereotypes).

I guess the only real option is to try our best to forget about these boxes we’re put in and just treat other people as people. “If it is possible, as far as it depends on you, live at peace with everyone.”

What’s wrong with turtles?

A while ago I was reading a blog post on gamasutra about how to design a game so as to discourage players from “turtling”.

Just to make sure we’re all on the same page, “turtling” is a pejorative description of someone’s style of play. This type of player is focused on defence, bunkering down and rarely attacking.

What I found interesting was that, throughout the post and several of the comments that followed, I was nodding along with the author, thinking “yes, that seems sensible”. Then one comment stopped me in my tracks by asking—in effect—”why shouldn’t players turtle if they want to?”; I suddenly realised I was mindlessly following the prevailing attitude that says turtling is inherently bad; something the game designer ought to prevent.

There are several behaviours in the same (or similar) boat. Save scumming*. Pause scumming*. Cherry tapping*. Kiting*. Camping*. Some are more acceptable than others (depending on context**), but they are generally seen as being negative, “unsporting”, or “cheap”. This also seems to be susceptible to the actor-observer effect: we accept it when we do it, because of perfectly valid reasons. We condemn it when others do it because they’re just cheats.

Players Behaving Badly

So, are there ways you can design a game to prevent (or at least deter) such behaviours? Sure, but you have to be very careful that you don’t create a worse problem by doing so. To make sure the change is actually affecting the behaviour you want, though, it pays to understand why people act that way (and not just why they say they did something—what are the underlying psychological principles).

I believe all these sorts of behaviour share a common motive: people are generally risk-averse (preferring a “sure thing”) for gains, and risk-seeking (preferring to gamble) for losses. Most games are framed in terms of gains (increasing points, winning matches, etc.) rather than losses, which predisposes people towards what they perceive*** as being the best strategy. “Playing the percentages”. Not taking undue risks.

For example, imagine if in each level of a platformer (Super Mario Bros for example) there were five bonus stars you could collect. Completing the level gives you 50 points, and each star is worth 10 points. The stars are placed in locations that challenge the player—either requiring them to navigate through dangerous terrain, or defeat/escape powerful enemies. When you examine the playtest data, you find that, while some players try for every star****, most players don’t bother risking it.

So, lets say you reframe things. The level is now worth 100 points, but you lose 10 points for every star you miss. And you find that, now that they’re thinking in terms of losses, players become more likely to risk trying for the stars, and overall more stars are collected. Success! Right? Except that players are also unhappier and more frustrated with the game; no-one likes being penalised. Probably not a good thing overall. You’ve reduced players turtling, and got them exploring more of your levels, but maybe they’re doing more save/pause scumming.

Players Behaving… Badly?

Maybe we need to take a step back. Sure, there are situations in which you want to discourage [some of] these behaviours, but is it a big enough issue to expend much design effort on? To clarify my point, I want to think about why we get so annoyed at these behaviours.

This doesn’t only apply to video games; there are plenty of examples in the sporting world, too. Pick your favourite sport, and you can probably think of players or teams who are “turtlers”: cautious and attritional rather than daring. They may well be top players, with enviable records. How do fans, commentators, journalists refer to them? Dependable. Hard-working. Consistent. Making the most of their talent. But are they loved? Do fans drop everything and rush to their televisions when that player walks onto the field? Not so much. They may even be seen as selfish, and overly focused on their numbers. There are exceptions, but people seem more drawn to the audacious and flamboyant players/teams, who may lose more often, but gosh darn if it isn’t exciting either way.

And I think that’s the key word: exciting. Entertaining. Dramatic. High level sport is a physical contest, but in the modern world it’s increasingly perceived as a performance as well. Hence, of course you want your team to win, but you don’t want it to be boring. We’re distracted by our deeply-ingrained sense of stories. We’re disappointed if we don’t see aspects of the “Hero’s Journey” play out: our heroes must bravely venture out to face their foes. It’s equally easy for players to get caught up in this, and try to play in a way that doesn’t reflect their strengths or their character.

Most video games are not competitive sports. How about (within reason) we give players the space to enjoy the game however they want to play it, without judging them for not playing it “right”. Maybe, if the turtles don’t feel discriminated against, they’ll be more comfortable coming out of their shells.


* Rough definitions:

Save scumming
Repeatedly reloading save games until you achieve a desired result (even though you could have continued).
Pause scumming
Repeatedly pausing time-critical sections.
Cherry tapping
Using weak/joke weapons/attacks to defeat an opponent. Requires either excessive training (so your character is far stronger than necessary), or wearing the opponent down via “death of a thousand cuts”.
Kiting
Repeatedly attacking an opponent then running away, thus slowly defeating them. Can also refer to teasing out individual opponents from a group rather than facing them all at once.
Camping
Lying in wait at a particular location to either have a clear shot at opponents or grab resources when they arrive.

** Generally, they are considered more acceptable in player-vs-computer situations, and less acceptable in player-vs-player situations.

*** Not necessarily the actual best strategy; humans are bad at probability.

**** Known as “achievers” or “completionists”. See the Bartle Test for example.

What’s the point?

Players of video games—particularly role-playing games (RPGs)—will often lament the problem of grinding (I suspect named in reference to “the daily grind”, but it is also sometimes referred to as “treadmilling”). The commonly-accepted definition of grinding is having to complete tedious and/or repetitive tasks. It often arises in the context of “leveling up” a character (essentially training to improve abilities).

Various workarounds have been proposed and/or implemented (see some examples). Completely removing the potential for grind would mean completely changing the leveling systems (which are otherwise tried, true, and effective), which would have significant consequences, so the approach de rigueur is to include some sort of payoff; a gold star for having defeated 100 swamp rats. This is applying an extrinsic reward to either motivate the player to grind, or placate them after a period of grinding.

While some aspects of game design—like the diminishing returns of experience points/leveling, and the random reinforcement of loot drops—are heavily informed by psychological findings, similar findings about the poor motivational effects of extrinsic rewards seem to have been passed over. Of course, it may also be that figuring out how to tap into intrinsic motivators is not only difficult, but getting back into the “overhaul the whole system” approach, which isn’t what we want.

I find myself wondering, though, if this is a case where the old adage “you never understand the solution until you understand the problem” applies. We have a definition of what grinding is, but maybe we need to consider why grinding is off-putting to so many players. Think of an RPG—whether it’s Ultima, Diablo, World of Warcraft, or even Pokémon—the parts of the game that are perceived as “grinding” aren’t mechanically different to the rest, they’re when your goals are different. You need to get stronger before you can overcome the next challenge. Your character still gains experience and levels while completing quests, but it’s a side-effect. “Grinding” is when leveling up becomes the main goal. And that’s just not very interesting*.

We can see something similar in the world of sports. The equivalent would be playing a match that has no impact on which team wins the trophy, so the only advantage to the players is the potential for improving their stats (though there’s still ticket sales, broadcast revenue, etc. to entice the higher-ups). For example, the fifth match of a best-of-five final when the score is 3-1; such a match is referred to as a “dead rubber”, and in some cases is abandoned.

Maybe this perspective can help. Grinding doesn’t seem like grinding if there’s another reason for doing it besides boosting stats**. Earning a gold star doesn’t help, unless it makes a difference to later gameplay. Perhaps other characters could start referring to your character as “Bane of Swamp Rats”. Perhaps swamp rats become more likely to flee rather than attack. But something beneficial—give them a reason, not an arbitrary number.


* For most players, anyway. For some, it’s the main attraction, and that’s fine, but I don’t believe that’s the case for the majority.

** Partly because I was feeling the lack of footnote, but also because this is a genuine side-issue: granularity. Sometimes the problem isn’t that there’s no other reason to kill all those swamp rats, but that you have to kill so many before it matters. It comes down to the same thing though: if you make the player’s actions feel meaningful, they’re less likely to get bored/frustrated with their progress. This is sometimes called “breadcrumbing”—leaving a trail of small markers/rewards to lead the player onward.

Moar Morality Mechanics

(N.B.: A mechanic is just the way something works in a game context. For example, rolling dice to determine the number of squares you can move is a mechanic.)

One of my recurrent interests is the representation of morality in games*. I recently encountered a couple of interesting and creative uses that I thought were worth sharing.

I’ve heard good things about (but have not played) a phone/tablet-based game called 80 Days which puts you in the role of Passepartout in attempting the titular round-the-world trip. As far as I’m aware, it’s a largely text-based pick-a-path adventure, with apparently huge amounts of detail about the world, the cities you visit, and the Steampunk/alternate-history setting.

The aspect of the game that I want to mention is that there is some sense of reputation in the story. One of the writing team has talked about a part of the game where (potential spoilers) you can choose to take a quick ship to another city, only to find when you get there that—because the crew of that ship are slave merchants—no-one else will associate with you, and your only way to progress is on a slave-hunting journey (end spoilers). I like the concept of, rather than your character having a tally of good/evil points based on the choices you made, other people have a measure of like/dislike based on what you’ve done (or are believed to have done). Others cannot see your intentions, or the limited information/options you had at the time. I suspect in this case, it’s a narrative device and not a specific mechanic (i.e. the system is not tracking any sort of “reputation points”), but I’m glad to see game designers thinking along those lines.

The other mechanic isn’t from a video game; recently I was introduced to Lords of Waterdeep, a “German-style” board game set in the world of Forgotten Realms. The gameplay is built around a set number of rounds during which players acquire resources in order to complete quests (which give them points). Naturally, you have to compete for those resources, and there are various cards that can be played which create additional bonuses/penalties, and so on.

The expansion (Scoundrels of Skullport) introduces a “corruption” mechanic, which does a good job of presenting a moral dilemma analogous to the real world:

  • There are new moves/cards that provide greater resources, but using them also gives you corruption tokens (“Is it worth it?”).
  • There are also ways to reduce your corruption tokens, but that uses up your (limited) turns and generally costs resources as well (“Restoring your reputation takes effort”).
  • At the end, players are penalised for every corruption token, with the amount determined by the total number of corruption tokens in play (“Less benefit if everyone’s cheating”).

This makes for a very interesting game mechanic. As the game progresses, people can get fairly concerned about how much corruption there is, and how much of a penalty they are going to get hit with. I’d like to make additions to this mechanic, but I realise they probably wouldn’t work for this particular game (they would either not fit with the setting, or would overcomplicate things):

  • A sense of reputation, based not on “how much?” corruption you have, but whether you have any. Certain choices could be unavailable to those with a sullied reputation. This adds an incentive to being “clean”, as opposed to just being less corrupt than the other players.
  • The mechanic falls down slightly in the last turn. At the very end, the potential benefits of corruption skyrocket for the less-corrupt players, mainly because they can be more certain of the size of the penalty (and know that the other players will receive a greater penalty). It could be better if the cost were unknown until after everyone has finished. Perhaps the corruption tokens in play could be randomly distributed amongst the corrupt players?

It pays to remember that any mechanic has to fit with the type of game it’s used in, and the setting and tone of that game. My suggestions/preferences lean more towards idealism than cynicism; I generally like a bit of hopeful optimism, but I realise that’s not always the effect you’re trying to achieve. Either way, it’s nice to see something other than the clichéd “good/evil points”.


* I’m mainly interested in video games, but there are lots of interesting aspects to other types of games, too. In particular, a board or card game has to keep any mechanics simple enough to be feasibly enacted by people sitting around a table, whereas a video game can get away with extensive calculations that the player need not know about. This means a successful board game is usually a good case study in game design, as the mechanics will have been carefully refined and simplified.

(Only one footnote? I must be under the weather…)

The Forgotten

It’s been a while since I’ve felt like writing anything (or at least, anything that isn’t a rant about plumbers). What I want to address today, though, is the disparity between what we view as significant or valuable, and what actually is.

I’ve encountered the same idea so many times, from people in so many different fields: “If I’ve done things right, no-one will notice. If I mess up, everyone will be glaring at me.” I’d even venture to suggest that the vast majority of jobs are like this.

Think of a rock star, strutting their stuff in a big stage show—lasers, pyrotechnics, the works. If everything goes well, the audience reaction will be “Gosh [band name] were well radical!”*. If the lighting display is out-of-sync, and the spotlight fails to follow the lead singer around the stage, everyone will be complaining about the technicians.

But just pause for a moment. Whether a concert goes well, or bombs, how many people are involved in making it work? Advertisers, ticket sellers/collectors, sound/lighting/sfx technicians, “roadies”, prima-donna wranglers, and probably heaps more, but all the acclaim goes to the handful of oddly-dressed bods on the stage.

This leads to two rather odd mental blocks relating to the actual cost and the perceived value of the performance.

Firstly, people complain about the ticket prices, insinuating that they would be cheaper if the guitarist was willing to only buy one new Lamborghini this year, apparently oblivious (unless they consciously stop and think about it) to all the behind-the-scenes folk who also deserve to get paid**. It’s not that people are unaware, but our brains will take the easy way out given half a chance (I recommend the book “Thinking Fast and Slow” for anyone curious about this phenomenon).

Secondly, regardless of how ticket prices get parcelled out, the few jobs that do receive attention also tend to receive significant remuneration. Think of the (exorbitant) pay-packets of famous athletes. They can (though not all do, to be fair, only those that reach a high enough level in a popular enough sport) earn hundreds of times what, say, a teacher does. But would anyone seriously argue that kicking a ball around on television is more valuable to society than teaching the next generation so that they can be content and productive themselves? Yet capitalism says otherwise, in one of its lies that western society has internalised: money represents value, ergo if you earn more money, you are more valuable.

What’s going on? Well, in typical fashion, we are measuring what is easy to measure and disregarding what isn’t. I’ve seen in pointed out that the reason a sportsman (and sadly, it is almost always a man***) can earn so much is because their performance works regardless of the audience—how many there are, whether they’re paying close attention or just watching for the atmosphere, etc. If televised, millions could be watching. For a teacher to do their job, they need to engage with each member of the class, which is just impractical once the class gets over a certain size****.

So, in a way, maybe this is a rant about plumbers. And everyone else doing those valuable-but-hidden jobs. Because I for one am very glad that you do what you do, and that I can take a shower without having to think about how the water gets there; this is a prompt to myself, as well as anyone else, that such things shouldn’t be forgotten.


* Maybe not in those terms. I may be showing my lack-of-hip.

** Please note that I am unaware of how much of the ticket price goes to the various parties. It may well disproportionately favour the performer(s), it may vary depending on the prestige of the act. But that’s a separate issue.

*** Again, separate issue. Important, yes, but this post is long enough already.

**** I make no claims as to what the feasible upper limit of a class size is—it probably depends on who both the teacher and the students are—but it’s certainly not in the hundreds, let alone the millions.

Changing Perspective

This is kind of related to/following on from my earlier posts about the maliciousness of technology, but looking into a specific example. I’ll try to present enough of an overview so it makes sense to those unfamiliar with the topic (which is probably most of you), but be aware that I’ll be glossing over a lot of the detail.

Many high-profile sports have made the most of television coverage to improve the decisions made by the referee/umpire (just being able to watch a replay can make a huge difference, especially if it can be slowed down). Cricket (yes, it’s my favourite sport. So there) recently introduced a few other tools to assist the umpires, collectively referred to as the Decision Review System, or DRS. Mostly, the system has been well received, as it has enabled both greater accuracy for tricky decisions (did the batter nick the ball? did the ball bounce before the fielder caught it? etc.), as well as being able to correct blatantly wrong decisions by the presiding umpire*.

Not everyone is happy with it, however. The Indian cricket board are against the technology, citing concerns about its accuracy**. Unfortunately, such things are never going to be 100% accurate, but I do think the system could improve on the way it presents its results.

The particular technology I want to focus on is called Hawk-Eye (it’s also used in several other sports) which uses multiple cameras to determine the ball’s position in space and track its movement. Besides being a general analysis tool, it’s mainly used to adjudicate LBW decisions.

A quick aside for those not familiar with cricket. If you already know what an LBW is, feel free to skip the next part:

The batter can be out if they miss the ball and it hits the wickets. LBW stands for Leg Before Wicket, and is designed to prevent the batter just standing in front of the wickets to prevent themselves getting out. It’s a complicated rule, but the basic idea is if:

  1. you miss the ball (with the bat)
  2. the ball hits your legs (or any other part of your body)
  3. the ball would have hit the wickets (had you not been in the way)

then you are out.

Everyone up to speed? Okay. It’s usually fairly clear when the batter misses and the ball hits their legs; what Hawk-Eye is used for is predicting whether the ball would have hit the wickets. If you’ve watched (on tv) a cricket match in the last couple of years, you’ve probably seen the outcome of this: a nifty graphic showing the path of the ball and where the system predicts it would have gone (in a different colour—see an example on youtube).

Seems pretty useful, right? Except… the prediction, like all predictions, is fallible. You cannot know for certain where the ball would have gone (even taking into account wind, spin, variable bounce, etc.). This is a good illustration of my topic: there are two major aspects of any piece of software: the underlying processes/calculations, and the user interface (i.e. how the user inputs information and receives results). In this case, the calculations are probably doing about as well as could be expected (the ball tracking part is apparently accurate to less than ½ cm), but the user interface could stand to be improved.

This is a common problem. A competent programming team is generally able to build (and properly test) a program that performs its calculations correctly the vast majority of the time. But a user interface doesn’t have the same clear “this input should produce this output” requirements. It should be “intuitive”, “user-friendly”, and other such subjective adjectives. This makes it a lot harder to know if it’s “correct” or not. Fortunately, there are a lot of useful guidelines to follow, and it’s a very good idea to get representative users in front of the system as much as possible to see if they can make sense of it or not. But it remains a design process, and as such is as much an art as a science.

So, what is the most easily-remedied interface problem with these LBW predictions? The fact that the predicted path is displayed as exact, which conveys to the user that the ball would definitely have ended up here. The fallout from it is evident from listening to commentators and fans discussing LBW decisions: everyone treats the path as a singular thing. In fact, there’s a margin of error***. This could be easily shown (given that they already have these fancy 3D computer graphics) by representing the predicted path as a cone-like shape (indicating that the prediction is less accurate the further it has to extrapolate. Rather than giving a binary “hitting/missing” response, it could report a percentage certainty (e.g. we are 73% sure that the ball would have hit).

It may seem counter-intuitive, but this lessening of clarity would make it more useful. The most important principle of user interface design is to consider what the user wants to do (not what is most convenient for the programmers). The user is the umpire, who wants to gather evidence to make a decision. If there’s clear evidence, they can feel confident in making their ruling. If the evidence is inconclusive, traditionally the benefit of the doubt is given to the batter. Either way, the tool is doing what it’s intended to: making the umpire’s job easier and helping them make a better decision. It’s also making it clearer to the viewers that it’s not a sure thing.

False confidence does not help. If I claim that I’ve measured my piece of string to be 12.4637 cm long, but my ruler only has centimetre markings, I’m not being precise or accurate, I’m fabricating. It would be more accurate to say the string is approximately 12.5 cm long.


* There is an issue surrounding whether players respect the umpires (which is undermined by being able to essentially say “No, you got it wrong, look again”), but that’s another story.

** They may well have other issues, I don’t claim an encyclopaedic knowledge of the situation, but that is the reason I’ve seen given.

*** Which the system acknowledges in that—for example—the ball being shown to just clip the wickets is not enough to overrule a “not out” decision.

The Pain Barrier

I recently saw a news article about a basketball player who suffered a fractured wrist early on in a game. He continued to play. Apparently he also spent the day on an IV drip owing to illness, and it was uncertain whether he would play at all. This got me thinking about sports injuries, and the attitude towards them from players, coaches, and spectators.

In New Zealand, there are a lot of stories of this ilk (playing on with a serious injury), mainly about various All Blacks (the NZ mens rugby team, for those not in the know). One notable example being the current captain, who—amongst other things—admits to concealing a broken foot in order to keep playing in the World Cup in 2011. Rather than being called out for this, it enhanced his reputation as a “real man”. Similar tales have assumed somewhat mythical status (google Wayne “Buck” Shelford or Colin Meads for other examples).

Which is probably a bad thing.

Why? Well, another way in which sports injuries have been in the news recently has been due to the treatment of concussions*. More care is being taken to ensure players don’t end up with permanent brain damage, which is a very good thing. It’s becoming expected that if you take a blow to the noggin, you get checked, and players aren’t allowed back until they’ve been medically cleared. All very sensible, and laudable.

If only the same attitude were applied to other serious issues. Sadly, there’s still too much of a “man up!” attitude in a lot of sports, and so players keep on, risking worsening their injury, or even doing themselves permanent harm, because they don’t want to be seen as weak or uncommitted.

In fairness to the players, injuries are often exaggerated (particularly by the media—the more drama the better), and I know from experience that breaking a bone may not hurt a lot at first (once you relax for a few minutes, and the adrenalin wears of, though…). Plus, for minor injuries (e.g. scrapes and bruises) it is valuable to be able to ignore it and carry on. That’s why there needs to be medical staff who are able to be objective**. Take the decision away from the player (they can keep their never-back-down image intact), and either patch them up, or pull them out as necessary.

Attempting to continue with a serious injury is heroic if the alternative is death. If the alternative is losing a game, it’s foolhardy.

(By the way, this post has focused on sportsmen. I’m sure the same issue affects sportswomen, but it’s exacerbated by the whole “real men don’t show weakness” guff.)


* In contact sports, like rugby or gridiron. It’s somewhat rarer in table-tennis.

** No human being can be completely objective, but the medics will hopefully be focused on the players’ health first and foremost.

Emergent Genre

There seem to be differing views of what “genre” is, at least as it pertains to video games. I suspect partly this is due to the fact that there are differing views of what a “game” is. One viewpoint, which is probably working its way forward as we speak: “Why does it even matter?” The answer is very simple, and gets back to the entire point of language in the first place.

We use tokens (sounds, letters, images, etc.) to communicate. You cannot communicate effectively without an agreement on the meaning of the tokens. If you think that “cheese sandwich” refers to a piece of cheese between two slices of bread, but I think “cheese sandwich” means being stuck between two tv news anchors, our conversation could get rather confusing.

Of course, conversations can (and do) get confusing all the time, because lots of words have multiple meanings, nuances, and associations, so in order to have a formal* discussion on a particular topic there must first be a clear agreement of the meaning of particular terms used. This is why any field develops its own jargon, and why academic articles, textbooks, contracts, etc. can come across as characterless and bland.

Some may still be asking the same question. After all, they’re only games; why would we want to be formal about them? Thus we turn a corner and bump into the “Can games be art?” debate. For what it’s worth, I think they can be, but that’s not really germane to this topic (genre, in case you’d forgotten. I almost had). Regardless of their artistic merit, games are certainly worthy of critical** analysis. If we cannot examine what worked (or didn’t) about a particular game, how can we expect to make the next game better? Plenty of movies have the aim of entertainment, without aspiring to high art, but still benefit from a clear understanding of the art of cinematography.

So, back to the topic: How to define genres for video games? To a large extent, I suspect (as implied by the title), that genres emerge from the exploration of a medium as the creators codify aspects that work well together. They can be influenced (basically, given a head start) by pre-existing genres in another medium, but they must take into account the techniques and details of the newer medium. “Hard-boiled” detective/thriller novels served as inspiration, but noir is a uniquely filmic genre (including particular styles of plot, lighting, music, etc.).

Similarly games must establish their own genres based on the features of the medium. Aspects of look, tone, plot, etc. can be taken from other media (film seems a particularly favourite source of inspiration), but the “ludic” aspects must be involved as well. We can see the beginnings of this: a FPS (First-Person Shooter) would once have been referred to as a “Doom-clone”***. However, I think these “genres” are as-yet too broad to be of much use; the genre of a game should give an indication of the style or tone as well as how you interact with it. For example, Doom (horror/action) and Portal (sci-fi/puzzle) could both be classed as FPS.

I’m confident we will eventually settle upon a more useful classification system; we just need to keep paying attention to what aspects of a game contribute to the overall experience.


* As opposed to a casual conversation. Most of the time, people are content to accept a little ambiguity (which is often quickly resolved by context) in order to facilitate conversation.

** That’s “critical” as in “critique”, not necessarily “critical” as in “criticism”.

*** A particularly notable example is often the starting point for establishing a category, which will probably be called “things like X” until a more suitable name is decided upon. An example of the transition can be seen in the debates over the term “rogue-like”.