Sunday, July 29, 2012

I like my women like I like my keyboard shortcuts...

I keep starting posts that turn into grand, sweeping philosophizing, and I'm trying to get away from that.  (Hence the delay in updating.)  So to make things a little more tractable, I'm going to take a small-scale approach and discuss a common gaming pet peeve: The unskippable cut scene.

There are all sorts of reasons to hate the unskippable cut scene.  For one, cutscenes in general violate the inherent participatory nature of the video game.  The Half-Life games were praised as revolutionary in their time because they were the first games among their contemporaries to tell the story through the game.  You experienced the entire story through Gordon Freeman's eyes, and it was an incredibly immersive experience as a result.  There wasn't a fade out and cut to 3rd person view every time the game designers decided it was exposition time.  Hell, if you wanted, you could just walk away from the people talking to you and start hitting stuff (including the people talking) with your crowbar.  Gordon Freeman is a mute MIT physicist - he's clearly on the autism spectrum, and people will understand if he'll just do that kind of thing from time to time.

Gordon Freeman's teeth grinding and hand flapping proved just too unsettling in preliminary game testing, but if you look closely, NPCs' reactions to it are still coded into the game.

Taken from a human factors perspective, though, I can think of two major reasons why cutscenes can be so annoying.  

Requirements Analysis

First, good design involves a strong understanding of your creation's requirements.  There's quite a bit of formal theory built around requirements analysis, but I'll present one popular take.  There are basically four major requirements you have to work through when designing something:

Functional requirements: things your thing absolutely has to be able to do, and if it didn't, would make people think your thing is broken.  For instance, a calculator has to be able to perform arithmetic (correctly) on numbers that you enter into it.

Indirect requirements: things that have to be present to make the functional requirements possible.  To keep going with the calculator example, you need to have a power source of some kind.  (Don't start with me, abacus nerds.)

User requirements: what's your audience?  What do they know and what do they expect to be able to do with your thing?  Are they men or women?  Big or small?  Young or old?

Environmental requirements: where is your thing being used?  A solar-powered calculator is useless to an astronomer at night.  But then she's probably using a scientific calculator or a computer anyway.

Gameplay and cutscenes have a very tense relationship under this sort of framework.  The core gameplay mechanics serve as a functional requirement (or perhaps an indirect requirement to behavioral reinforcement...), but what is a cutscene?  

In the games that use them, they can serve a range of purposes.  They provide motivation to the player.  They inform the player of the next thing they have to do.  And in story-centric games, they may even be the very thing the player is primarily interested in.  Depending on their implementation, they can be considered a functional or indirect requirement.  So then why do they get so annoying?  For one, they are if they don't accomplish any of the things I just listed above.  If they're uninformative, provide superfluous narrative, or are just plain insipid, you're gonna piss people off.  

Oh, my god, I don't care.  I just want to sneak around and kill terrorists.

Then there are people who have absolutely no interest in the cutscenes, and just skip them because it's standing between them and playing the game.  This can be indicative that the cutscenes are not serving a purpose in the game to begin with, and their presence becomes questionable (sorry if you just wasted hundreds of man-hours making them).  For one, they're failing to meet their requirements, but second, they're actually interfering with the main functional requirement(s) of the game.  As with so many things, an ounce of prevention is worth a pound of cure.  This is why game devs need to sit down and figure out what they're trying to accomplish with their game, and how they want to do it.  Is this going to be gameplay driven?  What purpose are the cutscenes serving?  Do we need them?  What constitutes a justifiable reason to have a cutscene?

But then there's another, even simpler reason why the unskippable cutscene is so annoying, and it's highly related to a concept from user-interface design: flexibility.  

Flexibility

In computing, flexibility refers to the degree to which a computer program can provide an optimal experience to users of different expertise levels.  Keyboard shortcuts are the prototypical example.  In a well-designed program, the functions to which keyboard shortcuts are mapped will be organized in some kind of menu or toolbox structure that allows users to find or stumble upon them.  However, as a user starts using a function more and more often, navigating through menus or toolboxes becomes cumbersome.  Fortunately, through that frequent use, the user notices and becomes familiar with the keyboard shortcut noted on the function, and can use that to access it quickly without having to go through the UI infrastructure.  And so, keyboard shortcuts provide an optimal experience for the novice and expert users alike, while providing a support structure to aid in the transition between those two levels of expertise - all without getting in the way of the different users.  (You'll find many games ignore that last part; the Sequelitis video I put up in the first post speaks to that already.)

Can you imagine how annoying it would be for you if, every time you used your computer, you had to go to the edit menu to copy, then again to paste, every single time you wanted to copy and paste?  It seems so small, but just try doing that for a day.  You won't last an hour.  For an expert user, having to go through the process, as simple and short as it is, feels incredibly slow and frustrating.

Sound familiar?

As a hardcore completionist when it comes to games, I can't even relate to why someone would skip a cutscene on the first pass through.  But as a hardcore completionist, I have found myself on a fifth pass through a game wanting to stab the game dev's eyes out for making me sit through this same cutscene again.  Unskippable cutscenes are a pain because, at the very least, they ignore the principle of flexibility.  They ignore the fact that the player might already be an "expert," intimately familiar with the goals and story of the game, and just wants to get back to the action.  

Unskippable cutscenes are just bad design.

But wait!  There's another side to this coin.  Unskippable cutscenes are becoming more and more rare these days, I'll admit, but this is also giving rise to another problem.  The too-easily-skippable cutscene. I've probably had this problem even more often than the unskippable cutscene, to be honest.

I've lost count of how many times I've been in this situation: You're sitting through a cutscene, no idea how long it is because this is your first pass through the game, and you've really gotta pee.  You don't want to miss anything, but holy crap, your kidneys are gonna explode!  You never had to leave the game mid-cutscene before.  Can you pause them?  I don't know!  This thing just keeps going...  But I can't hold it any longer...  Let's just hit start and maybe that'll--FUCK!

(Or, even simpler, you accidentally bumped a button, and poof!  Cutscene goes bye-bye.)

Now what am I supposed to do?  Granted, a well-designed game will provide you with some kind of redundancy to indicate where you're going next or what your next goal is if the cutscene served to tell you that, but what about the content itself?  That's content you paid for, gone.  If you're lucky, the game lets you re-watch cutscenes in some kind of gallery, but how often do you see that?  Now, that portion of the game is lost to you until you play through to that point again.  Well, ain't that a kick in the balls.  

(I started playing Saints Row The Third recently, and I've already managed to lose a cutscene.  It seemed to involve an energy mascot beating up a gangbanger, though, so I'm guessing I'm not missing anything, but the fact that I don't care speaks to the justifiability of your cutscene in the first place.)

What I'd look like if I had a sweet scar, a more chiseled face, and a better haircut.

Game devs wised up and provided an option to cater to the expert user, but now the first-time player is left hanging in the wind, which is just as bad (if not worse).  So how do we deal with that?  Once again, a simple principle from general UI design: error prevention.

Error prevention is what it sounds like: a means of keeping the user from doing something stupid.  

It's why your word processor pops up a window when you try to close a document that says something like, "Are you sure you'd like to close?  You haven't saved any of the additions you've made to your dissertation in the past five hours, you lunatic."  Or whatever.  No one reads text strings longer than about 5 words anyway.  

Error prevention is also why there's that little plastic door on buttons that do awesome things in airplane cockpits and tricked out cars that you have to flip up like a bad-ass first before you do something that usually corresponds to the phrase, "punch it."

The ideal solution, as you can probably guess at this point, is pausable cutscenes that give you the option to skip them when you pause.  Thankfully, Square-Enix, responsible for 20% of the world's cutscene content despite producing .05% of all games annually, has implemented this in a lot of their recent games.  It's probably one of the few good game design decisions of Final Fantasy XIII.  

 There's that and the Shiva sisters summon who form a motorcycle by scissoring.

Still, confirmation that you're about to do something stupid, at the very least, is a step in the right direction.  Diablo III is the most recent example of this that leaps to mind - you can't pause a cutscene, but at least it asks you if you're sure you want to skip it.  And you get a cutscene gallery in the main menu, so you don't have to worry about missing anything.

Flexibility is a particularly interesting principle with a multitude of implications in game design, so look forward to discussions of that in future posts.

Tuesday, July 17, 2012

Justifying my Existence, part 3

Ok, time for the glorious conclusion of my blogging trilogy.  Wherein I tie everything from parts 1 and 2 together, then celebrate blowing up the Death Star II with a big Ewok party in the woods.

(Everything else that's wrong with Jedi aside, how could blowing up the death star and killing the Emperor definitively end the Empire?  I mean, did the entire enterprise really hinge entirely on one functional but partially unconstructed starship and two top leaders?  They can't possibly be outnumbered and overpowered at that point by the Rebel Alliance - characterized as the ragtag underdog throughout the series - after a single battle, which itself resulted in great losses for the Alliance's already dwindling force.  I'm not willing to believe that the sociopolitical infrastructure that is both a requirement and product of an intergalactic empire can be undone in one stroke like that.  But, whatever.  I bet the Expanded Universe answers these questions, but frankly, it just makes me feel too nerdy to have any knowledge from the SWEU.  Anyhoo...)

So let's recap where we've been:

Human factors was born out of the military wanting to know why their best-trained pilots were still crashing the most precision-engineered planes.  The answer: the technology wasn't designed to fit the limitations of the human operator.  A bunch of people came together to figure out how to engineer the technology around the human being, and bam, we have human factors.

Video games are at an interesting point in their history.  Graphical improvements have defined the step up from one console generation to the next for the past 30 years.  But we're getting to a point where graphics are not significantly adding anything meaningful to the video game experience.  Recent trends have moved towards employing motion controls, but no one is really crazy about them except for idiots (and even then, for not very long).

So what do these things have in common?

I'm going to answer this question with a graph.  'Cause that's how we do in academics.

I know there's an official figure for this somewhere, but I can't seem to find the one I have in my mind's eye.  That makes me wonder if I made the whole thing up, but it all sounds probable enough to be real.

Anyway, discussions of human factors usually involve a graph that looks something like this:

FUCK YEAH, GRAPHS!

What this graph tells us is that aviation incidences have decreased over time thanks to advances in engineering.  More reliable hardware means planes aren't spontaneously combusting and sputtering to a halt in midair so much, so of course you see a drop in aviation incidents.  That's super, but engineering will only get you so far.  You eventually hit an asymptote at the wall of human error.  Even with the best engineering possible, you're still going to have some percentage of people mistaking one dial or switch for another and getting caught in a death spiral.  Codified safety regulations also help to bring the numbers down, but even they only do so much.  You never really reach zero.

To break through that wall, you need human factors.  You need a method to systematically study what mistakes people still make, understand why they make them so you can develop solutions, and then systematically test the solutions to see what works.  When you start engineering the human-machine (or human-human or human-human-machine) system as a whole, then you suddenly find avenues for improvement that weren't there before.

Now consider this graph which I most definitely just made up:

Human Factors is driven largely by Nyan Cat

Like aviation engineering, we're getting to a point in video games where our hardware is starting to asymptote.  Technical advances are adding less and less to the experience, even if they continue to advance monotonically.

(Sometimes, they even make the experience worse.  Remember how awesome it was when Dead or Alive introduced boob jiggle physics?  Then remember how horrifying it was when Team Ninja made the boobs jiggle independently from each other in wildly different directions on the Xbox 360?  WHAT HATH OUR SCIENCE WROUGHT?!)

OH GOD, WHY IS ONE UP WHEN THE OTHER IS DOWN?!  STOP RUNNING, FOR GOD'S SAKE.  You're hurting MY tits.

Back to the point, video games - in my opinion - are hitting a similar technical wall to the one I described for aviation.  While the technology evolves, things get better, but the rate at which they're getting better is slowing down.  (Before you jump up to argue with me on this, hear me out.)

Think about the Mario series.  Break it down into its core components (I'll discuss more on this in future posts), and you have this underlying thread - running and jumping through an obstacle course - that just evolves over time.  But think about how that evolution has played out.

We saw incremental improvements going through the 2D Mario games on NES, and those were awesome.  But then the SNES came out, and the graphical leap allowed game designers to introduce enemies and environments they never could before.  Enemies could be huge or tiny, they could change form, they could respond differently to different attacks.  Your environment could shift and move around you in new ways, creating situations and challenges that people had never seen before.

Then the Nintendo 64 came out, and it was an even bigger leap.  There was not only a leap technically, but also in the experience afforded by this open, three-dimensional world.  Challenges and puzzles could come at you from all new directions, and it fundamentally altered the way you approach and move through the game.  Once again, gamers were faced with a brand new experience they had never seen before.

Then the Gamecube came out, and it was cool because it was prettier, and the worlds were a little bigger, and for some reason you had a water jetpack?  I'm not normally one to denigrate a jetpack, but honestly, can anyone say that's as big an evolutionary step as the addition of a spatial dimension?

Oh, I almost forgot: Mario got short sleeves.  Progress.

And then the Wii came out with Super Mario Galaxy and it was like Super Mario Sunshine without the jetpack and some funky gravity physics.  And the revolutionary new control scheme the Wii was supposed to give us?  Wiggling your wand at the screen picked up stars.  (Oh, if only that were true in other facets of life.)

And now we have "New Super Mario Bros." which is basically Super Mario Bros. with pretty graphics and co-op play.  The evolution has become a cycle that's eating itself.

When an iconic figure in the industry like Mario is running out of places to go, you know something's up. The point is, video games are running up against this wall I talked about before; it looks like there's nothing else to improve on, but there's still this sense that we could be doing better if we knew what was holding us back.  And that's what human factors affords us: a tool set for making bad ideas good, good ideas great, and great ideas amazing.

Motion controls were one attempt to break through the wall by taking things in another direction.  While that was a valiant approach (that some say have ruined the glut of modern video games), it's become a joke among the core audience of gamers.  Understanding why it failed (and how to fix it) is one major application of human factors.  One reason that I've mentioned already (though there are many, in my opinion) sits at the core of human factors - developers pushing motion control didn't think about the users.  They designed a blender for someone who wanted a toaster; sure we have this cool new gadget, and they made some money off some protein-shake-drinking douchebags, but a lot of us are still waiting here with cold, pallid bread in hand.

And now, I've officially spent way too much time pontificating.  Enough with the mission statements.  Next post will be more like what I had originally intended for this blog.

Tuesday, July 10, 2012

Justifying My Existence, part 2

Ok, in retrospect I don't know why I teased the ending of the history lesson.  The big reveal is that the human factors engineers went back home to their universities, and continued their research.  The result was the beginnings of human factors as a field.  There were now scientists interested in the systematic study of human beings in the context of a technological system.

Although I use the term "technological," it's important to note that human factors covers more than what we would normally consider technology.  For instance, the way a restaurant kitchen staff interacts with each other to deliver food properly and efficiently can be considered a technological system to the human factors researcher.  The real interest is in understanding humans as part of a greater system rather than the human per se (cf. psychology's study of the individual's mind).  It's a multi-disciplinary field in that regard, as human factors is interested in the cognitive psychology of a person's mental processes, the biomechanics of how they move and act on the environment, the sociology of how their broader cultural contexts influence their behavior, and so on.  As a cognitive psychologist, though, you'll find I'm primarily interested in what's going on between people's ears.

So, I'm not talking about video games nearly as much as I want to be, so let's reign it back in.  Why should video games care about human factors?

Back in 2005(ish), Nintendo's head honcho, Satoru Iwata, explained the company's philosophy behind the Wii.  (I wish I could find the original interview, but it's getting swamped by Wii U stories.)  In essence, he explained that the defining change from one console generation to the next up to that point had been a boost in graphics; but we're reaching a plateau.  It may have been a while since you've seen it, but check out the original Toy Story:

Back in 1995, this blew our freaking minds.

It took 300 computer processors 2-15 hours to render a single frame of the movie.  Perhaps with the exception of the self-shadow effects, the entire movie looks like a current-gen in-game-engine cut scene.  For instance, check out what the PS3 pumps out in real time for the Toy Story 3 movie tie-in game:

Woody facing off with his infamous nemesis, skinny bandana man

Graphics are responsible for innovation in games insofar as they can more veridically represent a developer's vision, and the gap is rapidly closing.  We've hit a point where games are sufficiently pretty.  You're unlikely, for instance, to mistake someone's headband for their face in current-gen graphics.  

Agahnim from The Legend of Zelda: A Link to the Past (source)

And as any retro gaming nerd will tell you, graphics don't even affect how the game plays.  Whether you're reflecting energy balls back at Agahnim with your bug net on an SNES or Gannondorf's magic blasts with a bottle on your 3DS, it still feels awesome.  And it's about as much fun in all the other Zelda games since LttP, which just about all used it, too.  (Nerd burn!)

In fact, the only example I could think of off the top of my head where a step up in graphics had a direct impact on gameplay was when Nintendo (ironically) first used Link's eye gaze as a means of providing puzzle hints in Wind Waker.

Fun fact: You automatically direct your attention toward the target of other people's (especially cel-shaded people's) eye gaze (Friesen & Kingstone, 1998).

Getting back to my point, Iwata explained Nintendo would not significantly upgrade the Wii's graphics capability over the Gamecube, and instead would focus on upgrading the means of interacting with the game.  That, he argued, would lead to the next great innovations in gaming.  Granted, it eventually became a lot of mindless waggling, but Nintendo accomplished their goal of shaking up (har har!  PUNS!) the video game industry.  The new gimmick proved profitable and now every major console company is trying to cash in with their own make-an-ass-of-yourself-by-wiggling-some-part-of-your-body-around-but-oh-god-not-that-part-of-your-body-except-maybe-for-that-one-time-you-wanted-to-see-if-your-dong-was-big-enough-to-get-detected-by-the-Kinect method of game control.

(By the way, because I know you're curious now: the answer is yes.)

But here's the thing: the tactic was profitable because Nintendo drew in a whole lot of new, (*shudder*) casual gamers.  The assumption was that the Wii would be a gateway drug of sorts, luring in upstanding members of society with a few minutes of tennis here and a few minutes of bowling there.  They could never figure out how to do that on an XBox with its multiple joysticks, 11 visible buttons, 2 invisible buttons (L3 and R3), and its XTREME 'TUDE.  But this, you're just wiggling a little white stick!  How cute.  

Then in order to justify spending hundreds of dollars on what amounted to an afternoon diversion, those same people would try out something a little headier - like a Metroid or an Okami.  Then BAM, you have a new generation of gamers pre-ordering the next iteration of your system.

Or do you?

Of course you don't.  That question always means you don't.  You tapped into a completely separate market that behaves in a totally different way.  You tapped into a market that tries out that new "Dance Dance Resolution or whatever" game at the movie theater one time on a date but then never again (OMG! He thought I was such a nerd!  Lolololol).

You tapped into a market that can justify buying a $150 Netflix box or a $200 smartphone and sees games as fun, two-minute time-killers that they won't pay more than $2 for.  (Try selling those guys a 50-hour RPG for $50.)

You tapped into a market that would never understand why you'd pay $65 for a 16-bit game cartridge in 1994 no matter how awesome it was or how many times you've replayed it since then, and what does that even mean, the battery died and you can't save your game anymore?

Point is, those profits are punctate.  Nintendo didn't create repeat customers.  It's not a coincidence that console game sales are down and PC game sales are way up (though Steam offering cross-platform games probably also has something to do with that - how am I supposed to resist the complete Assassin's Creed series at 67% off?  DAMN YOU, GABEN!).  It made Nintendo a buttload of money, but it left a lot of gamers with a bad taste in their mouths.

So what role can human factors (and psychology) play in all this?  That's for the next post to explain.

Thursday, July 5, 2012

Justifying My Existence, part 1

Any discussion invoking human factors would be remiss if it didn't include some time spent on justifying why human factors is relevant and important.  Even when you look at internal reports and presentations at an organization that has a human factors division that regularly serves as a step in their protocol, you will see at least some form of justification for why they should exist and why they did what they did.  Honestly, I've never seen so much dismissal of a discipline from scientists and such frequent defense of it from its practitioners that didn't begin with someone saying, "my degree is in literature."

Anyway, we'll start with a little history lesson.  What the heck is human factors?  Like I said in my introductory post, it is - in part - applied psychology.  It's often characterized this way because of the field's historical roots.  And as with all great advances in science, it was born out of World War II.

As an aside, the field also has roots coming from "efficiency experts" of sorts such as Frank and Lillian Gilbreth of Cheaper by the Dozen fame (the original, not the godawful Steve Martin remake).  For now, I'm going to focus on the scientific aspect.

As you are no doubt aware, World War II was a golden age for the development of new ways for humans to kill each other.  The field of aviation was one such booming area.  Do a google image search for "World War II aircraft," and then do a search for "Pokémon."  You will find the results to be eerily similar to one another.

Man, Nintendo's really getting lazy with these Pokémon designs.

Point is, the world was engineering the hell out of aircrafts.  We were making them bigger, tougher, faster, more agile, painted to look like sharks.  Engineering was hitting a plateau.  (Especially with that shark thing - have you ever seen a plane painted to look like something more bad-ass than that?)  Aircraft represented the pinnacle of contemporary human technology.  And so, of course, you can understand why the military would go to great lengths to make sure the operators of their ludicrously expensive, cutting-edge tech were extensively trained to skillfully exploit it.

But here's the problem: the military kept seeing plane crashes that had nothing to do with combat.  Extensive, thorough training with these machines, all the single-minded discipline of the top pilots of the air force, and the cleverest mechanical engineering just weren't enough to keep these things intact and pilots safe to the military's standards.  What was happening?  With the razor-sharp clarity of hindsight, you probably know where this is going.  Let's take a look at a World War II era cockpit.  I never claimed to be a WWII aircraft buff, so I'm not going to pretend I know anything about this plane, but just look at this mess:

Cockpit from a B-24 Liberator, which sounds like the name of a sex toy to me.

Humans (and their brains) evolved to do some pretty awesome things.  The fact that I'm transmitting thoughts from my mind to yours through the use of visual symbols is itself staggering.  But humans, in terms of sheer hardware, evolved to basically hang out with a few other humans, pick berries, catch the occasional small animal for dinner, and run like hell from the occasional big animal looking for dinner.  

Evolution has not sufficiently operated on our bodies and minds to optimize them for driving down a road in a 2-ton death machine at 70 mph, let alone soaring above the clouds while relying on something like what you see above to keep us up there.  But I'm jumping ahead in our story.

The military brought in scientists from different fields to try answering the question: why do our planes keep crashing?  Setting a precedent for future generations of human factors specialists to follow, the psychologists in the room stated the obvious: the cockpits are too damn confusing. There was simply too much information being shoved down the bottleneck of human attention, and the pilots just couldn't manage it all while having to simultaneously operate the things.  The major culprit wasn't something about the mechanics of the aircraft - it was human error.

And so, the psychologists, being the amazing wizards of understanding and systematically studying human behavior that they are (HA!  Just kidding - the field didn't really get its shit together, in my opinion, until behaviorist and cognitive psychologists started duking it out in the 60s and everyone had to become that much more precise and rigorous in order to establish scientific dominance. but these guys were still pretty good.  Oh, man, this ended up being a really long parenthesis.  Where was I?  Oh, yeah, the psychologists, as experts in studying human behavior), isolated the sources of human error and worked with engineers to prescribe remedies to the problems.

Then life for the first human factors engineers became a victory montage set to 40s big band music.  Cockpit designs evolve, there are shots of guys wiping sweat off their brows and gazing thoughtfully at blueprints, pilots start crashing less, the war ends, and then a sailor in New York creates an iconic photograph by committing what amounts to sexual assault on a random nurse in the street.

But what of our scientists?

The military had no use for them anymore and sent them home like all those drafted military men.  But they were bitten by the human factors bug.  Their heads were still full of research questions the military didn't care about or otherwise deemed "out of scope" for their purposes.  And everyone knows how dangerous it is to leave a scientist alone with a burning question that dwells within him.

And so, like David Banner in the credits of the 80's Incredible Hulk TV show, the scientists were sent hitchhiking down a desert road to some oddly maudlin piano music...

To be concluded in the next post.