Game Career Guide is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Get the latest Education e-news
  • Learning to Play 1: Playing By Ear

    - Andrew Kuhar
  •  I've already broken the first rule of game design that was revealed to me:

    "Even when designing your own games, keep playing others'. They are your best learning tool. By playing games you become the 'player', and who are you making games for if not for other players?"

    This wisdom was handed down by my Swedish game design mentor, Knut Hybinette. It was one of the first lessons he spoke of to me as we shared our expanding interests in game design. Even then I knew it was probably good advice, and even then I laughed in silence at how preposterous the problem sounded. Forgetting to play video games? Is this really all there is to it? Actually, no -- it isn't, at all. Making time to play games, no less remembering to play games, can quickly turn into a serious issue when designing and developing our own.

    It was only last month that I recognized I hadn't bought and fully engaged in a new game in nearly a year. Coupled with the stress of self-criticism towards my own creativity, I suddenly understood the implications of not having a balanced diet of playing to making games and how disorienting it is. It's strange still, considering that as a college student in an art school I found myself in a natural habitat for gaming. It's an environment where games have forged a legacy as a quintessential pastime: they're socially acceptable, readily available, and they command attention amongst the bleak décor of dorm-room interior design.

    Another familiarity of college is the crash course on do-it-yourself time management. It eventually explains how this purge of play emerges, and happens to be the shorter story of the two. Chances are that if you're spending time making games, playing them will fall outside of your budget. The longer, more pertinent story is what this situation brought about in me as the two-sided coin of designer and player began to flip, fresh out of college.

    There isn't necessarily a word for this phenomenon. I'm wary to privilege one role over the other after discovering their need for each other in order to retain meaningfulness. At some point we decide we want to be game designers, and then we begin to mull over when we'll actually own up to those words. If we carry the scent of a gamer, our interest in behind the scenes footage and developer postmortems often trails close behind. Any and all novelty enveloping this wears off as soon as we unearth the truth like a regular Mulder & Scully, slowly comprehending just how much is required of us to create an actual video game, and how little we knew about it.

    With the clues there all along, things start to make sense: the never-ending list of credits after every AAA game, the bugs that make it into a final build, the environments that were blocked off so that one might never lay eyes on their un-textured grotesque. I'm not trying to make excuses for the under-sung hours of game development, but making games takes unfathomable amounts of time and can easily take the place of that which enticed us to make them in the first place: playing them. Shifting over from player to maker begins having a serious effect on our perception of creativity -- at least it has for me.

    For instance, take the concept of solving a problem: solving problems in-game is much different than solving problems out-of-the-game. This is especially interesting when the problems designers face are directly involved in tailoring the in-game problems for players to solve. Gamers have been trained to look for solutions in a certain way while playing games rather consistently for the past couple decades. We recognize that puzzles and situations can only be rectified with what's already at our disposal, placed before us, or referenced from past experiences in those virtual landscapes.

    There's this eureka moment that feels familiar in either situation, yet the events leading up to it are anything but. Process of elimination isn't as effective when working outside of games to make games, because the potential for solutions is not only intimidating, it's constantly expanding. In-game, we know there is a solution -- there has to be. There has to be an escape route somewhere, right?

    Nothing truly prepared me for that moment on the other side, when for the first time I finally had to admit: I'm not sure there is a solution -- at least not now -- and until then I have to try inventing one. Especially as a rookie do I run into this situation repeatedly, and the hoops I've had to jump through in both planning and execution stages is perhaps the nerdiest thrill I've ever come across. Neither epic loot nor brain-twisting puzzles can put a candle to sort of voltage that ownership and originality conduct.

    Additionally, consider how we consume the culture of games. Given that the internet's role in gaming has taken a much fuller shape, the pace at which we can retrieve information we'd otherwise find in the games themselves has dramatically shifted. When I'm busy working on assets and designs for a game project, I'll often take breaks by reading articles online or listening to podcasts from various competing media outlets. All the noise helps me feel informed by a spectrum of opinions, but for a while now I've felt as though I've played most games through the experiences of others. As I've loosened my grip on the reins of playing games, I've unintentionally put them in the hands of everyone else.


comments powered by Disqus