Get the latest Education e-news
 
  • Student Postmortem: Northeastern University's Shortfall Digital

    [08.09.07]
    - Mark Sivak and Seth Sivak


  • What Went Wrong

    1. Scope and time management of Shortfall online. While we were brainstorming, we tossed around some simply amazing ideas that would have made for an incredible game. We originally slated animation and networking as two of our goals for the project, but in the end, many features had to go to the chopping block due to over ambitiousness and poor time management.

    Inexperience can breed overconfidence, which put our sense of time in a blitz, especially because we were two students with other classes and a rapidly approaching graduation date.

    Despite several days of crunch time, we missed our first play test date. The game was simply not ready. Luckily, our advisers were kind enough to allow us an opportunity simply to discuss our game with the class, a mix of graduate and undergraduate mechanical and industrial engineers. Their discussion proved to be incredibly helpful, and we heard some great feedback about the game, despite it not being a finished product.

    Once we had a final set of features in place, we still underestimated the time it would take to actually create and test the game. As much as we can attribute this problem to our inexperience and limited knowledge of designing games in Flash, it's also an extremely common painpoint even for professional video game developers.

    2. Using Adobe Flash as a platform. We had very limited experience with Adobe Flash. This became obvious during our first iteration of the game, which we attempted to finish for our first play test. We simply did not finish on time.

    We found it very challenging to have more than one developer working on the game at a time. What made it even more of a struggle was that we were pressed for time and had to work on everything in series rather than a few things in parallel. It got worse when we realized how badly we underestimated the complexity of the game itself.

    After missing the initial play test deadline, we decided to totally rework entire portions of the game, a valuable move in the long run because we had learned so much about programming in Flash and building browser-based games that we could go back and change all the errors we had made earlier.
    Reworking portions of the game also gave us a chance to optimize the code and clean up the work we had already done.

    The released product, while not as polished as we had hoped, exceeds what we thought could be done given the timeframe.

     


    3. Balancing green score, money, current events, storage, and random markets. Shortfall was built as a simulation to reflect the real life decisions and factors that go into making engineering choices. By nature, it's a very complex system with dozens of variables changing the dynamics behind the production and sale of automobiles.

    The sophistication of the real life system made balancing the game the most challenging aspect of the entire project. We spent a great deal of time adjusting numbers and trying various strategies, but it seemed like we could never do enough. No matter how hard we tried, we ended up with a certain strategy that was working much better than the rest, and adjustment of those variables would switch the results the other way.

    Feedback from the board game testing informed us that many players felt the game was not fair and that too much was left to chance. Our goal was to eliminate any major balance issues while reinforcing that player choices, not random factors in the game mechanics, actually facilitated the outcome.

     

    Knowing that the game was going to support multiple players, with groups competing against each other, we felt that an obvious unfairness would draw attention away from the possibility of learning, as student would focus on what was unfair and not the actual topic of the game.

    4. Instructions for gameplay. With the date of our first play test behind us, we made the decision to try the game out two months later on a much bigger audience: 80 mechanical and industrial engineering students.

    The day before the students play tested the game, they heard a lecture on the same subject matter as the game's content: environmental issues involved in engineering decisions.

    The day of the play test, we had 17 teams, each with its own laptop. The students took two knowledge tests, one before the game and one after, to assess what they knew coming in versus what they learned from playing, as well as their confidence level about the topics.

    We offered only a very brief introduction and tutorial, avoiding a lengthy explanation of how the game worked because we wanted to observe how students might learn through trial-and-error. The groups also received information packets that discussed the game basics of how the market worked.

    Unfortunately, we somehow had a miscommunication prior to the printing of these packets that left out a full innovation tree, which we felt would expedite player choices and give students plenty of information to help them form strategies. If we could do it again, we would say we must include a full innovation tree in the handouts. The instructions were presented as narrative memo handouts and were given to the students the day before the play test and then again the day of the play test.

Comments

comments powered by Disqus