Game Career Guide is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Get the latest Education e-news
 
  • 10 Steps To Succeed as a Freshman Game Developer

    [03.23.10]
    - Chris Ridmann
  •  So, you want to be a game developer? Well, starting out as a game developer is a very daunting task, especially because games are perhaps the most complex pieces of software to develop. While I am by no means an expert in game development (or even out of college), I do have a unique perspective on starting out as a game developer. Last year I finished freshman year of my Computer Science degree at the University of Illinois at Urbana Champaign, and was (and still am) on the Chromatactix development team that started at an organization at U of I called Gamebuilders.

    Chromatactix is an action-adventure game we are developing for the Xbox 360 where you an up to three friends survive against alien invasions. Gamebuilders, as the name suggests, develops video games for both the Engineering Open House at our school and the Independent Games Festival. Our presentation at the Engineering Open House was an overwhelming success; we had a steady line of people the whole event (which lasted 2 days) just to play our game. We even had kids bring their friends the next day to play it. Personally, it was very gratifying to see our game liked by so many people - all of our hard work paid off!

    With a year under my belt, I'd like to share some thoughts and reflections on how our project did so well, and give advice to future projects. Just because you're a freshman or starting out in the industry, doesn't mean you can't make a killer game!

    1. Take the Initiative to Teach Yourself

    Coming into college, I had very little actual experience with programming - the only experience I had was developing websites and dabbling with C++. I had never made a video game (though I thought it would be cool), and had never even took a programming class in high school.

    The courses you take in college are going to give you a very solid base of understanding, however (at least from my experiences so far) they won't teach you frameworks or engines that the real world uses. You'll get a lot of good experience by applying the theory and concepts from school to practical applications. For example, I used the concepts from my "Intro to Programming" class which taught Java and Object Oriented Programming to teach myself C# - a language very similar in syntax to Java - so I could interact with Microsoft's XNA Framework and start developing video games.

    Besides my passion for video games, the knowledge that the stuff I'm teaching myself applies to many areas of the software development industry helps motivate spending a lot of time learning outside the curriculum. Making a game isn't mutually exclusive from everything else - a game is still a piece of software, so it falls under the realm of software development. I have applied what I know from game development from a variety of different programming projects.

    Furthermore, I was able to get an internship position as a software engineer from one of our career fairs - something I would never have been able to do as a freshman if I didn't demonstrate my involvement outside of the curriculum. The experience you get from applying what you learn and teaching yourself is invaluable and will give you a leg up above everyone else.

    2. Pick a Technology and Go With It

    While I was just beginning at game development, I spent a lot of time meticulously analyzing what technologies I should use to build the game. What language should I use?

    Should I learn DirectX or OpenGL? Should I use a pre-built engine? What games should I make first? The research went on and on. Don't get me wrong, it's important to do your homework before you just jump into the first technology you see, but there comes a point where you have to make a decision.

    One of the best pieces of advice I got was to just use a language and development environment that you are most comfortable with, because at the end of the day you are more likely to grasp the concepts and enjoy what you are doing if you like the technology you are working with. I ended up choosing C# because I was much more comfortable using it than C++ or some other language. The garbage collector and general feel of the syntax made it much more favorable for me to work with than in C++.

    You might be thinking that most of the game development industry uses C++, so why not just use that to develop your games? At least while you're learning, the difference between these technologies doesn't really matter. Sure, you have more control over memory management with C++, but chances are you aren't going to be making a commercial game your first few games.

    The concepts related to game logic, artificial intelligence, collision detection, algorithms, and general programming paradigms are language independent, so it's best to just learn with what you are comfortable with. The XNA Framework allows me to focus easily on the above things. For example, instead of worrying about how to draw a polygon mesh on the screen, I can just call an XNA function that interacts with DirectX to draw it, allowing me to deal with the fun stuff right away.

    To illustrate the point that most of game programming (probably most programming in general) is language independent, I'd like to share one of my experiences at one of our school's Game Career Night. Our team was showing off Chromatactix to several guys at Volition (Red Faction, Saint's Row), and they immediately broke down our game into its components - collision detection, AI, etc. They really liked what Microsoft was doing with XNA, saying that it's now much easier to get into game development with this technology.

Comments

comments powered by Disqus