Gaming Education at College (5) - Good and Bad Lessons at Colleges

posted in Master of None
Published April 25, 2023
Advertisement

LAST: What Students Gain from College

This is part of a series of same titled blogs that focus on the teaching of gaming above the high school level. As a professor who taught (and sometimes still teaches) gaming, I’ll share my perspective on the pros and cons. I’m going to try to go in depth on what to expect and what we shouldn’t. I also will answer common questions by students and prospective companies who wish to hire. I teach at community colleges in the United States, so my experience may differ from other parts of the world. Take my opinion for what it is: the most authoritative. No, just kidding, but I wouldn’t mind hearing any opinions for or against my own.

The latter discussion is a small subset of a larger list of why colleges can be beneficial. Are there weaknesses of college? Of course. However, each college has a different set of weaknesses. Just like colleges have their own set of strengths, they can be limited as well. Some of these limits are due to resources and tangible constraints. Others are due to methodology. To understand this more fully, let’s look at what good colleges teach versus what bad colleges teach.

The first thing that I believe a college should prepare students to do is to make something. This sounds trivial, but many poor colleges would like to get into discussions about game theory, arguments about elegant coding, “applying 15 rules, steps or guidelines to a better <blank>”, etc. These topics aren’t bad, and they can take accomplished students and game makers to a higher level of understanding. However, for a freshman or sophomore, these topics tend to degrade into criticisms and opinion pieces with little output. Moreover, it’s hard to discuss higher level theory without ever doing application.

A college should teach student how to model, write code, develop interactive stories, create levels, and more. This should start usually with software or past creations. However, once the students know the basics, they should be pushed to do better until they can create something sophisticated. It should not look sloppy, simple, half done, or part of a nice hobby that gets us a pat on the head from mommy.

Once the labor intensive pursuits are accomplished by the students, then piecing these parts to form a whole can be accomplished. Some students may get frustrated because they want to make a game right from the start. My personal opinion is that you have to fight them on this. You wouldn’t construct a car before how you have full knowledge of how gears work. It may be unpleasant, but you this also helps them in another way. Most game companies don’t hire people to make a full game. Everybody works on their part.

Learning the full game making process at the latter part of their education will help them start to think of bigger issues. This is a good set up for advanced degrees. Here you can focus on how graphic objects interact with mechanics, artificial intelligence, and more in depth work while working with larger groups. Then, I think the focus should still be in doing, but with more topics/issues to balance on the students’ plate.

The last thing that I will mention on what the college should teach may be unpopular with industry. They should offer students a lot more than gaming. In order to handle sophisticated problems and issues, you have to grow people’s minds. The worst way to do this is use a recipe approach for everything. This is what teaching gaming would devolve into if not for exposing students to the wide menu of courses at colleges and universities. You need to put students in other environments where they are asked to think in a way that is unnatural to them. This is the way they grow.

NEXT: Stumbling Blocks of Higher Education

0 likes 1 comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement