Posted 28 July 2008 - 05:32 PM
*moderator hat on*
Everyone take a chill pill please. It's quite okay to have a strong opinion, particularly in the academia vs industry debate, but please try to keep the discussion polite... or at least avoid making directed, personal attacks at each other.
*moderator hat off*
On the original question of researching the use of NNs in games. That's quite valid. Go for it. Just don't expect anyone to actually use ANNs just because you might find a valid application for them. As has been pointed out several times in this thread, there exists, almost always, an alternative solution to a problem that an ANN can solve (and usually *how* it solves it is more easily and more widely understood). Doing research for research sake is not a good use of your time. You should be looking for quantifiably useful results. That is, research must have significance AND importance. Thus, you should be looking at problems and asking "can an ANN solve this better than the existing methods". Significant quantities of previous research though have shown that, generally, the answer is no.
Some comments on the parallel, off-topic discussions of game education and industry vs academic innovation...
In Australia over the past 5 years many universities have jumped onto the game dev/design education bandwagon... 10 years ago there were only 2 places in Oz you could go to study games. Now it's more like 20. This is a recognition of two things: 1) that there is strong demand for 'cool' courses (and those perceived as vocational) amongst high school graduates; and, 2) there is a demand from industry for graduates who have some basic understanding of the problems that must be overcome in the production of games software.
Traditionally though, the role of universities has been to develop scholarship and engender graduates with the skills for life-long learning. These skills can of course be learned outside of the university environment. The role of universities is not (or at least, should not be) to teach people how to do a specific job. These skills should be learned through practice, while on the job. Unfortunately, in this modern, economically focused age, universities have been forced to sacrifice scholarship for 'graduate outcomes' (meaning employment prospects) because industry does not want to bear the expense of training workers. The result is that universities now try to cater to what industry wants and what students want, rather than on what society needs. Hence the rise in games dev programs. (There is also another driver: market growth in the games industry due to the 'leisure lifestyle' of Gen Y... but that's a discussion for another day). We should not though expect universities to churn out people who are job-ready on day one. It simply isn't possible. They have a lot to learn and it's up to industry to choose those most capable of learning and employ them, when they have the need.
Having said that, there ARE very good degree programs teaching game development in a computer science/software engineering framework, where students learn fundamental skills applicable across a broad spectrum of IT roles, but also focus heavily on game development. One would expect that graduates from these programs are useful to industry. Sure, they're wet behind the ears and need to learn a lot... but at least they have some basic foundations from which to grow from.
As for innovation...
I cannot recall the source of the data, nor the exact figures (so please, take this with a grain of salt), but I remember reading that around 95% of innovation in IT was achieved by industry, rather than academia and that this was simply because it was industry trying to solve the day to day problems in software development. In other words, they needed a solution so they went out and developed one. That doesn't mean though that academia is a waste of space and money. The role of academia is NOT to produce commercial applications of knowledge, nor to produce knowledge with immediate commercial value (although this does happen from time to time). Indeed, because there is not an inherent, immediate commercial value in what academics do, many people denounce them as useless.
On the contrary though, academics are afforded the luxury of the time and money to investigate problems that *may* have a commercial value in the future (or may lead to an advancement of knowledge). In Australia we have two government funded research streams, provided by the Australian Research Council, to support this: Discovery grants and Linkage grants. These are aimed, respectively, at developing fundamental knowledge (discovery) and developing commercially viable applications of fundamental knowledge (linkage). The latter is always done in partnership with industry. Thus, at least in Australia, the role of academics is to solve the problems, or develop the knowledge, that industry has neither the time nor money to investigate, simply because they cannot guarantee a benefit to their bottom line. We get to look at the big picture, or the fuzzy, distorted picture that no one else can afford to look at, to find new solutions to old (or new) problems.
Principally, our aim is to inform industry of what is possible and to provide them with a strong foundation from which they can develop the solutions that they need. Both groups are necessary. Without industry, academia has no funding support (no one paying taxes that fund the research) and without academia, industry has to bear the cost of the research upon which their innovations are often based (and it's been shown time and again that industry cannot afford to do this). In the end, we all need to get along with each other, which, if I recall correctly, was the original comment in this post! ;)