Final year project

Started by
9 comments, last by Psyk60 15 years, 11 months ago
Next year is my final year of university doing a computer science course. I won't be starting work on my final year project until a fair way into next academic year, but I think getting an early start on brainstorming ideas for it is a good idea. Doing something that interests me not only means I'm more likely to get a better grade, but could also attract potential employers. As you've probably guessed from me posting on here I'm interested in game development, so I want to do a project that relates to that. As it's a computer science course, not a game development course, a game itself is not really appropriate for this project. I'm looking for something that is academically interesting from a computer science point of view and can also be applied to games. Another requirement of the project is that it must have commercial potential as I also have to write a business plan to accompany the project. I want to come up with quite a few ideas because I will have to find a member of staff willing to support me on the project, so I'll be stuck if I only have one idea but there is no staff member with the expertise to guide me on it. I have a few loose ideas already. One possibility is researching into how a GPU could be used to perform non-graphical tasks for games, probably using CUDA. I know there's already work being done by Havok about using GPUs for physics calculations. Or I could do something to do with procedurally generated content. That could even link in with the previous idea and offload that work to the GPU. As you can see these aren't very specific or well developed ideas, just a starting point. I'm hoping people here can give me suggestions for what could make a good project and something that could potentially be useful for games, so I can build up some more concrete ideas. Any suggestions?
Advertisement
Offloading work onto a GPU sounds both interesting AND socially relevant, with the recent nVidia vs. Intel fighting over which is more important (GPU or CPU). Your findings might even find its way onto slashdot ;)

If you could find a way to offload traditionally processor-intensive tasks (encoding video, for example) onto a GPU, you would definitely turn some heads.

If you were able to get the ball rolling on that project, I imagine it'd also be interesting to compare the performance of multi-GPU cards, SLI cards, etc.
Deep Blue Wave - Brian's Dev Blog.
Video encoding is a possibility. I guess I'd like something a bit more closely related to gaming, but it's not a bad idea. I would have thought it is something that could benefit from being done on a GPU because it can be parallelised quite a bit (probably, don't know a great deal about video codecs).

As for the actual hardware to do it on, I have a CUDA compatible GPU in my machine but it would be interesting to compare different set ups. I'd need funding for that though.
GPGPU is becoming a huge area of both academic and commercial activity, definitely a great topic for your project. Definitely check out gpgpu.org, and perhaps even the GPGPU forum over at beyond3d. I'm sure if you ask around, you'll find a few people who can help you get started. Also if you're just doing research on what's out there, you may want to look into some of the commercial packages such as Rapidmind. There's also BrookGPU, from Stanford U.

As far as a specific topic to work on related to games...one idea I'd had but never got around to implementing was running neural networks on a GPU. The training step is generally pretty slow, but if you ran it on the GPU in parallel it might be fast enough that you could keep feeding the network data from the player's actions and then use the net to govern a few AI parameters or decisions.
Another good thing about the GPU idea is that a lot of the theory could transfer over to using processors like Cell. I could even look into the possibility of getting access to a PS3 devkit from a local company.
Quote:Original post by Psyk60
Another good thing about the GPU idea is that a lot of the theory could transfer over to using processors like Cell. I could even look into the possibility of getting access to a PS3 devkit from a local company.


You don't even need a devkit to work with Cell, just install Linux on a retail PS3 and you can use the Cell SDK. No access to RSX though, unfortunately.
Quote:Original post by MJP
Quote:Original post by Psyk60
Another good thing about the GPU idea is that a lot of the theory could transfer over to using processors like Cell. I could even look into the possibility of getting access to a PS3 devkit from a local company.


You don't even need a devkit to work with Cell, just install Linux on a retail PS3 and you can use the Cell SDK. No access to RSX though, unfortunately.

Good point. Even so there's a local company with lots of expertise and I'm sure they'd be willing to offer support if there are no legal complications.

Quote:Original post by Psyk60
Good point. Even so there's a local company with lots of expertise and I'm sure they'd be willing to offer support if there are no legal complications.


I'm sure they can help you out, but in some ways they'll probably be tied by the NDA that Sony makes you sign before can work with a devkit. You'd probably have to sign one as well if you were to work with a devkit.
Quote:Original post by MJP
Quote:Original post by Psyk60
Good point. Even so there's a local company with lots of expertise and I'm sure they'd be willing to offer support if there are no legal complications.


I'm sure they can help you out, but in some ways they'll probably be tied by the NDA that Sony makes you sign before can work with a devkit. You'd probably have to sign one as well if you were to work with a devkit.


Yeah there's no doubt I'd have to sign an NDA. There might actually be not much point if the NDA is too restrictive. There's not much point in doing all this research and then not being able to include it in my project.

They probably have a bit more freedom with the devkit than most companies though, as they are actually part of Sony and some of the things they make might even be part of the devkit (they make developer tools, not sure if it's part of the devkit itself though).
GPGPU and cell stuff is probably a good idea; if your lectures are anything like mine were then new stuff, such as off loading things to graphics cards, is practically on the level of magic.

Keep in mind that most of the marks are going to be on explaining things; my project (as linked to from my journal) wasn't technically that complicated however it sounds as such and the explaination of what was done was what was important.

This topic is closed to new replies.

Advertisement