the state of GPU coding...

Started by
5 comments, last by Holland 14 years, 7 months ago
Just to preface this: I've been coding for a few years now, mostly focused on game programming. I've done little shader work, however. So I was looking into the GPU Gems series. I have the AI equivalent as well as the standard Game Programming Gems and I love the series of books. I started thinking tho, the current state of GPU coding...are these GPU Gems books even going to be worth having in the near future? With things like CUDA coming around and the increasing speed of RAM, it's pretty obvious that we are working our way towards a GPU-less world. And I don't mean GPU-less as in just zero GPU in the system, but it seems like we're trying more and more to use the GPU just like we would a CPU, which means that in the near future (if companies are smart) they'll start coming out with CPU/GPU combos and such. So I realize that in the meantime, GPU programming using shaders is a must-have for games to perform the way they do. For that, I think the GPU gems books would be good to have. But at the same time I don't want to go blow money on books that, in a couple years, will be completely obsolete. Long post, slightly a rant. Slightly just an opinion on the way I'm viewing things, but the base question I have for you: Is the GPU Gems series worth getting at this point in time or would I be better off getting some books on CUDA and playing with that?
Advertisement
I think the underlying algorithms and ideas presented in the books will still be applicable in many ways on future hardware.

Plus they're like nerd porn...cmon

( I own the whole series [grin] )
Quote:Original post by Holland
For that, I think the GPU gems books would be good to have. But at the same time I don't want to go blow money on books that, in a couple years, will be completely obsolete.


APIs go obsolete. Concepts don't.

GPU is just one way of doing data parallel work. That is something that is here to stay. Whether learning such concepts as they apply to shaders, using one of upcoming .Net technologies, using functional languages, or the new CPUs - it's mostly the same foundation.


Even the graphic gems, while thoroughly dated, still contains useful information, as well as some techniques that are very applicable today.


At the end of the day, most of fundamentals of CS have been discovered in the 60's. These days, the focus just shifts from part to another.

If you are interested just into a quick buck and short stint in coding, then do the math. How much more will you earn vs. cost of books. If you are in for the long haul, the question you should be asking is: What can I learn next?
I really think we're moving more and more towards the idea of seperating game logic from graphics logic. things like particle engines will be completely done on the gpu. You'll simply supply starting data to the gpu, then it'll do everything from then on.

And hopefully cards will be cheap and powerful enough that 2 cards will be good enough for 1 being high end graphics and 1 being unbelievably realistic physix. once again allow the cpu care only about game logic.

but if the speculations of some game companies stand true, gpu's are soon to pass the speed of cpu's, and having extremely higher ram frequencies and bandwith, I think they will be great for small processes that have to be repeated more frequently then what is happening on the cpu... ie physics. very little happens in physics, it's just that it has too happen alot. compared to graphics and game logic.

and if things with gpus gp far enough, eventually cpus will be mearly script parsers and the gpus will do all the work. with the cpus deciding what gpu does what work.
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
Quote:Original post by Holland
Just to preface this: I've been coding for a few years now, mostly focused on game programming. I've done little shader work, however.

So I was looking into the GPU Gems series. I have the AI equivalent as well as the standard Game Programming Gems and I love the series of books. I started thinking tho, the current state of GPU coding...are these GPU Gems books even going to be worth having in the near future?

With things like CUDA coming around and the increasing speed of RAM, it's pretty obvious that we are working our way towards a GPU-less world. And I don't mean GPU-less as in just zero GPU in the system, but it seems like we're trying more and more to use the GPU just like we would a CPU, which means that in the near future (if companies are smart) they'll start coming out with CPU/GPU combos and such.

So I realize that in the meantime, GPU programming using shaders is a must-have for games to perform the way they do. For that, I think the GPU gems books would be good to have. But at the same time I don't want to go blow money on books that, in a couple years, will be completely obsolete.

Long post, slightly a rant. Slightly just an opinion on the way I'm viewing things, but the base question I have for you:

Is the GPU Gems series worth getting at this point in time or would I be better off getting some books on CUDA and playing with that?


Given that there are no books on CUDA ... that might be hard. Same for OpenCL.

Basically, learning about shaders probably won't be a whole lot of help if you want to use your GPU for numerical computing ...
You know they're all available for free online right?

GPU Gems

GPU Gems 2

GPU Gems 3

Game Programming Blog: www.mattnewport.com/blog

Quote:Original post by mattnewport
You know they're all available for free online right?

GPU Gems

GPU Gems 2

GPU Gems 3


haha...no I did not. Thank you very much, this is most useful. Although I have to admit, there's something very desirable about flipping through actual pages in a book. And that new book smell.

Now if only I could get a peripheral that emitted that new book smell every time I open a brand new website!

This topic is closed to new replies.

Advertisement