Sign in to follow this  
ZomCoder

3D Graphics Hardware: How much will it do for us eventually?

Recommended Posts

Hey everyone...this is my first post here in years after a long hiatus from game programming. I come from the "old school" of game programming and tend to like to be responsible for every single pixel on the screen. Thus, I enjoy writing software rasterized graphics programs. The "new school" of course likes to use all sorts of new technologies and graphics hardware and so forth. It appears that the graphics hardware does all sorts of things for us now that we don't have to code ourselves anymore. Drawing triangles...textures..lights...on and on and on. However it isn't as though our job as programmers is any less interesting, in fact it may be MORE interesting with the ability to use shaders, etc. (which may give satisfaction to old schoolers who like to be responsible at the per-pixel level). So I'm wondering---is there any talk of making graphics hardware go beyond the level of primitives and shading? In other words, will graphics hardware eventually implement various hidden surface removal algorithms, since it is becoming apparent there are a handful of exceedingly common ones that are adaptable to many different situations? Regards, -Zom

Share this post


Link to post
Share on other sites
Well, that 'old school' is really old :-)

Graphic hardware is becoming each year more general about what part of the pipeline the programmer is allowed to customize, think for example to geometry shaders in DX 10 and antialiasing in 10.1.

I cannot help you much, the only thing that comes to my mind is to look for the next intel graphic hardware Larrabee, because it will x86 based and theoretically could give you the option to write an old software rasterizer...

Share this post


Link to post
Share on other sites
Yeah, the interesting thing about Larrabee is that it doesn't really have a whole lot graphics-specific hardware. What it does have is massive bandwidth to low-latency memory, and upwards of 32 x86-based cores that can process up to 4 threads. Rumor is that each is running at somewhere between 1.5 and 2.4Ghz, and then there's the 512-bit vector instruction set, which is enough to hold an entire 4x4 floating-point matrix in a single register -- Matrix-Matrix multiply instruction anyone?

Tom Forsyth, a well-known graphics programmer, was hired by Intel and is working on Direct3D for Larabee. The Direct3D driver for Larabee is basically a super-advanced software renderer, probably utilizing dynamic code generation.

Share this post


Link to post
Share on other sites
Quote:
Original post by Ravyne
Yeah, the interesting thing about Larrabee is that it doesn't really have a whole lot graphics-specific hardware. What it does have is massive bandwidth to low-latency memory, and upwards of 32 x86-based cores that can process up to 4 threads. Rumor is that each is running at somewhere between 1.5 and 2.4Ghz, and then there's the 512-bit vector instruction set, which is enough to hold an entire 4x4 floating-point matrix in a single register -- Matrix-Matrix multiply instruction anyone?

Tom Forsyth, a well-known graphics programmer, was hired by Intel and is working on Direct3D for Larabee. The Direct3D driver for Larabee is basically a super-advanced software renderer, probably utilizing dynamic code generation.


Interesting, I didn't know that they added specific graphic instructions, though I was sure they would have to. I didn't know that they were already developing the directx drivers either...

Share this post


Link to post
Share on other sites
Quote:
Original post by ZomCoderIn other words, will graphics hardware eventually implement various hidden surface removal algorithms, since it is becoming apparent there are a handful of exceedingly common ones that are adaptable to many different situations?

Graphics cards have supported zbuffers for ages now, which is a hidden surface removal algorithm. And occulsion queries have also been supported for years as well.

Share this post


Link to post
Share on other sites
More specifically, I meant to ask will graphics hardware eventually store large data structures such as bsp trees, octrees, etc. and basically do that for us? I don't want it to is what I'm trying to say---I like learning about and implementing algorithms like that =) I'm well aware z-buffers at least are present.

Share this post


Link to post
Share on other sites
Quote:
Original post by ZomCoder
More specifically, I meant to ask will graphics hardware eventually store large data structures such as bsp trees, octrees, etc. and basically do that for us?


If at all it will go the other way, graphics hardware and driver will do less and less for you and more and more will rely on your code (or libraries that you can use). Of course bsp trees, octrees will be more practical to store and refer to on the graphics hardware but this will be your responsibility and not the driver/graphics card.

LeGreg

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this