Jump to content
  • Advertisement
Sign in to follow this  
der

very special buffer

This topic is 5067 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi, what would make interpolated vertex position, normal and UV map being writen to a buffer when rendering a primitive ? Hope I made myself clear enougth. If not here is a example that doesn't explain much more. float special_buffer [ window_width * window_height * ( 3 + 3 + 2 ) ] ; // 3 + 3 + 2 = 3 float for the postion + 3 for the normal + 2 more for the UV map glBegin ( GL_TRIANGLES ); glNormal3f ( n1.x, n2.y, n3.z ) glTexCoord2f ( map1.u, map1.v ); glVertex3f ( v1.x, v1.y, v1.z ); // and so on with two others vertices... glEnd ( ); // then here , i'd like the positions normals and mapping calculated during the rasterization ( really, during the "rasterization" ? is it correct ? ) to be now in the buffer cleaverly named "special_buffer". [Edited by - Yann L on August 3, 2004 8:09:10 PM]

Share this post


Link to post
Share on other sites
Advertisement
what happend ?
GameDev doesn't take return charater at end of lines anymore ?

Share this post


Link to post
Share on other sites
at best you could fake it with a render to a pbuffer or frame buffer and instead of outputting color values to the buffer you write the interpolated values to it and then read 'em back later as python_regious says via glReadPixels, this would require a fragment shader however and a number of passes to get all the infomation out and the read back alone would kill performance, never mind working out how to interperate that infomation...

why on earth would you want this infomation?

Share this post


Link to post
Share on other sites
Such a buffer is sometimes called a "fat buffer". It's basically a framebuffer with user defined component mapping, and is mainly used in deferred shading techniques: the colour, position, normal, and other per fragment parameters are stored in the fat buffer, and then reused as a texture in the deferred pass.

You can only do that on hardware supporting fragment programs. Faking it on lower class hardware is partially possible, but requires multiple passes and subsequent calls to glReadPixels(), which will severely affect your performance.

Share this post


Link to post
Share on other sites
I can't believe there's no way to grab those small informations.
All those calculations lost forever after the end of the rendering process ?
What a pity.
Anyway... I was just thinking that doing some lighting calculations with the few window_width x window_height points might be less CPU consuming than doing them with every single pixel on every triangles drawn.

So if you heard some way to do this without multipass neither by using glReadPixel, which is really slow, I know, please let me now...

I won't try what you said since it seems to be way to slow. But could you just tell me how to render something else than color to the buffer ? I could use this technique someday for other stuff.

Thanks

Share this post


Link to post
Share on other sites
yeah, Yann L, this is my goal.
could you be more explicit please ?
which opengl EXT does it require ? Where can i find doc about it ?
And WHY ON HELL ARE MY RETURN EATEN ? Do I have to type \n or
? I knew choosing that eye as an icon would curse my topic. I knew it !

Share this post


Link to post
Share on other sites
Quote:
Original post by der
I can't believe there's no way to grab those small informations.
All those calculations lost forever after the end of the rendering process ?

No they aren't lost. They're available on the 3D cards VRAM, and you can do whatever you want with them provided it's over the GPU (for example bind them as a texture). If you want them available to the CPU, then you need to squeeze them through the narrow AGP bus, and that's slow.

Quote:

Anyway... I was just thinking that doing some lighting calculations with the few window_width x window_height points might be less CPU consuming than doing them with every single pixel on every triangles drawn.

That's the deferred shading I was talking about above. Note that the CPU will never see that data, everything is done on the GPU.

Quote:

So if you heard some way to do this without multipass neither by using glReadPixel, which is really slow, I know, please let me now...

There is none. The only way to make that data available to the CPU is through glReadPixels(). But that's pretty useless anyway - why would you want to transfer everything back to system RAM, if everything you want to do can be done on the GPU ?

Quote:

But could you just tell me how to render something else than color to the buffer ? I could use this technique someday for other stuff.

Deferred shading. It sounds really good, but don't get too excited about it, it's unfortunately pretty slow due to the very large bandwidth requirements. Still, it's an interesting technique one should keep in mind for future hardware.

Share this post


Link to post
Share on other sites
Quote:
Original post by der
And WHY ON HELL ARE MY RETURN EATEN ?

Good question. Let me try something...

Hmm, I edited and reinserted your post above without doing any modifications, and suddendly the line breaks are there. Weird. What browser are you using ?

Share this post


Link to post
Share on other sites
Yann L "Still, it's an interesting technique one should keep in mind for future hardware."
I guessed so. But i don't want to give NVidia & friends the opportunity to make more money (*), I'd just like to use it now for my game.

About the return... don't bother anymore.
The HTML return ( 'smaller' 'B' 'R' 'greater' ) work fine and the classic one (13) seems working sometime too.

(*) unless they gave me a good reward and a new 3D card featuring this new GL_TRICK.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!