Sign in to follow this  
3dmaniac

mrt slows down app 5 times??

Recommended Posts

Hello, some time ago i wrote a deffered shading demo which usues mrt. That time i had radeon 9800 and it was running with ~~50fps. Now i've got x1950pro and...i've got still ~~50fps. Why is that??? When i disable rendering to multiple buffers i've got ~~250fps...strange. Help is welcome :) code snippet: void main() { vec3 viewVec = normalize(v_viewVec); vec2 newST = ComputeNewCoords(viewVec); vec3 normal = TBN_Matrix * GetNormalVec(newST); gl_FragData[0] = vec4(normal, 0.0); gl_FragData[1] = vec4(fragPos, 0.0); }

Share this post


Link to post
Share on other sites
I'm going to take a guess and say you are using Windows XP, correct?

The reason for this slow down is because of a (bloody stupid) bug in ATI's OpenGL implimentation which, when you change the number of render targets, recompiles the shaders. This does evil things to your frame rate as you've noticed.

The good news is the Vista driver is fixed (I've Vista and a ATI card), the bad news is I don't know when/if the Vista driver (which is the fabled OpenGL rewrite) will get back ported to XP (if at all).

Throw Longs Peak into the mix and god knows where the driver fixes are going to go in the future.

Share this post


Link to post
Share on other sites
Thanks for answer, although its a sad info :( Yes i'm running under winXP, i thought i can make use of mrt in some bigger project, now i see it wont be so ease to implement some nice lighting:(

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this