Jump to content
  • Advertisement
Sign in to follow this  

Vertex Shader Constants on ATI

This topic is 5055 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, My problem is the following: I have a laptop with ATI RADEON 9600 card and I wanted to use CG with open gl. I wrote a small application showing a triangle. ___________________________________________________ glBegin(GL_TRIANGLES); glColor3f(1,0,0); glVertex2f(0,0); glColor3f(0,1,0); glVertex2f(0,1); glColor3f(1,1,1); glVertex2f(1,-1); glEnd(); ___________________________________________________ The following simple vertex program leads to the correct (gradually colored) triangle: ___________________________________________________ struct appdata { float4 position : POSITION; float3 color : COLOR0; }; struct vfconn { float4 pos : POSITION; float4 color : COLOR0; }; vfconn main(appdata IN, uniform float4x4 ModelViewProj) { vfconn OUT; OUT.pos = mul(ModelViewProj, IN.position); OUT.color.xyz = IN.color.xyz; OUT.color.a = 1.0; return OUT; } ___________________________________________________ when I change the line: OUT.color.xyz = IN.color.xyz; to OUT.color.xyz = float3(1,0,0);//black the triangle becomes completely black. (instead of red) (The triangle is really black, because changing the background to a different colors makes the black triangle visible) The same problem arises for: OUT.color.xyz = 2.0 * IN.color.xyz;//black or OUT.color.xyz = 1.001 * IN.color.xyz;//black on the other hand: OUT.color.xyz = IN.color.xyz + IN.color.xyz;//OK works. It seems as if all these constant values are set to 0. The only constant that seems to work is: OUT.color.xyz = 1.0 * IN.color.xyz; //OK This is probably the case because the compiler "optimizes the constant away"... I checked the assembly code and cannot see where the problem comes from. I tried using the compiled cg programs (CG_OBJECT) but it would not work either. Changing the profile did not help at all. ( Actually not having a NVIDIA card does not give that many choices anyway...) In a last step I tried saving the output values in TEXCOORD0 and wrote a fragment shader transforming these texcoords in color but it did not help neither. By the way, in the fragment shader constants work perfectly and changing the color to red by using float3(1,0,0) works just fine. If I pass every constant in the vertex program as a parameter (like in the above example for the transformation matrix) the shader works. But this is an unacceptable solution! Finally I tried changing the driver of my card and still it refuses to show the colored triangle. All older demos from the NVIDIA page (like Dawn) and recent games work perfectly. Compiling my source code under Linux on a PC with NVIDIA card did not lead to the aforementioned problems. Did anyone ever encounter this problem? I would be really happy if someone could help me, because I am an absolute beginner concerning shaders ... and this is not very encouraging... If there is no way out: Is it possible that I would have less problems if I change to GLSL? (I downloaded some demos (3dsLabs page), but they are amazingly slow! Even for one textured quad the frame rate dropped to 1 fps) I would be really happy if someone could help me! Thank you very much in advance!

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!