Jump to content
  • Advertisement
Sign in to follow this  
LuckyLuciano

OpenGL [Cg] Some shaders don't work. Wrong initialisation?

This topic is 4358 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I'm using Cg shaders in a project, which can use an OpenGL and Direct3D9 renderer to render it's data. All shaders work without a problem in the OpenGL renderer, but some wont work (they all compile though) in the Direct3D9 renderer. The wierdest thing is that the shaders stop working in Direct3D as soon is I try to use colored lighting (at least, in the shader below). For example, I wrote this simple diffuse lighting shader which works flawlessly in OpenGL, but this only works for Direct3D if I use white lighting:
//Vertex Shader
float4x4 modelViewProj;
float4 lightPos;

struct output{
  float4 position : POSITION;
  float4 texture : TEXCOORD0;
  float4 lightDir : TEXCOORD1; 
  float4 normal : TEXCOORD2;
};

output main(float4 position: POSITION,
			float4 texCoord: TEXCOORD0,
			float4 normal : NORMAL)
{
  output OUT;
  float4 tempPos;
  
  OUT.position = mul(modelViewProj, position);
  OUT.texture = texCoord;
  
  OUT.lightDir = normalize(lightPos - position);
  OUT.normal = normalize(normal);
  return OUT;
}


//Fragment shader
uniform sampler2D textureMap;

struct fs_output{
  float4 color : COLOR;
};

fs_output main(float4 lightDir: TEXCOORD1, float4 normal : TEXCOORD2, float4 texture : TEXCOORD0)
{
  fs_output OUT;
  float4 c1;
  float cosAngle = saturate(dot(normal, lightDir));
  
  c1 = tex2D(textureMap, texture.xy);

  OUT.color = cosAngle * float4(1.0f, 1.0f, 1.0f, 1.0f) * c1;
  return OUT;
}


The above shader works in both situations. When I change the light-color in the pixelshader, like this: OUT.color = cosAngle * float4(1.0f, 0.0f, 0.0f, 1.0f) * c1; ...then there is no lighting whatsoever in Direct3D-mode. I see the texture though, so the shader works (in a way :)) Could someone please look at my Cg-setup code please? I did a good search about it, and I can't really see any mistakes: This code I run during setup of the renderer: (I'm using the Cg 1.5 Toolkit, by the way)
context = cgCreateContext();
cgD3D9SetDevice(d3d9Device);


This code I run everytime I load a shader:
//Load Vertex-shader
shader->vertexProfile = cgD3D9GetLatestVertexProfile();
shader->vertexProgram =	cgCreateProgramFromFile(context, CG_SOURCE, vertexPath.c_str(), shader->vertexProfile, vertexEntryName.c_str(), NULL); 

/*I removed error-checking code here, for clarity!*/
cgD3D9LoadProgram(shader->vertexProgram, TRUE, 0);

//Load fragment-shader
shader->fragmentProfile = cgD3D9GetLatestPixelProfile();
shader->fragmentProgram = cgCreateProgramFromFile(context, CG_SOURCE, fragmentPath.c_str(), shader->fragmentProfile, fragmentEntryName.c_str(), NULL);                      

/*I removed error-checking code here, for clarity!*/
cgD3D9LoadProgram(shader->fragmentProgram, TRUE, 0);


I guess I'm not using a correct shader-profile in Direct3D, but there isn't a way to set this manually, is there? If someone would look at the above code, I would be much obliged. Thanks a lot in advance!

Share this post


Link to post
Share on other sites
Advertisement
I did some more reading about the subject and some people create a unique VertexDeclaration for each shader-program (depending on the shader input-values). Is this absolutely necessary? I just use one VertexDeclaration in my renderer which describes how the vertex-data of objects are interleaved, that's all.

Because the input-values can't be determined at load-time and cause they are highly dependent on the shader in question I find it hard to believe that you have to declare a unique VertexDeclaration for each one, to be honest.

But just to be sure, could this be my problem?

Share this post


Link to post
Share on other sites
Ok, this is really getting over my head. I found a workaround (well, sort off, it's ugly) but can anyone please explain me what's going on here?

As mentioned before the code above works for OpenGL and Direct3D if I use white lights in my fragment shader, like this:

float cosAngle = saturate(dot(IN.normal, IN.lightDir));
c1 = tex2D(textureMap, IN.texture.xy);
OUT.color = cosAngle * float4(1.0, 1.0, 1.0, 1.0) * c1;



If I use a "colored" light, the shader stops working for Direct3D:

float cosAngle = saturate(dot(IN.normal, IN.lightDir));
c1 = tex2D(textureMap, IN.texture.xy);
//This should be a teal-colored light, but somehow makes the pixelshader invalid for Direct3D.
OUT.color = cosAngle * float4(0.0, 1.0, 1.0, 1.0) * c1;



....BUT, if I change the color values "manually" the lighting is correct in Direct3D! Like this:

float cosAngle = saturate(dot(IN.normal, IN.lightDir));
c1 = tex2D(textureMap, IN.texture.xy);

float4 color = c1;
//give light some color (teal);
color.x *= 0.0 * cosAngle;
color.y *= 1.0 * cosAngle;
color.z *= 1.0 * cosAngle;

OUT.color = color;



Does anyone have a reason for this behaviour please? This workaround works, but it's ugly and I would like to change it. Could this be an error in Cg perhaps?

Thanks a lot!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!