Jump to content
  • Advertisement
Sign in to follow this  
khanhhh89

OpenGL Black screen when sampling OpenGL texture on AMD graphics cards

This topic is 753 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

My program use QtOpenGL to draw a sphere and color it by sampling a texture in a single draw call. It works on Nvdia cards, but when I switch to AMD cards (my laptop and other laptops), it show a black screen. (Note: it only failed with AMD Catalyst Software Suite driver, but works with the AMD Radeon Software Crimson Edition Beta driver at this link).
Here is the normal picture on Nvdia cards, and the black bug picture on AMD cards.

 

NVdia

Mcs77.png

 

AMD

utMHM.png
 
It seems to be a texture sampling bug (not framebuffer bug) because OpenGL draws normally when I use a simple shading method (color = dot(vertexPixelNormal, lightDirection)) as the following picture.

nvJcN.png
 
I use CodeXL from AMD for debugging, and when I click on the texture ID from the CodeXL property view, it show exactly my image (Does it mean that my image is updated to GPU successfully?). Here is the OpenGL calls log.
Note: You can't see the function glTextureStorage2DEXT before glTextureSubImage2D in the log because CodeXL doesn't log glTextureStorage2DEXT, which is used by QtOpenGL. I debug step by step and ensure that this function is called.

kdzPK.jpg
 Here is the texture property from CodeXL property view

6evHf.jpg 
Here is the fragment shader

#version 150 core

uniform sampler2D matcapTexture;

vec3 matcapColor(vec3 eye, vec3 normal) 
{
  vec3 reflected = reflect(eye, normal);

  float m = 2.0 * sqrt(
    pow(reflected.x, 2.0) +
    pow(reflected.y, 2.0) +
    pow(reflected.z + 1.0, 2.0)
  );
  vec2 uv =  reflected.xy / m + 0.5;
  uv.y = 1.0 - uv.y;
  return texture(matcapTexture, uv).xyz;
}

in vec4 fragInput;  //vec4(eyePosition, depth)

void main()
{
   vec3 n   = normalize(cross(dFdx(fragInput.xyz), dFdy(fragInput.xyz)));                    //calculate vertex pixel normal by dfdx,dfdy
  const vec3 LightDirection = vec3(0.0f, 0.0f, -1.0f); 
  vec3 fragColor = matcapColor(LightDirection, n); 
  gl_FragData[0]    = vec4(fragColor.x, fragColor.y, fragColor.z, 1.0f);
}

I spent several days for this bug but can't find any clues. Hope you guys could help me show what's incorrect here. Did I do something that AMD didn't expect?

Share this post


Link to post
Share on other sites
Advertisement

@

lawnjelly: thanks for your answer. the flag GL_GENERATE_MIPMAP is false when I use CodeXL to debug, and I just use the basic glsl function texture() for retrieving texels.

Share this post


Link to post
Share on other sites

Have you ran CodeXL / gDEBugger or similar on the laptop where the sphere is coming out black? That might help pin down the problem .. I am assuming here that your codeXL screenshot is from your development machine where it is working...

 

If it isn't the mipmaps then as you suggest it may be an issue with the texture getting created at all.. I would maybe try calling glTexImage2D on first create instead of glTextureStorage2DEXT, just in case the latter isn't working.. but really getting codeXL on the laptop is better to do first otherwise we are just guessing...

 

Also it is notable the warning about the requested texture pixel format. Maybe this is the problem too, your development machine could be just substituting another, and the test machine is borking.

 

Maybe someone with more OpenGL knowledge can answer, I find these things difficult to debug too!  :lol: 

Edited by lawnjelly

Share this post


Link to post
Share on other sites

khanhhh89, what are settings for GL_TEXTURE_MIN_FILTER?

Do you call glTexParameteri() for this parameter?

Because by default minimizing filter set to GL_NEAREST_MIPMAP_LINEAR, means mipmapping is enabled, so texture must have complete mip pyramid, or GL_TEXTURE_MAX_LEVEL set accordingly to currently available mips count.

Share this post


Link to post
Share on other sites

lawnjelly:

- I run CodeXL on the laptop which cause the problem, and I see the texture shown on the CodeXL exproler window. it's exactly the image I expect.

 

- For glTextureStorage2DEXT, I don't know how to switch to glTextureImage2D because I use QOpenGLTexture for binding, loading texture. But do you think it relate to the problem because glTextureImage2D is used to upload image to GPU, and I see the image in GPU.

 

- For the warning from CodeXL about the requested format, I did try to investigate it by stepping into QOpenGLTexture. I see that the function glTextureStorage2DEXT use the internal format GL_RGBA8, and the function glTexturesubImage2D use the format GL_RGBA and the type GL_UNSIGNED_BYTE. So I don't see any conflicts between these two functions. Don't know why CodeXL show this warning. Could you give me some other clues? 


vstrakh: thanks for your suggestion. I checked GL_TEXTURE_MIN_FILTER again, and it is set to GL_LINEAR, like GL_TEXTURE_MAX_FILTER. You can see that in the OpenGL call log and the texture property from CodeXL I attached above. Do I need to set GL_TEXTURE_MAX_LEVEL?  could you help with some more clues? 

Edited by khanhhh89

Share this post


Link to post
Share on other sites
You can see that in the OpenGL call log and the texture property from CodeXL I attached above

 

I see you're calling glTextureParameteri(), while passing GL_TEXTURE_2D as first argument. That's totally wrong.

First of all, glTextureParameter() is extension for anything below GL 4.5, so that call might even crash on some systems.

Second - if you really need glTextureParameteri(), then pass texture id as first argument, not the texture target GL_TEXTURE_2D.

If you don't care about glTextureParameteri() details, better call glTexParameteri() when texture is bound to GL_TEXTURE_2D. This function is in the core since the very beginning.

Edited by vstrakh

Share this post


Link to post
Share on other sites

If it doesn't turn out to be glTextureParameteri as vstrakh suggests ...

If the texture is fine on the laptop and it is not using black mipmaps...

 

Then personally I'd next double check that the texture was returning black, and there wasn't a problem in your shaders:

gl_FragColor = vec4(texture2D (matcapTexture, uv).rgb, 1.0);

Or possibly fixing in some UVs as you are calculating them too:

gl_FragColor = vec4(texture2D (matcapTexture, vec2(0.5, 0.5)).rgb, 1.0);

If that is returning black and the texture is white, there must be something else going on (some other test maybe?)

 

I am also assuming that you are checking OpenGL for errors regularly (QT probably does this for you) with glGetError(). Aside from this I am running out of ideas lol..  :D

Share this post


Link to post
Share on other sites

Try changing:

float m = 2.0 * sqrt(
pow(reflected.x, 2.0) +
pow(reflected.y, 2.0) +
pow(reflected.z + 1.0, 2.0)
);

to:

float m = 2.0 * sqrt(reflected.x * reflected.x + reflected.y * reflected.y + ((reflected.z + 1.0) *  (reflected.z + 1.0)));

The pow function is undefined when the first parameter is negative. Compilers don't always convert pow(x, 2.0) to (x * x).

 

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!