Jump to content
  • Advertisement
Sign in to follow this  
snobaste

OpenGL OpenGL Fog on integrated graphics

This topic is 2565 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, first time poster here :)

I have been working on an OpenGL game and have recently sent copies to friends to test out.

On my machine and any of the ones I've tested it on (all with dedicated graphics cards ranging from an old Radeon X1600 Mobility to an nVidia GeForce GTX 580), my OpenGL fog displays correctly:
[media]http://img8.imageshack.us/img8/4047/sorrowsong2011071221410.png[/media]

However, when others run the game (specifically on laptops with Intel GMA onboard graphics), the fog does not render correctly:
[media]http://img37.imageshack.us/img37/1614/sorrowsong2011071221410.jpg[/media]

Here's how I'm initializing my OpenGL context:
[source]
glewInit();

glClearColor(0.0f, 0.0f, 0.0f, 0.0f);

glDisable(GL_DEPTH_TEST);

glViewport(0, 0, screenWidth, screenHeight);
glDisable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
glEnable(GL_TEXTURE_2D);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, screenWidth, screenHeight, 0, 0.1f, 1000.0f);
if(accumBufferSupport) {
glClearAccum(0.0, 0.0, 0.0, 1.0);
glClear(GL_ACCUM_BUFFER_BIT);
}

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_FOG);
glFogi(GL_FOG_MODE, GL_LINEAR);

glFogf(GL_FOG_DENSITY, 0.55f);
glFogf(GL_FOG_START, 1000.0f);
glFogf(GL_FOG_END, 1500.0f);
GLfloat fogColor[4]= {1.0f, 1.0f, 1.0f, 1.0f};
glFogfv(GL_FOG_COLOR, fogColor);

glHint(GL_FOG_HINT, GL_NICEST);

glEnable(GL_CULL_FACE);
[/source]

Then, when each level of the game is loaded, the fog color is changed with glFogfv(GL_FOG_COLOR, fogColor) and nothing else.

Also, because the game is 2D with 3D backgrounds, I switch projection modes each frame:
[source]
// Use perspective
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45, 4.0f/3.0f, 0.1f, 100000.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

glPushAttrib(GL_VIEWPORT_BIT);
glViewport(...); // Viewport math left out for length... this just sets the viewport to the game rectangle on screen

// Code to render background here

// Go back to orthographic
glPopAttrib();
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, screenWidth, screenHeight, 0, 0.1f, 1000.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
[/source]

I'm guessing the problem might have something to do with the Z-ordering, but I'm not understanding why it's happening only on integrated cards. It also doesn't appear to be a problem with older versions of OpenGL, either. The fog runs fine on the Radeon X1600 (OpenGL 2.0), but then doesn't run on the Intel GMA X4500 HD (OpenGL 2.1).

I haven't been able to find anything online about this issue and was wondering if anyone knows what the problem might be.

Thanks :)

Share this post


Link to post
Share on other sites
Advertisement
Is the background plane rendered as a single quad with fog, or do you have many passes or more complex geometry?

It's possible that it's just a driver problem, but you could try logging some pixel format information and see so you don't get a 16-bit z-buffer, if you believe that could cause the artifact. You should be able to force a 16-bit z buffer when selecting the pixel format, to try it out on your own system.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!