• Advertisement
Sign in to follow this  

Graphics glitch on intel integrated graphics

This topic is 3132 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I was testing my current project on some different machines to help ferret out problems, and I found one. Here's a picture of a satellite being rendered correctly on the debug screen in my project: rendered correctly All is well! However, as I zoom out more and more in game, I start to see artifacts on it! artifact! I have two other machines with a GeForce card and an ATI card and they do not have the problem. I feel like its a problem with the depth buffer, but I don't know why I wouldn't see it on the other machines as well. Or could it be a problem with the integrated Intel graphics?

Share this post


Link to post
Share on other sites
Advertisement
Looks like z-fighting indeed. What depth buffer precision does the app use with that Intel chip?

Share this post


Link to post
Share on other sites
Quote:
What depth buffer precision does the app use with that Intel chip?


Not sure how to check this? I'm not changing any of the OpenGL setup / configuration based on the video card, or is this entirely card dependent?

Share this post


Link to post
Share on other sites
to see if it is zfighting (Im not 100% certain)
try gluPerspective( X,X, near *10.0, far );
i.e. make the near clipplane 10x larger

Share this post


Link to post
Share on other sites
Quote:
Original post by scottrick49
Quote:
What depth buffer precision does the app use with that Intel chip?
Not sure how to check this? I'm not changing any of the OpenGL setup / configuration based on the video card, or is this entirely card dependent?
You request a specific number of depth bits when you create the OpenGL context - thus the method varies depending on your windowing toolkit.

If you don't request a specific number of bits, the driver will pick a default, and the Intel driver may be defaulting lower than the ATI. The Intel card should support a higher bit depth though, so try requesting one. For instance, my Intel X3100 (on a Mac) supports both 16 and 24 bit depth, and appears to default to 16.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement