Jump to content
  • Advertisement
Sign in to follow this  
dgusain

OpenGL Opengl having issues with Intel HD3000

This topic is 2205 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi All,

We have an app which shows geometry in 3D using Opengl 3.0. It seems to work fine on all machine (aparently all had NVIDIA cards n drivers) till i got a Dell machine with Intel HD 3000 card (8.15.10.2418). This machine doesn't seem to render the objects properly. please have a look at the two images, i have attached, to see what i mean.

And i am not using anything fancy here. Its just stripes of GL_QUAD_STRIP kind.

Right now i have 8.15.10.2418(which i got from manufacturer(DELL) website on someone's suggestion) but prior to that i had 8.15.10.2656 (lates on Intels website) and the results were the same.
Any pointers would be highly appreciated as we want to support as many kinds of machine/laptops.
(i haven't seen this issue on any other driver yet)

Share this post


Link to post
Share on other sites
Advertisement
Intel's drivers are famous for not supporting OpenGL properly. You could try using DirectX if you want to support Intel cards, which is supposed to work somewhat better.

Share this post


Link to post
Share on other sites
So far as Intel is concerned, GL_QUAD_STRIP actually is quite fancy: it's one of those weird old GL1.x primitive layouts that would need to be converted to triangles in the driver, it doesn't exist in D3D so it breaks the general rule of "with Intel if it's not in D3D then don't even try doing it in GL", and it's also been deprecated from more recent GL_VERSIONs. So convert it to an indexed triangle list and it should work just fine.

Share this post


Link to post
Share on other sites
Thanks for the replies guys..
as i tried your suggestions i figured out that the problem lies somewhere else......
if you look at the pic above, you would notice that there are two faces to the image.. if i draw just the front face of it.. it draws perfectly fine (even with GL_QUAD_STRIP as well as if i use triangle). So it seems using GL_QUAD_STRIP is not the problem. The weird effect comes when i draw the back face of the image. It seems the back face somehow interferes with the front face and gives this weird effect. And this happens just with this Intel HD3000 card only.

Can you think of any reason how this can happen with this card/driver?

Share this post


Link to post
Share on other sites

Thanks for the replies guys..
as i tried your suggestions i figured out that the problem lies somewhere else......
if you look at the pic above, you would notice that there are two faces to the image.. if i draw just the front face of it.. it draws perfectly fine (even with GL_QUAD_STRIP as well as if i use triangle). So it seems using GL_QUAD_STRIP is not the problem. The weird effect comes when i draw the back face of the image. It seems the back face somehow interferes with the front face and gives this weird effect. And this happens just with this Intel HD3000 card only.

Can you think of any reason how this can happen with this card/driver?


that is possible z fighting. it may be the result of poor quality drivers (ie. intel). A year ago I had to work with intel and ogl and I had really bad experience with them.

Share this post


Link to post
Share on other sites

Can you think of any reason how this can happen with this card/driver?


Historically, any time you try to do something "not what people are usually doing," or in this case what an IHV (Intel) is doing, you get yourself into trouble. This is because infrequently used stuff is not tested very well, so bugs aren't even perceived, let alone fixed. So, the original advice that another poster gave is not wrong. Try implementing a test program that uses triangle strip primitives instead of quads, and see if the problem goes away. It could be that Intel supports some quad stuff, but when you get into double sided quads, someone slacked off on the driver implementation and it is not done well. I would point out that quads are not inherently flat, so it's easy to see how a driver could do a quick-and-dirty conversion to triangles, pick baloney normals, and thus z-fight. YMMV for how flat your quads really are, how "pixel perfect" the driver tries to be, etc.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!