Jump to content
  • Advertisement
Sign in to follow this  
Sock5

Using pixel format '2' crashes SetPixelFormat, but other values don't?

This topic is 2109 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Using SetPixelFormat and passing 2 as a format crashes and GetLastError() gives me a code of 2000.I used the example code from Rastertek: http://www.rastertek.com/gl40tut03.html , but when I do it, it causes a crash.I have no idea how to proceed, I mean I can't even find what '2' means, it's probably  "RGBA 8 bits per channel", as it is described in the description array prior to the format creation, but why would it say it's invalid?

Share this post


Link to post
Share on other sites
Advertisement
result = wglChoosePixelFormatARB(m_deviceContext, attributeListInt, NULL, 1, pixelFormat, &formatCount);

 result = SetPixelFormat(m_deviceContext, pixelFormat[0], &pixelFormatDescriptor);

 

there is no '2' anywhere near the pixelformats, as i could find.. am i looking in the wrong place?

Share this post


Link to post
Share on other sites
result = wglChoosePixelFormatARB(m_deviceContext, attributeListInt, NULL, 1, pixelFormat, &formatCount);

 result = SetPixelFormat(m_deviceContext, pixelFormat[0], &pixelFormatDescriptor);

 

there is no '2' anywhere near the pixelformats, as i could find.. am i looking in the wrong place?

 

Yeah the '2' is the value inside pixelFormat that I see when I hover over it when I put a breakpoint in the debug code.I tried using '1' and '2' directly and '1' works, however I have no idea what they mean.I couldn't find an online table of GL pixel formats and their related integers :/ .

Share this post


Link to post
Share on other sites
There is no online table because that depends on your OS, your graphics card and your graphics driver. You could try http://www.realtech-vr.com/glview/index.html to show info about your system.
Maybe there is also some modern equivalent to DescribePixelFormat you could use, but the easiest way is to think in terms of the features you want from the context and feed those into ChoosePixelFormat (or the newer function wglChoosePixelFormatARB for modern OpenGL versions) do that work for you.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!