Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

knownonlyash

weird problem using glgetstring in win32

This topic is 5617 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi, Hope you can help me with this... I''m moving to opengl in win32 and I have created an About dialog box to show gl info about extensions. I''ve taken the following code from the opengl superbible glthread.c example so i know it works but I get an error when it compiles SetDlgItemText(hwnd, IDC_GLEXT, glGetString(GL_EXTENSIONS)); Open Win32.cpp(213) : error C2664: ''SetDlgItemTextA'' : cannot convert parameter 3 from ''const unsigned char *'' to ''const char *'' thanx in advance

Share this post


Link to post
Share on other sites
Advertisement
Just cast the result from glGetString to a const char *, which is what the error message says SetDlgItemText wants.

reinterpret_cast<const char *>(glGetString(GL_EXTENSIONS))


[edited by - Brother Bob on March 3, 2003 8:56:12 AM]

Share this post


Link to post
Share on other sites
cool thanx it works but the problem just got weirder - i really hate vc++ sometimes...any alternatives?

i took the source back home from work and ran it and its as if the getstring(gl_extensions) is running still on my work pc - is it buffering the result somewhere??!

i know its definitely different cos if i run getstring from a command line and project i get 20 more extensions and the version is different

please help

thanks again

Share this post


Link to post
Share on other sites
What kind of problem do you have? Saying you have a problem, not giving a description and asking for advice is not helping us much.

Share this post


Link to post
Share on other sites
Check which vendor glGetString returns

glGetString( GL_VENDOR );

I''d guess you are using a software only pixelformat, in which case the above will return Microsoft.

Regards

Share this post


Link to post
Share on other sites
thanx for getting back to me

the vendor is microsft and the renderer is GDI generic - I have posted the pixel format is there anything else you need?


  

BOOL bSetupPixelFormat(HDC hdc)
{
PIXELFORMATDESCRIPTOR pfd, *ppfd;
int pixelformat;

ppfd = &pfd;

ppfd->nSize = sizeof(PIXELFORMATDESCRIPTOR);
ppfd->nVersion = 1;
ppfd->dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL |
PFD_DOUBLEBUFFER;
ppfd->dwLayerMask = PFD_MAIN_PLANE;
ppfd->iPixelType = PFD_TYPE_RGBA;
ppfd->cColorBits = 24;
ppfd->cDepthBits = 16;
ppfd->cAccumBits = 0;
ppfd->cStencilBits = 0;

pixelformat = ChoosePixelFormat(hdc, ppfd);

if ( (pixelformat = ChoosePixelFormat(hdc, ppfd)) == 0 )
{
MessageBox(NULL, "ChoosePixelFormat failed", "Error", MB_OK);
return FALSE;
}

if (SetPixelFormat(hdc, pixelformat, ppfd) == FALSE)
{
MessageBox(NULL, "SetPixelFormat failed", "Error", MB_OK);
return FALSE;
}

return TRUE;
}

Share this post


Link to post
Share on other sites
You're only getting software rendering as suspected.

Try changing either the depth buffer to 24, or the colour buffer to 16.

The reason for this is that you can normally use either 32 or 64 bits in the frame buffer - eg

16 colour + 16 depth = 32 bits
or
32 colour + 24 depth (+ optionally 8 stencil) = 64 bits

BTW - what video hardware are you using?

[edited by - Shag on March 4, 2003 12:24:44 PM]

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!