weird problem using glgetstring in win32

Started by
6 comments, last by knownonlyash 21 years, 1 month ago
hi, Hope you can help me with this... I''m moving to opengl in win32 and I have created an About dialog box to show gl info about extensions. I''ve taken the following code from the opengl superbible glthread.c example so i know it works but I get an error when it compiles SetDlgItemText(hwnd, IDC_GLEXT, glGetString(GL_EXTENSIONS)); Open Win32.cpp(213) : error C2664: ''SetDlgItemTextA'' : cannot convert parameter 3 from ''const unsigned char *'' to ''const char *'' thanx in advance
Advertisement
Just cast the result from glGetString to a const char *, which is what the error message says SetDlgItemText wants.
reinterpret_cast<const char *>(glGetString(GL_EXTENSIONS)) 


[edited by - Brother Bob on March 3, 2003 8:56:12 AM]
cool thanx it works but the problem just got weirder - i really hate vc++ sometimes...any alternatives?

i took the source back home from work and ran it and its as if the getstring(gl_extensions) is running still on my work pc - is it buffering the result somewhere??!

i know its definitely different cos if i run getstring from a command line and project i get 20 more extensions and the version is different

please help

thanks again
What kind of problem do you have? Saying you have a problem, not giving a description and asking for advice is not helping us much.
Check which vendor glGetString returns

glGetString( GL_VENDOR );

I''d guess you are using a software only pixelformat, in which case the above will return Microsoft.

Regards
thanx for getting back to me

the vendor is microsft and the renderer is GDI generic - I have posted the pixel format is there anything else you need?


  BOOL bSetupPixelFormat(HDC hdc) {     PIXELFORMATDESCRIPTOR pfd, *ppfd;     int pixelformat;      ppfd = &pfd      ppfd->nSize = sizeof(PIXELFORMATDESCRIPTOR);     ppfd->nVersion = 1;     ppfd->dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL |                          PFD_DOUBLEBUFFER;     ppfd->dwLayerMask = PFD_MAIN_PLANE;     ppfd->iPixelType = PFD_TYPE_RGBA;     ppfd->cColorBits = 24;     ppfd->cDepthBits = 16;     ppfd->cAccumBits = 0;     ppfd->cStencilBits = 0;      pixelformat = ChoosePixelFormat(hdc, ppfd);      if ( (pixelformat = ChoosePixelFormat(hdc, ppfd)) == 0 )     {         MessageBox(NULL, "ChoosePixelFormat failed", "Error", MB_OK);         return FALSE;     }      if (SetPixelFormat(hdc, pixelformat, ppfd) == FALSE)     {         MessageBox(NULL, "SetPixelFormat failed", "Error", MB_OK);         return FALSE;     }      return TRUE; }   
You're only getting software rendering as suspected.

Try changing either the depth buffer to 24, or the colour buffer to 16.

The reason for this is that you can normally use either 32 or 64 bits in the frame buffer - eg

16 colour + 16 depth = 32 bits
or
32 colour + 24 depth (+ optionally 8 stencil) = 64 bits

BTW - what video hardware are you using?

[edited by - Shag on March 4, 2003 12:24:44 PM]
cools it works now - i swapped the pixelformatdescriptor with one from nehe''s

i have an s3 virge vivid xs

thanx for all your help

This topic is closed to new replies.

Advertisement