Jump to content
  • Advertisement
Sign in to follow this  
Neutro

OpenGL OpenGL & win32 - My code doesn't draw with glBegin

This topic is 2773 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I'm trying to draw a triangle centered where I left clicked with the mouse.

When I click in the window I created, no drawing takes place.

(see below for my OpenGL class code and my WndProc)
In my WndProc message loop, I use case WM_LBUTTONDOWN and I verified that it gets the message correctly with a breakpoint, so I know this is not the problem.
I also checked glGetError() after calling my DrawTriangle function and it returns 0.

Seems like I can't draw in the WM_LBUTTONDOWN case or perhaps my API calls are incorectly ordered?

Basically I have a class for opengl, it has a protected hWnd, HGLRC and hdc.
On WM_CREATE, I get the hWnd, then I get a hdc with GetDC(hWnd), I set the pfd structure, call ChoosePixelFormat and SetPixelFormat then I get the rendering context with wglCreateContext and I make it current with wglMakeCurrent.

With that I get a OpenGL 3.3 context.
After that I call glClearColor(blue) and it works.

In WM_PAINT, I have

glViewport(0, 0, windowWidth, windowHeight);
glClear(GL_COLOR_BUFFER_BIT);
SwapBuffers(hdc);




and in WM_LBUTTONDOWN (I call oRender.DrawTriangle(0.0f,0.0f) which is:)

glClear(GL_COLOR_BUFFER_BIT);

glBegin(GL_TRIANGLES);
glColor3f(1.0,0.0,0.0);

glVertex3f(0.0,0.0,0.0);
glVertex3f(150.0,0.0,0.0);
glVertex3f(75.0,100.0,0.0);

glEnd();
//-- then I post message so that windowsOS calls WM_PAINT (for SwapBuffers())
PostMessage(hWnd, WM_PAINT, 0, 0);






For more info, here is the source of my class:

GLRenderer::GLRenderer(void) { state = 0; }

// CreateGLContext creates an OpenGL context and attaches it to the window defined by hwnd.
void GLRenderer::CreateContext(HWND hwnd) {
this->hwnd = hwnd;
hdc = GetDC(hwnd); // Get the device context for our window

PIXELFORMATDESCRIPTOR pfd;
memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
pfd.nSize = sizeof(PIXELFORMATDESCRIPTOR);
pfd.dwFlags = PFD_DOUBLEBUFFER | PFD_SUPPORT_OPENGL | PFD_DRAW_TO_WINDOW;
pfd.iPixelType = PFD_TYPE_RGBA;
pfd.cColorBits = 32;
pfd.cDepthBits = 32;
pfd.iLayerType = PFD_MAIN_PLANE;

int nPixelFormat = ChoosePixelFormat(hdc, &pfd);

if (nPixelFormat == 0)
MessageBox(NULL, _T("Call to ChoosePixelFormat failed."), _T("OpenGL Error"), NULL);

bool bResult = SetPixelFormat(hdc, nPixelFormat, &pfd);

if (!bResult)
MessageBox(NULL, _T("Call to SetPixelFormat failed."), _T("OpenGL Error"), NULL);

hrc = wglCreateContext(hdc);
wglMakeCurrent(hdc, hrc); // Makes the context current and active

GLenum error = glewInit(); // Enables GLEW
if (error != GLEW_OK)
MessageBox(NULL, _T("Failed to enable GLEW."), _T("OpenGL Error"), NULL);

}

void GLRenderer::Reshape(int w, int h) {
windowWidth = w;
windowHeight = h;
}

void GLRenderer::PrepareScene(void) {
glClearColor(0.4f, 0.6f, 0.9f, 0.0f);
}

void GLRenderer::RenderScene(void) {
glViewport(0, 0, windowWidth, windowHeight);
glClear(GL_COLOR_BUFFER_BIT);

SwapBuffers(hdc);
}

//I simplified this function to verify that the problem was not cause by the params
void GLRenderer::DrawTriangle(GLfloat x, GLfloat y) {
wglMakeCurrent(hdc, hrc);

glClear(GL_COLOR_BUFFER_BIT);

//glTranslatef(0.0f, 0.0f, -5.0f);

glBegin(GL_TRIANGLES);
glColor3f(1.0,0.0,0.0);

glVertex3f(0.0,0.0,0.0);
glVertex3f(150.0,0.0,0.0);
glVertex3f(75.0,100.0,0.0);

glEnd();

}

void GLRenderer::Cleanup(void) {
wglMakeCurrent(hdc, 0); // Remove the rendering context from the device context
wglDeleteContext(hrc); // Delete the rendering context

ReleaseDC(hwnd, hdc); // Release the device context from the window
}

GLRenderer::~GLRenderer() {}





and here is my WndProc:

LRESULT CALLBACK WndProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam) {

switch (message) {

case WM_CREATE:
oRender.CreateContext(hWnd);
oRender.PrepareScene();
break;

case WM_PAINT:
oRender.RenderScene();

break;

case WM_SIZE:
oRender.Reshape(LOWORD(lParam), HIWORD(lParam));
break;

/* I/O Handlers - Start */
case WM_LBUTTONDOWN: {
//Not using this, just testing with 0.0, 0.0 to simplify
GLfloat x = LOWORD(lParam);
GLfloat y = HIWORD(lParam);

oRender.DrawTriangle(0.0f,0.0f);
int error = glGetError();
std::cout << "OpenGL error number is : " << error << "." << std::endl;
PostMessage(hWnd, WM_PAINT, 0, 0);
}
break;

/* I/O Handlers - End */

case WM_DESTROY:
oRender.Cleanup();
PostQuitMessage(0);
break;

default:
return DefWindowProc(hWnd, message, wParam, lParam);
}

return 0;
}





I tried to remove the class and inline everything with a static hdc and hrc but this does not work either. I was thinking maybe there's a problem with my viewpoint? I also tried to use glFlush() after glEnd() but this doesn't help.

I used GLEW and oRender is global.

I hope someone will be able to quickly spot my beginner's mistake!
Thanks a lot!!

Share this post


Link to post
Share on other sites
Advertisement
You don't appear to have any projection matrices set anywhere, and I don't think the default one will actually render anything.

I'm guessing that you expect your coordinates to map to pixels on the screen (they don't by default).

Try this in your OpenGL setup/resize method:

glMatrixMode(GL_PROJECTION);
glOrtho(0,WINDOW_WIDTH,0,WINDOW_HEIGHT, -1, 1);


This should map your vertex locations to screen pixels.

Also I'm not sure what you think clearing the color buffer does, but you're clearing the screen right before you swap buffers, so even if your triangle did draw it would be erased as soon as you tried to render the scene.


void GLRenderer::RenderScene(void) {
glViewport(0, 0, windowWidth, windowHeight);
---> glClear(GL_COLOR_BUFFER_BIT);

SwapBuffers(hdc);
}



Share this post


Link to post
Share on other sites
If you clear then swap buffers without drawing anything in between you will get nothing drawn.

Also I believe his triangle will still draw in clip space even if you don't specify anything.

Share this post


Link to post
Share on other sites
If you don't set the projection matrix, then you assume your vertices are in normalized device coordinates, which go from -1 to 1 on all three axes. So his triangle might still draw something to the screen since it has a vertex at (0,0,0), but not anything like what he expects.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!