Archived

This topic is now archived and is closed to further replies.

Julio

CVA Problems

Recommended Posts

I'm trying to implement compiled vertex arrays in my engine. my first attempt failed, and my second attempt won't even run. I've borrowed initialization code from glVelocity, and when I get it working I plan on rewriting it. anyways, the cva's won't initialize. my hardware Geforce 256) has the correct extension and is capable of cva's. is there anything wrong in the code?
    
GLLOCKARRAYSEXTPROC m_glLockArraysEXT	= 0;
GLUNLOCKARRAYSEXTPROC m_glUnlockArraysEXT	= 0;
GLARRAYELEMENTEXTPROC m_glArrayElementEXT	= 0;
GLCOLORPOINTEREXTPROC m_glColorPointerEXT	= 0;
GLDRAWARRAYSEXTPROC m_glDrawArraysEXT	= 0;
GLEDGEFLAGPOINTEREXTPROC m_glEdgeFlagPointerEXT= 0;
GLGETPOINTERVEXTPROC m_glGetPointervEXT= 0;
GLINDEXPOINTEREXTPROC m_glIndexPointerEXT	= 0;
GLNORMALPOINTEREXTPROC m_glNormalPointerEXT= 0;
GLTEXCOORDPOINTEREXTPROC m_glTexCoordPointerEXT= 0;
GLVERTEXPOINTEREXTPROC m_glVertexPointerEXT= 0;
GLCLIENTACTIVETEXTUREARBPROC m_glClientActiveTextureARB = 0;
  
and the initialization:
      
	m_glArrayElementEXT = (GLARRAYELEMENTEXTPROC) 
		wglGetProcAddress("glArrayElementEXT");
	m_glColorPointerEXT = (GLCOLORPOINTEREXTPROC) 
		wglGetProcAddress("glColorPointerEXT");
	m_glDrawArraysEXT = (GLDRAWARRAYSEXTPROC) 
		wglGetProcAddress("glDrawArraysEXT");
	m_glEdgeFlagPointerEXT = (GLEDGEFLAGPOINTEREXTPROC) 
		wglGetProcAddress("glEdgeFlagPointerEXT");
	m_glGetPointervEXT = (GLGETPOINTERVEXTPROC) 
		wglGetProcAddress("glGetPointervEXT");
	m_glIndexPointerEXT = (GLINDEXPOINTEREXTPROC) 
		wglGetProcAddress("glIndexPointerEXT");
	m_glNormalPointerEXT = (GLNORMALPOINTEREXTPROC) 
		wglGetProcAddress("glNormalPointerEXT");
	m_glTexCoordPointerEXT = (GLTEXCOORDPOINTEREXTPROC) 
		wglGetProcAddress("glTexCoordPointerEXT");
	m_glVertexPointerEXT = (GLVERTEXPOINTEREXTPROC) 
		wglGetProcAddress("glVertexPointerEXT");

	if (m_glArrayElementEXT  == 0 || m_glColorPointerEXT    == 0 || //here is returns false, everything is 0

		m_glDrawArraysEXT    == 0 || m_glEdgeFlagPointerEXT == 0 ||
		m_glGetPointervEXT   == 0 || m_glIndexPointerEXT    == 0 ||
		m_glNormalPointerEXT == 0 || m_glTexCoordPointerEXT == 0 ||
		m_glVertexPointerEXT == 0)
	{
		return false;
	}

	m_glLockArraysEXT = (GLLOCKARRAYSEXTPROC) 
		wglGetProcAddress("glLockArraysEXT");
	m_glUnlockArraysEXT = (GLUNLOCKARRAYSEXTPROC) 
		wglGetProcAddress("glUnlockArraysEXT");

	if (m_glLockArraysEXT == 0 || m_glUnlockArraysEXT == 0)
		return false;

	m_glClientActiveTextureARB = (GLACTIVETEXTUREARBPROC) 
		wglGetProcAddress("glClientActiveTextureARB");

	if (m_glClientActiveTextureARB == 0)
		return false;

	return true;
    
Everything compiles fine. yes, I have glext.h linked to it. if anybody might know what's wrong that'd be great. thanks, Joe Never underestimante the power of stupid people in large groups. Edited by - Julio on July 30, 2001 9:40:09 PM

Share this post


Link to post
Share on other sites
Almost all of the functions is already a part of OpenGL 1.1 so no need to get the addresses. You are using prototypes like
GLLOCKARRAYSEXTPROC m_glLockArraysEXT = 0;
but my glext.h would like
PFNGLLOCKARRAYSEXTPROC m_glLockArraysEXT =0;

In the last one are you using (GLACTIVETEXTUREARBPROC) with wglGetProcAddress("glClientActiveTextureARB") but probably meant wglGetProcAddress("glActiveTextureARB").

Share this post


Link to post
Share on other sites
well, all of those are defined in glext.h. why don''t I need to get the proc address? could you explain more?
thanks,
Joe


Never underestimante the power of stupid people in large groups.

Share this post


Link to post
Share on other sites
nevermind. I think I see what you''re saying. I found a great tutorial on this. I doesn''t look like it has to be that hard. I think I''m just going to rewrite all of the code.


Never underestimante the power of stupid people in large groups.

Share this post


Link to post
Share on other sites
ok, I rewrote the code just to get something to show on the screen. this is my code:
      
bool CVertexArray::Init()
{
glLockArraysEXT = (PFNGLLOCKARRAYSEXTPROC) wglGetProcAddress("glLockArraysEXT");
glUnlockArraysEXT = (PFNGLUNLOCKARRAYSEXTPROC) wglGetProcAddress("glUnlockArraysEXT");

if(glLockArraysEXT == NULL || glUnlockArraysEXT == NULL)
{
return false;
}
return true;
}

and the render:
     
void CVertexArray::Render(float *pIndicies, float *pTexCoords, unsigned int iVertexCount)
{
glEnableClientState(GL_VERTEX_ARRAY);

glEnableClientState(GL_TEXTURE_COORD_ARRAY);

glVertexPointer(3, GL_FLOAT, 0, pIndicies);

glTexCoordPointer(2, GL_FLOAT, 0, pTexCoords);

glLockArraysEXT(0, iVertexCount);

// Render

glDrawArrays(GL_TRIANGLES, 0, 3);

glDisableClientState(GL_TEXTURE_COORD_ARRAY);

glUnlockArraysEXT();
}

I plan on getting rid of redundand state changes later, but for now I just want it to work. The problem is that in the initialization both glLockArraysEXT and glUnlockArraysEXT are 0, NULL. WHY? It works fine in other programs I run. At the top of my class .cpp file I declare them:
  
typedef void (APIENTRY *PFNGLLOCKARRAYSEXTPROC) (int first, int count);
typedef void (APIENTRY *PFNGLUNLOCKARRAYSEXTPROC) (void);

PFNGLLOCKARRAYSEXTPROC glLockArraysEXT = NULL;
PFNGLUNLOCKARRAYSEXTPROC glUnlockArraysEXT = NULL;

thanks,
Joe


Never underestimante the power of stupid people in large groups.

Edited by - Julio on August 1, 2001 1:40:47 PM

Edited by - Julio on August 1, 2001 1:42:02 PM

Share this post


Link to post
Share on other sites
You should have a current OpenGL rendering context before calling wglGetProcAddress. Here is all about OpenGL extensions
http://www.opengl.org/developers/code/features/OGLextensions/OGLextensions.html

Share this post


Link to post
Share on other sites