Jump to content

  • Log In with Google      Sign In   
  • Create Account

Access Violation upon calling an OpenGL Extension


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 stevo86   Members   -  Reputation: 150

Like
0Likes
Like

Posted 16 April 2010 - 08:35 AM

This is a very strange problem that I cannot seem to resolve. I've created an absolute basic windows program along the following lines:
	WNDCLASS wc;
	PIXELFORMATDESCRIPTOR pfd;
	int format;

	memset(&wc, 0, sizeof(WNDCLASS));
	memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
	
	wc.hbrBackground = GetSysColorBrush(COLOR_3DFACE);
	wc.hCursor = LoadCursor(NULL, IDC_ARROW);
	wc.hIcon = LoadIcon(NULL, IDI_APPLICATION);
	wc.lpfnWndProc = os_proc;
	wc.lpszClassName = "BaseWindow";
	wc.style = CS_OWNDC | CS_VREDRAW | CS_HREDRAW;

	if (!RegisterClass(&wc))
	{
		return false;
	}

	if (!(gwnd = CreateWindow("BaseWindow", "Window", WS_VISIBLE | WS_OVERLAPPEDWINDOW, CW_USEDEFAULT, CW_USEDEFAULT, width, height, NULL, NULL, ginstance, NULL)))
	{
		return false;
	}

	gdc = GetDC(gwnd);

	if (!gdc)
	{
		return false;
	}

	pfd.nVersion = 1;
	pfd.nSize = sizeof(PIXELFORMATDESCRIPTOR);
	pfd.iLayerType = PFD_MAIN_PLANE;
	pfd.iPixelType = PFD_TYPE_RGBA;
	pfd.cColorBits = 32;
	pfd.cStencilBits = 8;
	pfd.cDepthBits = 24;
	pfd.dwFlags = PFD_DOUBLEBUFFER | PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL;

	format = ChoosePixelFormat(gdc, &pfd);
	if (!SetPixelFormat(gdc, format, &pfd))
	{
		return false;
	}

	grc = wglCreateContext(gdc);
	wglMakeCurrent(gdc, grc);

	glViewport(0, 0, width, height);
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();

	gluOrtho2D(0, width, 0, height);

	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();

	glEnable(GL_DEPTH_TEST);
	glDepthFunc(GL_LEQUAL);
	glClearDepth(1.0f);
	glClearColor(0, 0, 0, 0);

	int ret = glewInit();

	if (ret != GLEW_OK)
	{
		return false;
	}

	if (!GLEW_ARB_vertex_program)
	{
		printf("Vertex shaders are not supported\n");
		return false;
	}
	else
	{
		printf("Vertex shaders are supported\n");
	}

	unsigned int id = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);
Now the odd part about this is that even though I'm calling glewInit() and verifying that it works, and verifying that I do have shader support, the second I run the program, I get the following error: "Unhandled exception in program.exe: access violation at 0xC0000005" When I hit OK, Visual Studio highlights the following line as the cause of the problem:
	unsigned int id = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);
Anyone know the cause of this? I'm using an HP Pavilion d2035us laptop with an Intel 945 Express chipset.

Sponsor:

#2 HuntsMan   Members   -  Reputation: 368

Like
0Likes
Like

Posted 16 April 2010 - 08:51 AM

GL_ARB_vertex_program is for assembly shaders, not for GLSL. You want GL_ARB_vertex_shader and GL_ARB_shading_language_100

#3 stevo86   Members   -  Reputation: 150

Like
0Likes
Like

Posted 16 April 2010 - 08:54 AM

That seems to be it. Cheers!




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS