Lighting sparse meshes

Started by
25 comments, last by PrestoChung 13 years, 4 months ago
Ah, yes thanks for catching that. The function in the tutorial had source as a GLchar**, I have changed it to GLchar* and missed to change that part.

Doesn't seem to be related to my crash error though.

How do I check that my video card supports it? I have a newer card Geforce GTS250 i think it is.
Advertisement
glGetString(GL_EXTENSIONS) gives you all supported extensions.

try hardcoding your shader code in the program to see if thats a problem
------------------------------
Join the revolution and get a free donut!
O, yea I have looked at that and GLEW has pulled in the proper extensions as far as I can see. I think it would tell me that the function is undefined if that was the problem.

Edit:
I just noticed in the debugger that the vshadesource and fshadesource pointers are still at 0 when I get to glShaderSourceARB so it must be a failure of of the loading function. I'll step through it and see what I can find out.
Fixed by passing the source pointer by reference. It looks funny though
Sint32 LoadShader(char* filename_, GLchar*& source_)


Compiled successfully. Now I just need to get glDrawElements to work without crashing.

Edit: Ack! forgot to create the shader program :)
It's working now but I'm a little confused because:

Even if my vert shader or frag shader does not compile, or contains and empty main function that does nothing, I still get some drawing.

I set 3 attrib pointers (interleaved), but nowhere do I specify that the first one is the position.

Does OpenGL automatically use the first attribute as a position vertex?

Otherwise I'm a little confused as to how it could know how to render my VBO data.

Might this have something to do with it? Quote from http://www.opengl.org/sdk/docs/tutorials/ClockworkCoders/attributes.php

Quote:In other words, NVidia hardware indices are reserved for built-in attributes:
gl_Vertex 0
gl_Normal 2
gl_Color 3
gl_SecondaryColor 4
gl_FogCoord 5
gl_MultiTexCoord0 8
gl_MultiTexCoord1 9
gl_MultiTexCoord2 10
gl_MultiTexCoord3 11
gl_MultiTexCoord4 12
gl_MultiTexCoord5 13
gl_MultiTexCoord6 14
gl_MultiTexCoord7 15


When enable or set an attribute pointer
glEnableVertexAttribArray(0);glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(TexNormVert), 0);

in this case attribute "0", whenever I use gl_Vertex in my shader source, it is looking at this attribute?

It seems to be so. When I use attribute 8 for the texture coordinate part of my Vertex
glEnableVertexAttribArray(8);glVertexAttribPointer(8, 2, GL_FLOAT, GL_FALSE, sizeof(TexNormVert), (GLvoid*) (NULL + sizeof (GL_FLOAT) * 6) );

I suddenly get texture to display properly!

This shader code is generating an error C5052: gl_MultiTexCoord0 is not accessible in this profile

But at the same time the texture is rendering! I'm really mystified as to how this is possible.

vertshader.vert
void main(){	gl_Position = ftransform();} 
fragshader.frag
uniform sampler2D tex;void main(){     	gl_FragColor = texture2D(tex, gl_MultiTexCoord0);}  
If you will use glVertexAttribPointer, then do not use gl_Vertex in your shader.
NEVER use ftransform().

You must declare attrib in your shader as I show here
http://www.opengl.org/wiki/GlVertexAttribPointer
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Edit: updated a bit. Still no drawing visible:

Here is setup after shaders are compiled:
Shader_Program = glCreateProgramObjectARB();glAttachObjectARB(Shader_Program, vshadehandle);glAttachObjectARB(Shader_Program, fshadehandle);glBindAttribLocationARB( Shader_Program, 0, "inVertex" );glBindAttribLocationARB( Shader_Program, 1, "inNormal" );glBindAttribLocationARB( Shader_Program, 2, "inTexCoord0" );glBindFragDataLocation( Shader_Program, 0, "outFragColor" );glLinkProgramARB( Shader_Program );glUseProgramObjectARB( Shader_Program );


Here is building VAO and VBO for each object:
glGenVertexArrays(1, &Vao_Handle);glBindVertexArray(Vao_Handle);glGenBuffersARB(1, &Vbo_Handle);glBindBufferARB(GL_ARRAY_BUFFER_ARB, Vbo_Handle);glBufferDataARB(GL_ARRAY_BUFFER_ARB, sizeof(vertices[0])*vertices.size(), &vertices[0], GL_STATIC_DRAW_ARB);		glEnableVertexAttribArray(0);glEnableVertexAttribArray(1);glEnableVertexAttribArray(2);glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(TexNormVert), 0);glVertexAttribPointer(1, 3, GL_FLOAT, GL_TRUE, sizeof(TexNormVert), (GLvoid*) (NULL + sizeof (GL_FLOAT) * 3) );glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, sizeof(TexNormVert), (GLvoid*) (NULL + sizeof (GL_FLOAT) * 6) );glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, Vbo_Element_Index);glBindVertexArray(0);
Then the drawing step:
glBindVertexArray(Vao_Handle);glDrawElements(GL_TRIANGLES, NumIndices(Portal_Walls.size()), GL_UNSIGNED_BYTE, 0);glBindVertexArray(0);

And right now the shaders are like so:
/////////Vertex Shader#version 150uniform mat4 ProjectionModelviewMatrix;in vec4 inVertex;in vec2 inTexCoord0;out vec2 outTexCoord0;void main(){	gl_Position = ProjectionModelviewMatrix * inVertex;	outTexCoord0 = inTexCoord0;} /////////////Fragment Shader#version 150uniform sampler2D tex;in vec2 outTexCoord0;out vec4 outFragColor;vec4 mycolor;void main(){	mycolor = texture2D(tex, outTexCoord0);		outFragColor = mycolor;}  


Screen is blank, no drawing at all.

[Edited by - PrestoChung on December 12, 2010 8:22:58 AM]
Why do you use GL_UNSIGNED_BYTE?
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Quote:Original post by V-man
Why do you use GL_UNSIGNED_BYTE?


Thats the type of my element arrays.

GLubyte indices0[] = {  3, 1, 0, ... };

This topic is closed to new replies.

Advertisement