• Create Account

## error with seperated matrices and GLSL

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

7 replies to this topic

### #1TheCreeperLawyer  Members

Posted 01 September 2012 - 06:57 AM

Hello,

I recently started playing around with OpenGL on C++ and after getting some of the basics, a simple GLSL implementation. I tried to add very slight random noise to some simple shaded polygons, where the colors are described at the vertices (I don't know the proper name for this) in a pyramid.

I made it use FragCoord to get it coords to calculate the noise value, but this uses screen pixels and so the noise would stay the same no matter the prospective you're looking at the model from. I followed the description of a better method described in the top answer here, but this doesn't seem to be working. I can't move the shape around and it's stuck at z=0. I can't change any camera perspectives. I think this may be an error in my GLSL as I'm totally new to it and it's quite alien to me.

Here's an image of my problem. The first is what it's meant to look like (without the noise) the second what it looks like:

Here is my code:

frag.glsl
varying vec4 verpos;
float rand2d(vec2 x){
uint n = floatBitsToUint(x.y * 214013.0 + x.x * 2531011.0);
n = n * (n * n * 15731u + 789221u);
n = (n >> 9u) | 0x3F800000u;

return (2.0 - uintBitsToFloat(n));
}
float rand3d(vec3 x){
uint n = floatBitsToUint(x.y * 214013.0 + x.x * 2531011.0 + x.z * 27644437.0);
n = n * (n * n * 15731u + 789221u);
n = (n >> 9u) | 0x3F800000u;

return (2.0 - uintBitsToFloat(n));
}
void main(void)
{
vec4 outColor = gl_Color;
outColor *= ((0.2 * rand3d(vec3(verpos.x, verpos.y, verpos.z)) + 0.9);
outColor.w = 1.0;
gl_FragColor = outColor;
}


vert.glsl
uniform mat4 view_matrix;
uniform mat4 model_matrix;

varying vec4 verpos;
void main(){
verpos = model_matrix * gl_Vertex;
gl_FontColor = gl_Color;
gl_Position = gl_ProjectionMatrix * view_matrix * model_matrix * gl_Vertex;
}


and my render method where I separate my matrices:
void App::render(){
start = SDL_GetTicks();
//Clear the matrices and pixel/depth buffer(s)
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
//Create model matrix and pass to shader
glTranslatef(x, y, -6.0f);//Modify the world matrix
GLfloat model_matrix_array[16];
glGetFloatv(GL_MODELVIEW_MATRIX, model_matrix_array);
glUniformMatrix4fv(model_matrix, 1, GL_FALSE, model_matrix_array);
//Create view matrix and pass to shader
glRotatef(roty, 1.0f, 0.0f, 0.0f);//Modify the view matrix
glRotatef(rotx, 0.0f, 1.0f, 0.0f);//Modify the view matrix
GLfloat view_matrix_array[16];
glGetFloatv(GL_MODELVIEW_MATRIX, view_matrix_array);
glUniformMatrix4fv(view_matrix, 1, GL_FALSE, view_matrix_array);

glBegin(GL_TRIANGLE_FAN);
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 1.0f);
glColor4f(0.0f, 1.0f, 0.0f, 1.0f); glVertex3f(-1.0f,  1.0f, -1.0f);
glColor4f(1.0f, 0.0f, 0.0f, 1.0f); glVertex3f( 1.0f,  1.0f, -1.0f);
glColor4f(1.0f, 1.0f, 0.0f, 1.0f); glVertex3f( 1.0f, -1.0f, -1.0f);
glColor4f(0.0f, 1.0f, 1.0f, 1.0f); glVertex3f(-1.0f, -1.0f, -1.0f);
glColor4f(0.0f, 1.0f, 0.0f, 1.0f); glVertex3f(-1.0f,  1.0f, -1.0f);
glEnd();
glColor4f(0.0f, 1.0f, 0.0f, 1.0f); glVertex3f(-1.0f,  1.0f, -1.0f);
glColor4f(1.0f, 0.0f, 0.0f, 1.0f); glVertex3f( 1.0f,  1.0f, -1.0f);
glColor4f(1.0f, 1.0f, 0.0f, 1.0f); glVertex3f( 1.0f, -1.0f, -1.0f);
glColor4f(0.0f, 1.0f, 1.0f, 1.0f); glVertex3f(-1.0f, -1.0f, -1.0f);
glEnd();
SDL_GL_SwapBuffers();
frames++;
current = SDL_GetTicks();
}


I'm sorry that this is such a big post but I really hope it doesn't put people off and that someone can help me. I would really appreciate it.

Thanks.

### #2Goran Milovanovic  Members

Posted 01 September 2012 - 05:09 PM

I think this may be an error in my GLSL as I'm totally new to it and it's quite alien to me.

You're using "gl_FontColor" rather than the correct "gl_FrontColor" ... It could be only that, or maybe that's just one part of the overall problem.

To see GLSL errors, you have to get them after the shader compilation process, using glGetShaderInfoLog.

Also, you might want to steer clear of immediate mode (this method where you're passing geometry, and other data using glBegin/End glVertex/Color). There are links on the front page of the OpenGL forum which lead to relevant documentation, which describes how to allocate the appropriate vertex/color/texcoord buffers.

+---------------------------------------------------------------------+

| Game Dev video tutorials  ->   http://www.youtube.com/goranmilovano |
+---------------------------------------------------------------------+

### #3TheCreeperLawyer  Members

Posted 01 September 2012 - 07:12 PM

Thanks for that.

I'm having trouble displaying the info log though. The issue is, I don't know how to convert from an GLchar* to a LPCWSTR for use in a MessageBox(). I'm not using ATL or MFC.

Any help there would be greatly appreciated as converting between strings has given me a headache before.

Thanks.

### #4Goran Milovanovic  Members

Posted 02 September 2012 - 02:57 AM

You would have to convert the basic C string into a wchar (wide char) ... I think.

Why not simply use printf?

+---------------------------------------------------------------------+

| Game Dev video tutorials  ->   http://www.youtube.com/goranmilovano |
+---------------------------------------------------------------------+

### #5TheCreeperLawyer  Members

Posted 02 September 2012 - 02:58 AM

I could use printf(), but for the sake of my learning, how do I convert a C string to wchar?

### #6Hodgman  Moderators

Posted 02 September 2012 - 03:14 AM

I'm having trouble displaying the info log though. The issue is, I don't know how to convert from an GLchar* to a LPCWSTR for use in a MessageBox().

You can choose whether you're using the Unicode or ASCII version of the windows API in your project properties, with the "Character set" property --
Use Multi-Byte Character Set = UTF8/ASCII (char* arguments).
Use Unicode Character Set = UTF16 (wchar* arguments).

### #7TheCreeperLawyer  Members

Posted 02 September 2012 - 04:12 AM

I did some research and managed to write a function to convert from char* to wchar* and then to wstring. Unless I've done something wrong, it appears that there is NO debug information at all after compiling the shaders. So this brings be back to my original problem.

Thanks for the help though. I really appreciate it!

### #8Goran Milovanovic  Members

Posted 02 September 2012 - 12:25 PM

Be sure to test the functions that get/print the info log, by making an intentional mistake somewhere in your GLSL code. You should definitely see something at that point, and if not, then it's likely that your code for getting/displaying the info log is broken.

+---------------------------------------------------------------------+

| Game Dev video tutorials  ->   http://www.youtube.com/goranmilovano |
+---------------------------------------------------------------------+

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.